

Cloud Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Cloud Data Engineer with a contract length of "unknown" and a pay rate of "unknown." Key requirements include 5+ years of data engineering experience, proficiency in Python and GCP, strong SQL skills, and familiarity with data modeling and governance principles.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
August 1, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Unknown
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Southfield, MI
-
π§ - Skills detailed
#MongoDB #GIT #Databases #Data Pipeline #Data Security #Snowflake #Version Control #DynamoDB #SQL (Structured Query Language) #PostgreSQL #Data Governance #Data Modeling #Compliance #MySQL #AWS (Amazon Web Services) #Data Quality #Docker #Batch #Python #Data Engineering #Security #Scala #"ETL (Extract #Transform #Load)" #Dataflow #BigQuery #Computer Science #Kubernetes #GCP (Google Cloud Platform) #Programming #Storage #Data Ingestion #NoSQL #Cloud
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Job description:
β’ Data Engineer (Python + GCP) Responsibilities Design, develop, and maintain scalable ETL/ELT pipelines. Optimize and automate data ingestion, transformation, and storage processes. Work with structured and unstructured data sources, ensuring data quality and consistency. Develop and maintain data models, warehouses, and databases. Ensure data security, privacy, and compliance with industry standards. Troubleshoot and resolve data-related issues in a timely manner. Monitor and improve system performance, reliability, and scalability.
β’ What you will need: Strong programming skills using python. 5+ years of experience in data engineering, ETL development, or a related role. Proficiency in SQL and experience with relational (PostgreSQL, MySQL, etc.) and NoSQL (DynamoDB, MongoDB etcβ¦) databases. Proficiency in building data pipelines in Google cloud platform(GCP) using services like DataFlow, Cloud Batch, BigQuery, BigTable, Cloud functions, Cloud Workflows, Cloud Composer etc... Strong understanding of data modeling, data warehousing, and data governance principles. Should be capable of mentoring junior data engineers and assisting them with technical challenges. Familiarity with containerization and orchestration (Docker, Kubernetes). Experience with version control systems (Git) and CI/CD pipelines. Excellent problem-solving skills and ability to work in a fast-paced environment.
β’ Excellent communication skills. Hands-on experience with snowflake is a plus. Experience in AWS is a plus. Education and Experience Bachelorβs degree in Computer Science, Information Systems, Information Technology, or a similar major or Certified Development Program 5+ years of experience building data pipelines using python & GCP (Google Cloud platform).