Saransh Inc

GCP Data Engineer with Python

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Engineer with Python, a contract position in Dearborn, MI, lasting 8 to 12 years of experience. Key skills include GCP, Python, ETL tools, and RDBMS technologies. Onsite work is required four days a week.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
December 4, 2025
πŸ•’ - Duration
Unknown
-
🏝️ - Location
On-site
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Dearborn, MI
-
🧠 - Skills detailed
#Scala #Python #Spatial Data #Data Pipeline #"ETL (Extract #Transform #Load)" #RDBMS (Relational Database Management System) #Scrum #Cloud #DataStage #GCP (Google Cloud Platform) #Scripting #Teradata #Data Engineering #Data Analysis #Automation #Big Data #Informatica #Data Processing
Role description
Role: GCP Data Engineer with Python Location: Dearborn, MI (4 days a week onsite) Job Type: Contract Experience: Overall 8 to 12 years Job Summary β€’ The Data Engineer will be responsible for supporting the Credit Global Securitization (GS) team’s upskilling initiative by contributing to data engineering efforts across cloud and traditional platforms. β€’ This role is intended to accelerate development and delivery. β€’ The engineer will work closely with cross-functional teams to build, optimize, and maintain data pipelines and workflows using GCP, Python, and ETL tools. Required Technical Skills β€’ Minimum 3+ years of hands-on experience with Google Cloud Platform (GCP), specifically using Astronomer/Composer for orchestration. β€’ Strong proficiency in Python for data engineering and automation. β€’ Experience with RDBMS technologies such as DB2 and Teradata. β€’ Exposure to Big Data ecosystems and distributed data processing. Nice To Have Technical Skills β€’ Prior experience with ETL tools like DataStage or Informatica. Responsibilities β€’ The Data Engineer will play a key role in the developing and maintaining scalable data pipelines and workflows. β€’ The engineer will work with GCP tools like Astronomer/Composer and leverage Python for automation and transformation tasks. β€’ The role involves integrating data from RDBMS platforms such as DB2 and Teradata, and supporting ETL processes using tools like DataStage or Informatica. β€’ The engineer will collaborate with existing team members, including Software Analysts and Scrum Masters, and will be expected to contribute to knowledge sharing and process improvement. Specifically β€’ Develop and implement solutions using GCP, Python, Big Data technologies to enhance data analysis capabilities. β€’ Collaborate with cross-functional teams to design and optimize data models in Teradata and DB2 environments. β€’ Utilize Python for scripting and automation to streamline geospatial data processing tasks. β€’ Integrate and manage data workflows using Cloud Composer to ensure efficient data pipeline operations. β€’ Leverage GCP Cloud to deploy scalable applications and services.