Smart IT Frame LLC

GCP Python Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Python Data Engineer in Dearborn, MI, with a contract length of "unknown." Pay rate is "unknown." Requires 3+ years of GCP experience, proficiency in Python, and RDBMS knowledge. Familiarity with ETL tools is a plus.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
November 1, 2025
πŸ•’ - Duration
Unknown
-
🏝️ - Location
On-site
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Dearborn, MI
-
🧠 - Skills detailed
#Informatica #Big Data #Data Processing #Scala #"ETL (Extract #Transform #Load)" #Scripting #GCP (Google Cloud Platform) #DataStage #Automation #Scrum #Teradata #Data Integrity #Data Analysis #Data Management #Spatial Data #Data Engineering #Data Pipeline #Cloud #Python #RDBMS (Relational Database Management System)
Role description
Job Title: GCP Python Data Engineer Location: Dearborn, MI Job Summary: The Data Engineer will be responsible for supporting the Credit Global Securitization (GS) team’s upskilling initiative by contributing to data engineering efforts across cloud and traditional platforms. This role is intended to accelerate development and delivery. The engineer will work closely with cross-functional teams to build, optimize, and maintain data pipelines and workflows using GCP, Python, and ETL tools. Required Technical Skills: β€’ Minimum 3+ years of hands-on experience with Google Cloud Platform (GCP), β€’ specifically using Astronomer/Composer for orchestration. β€’ Strong proficiency in Python for data engineering and automation. β€’ Experience with RDBMS technologies such as DB2 and Teradata. β€’ Exposure to Big Data ecosystems and distributed data processing. Nice to have Technical Skills : Prior experience with ETL tools like DataStage or Informatica. Responsibilities: The Data Engineer will play a key role in the developing and maintaining scalable data pipelines and workflows. The engineer will work with GCP tools like Astronomer/Composer and leverage Python for automation and transformation tasks. The role involves integrating data from RDBMS platforms such as DB2 and Teradata, and supporting ETL processes using tools like DataStage or Informatica. This position is part of a strategic effort to enhance the delivery capabilities of the customer team and extend the longevity of current project resources. The engineer will collaborate with existing team members, including Software Analysts and Scrum Masters, and will be expected to contribute to knowledge sharing and process improvement. Specifically: Develop and implement solutions using GCP, Python, Big Data technologies to enhance data analysis capabilities. ο‚§ Collaborate with cross-functional teams to design and optimize data models in Teradata and DB2 environments. Utilize Python for scripting and automation to streamline geospatial data processing tasks. Integrate and manage data workflows using Cloud Composer to ensure efficient data pipeline operations. Leverage GCP Cloud to deploy scalable applications and services. Conduct thorough data analysis to provide actionable insights for business decision-making. Ensure data integrity and accuracy through rigorous testing and validation processes. Provide technical expertise and support to team members on data management and analysis. Stay updated with the latest advancements in geospatial technologies and incorporate them into projects. Optimize cloud resources to achieve cost-effective and high-performance solutions. Collaborate with stakeholders to understand requirements and deliver tailored solutions. Document processes and methodologies to maintain knowledge continuity and facilitate training. Contribute to the development of best practices and standards for data management.