

Aptino, Inc.
GCP Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Engineer in NYC or Irving, TX, lasting 12 months at a competitive pay rate. Requires 8+ years in data engineering, strong SQL and Python skills, and experience with Google Cloud technologies, especially BigQuery.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
October 29, 2025
🕒 - Duration
More than 6 months
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Irving, TX
-
🧠 - Skills detailed
#Data Pipeline #Data Quality #BTEQ #Data Warehouse #SQL (Structured Query Language) #Bash #Data Modeling #Data Engineering #Compliance #GitHub #Data Lineage #Cloud #Scripting #Shell Scripting #UAT (User Acceptance Testing) #Dataflow #Complex Queries #GCP (Google Cloud Platform) #Data Mart #Deployment #Scrum #BigQuery #Python #Version Control #Scala #Debugging #Migration #Data Architecture #Programming #"ETL (Extract #Transform #Load)" #Agile #Teradata #Airflow #Documentation
Role description
Role: GCP Data Engineer
Location: NYC OR Irving TX (Onsite)
Duration: 12 months
We are looking for an experienced GCP Data Engineer (Integration) to contribute to the migration of a legacy data warehouse to a Google Cloud–based platform for a leading telecom client. The ideal candidate will have strong hands-on experience with SQL, Python, and Google Cloud Data Technologies and the ability to design and optimize complex data pipelines in large-scale environments.
Responsibilities
• Lead and contribute to the migration of legacy Teradata data warehouse to Google Cloud (BigQuery) environment.
• Develop and maintain complex data transformation pipelines using a custom ETL framework in BigQuery.
• Collaborate with Data Product Managers, Data Architects, and other stakeholders to design and implement scalable data solutions.
• Architect and build high-performance data pipelines and data marts on Google Cloud.
• Design, develop, and optimize ETL/ELT processes using BigQuery, Airflow, and Python.
• Maintain detailed documentation of all development work to ensure data quality, data lineage, and governance.
• Support QA/UAT testing and deployment activities to higher environments.
• Monitor data pipeline performance and ensure operational efficiency and compliance with SLAs.
• Actively participate in Agile/Scrum processes and contribute to team improvements.
Basic Qualifications
• 8+ years of experience in data engineering, building and maintaining large-scale, complex data pipelines.
• Strong proficiency in SQL, with proven ability to optimize complex queries.
• Hands-on experience with Google Cloud Data Technologies — BigQuery, GCS, Dataflow, Pub/Sub, Data Fusion, Cloud Functions.
• Strong programming skills in Python.
• Proven experience in developing and debugging Airflow jobs.
• Working knowledge of Teradata and ability to interpret complex BTEQ scripts.
Preferred Qualifications
• Experience migrating data from Teradata to BigQuery.
• Familiarity with Cloud Data Proc, Pub/Sub, Dataflow, and Data Fusion.
• Proficiency with job orchestration tools (e.g., Airflow) for building complex workflows.
• Experience automating ETL processes using Python and shell scripting (Bash).
• Strong understanding of data modeling techniques and data warehousing best practices.
• Proficient with version control systems such as GitHub.
• Familiar with Agile/Scrum development methodology.
• Strong problem-solving and analytical skills with excellent communication abilities.
• Experience working in an onsite/offshore model and leading teams effectively
Role: GCP Data Engineer
Location: NYC OR Irving TX (Onsite)
Duration: 12 months
We are looking for an experienced GCP Data Engineer (Integration) to contribute to the migration of a legacy data warehouse to a Google Cloud–based platform for a leading telecom client. The ideal candidate will have strong hands-on experience with SQL, Python, and Google Cloud Data Technologies and the ability to design and optimize complex data pipelines in large-scale environments.
Responsibilities
• Lead and contribute to the migration of legacy Teradata data warehouse to Google Cloud (BigQuery) environment.
• Develop and maintain complex data transformation pipelines using a custom ETL framework in BigQuery.
• Collaborate with Data Product Managers, Data Architects, and other stakeholders to design and implement scalable data solutions.
• Architect and build high-performance data pipelines and data marts on Google Cloud.
• Design, develop, and optimize ETL/ELT processes using BigQuery, Airflow, and Python.
• Maintain detailed documentation of all development work to ensure data quality, data lineage, and governance.
• Support QA/UAT testing and deployment activities to higher environments.
• Monitor data pipeline performance and ensure operational efficiency and compliance with SLAs.
• Actively participate in Agile/Scrum processes and contribute to team improvements.
Basic Qualifications
• 8+ years of experience in data engineering, building and maintaining large-scale, complex data pipelines.
• Strong proficiency in SQL, with proven ability to optimize complex queries.
• Hands-on experience with Google Cloud Data Technologies — BigQuery, GCS, Dataflow, Pub/Sub, Data Fusion, Cloud Functions.
• Strong programming skills in Python.
• Proven experience in developing and debugging Airflow jobs.
• Working knowledge of Teradata and ability to interpret complex BTEQ scripts.
Preferred Qualifications
• Experience migrating data from Teradata to BigQuery.
• Familiarity with Cloud Data Proc, Pub/Sub, Dataflow, and Data Fusion.
• Proficiency with job orchestration tools (e.g., Airflow) for building complex workflows.
• Experience automating ETL processes using Python and shell scripting (Bash).
• Strong understanding of data modeling techniques and data warehousing best practices.
• Proficient with version control systems such as GitHub.
• Familiar with Agile/Scrum development methodology.
• Strong problem-solving and analytical skills with excellent communication abilities.
• Experience working in an onsite/offshore model and leading teams effectively






