

ETL Data Engineer - W2 Position
β - Featured Role | Apply direct with Data Freelance Hub
This role is for an ETL Data Engineer in Dallas, TX (Hybrid) with a contract length offering $45/hr on W2. Requires 6+ years in Data Engineering, 2+ years in GCP, and expertise in Teradata, Python, and Informatica. GCP certification preferred.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
360
-
ποΈ - Date discovered
June 11, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Hybrid
-
π - Contract type
W2 Contractor
-
π - Security clearance
Unknown
-
π - Location detailed
Dallas, TX
-
π§ - Skills detailed
#Security #Logging #Kafka (Apache Kafka) #"ETL (Extract #Transform #Load)" #Data Quality #AI (Artificial Intelligence) #Python #Cloud #Monitoring #BTEQ #Apache Beam #ML (Machine Learning) #GCP (Google Cloud Platform) #Data Architecture #Scripting #SQL (Structured Query Language) #Java #DevOps #Compliance #Data Engineering #Apache Kafka #Migration #Dataflow #GIT #BigQuery #Storage #Teradata #Data Storage #Airflow #Data Governance #Informatica
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Position: ETL Data Engineer
Location: Dallas TX (Hybrid Model)
Contract Opportunity
Rate: Max $45/hr on W2
Must have skills: GCP, Teradata, Python, Informatica
Key Responsibilities:
β’ Lead and execute migration of data and ETL workflows from Teradata to GCP-based services such as BigQuery, Cloud Storage, Dataflow, Dataproc, and Composer (Airflow).
β’ Analyze and map existing Teradata workloads to appropriate GCP equivalents.
β’ Rewrite SQL logic, scripts, and procedures in GCP-compliant formats (e.g., standard SQL for BigQuery).
β’ Collaborate with data architects and business stakeholders to define migration strategies, validate data quality, and ensure compliance.
β’ Develop automated workflows for data movement and transformation using GCP-native tools and/or custom scripts (Python/Java).
β’ Optimize data storage, query performance, and costs in the cloud environment.
β’ Implement monitoring, logging, and alerting for all migration pipelines and production workloads.
Required Skills:
β’ 6+ years of experience in Data Engineering, with at least 2 years in GCP.
β’ Strong hands-on experience in Teradata data warehousing, BTEQ, and complex SQL.
β’ Solid knowledge of GCP services: BigQuery, Dataflow, Cloud Storage, Pub/Sub, Composer, and Dataproc.
β’ Experience with ETL/ELT pipelines using tools like Informatica, Apache Beam, or custom scripting (Python/Java).
β’ Proven ability to refactor and translate legacy logic from Teradata to GCP.
β’ Familiarity with CI/CD, Git, and DevOps practices in cloud data environments.
β’ Strong analytical, troubleshooting, and communication skills.
Preferred Qualifications:
β’ GCP certification (e.g., Professional Data Engineer).
β’ Exposure to Apache Kafka, Cloud Functions, or AI/ML pipelines on GCP.
β’ Experience working in the healthcare, retail, or finance domains.
β’ Knowledge of data governance, security, and compliance in cloud ecosystems.