

Gardner Resources Consulting, LLC
Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Data Engineer position on a hybrid contract, focusing on cloud platforms (GCP, AWS, Azure) for data migration and optimization. Key skills include SQL, Python, Pandas, and Airflow. Experience with Teradata is required.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
January 10, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Greater Hartford
-
🧠 - Skills detailed
#Scripting #"ETL (Extract #Transform #Load)" #SQL (Structured Query Language) #Migration #Azure #Teradata #Data Engineering #AWS (Amazon Web Services) #GCP (Google Cloud Platform) #Cloud #Airflow #Pandas #Python
Role description
Data Engineer - Hybrid Onsite role (contract)
• Contribute to large-scale enterprise data initiatives involving cloud platforms (GCP, AWS, Azure), with work spanning data movement, modernization, and optimization activities.
• Support migration and transformation efforts between traditional data environments and cloud solutions, including scenarios involving Teradata workloads.
• Utilize a variety of tools and scripting approaches, including SQL, Python, and Pandas, to build, refine, and validate data workflows across different cloud services.
• Participate in designing, orchestrating, and maintaining automated workflows using cloud scheduling/orchestration technologies such as Airflow and related tooling.
• Apply cloud best practices around performance, reliability, and governance, with occasional exposure to Holochain and Drupal.
Data Engineer - Hybrid Onsite role (contract)
• Contribute to large-scale enterprise data initiatives involving cloud platforms (GCP, AWS, Azure), with work spanning data movement, modernization, and optimization activities.
• Support migration and transformation efforts between traditional data environments and cloud solutions, including scenarios involving Teradata workloads.
• Utilize a variety of tools and scripting approaches, including SQL, Python, and Pandas, to build, refine, and validate data workflows across different cloud services.
• Participate in designing, orchestrating, and maintaining automated workflows using cloud scheduling/orchestration technologies such as Airflow and related tooling.
• Apply cloud best practices around performance, reliability, and governance, with occasional exposure to Holochain and Drupal.





