

Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer on a 6-month contract, remote (U.S. preferred), with a pay rate of "unknown." Key skills include Python, YAML, Snowflake, and AWS Glue. Experience in Azure-to-AWS migration is preferred.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
July 26, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Remote
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
United States
-
π§ - Skills detailed
#Data Engineering #Data Pipeline #Automation #DBeaver #Data Access #AWS (Amazon Web Services) #AWS Migration #AWS Glue #"ETL (Extract #Transform #Load)" #Migration #ML (Machine Learning) #Cloud #Data Storage #Debugging #Python #Data Analysis #YAML (YAML Ain't Markup Language) #Data Migration #Storage #Data Processing #Azure #Snowflake
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Job Title: Data Engineer
Employment Type: Contract
Duration: 6 months (with potential extensions up to 3 years)
Location: Remote (U.S. preferred)
Role Overview:
Weβre hiring 4β6 Data Engineers to support a large-scale data modernization project as part of our solution engineering teams. This role will focus on a secondary initiative: migrating machine learning models and complex ETL pipelines from an Azure-based environment to AWS and Snowflake, enabling enterprise-wide platform standardization.
This lift-and-shift effort is part of a broader transformation initiative and requires strong technical hands-on skills and a deep understanding of cloud-based data engineering.
Key Responsibilities:
β’ Design, build, and maintain ETL pipelines to support large-scale data migration and integration efforts.
β’ Support the migration of ML models and complex data workflows from Azure to AWS/Snowflake.
β’ Collaborate with cross-functional teams to ensure smooth transitions and minimal downtime during data migrations.
β’ Work with Python and YAML configuration files to define workflows and automate data processing.
β’ Use tools like AWS Glue and dBeaver to transform and access data efficiently.
β’ Create and manage views in Snowflake, ensuring secure and organized data access for analytics teams.
β’ Perform data validation and assist in debugging issues during the transition process.
Required Skills & Experience:
Technical Proficiency:
β’ Python (strong coding skills for automation and data pipeline development)
β’ YAML (for configuration of data workflows and tool settings)
β’ Snowflake (hands-on experience creating views, managing access, and performing data analysis)
β’ AWS Glue (preferred for ETL orchestration)
β’ Database access tools like dBeaver
Platform Knowledge:
β’ Understanding of Snowflakeβs architecture, including view creation, access control, and data sharing capabilities.
β’ Familiarity with AWS cloud services, especially for data storage, transformation, and workflow automation.
β’ Background in Azure-to-AWS migration (preferred but not required)