

ActiveSoft, Inc
Data Engineer - AWS (Contract to Hire)
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer - AWS (Contract to Hire) with a 1-year duration, offering remote work. Key skills include Snowflake, Fivetran, AWS, Python, and DBT Cloud. Candidates must have 10+ years of experience and be U.S. citizens or GC holders.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
March 17, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Texas, United States
-
🧠 - Skills detailed
#dbt (data build tool) #Cloud #Fivetran #"ETL (Extract #Transform #Load)" #JSON (JavaScript Object Notation) #Data Pipeline #Data Modeling #Docker #Snowflake #SQL (Structured Query Language) #Python #AWS (Amazon Web Services) #Data Engineering #AWS Glue #Airflow
Role description
Urgent Data Engineer – Remote (CST).
Must have: Snowflake, Fivetran, AWS, AWS Glue, Python, DBT Cloud.
1-year Contract-to-Hire | Interviews happening now.
Must be GC or US Citizens only
Must-Have Technologies (Mandatory)
• Snowflake
• Fivetran
• AWS (Key Requirement)
• AWS Glue
• Python
• DBT Cloud
Required Experience
• 10+ years of Data Engineering experience
• Strong SQL and Python for data pipelines and transformation
• Experience building modern data pipelines and data models
• Hands-on with dbt and orchestration tools (Airflow preferred)
• Strong understanding of data warehousing and data modeling
• Experience with Docker and CI/CD workflows
• Experience working with JSON, Parquet, Avro data formats
Other Requirements
• Excellent communication skills
• Must be contract-to-hire eligible
• Must be available immediately for interviews
Urgent Data Engineer – Remote (CST).
Must have: Snowflake, Fivetran, AWS, AWS Glue, Python, DBT Cloud.
1-year Contract-to-Hire | Interviews happening now.
Must be GC or US Citizens only
Must-Have Technologies (Mandatory)
• Snowflake
• Fivetran
• AWS (Key Requirement)
• AWS Glue
• Python
• DBT Cloud
Required Experience
• 10+ years of Data Engineering experience
• Strong SQL and Python for data pipelines and transformation
• Experience building modern data pipelines and data models
• Hands-on with dbt and orchestration tools (Airflow preferred)
• Strong understanding of data warehousing and data modeling
• Experience with Docker and CI/CD workflows
• Experience working with JSON, Parquet, Avro data formats
Other Requirements
• Excellent communication skills
• Must be contract-to-hire eligible
• Must be available immediately for interviews




