

ActiveSoft, Inc
Data Engineer - AWS
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer - AWS with a contract length of "unknown," offering a pay rate of "unknown." Key skills include DBT Cloud, AWS, Python, and Snowflake. Requires 5+ years of data engineering experience, preferably in healthcare.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
March 24, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#JSON (JavaScript Object Notation) #Docker #Automation #Fivetran #Azure #Data Modeling #Python #Cloud #Data Pipeline #"ETL (Extract #Transform #Load)" #Documentation #AWS (Amazon Web Services) #Data Governance #Data Quality #Data Engineering #SQL (Structured Query Language) #dbt (data build tool) #Airflow #Snowflake
Role description
Job Description:
MUST HAVE: DBT Cloud, AWS (not looking for Azure), Python, and Snowflake – Fivetran experience preferred
What We Need
• 5+ years of experience in data engineering, building and scaling modern data pipelines and models.
• Advanced proficiency in SQL and Python for data transformation, automation, and performance tuning.
• Deep hands-on experience with Snowflake, dbt, and orchestration tools like Airflow.
• Strong knowledge with Fivetran, AWS, and Docker.
• Strong understanding of data modeling, warehousing, and CI/CD workflows.
• Clear written and verbal communication skills, including the ability to document and present technical work effectively to engineering teams and business stakeholders.
• Experience with unstructured data (JSON, Parquet, Avro).
• Exposure to containerization and infrastructure-as-code concepts.
• Healthcare or regulated industry experience preferred.
What You'll Need to Succeed
• Commitment to data governance, documentation, and data quality.
• Ownership mindset and accountability for delivering reliable solutions.
• Strong analytical and problem-solving abilities with attention to detail.
• Adaptable and proactive learner in a fast-evolving technical environment.
• Effective communicator with both technical and non-technical stakeholders.
Job Description:
MUST HAVE: DBT Cloud, AWS (not looking for Azure), Python, and Snowflake – Fivetran experience preferred
What We Need
• 5+ years of experience in data engineering, building and scaling modern data pipelines and models.
• Advanced proficiency in SQL and Python for data transformation, automation, and performance tuning.
• Deep hands-on experience with Snowflake, dbt, and orchestration tools like Airflow.
• Strong knowledge with Fivetran, AWS, and Docker.
• Strong understanding of data modeling, warehousing, and CI/CD workflows.
• Clear written and verbal communication skills, including the ability to document and present technical work effectively to engineering teams and business stakeholders.
• Experience with unstructured data (JSON, Parquet, Avro).
• Exposure to containerization and infrastructure-as-code concepts.
• Healthcare or regulated industry experience preferred.
What You'll Need to Succeed
• Commitment to data governance, documentation, and data quality.
• Ownership mindset and accountability for delivering reliable solutions.
• Strong analytical and problem-solving abilities with attention to detail.
• Adaptable and proactive learner in a fast-evolving technical environment.
• Effective communicator with both technical and non-technical stakeholders.






