

Brooksource
Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Senior Data Engineer for a 6-month contract, offering a pay rate of "XX" per hour. It requires 7–8+ years of experience, expertise in Snowflake, Airflow, AWS, and preferably healthcare data experience.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
January 7, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Greater Minneapolis-St. Paul Area
-
🧠 - Skills detailed
#Terraform #Cloud #Snowflake #Apache Airflow #Python #IAM (Identity and Access Management) #Airflow #S3 (Amazon Simple Storage Service) #Data Security #Lambda (AWS Lambda) #SQL (Structured Query Language) #AWS (Amazon Web Services) #"ETL (Extract #Transform #Load)" #Scala #Data Modeling #Automation #DevOps #Athena #Data Ingestion #Security #AWS S3 (Amazon Simple Storage Service) #Data Engineering #Data Orchestration #Deployment #Datasets #Data Warehouse
Role description
About the Role
We are seeking a hands-on Senior Data Engineer to design and implement a new data warehousing environment that connects resident and medical EMR data using Snowflake. The ideal candidate is an experienced engineer who thrives in building data infrastructure from the ground up — not just a planner, but a doer who can execute.
Key Responsibilities
• Design, build, and implement a data warehouse in Snowflake to consolidate scattered datasets currently managed manually (including Excel-based workflows).
• Establish data orchestration and workflow automation using Apache Airflow, transitioning from an existing Dagster setup.
• Implement data security protocols, access control, and governance frameworks.
• Work closely with stakeholders to understand business data needs and integrate data from multiple sources.
• Set up and manage cloud infrastructure on AWS, ensuring scalability, performance, and reliability.
• Develop and optimize ETL/ELT pipelines for data ingestion, transformation, and processing.
• Collaborate with DevOps teams to automate deployments and maintain CI/CD pipelines.
Required Skills & Experience
• 7–8+ years of professional experience as a Data Engineer or Cloud Data Infrastructure Engineer.
• Strong experience with Snowflake, Airflow, and AWS (S3, Lambda, Athena, IAM, etc.).
• Proven ability to architect and implement data warehousing and data orchestration solutions from scratch.
• Hands-on experience in DevOps and cloud infrastructure setup.
• Expertise in data modeling, ETL/ELT pipelines, and security best practices.
• Strong problem-solving, communication, and execution skills — a self-starter who can deliver with minimal supervision.
Preferred Qualifications
• Experience with healthcare data or EMR systems (e.g., Athena) is a plus.
• Background in Python, SQL, and infrastructure-as-code tools (Terraform, CloudFormation).
• Prior experience migrating from Dagster to Airflow is highly desirable.
About the Role
We are seeking a hands-on Senior Data Engineer to design and implement a new data warehousing environment that connects resident and medical EMR data using Snowflake. The ideal candidate is an experienced engineer who thrives in building data infrastructure from the ground up — not just a planner, but a doer who can execute.
Key Responsibilities
• Design, build, and implement a data warehouse in Snowflake to consolidate scattered datasets currently managed manually (including Excel-based workflows).
• Establish data orchestration and workflow automation using Apache Airflow, transitioning from an existing Dagster setup.
• Implement data security protocols, access control, and governance frameworks.
• Work closely with stakeholders to understand business data needs and integrate data from multiple sources.
• Set up and manage cloud infrastructure on AWS, ensuring scalability, performance, and reliability.
• Develop and optimize ETL/ELT pipelines for data ingestion, transformation, and processing.
• Collaborate with DevOps teams to automate deployments and maintain CI/CD pipelines.
Required Skills & Experience
• 7–8+ years of professional experience as a Data Engineer or Cloud Data Infrastructure Engineer.
• Strong experience with Snowflake, Airflow, and AWS (S3, Lambda, Athena, IAM, etc.).
• Proven ability to architect and implement data warehousing and data orchestration solutions from scratch.
• Hands-on experience in DevOps and cloud infrastructure setup.
• Expertise in data modeling, ETL/ELT pipelines, and security best practices.
• Strong problem-solving, communication, and execution skills — a self-starter who can deliver with minimal supervision.
Preferred Qualifications
• Experience with healthcare data or EMR systems (e.g., Athena) is a plus.
• Background in Python, SQL, and infrastructure-as-code tools (Terraform, CloudFormation).
• Prior experience migrating from Dagster to Airflow is highly desirable.





