

Brooksource
Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer with a contract length of "unknown," offering a pay rate of "unknown," and is remote. Key skills include Snowflake, Airflow, AWS, and healthcare data experience is preferred. 7–8+ years of relevant experience required.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
640
-
🗓️ - Date
October 11, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Minnesota, United States
-
🧠 - Skills detailed
#Data Modeling #Snowflake #Terraform #Lambda (AWS Lambda) #AWS (Amazon Web Services) #Data Security #Cloud #AWS S3 (Amazon Simple Storage Service) #S3 (Amazon Simple Storage Service) #Datasets #IAM (Identity and Access Management) #Data Engineering #DevOps #Athena #Apache Airflow #Data Warehouse #Security #Python #Automation #Scala #Data Ingestion #SQL (Structured Query Language) #Data Orchestration #"ETL (Extract #Transform #Load)" #Deployment #Airflow
Role description
About the Role
We are seeking a hands-on Senior Data Engineer to design and implement a new data warehousing environment that connects resident and medical EMR data using Snowflake. The ideal candidate is an experienced engineer who thrives in building data infrastructure from the ground up — not just a planner, but a doer who can execute.
Key Responsibilities
• Design, build, and implement a data warehouse in Snowflake to consolidate scattered datasets currently managed manually (including Excel-based workflows).
• Establish data orchestration and workflow automation using Apache Airflow, transitioning from an existing Dagster setup.
• Implement data security protocols, access control, and governance frameworks.
• Work closely with stakeholders to understand business data needs and integrate data from multiple sources.
• Set up and manage cloud infrastructure on AWS, ensuring scalability, performance, and reliability.
• Develop and optimize ETL/ELT pipelines for data ingestion, transformation, and processing.
• Collaborate with DevOps teams to automate deployments and maintain CI/CD pipelines.
Required Skills & Experience
• 7–8+ years of professional experience as a Data Engineer or Cloud Data Infrastructure Engineer.
• Strong experience with Snowflake, Airflow, and AWS (S3, Lambda, Athena, IAM, etc.).
• Proven ability to architect and implement data warehousing and data orchestration solutions from scratch.
• Hands-on experience in DevOps and cloud infrastructure setup.
• Expertise in data modeling, ETL/ELT pipelines, and security best practices.
• Strong problem-solving, communication, and execution skills — a self-starter who can deliver with minimal supervision.
Preferred Qualifications
• Experience with healthcare data or EMR systems (e.g., Athena) is a plus.
• Background in Python, SQL, and infrastructure-as-code tools (Terraform, CloudFormation).
• Prior experience migrating from Dagster to Airflow is highly desirable.
About the Role
We are seeking a hands-on Senior Data Engineer to design and implement a new data warehousing environment that connects resident and medical EMR data using Snowflake. The ideal candidate is an experienced engineer who thrives in building data infrastructure from the ground up — not just a planner, but a doer who can execute.
Key Responsibilities
• Design, build, and implement a data warehouse in Snowflake to consolidate scattered datasets currently managed manually (including Excel-based workflows).
• Establish data orchestration and workflow automation using Apache Airflow, transitioning from an existing Dagster setup.
• Implement data security protocols, access control, and governance frameworks.
• Work closely with stakeholders to understand business data needs and integrate data from multiple sources.
• Set up and manage cloud infrastructure on AWS, ensuring scalability, performance, and reliability.
• Develop and optimize ETL/ELT pipelines for data ingestion, transformation, and processing.
• Collaborate with DevOps teams to automate deployments and maintain CI/CD pipelines.
Required Skills & Experience
• 7–8+ years of professional experience as a Data Engineer or Cloud Data Infrastructure Engineer.
• Strong experience with Snowflake, Airflow, and AWS (S3, Lambda, Athena, IAM, etc.).
• Proven ability to architect and implement data warehousing and data orchestration solutions from scratch.
• Hands-on experience in DevOps and cloud infrastructure setup.
• Expertise in data modeling, ETL/ELT pipelines, and security best practices.
• Strong problem-solving, communication, and execution skills — a self-starter who can deliver with minimal supervision.
Preferred Qualifications
• Experience with healthcare data or EMR systems (e.g., Athena) is a plus.
• Background in Python, SQL, and infrastructure-as-code tools (Terraform, CloudFormation).
• Prior experience migrating from Dagster to Airflow is highly desirable.