

On-Demand Group
Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer on a 6-month contract, offering $70 to $80 per hour, based on-site in Minneapolis, MN, Arlington, VA, Portland, OR, or Raleigh, NC. Requires 5+ years of experience, proficiency in GCP BigQuery, and healthcare data experience.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
640
-
ποΈ - Date
January 6, 2026
π - Duration
More than 6 months
-
ποΈ - Location
On-site
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
Minneapolis, MN
-
π§ - Skills detailed
#Python #Mathematics #Data Security #DevOps #Deployment #Infrastructure as Code (IaC) #Scala #Visualization #Data Modeling #Data Pipeline #Storage #Cloud #ML (Machine Learning) #Security #Data Analysis #AI (Artificial Intelligence) #Computer Science #"ETL (Extract #Transform #Load)" #Looker #SQL (Structured Query Language) #Scripting #Data Governance #Terraform #Data Ingestion #BigQuery #BI (Business Intelligence) #GCP (Google Cloud Platform) #Data Quality #Data Science #Data Engineering #Airflow #Tableau #Microsoft Power BI #Automation #Data Cleansing #Dataflow #Monitoring
Role description
On-Demand Group is currently seeking for a Data Engineer for a 6-month contract engagement to start.
Job Title: Data Engineer
Job Location: On-site β any 4 office location, Minneapolis, MN, Arlington, VA, Portland, OR, Raleigh, NC
Job Type: Contract to Hire
Job Summary:
The Senior Cloud Data Engineer plays a key role in designing, building, and maintaining
data pipelines and infrastructure using Google Cloud Platform (GCP) BigQuery. The
incumbent will collaborate with data analysts, data scientists, and other engineers to
ensure timely access to high-quality data for data-driven decision-making across the
organization.
Responsibilities:
β’ Design, develop, and implement data pipelines using GCP BigQuery, Dataflow, and
Airflow for data ingestion, transformation, and loading.
β’ Optimize data pipelines for performance, scalability, and cost-efficiency.
β’ Ensure data quality through data cleansing, validation, and monitoring processes.
β’ Develop and maintain data models and schemas in BigQuery to support various
data analysis needs.
β’ Automate data pipeline tasks using scripting languages like Python and tools like
Dataflow.
β’ Collaborate with data analysts and data scientists to understand data requirements
and translate them into technical data solutions.
β’ Leverage DevOps Terraform (IaC) to ensure seamless integration of data pipelines
with CI/CD workflows.
β’ Monitor and troubleshoot data pipelines and infrastructure to identify and resolve
issues.
β’ Stay up-to-date with the latest advancements in GCP BigQuery and other related
technologies.
β’ Document data pipelines and technical processes for future reference and
knowledge sharing.
Basic Requirements:
β’ Bachelorβs degree or equivalent experience in Computer Science, Mathematics,
Information Technology or related field.
β’ 5+ years of solid experience as a data engineer.
β’ Strong understanding of data warehousing / datalake concepts and data modeling
principles.
β’ Proven experience with designing and implementing data pipelines using GCP
BigQuery, Dataflow and Airflow.
β’ Strong SQL and scripting languages like Python (or similar) skills.
β’ Experience with data quality tools and techniques.
β’ Ability to work independently and as part of a team.
β’ Strong problem-solving and analytical skills.
β’ Passion for data and a desire to learn and adapt to new technologies.
β’ Experience with other GCP services like Cloud Storage, Dataflow, and Pub/Sub etc.
β’ Experience with cloud deployment and automation tools like Terraform.
β’ Experience with data visualization tools like Tableau or Power BI or Looker.
β’ Experience with healthcare data.
β’ Familiarity with machine learning, artificial intelligence and data science concepts.
β’ Experience with data governance and healthcare PHI data security best practices.
β’ Ability to work independently on tasks and projects to deliver data engineering
solutions.
β’ Ability to communicate effectively and convey complex technical concepts as well
as tasks / project updates.
The projected hourly range for this position is $70 to $80.
On-Demand Group (ODG) provides employee benefits which includes healthcare, dental, and vision insurance. ODG is an equal opportunity employer that does not discriminate on the basis of race, color, religion, gender, sexual orientation, age, national origin, disability, or any other characteristic protected by law.
On-Demand Group is currently seeking for a Data Engineer for a 6-month contract engagement to start.
Job Title: Data Engineer
Job Location: On-site β any 4 office location, Minneapolis, MN, Arlington, VA, Portland, OR, Raleigh, NC
Job Type: Contract to Hire
Job Summary:
The Senior Cloud Data Engineer plays a key role in designing, building, and maintaining
data pipelines and infrastructure using Google Cloud Platform (GCP) BigQuery. The
incumbent will collaborate with data analysts, data scientists, and other engineers to
ensure timely access to high-quality data for data-driven decision-making across the
organization.
Responsibilities:
β’ Design, develop, and implement data pipelines using GCP BigQuery, Dataflow, and
Airflow for data ingestion, transformation, and loading.
β’ Optimize data pipelines for performance, scalability, and cost-efficiency.
β’ Ensure data quality through data cleansing, validation, and monitoring processes.
β’ Develop and maintain data models and schemas in BigQuery to support various
data analysis needs.
β’ Automate data pipeline tasks using scripting languages like Python and tools like
Dataflow.
β’ Collaborate with data analysts and data scientists to understand data requirements
and translate them into technical data solutions.
β’ Leverage DevOps Terraform (IaC) to ensure seamless integration of data pipelines
with CI/CD workflows.
β’ Monitor and troubleshoot data pipelines and infrastructure to identify and resolve
issues.
β’ Stay up-to-date with the latest advancements in GCP BigQuery and other related
technologies.
β’ Document data pipelines and technical processes for future reference and
knowledge sharing.
Basic Requirements:
β’ Bachelorβs degree or equivalent experience in Computer Science, Mathematics,
Information Technology or related field.
β’ 5+ years of solid experience as a data engineer.
β’ Strong understanding of data warehousing / datalake concepts and data modeling
principles.
β’ Proven experience with designing and implementing data pipelines using GCP
BigQuery, Dataflow and Airflow.
β’ Strong SQL and scripting languages like Python (or similar) skills.
β’ Experience with data quality tools and techniques.
β’ Ability to work independently and as part of a team.
β’ Strong problem-solving and analytical skills.
β’ Passion for data and a desire to learn and adapt to new technologies.
β’ Experience with other GCP services like Cloud Storage, Dataflow, and Pub/Sub etc.
β’ Experience with cloud deployment and automation tools like Terraform.
β’ Experience with data visualization tools like Tableau or Power BI or Looker.
β’ Experience with healthcare data.
β’ Familiarity with machine learning, artificial intelligence and data science concepts.
β’ Experience with data governance and healthcare PHI data security best practices.
β’ Ability to work independently on tasks and projects to deliver data engineering
solutions.
β’ Ability to communicate effectively and convey complex technical concepts as well
as tasks / project updates.
The projected hourly range for this position is $70 to $80.
On-Demand Group (ODG) provides employee benefits which includes healthcare, dental, and vision insurance. ODG is an equal opportunity employer that does not discriminate on the basis of race, color, religion, gender, sexual orientation, age, national origin, disability, or any other characteristic protected by law.






