On-Demand Group

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer on a contract-to-hire basis, offering $78 to $89 per hour. Key skills include GCP, SQL, Python, and Airflow. Candidates must have 5+ years of experience and a bachelor's degree. Location is onsite in Minneapolis, Arlington, Portland, or Raleigh.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
681
-
πŸ—“οΈ - Date
November 14, 2025
πŸ•’ - Duration
More than 6 months
-
🏝️ - Location
On-site
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Minneapolis, MN
-
🧠 - Skills detailed
#Monitoring #Data Catalog #Microsoft Power BI #BI (Business Intelligence) #Security #Data Quality #Infrastructure as Code (IaC) #ML (Machine Learning) #BigQuery #Scripting #Cloud #Data Processing #Tableau #Data Governance #Deployment #Airflow #GCP (Google Cloud Platform) #Data Cleansing #Data Lake #Data Ingestion #SQL (Structured Query Language) #Data Analysis #Data Science #Data Security #Storage #Terraform #Visualization #Data Engineering #Data Pipeline #Looker #Data Modeling #Mathematics #Data Integrity #Automation #Dataflow #"ETL (Extract #Transform #Load)" #Computer Science #AI (Artificial Intelligence) #Python #DevOps #Scala
Role description
Job Title: Data Engineer Job Location: On-site Job Type: Contract to Hire USC of GC Holders only for contract to hire need having no sponsorship β€’ Must have requirements: β€’ GCP, SQL, Python, Airflow β€’ System design mindset β€’ Communication – ability to vocalize what they are doing, what/how they are achieving their work. Accents not an issue as long as they are comprehendible. β€’ Healthcare not required, but a nice to have. β€’ Location: Onsite – any 4 office location, focus is Minneapolis, Arlington, VA, Portland, OR, Raleigh, NC β€’ 100% onsite, then switch to 2-3x/week hybrid if they do well Job Summary: The Senior Cloud Data Engineer plays a key role in designing, building, and maintaining data pipelines and infrastructure using Google Cloud Platform (GCP) BigQuery. The incumbent will collaborate with data analysts, data scientists, and other engineers to ensure timely access to high-quality data for data-driven decision-making across the organization. The Senior Cloud Data Engineer is a highly technical person that has mastered hands-on coding in data processing solutions and scalable data pipelines to support analytics and exploratory analysis. This role ensures new business requirements are decomposed and implemented in the cohesive end-to-end designs that enable data integrity and quality, and best support BI and analytic capability needs that power decision-making. This includes building data acquisition programs that handle the business’s growing data volume as part of the Data Lake in GCP BigQuery ecosystem and maintaining a robust data catalog. This is a Senior Data Engineering role within Data & Analytics’ Data Core organization working closely with leaders of the Data & Analytics. The incumbent will continually improve the business’s data and analytic solutions, processes, and data engineering capabilities. The incumbent embraces industry best practices and trends and, through acquired knowledge, drives process and system improvement opportunities. Responsibilities: β€’ Design, develop, and implement data pipelines using GCP BigQuery, Dataflow, and Airflow for data ingestion, transformation, and loading. β€’ Optimize data pipelines for performance, scalability, and cost-efficiency. β€’ Ensure data quality through data cleansing, validation, and monitoring processes. β€’ Develop and maintain data models and schemas in BigQuery to support various data analysis needs. β€’ Automate data pipeline tasks using scripting languages like Python and tools like Dataflow. β€’ Collaborate with data analysts and data scientists to understand data requirements and translate them into technical data solutions. β€’ Leverage DevOps Terraform (IaC) to ensure seamless integration of data pipelines with CI/CD workflows. β€’ Monitor and troubleshoot data pipelines and infrastructure to identify and resolve issues. β€’ Stay up-to-date with the latest advancements in GCP BigQuery and other related technologies. β€’ Document data pipelines and technical processes for future reference and knowledge sharing. Basic Requirements: β€’ Bachelor’s degree or equivalent experience in Computer Science, Mathematics, Information Technology or related field. β€’ 5+ years of solid experience as a data engineer. β€’ Strong understanding of data warehousing / datalake concepts and data modeling principles. β€’ Proven experience with designing and implementing data pipelines using GCP BigQuery, Dataflow and Aiflow. β€’ Strong SQL and scripting languages like Python (or similar) skills. β€’ Experience with data quality tools and techniques. β€’ Ability to work independently and as part of a team. β€’ Strong problem-solving and analytical skills. β€’ Passion for data and a desire to learn and adapt to new technologies. β€’ Experience with other GCP services like Cloud Storage, Dataflow, and Pub/Sub etc. β€’ Experience with cloud deployment and automation tools like Terraform. β€’ Experience with data visualization tools like Tableau or Power BI or Looker. β€’ Experience with healthcare data. β€’ Familiarity with machine learning, artificial intelligence and data science concepts. β€’ Experience with data governance and healthcare PHI data security best practices. β€’ Ability to work independently on tasks and projects to deliver data engineering solutions. β€’ Ability to communicate effectively and convey complex technical concepts as well as tasks / project updates. The projected hourly range for this position is $78 to $89. On-Demand Group (ODG) provides employee benefits which includes healthcare, dental, and vision insurance. ODG is an equal opportunity employer that does not discriminate on the basis of race, color, religion, gender, sexual orientation, age, national origin, disability, or any other characteristic protected by law.