

ALL KNOWN SERVICES
Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer in Chesterfield, MO, for 6 months (possible extension), offering a hybrid work model. Requires strong skills in Power BI, BigQuery, SQL, ETL/ELT pipelines, and API integration. Local candidates only.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
480
-
🗓️ - Date
May 1, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Missouri, United States
-
🧠 - Skills detailed
#BigQuery #DAX #BI (Business Intelligence) #"ETL (Extract #Transform #Load)" #Data Processing #Data Science #Programming #Clustering #Microsoft Power BI #API (Application Programming Interface) #Scala #Data Engineering #Automation #Dataflow #Agile #SQL (Structured Query Language) #Cloud #Data Modeling #R #REST (Representational State Transfer) #Python #Datasets
Role description
Strictly W2 Only - No C2C
Job Title: IT Data Engineer
Location: Chesterfield, MO 63017 (Only Locals)
Duration: 6 Months (Extension Possible)
Work Type: Hybrid (3 Days Onsite Required)
⚠️ Hybrid – 3 days onsite mandatory - Only Locals
⚠️ Strong Power BI + BigQuery + SQL required
Key Responsibilities
Data Engineering & Pipeline Development
• Design, build, and maintain scalable ETL/ELT pipelines using: Google BigQuery, Microsoft Fabric dataflows, Cloud-based tools
• Optimize performance using: Partitioning, Clustering, Query cost optimization
Power Platform Development
• Develop Power Apps (Canvas & Model-Driven) for business workflows.
• Build and optimize Power BI dashboards and reports (DAX, tuning).
Data Modeling & Analytics
• Create reusable data models and curated datasets.
• Deliver analytics-ready data for reporting and business insights.
API & Integration Work
• Integrate with APIs using: REST, OAuth, Rate limiting & retry strategies
• Load and transform data into BigQuery and Fabric OneLake.
Programming & Data Processing
• Use Python and/or R for: Data processing, Statistical analysis, Workflow automation
Collaboration & Delivery
• Work with: Data scientists, Analysts, Product owners
• Translate business requirements into technical solutions.
• Manage multiple projects in a fast-paced environment.
Must Haves
• Strong experience with: Power BI (DAX, reporting, optimization), Power Apps (Canvas & Model-Driven), Google BigQuery, Advanced SQL
• Experience building ETL/ELT pipelines.
• Strong API integration experience (REST/OAuth).
• Experience with Microsoft Fabric (dataflows).
• Proficiency in Python or R.
• Strong communication and problem-solving skills.
Nice to Have
• Experience with Fabric OneLake.
• Exposure to data science workflows.
• Experience working in fast-paced agile environments.
• Experience optimizing enterprise-scale analytics platforms.
Strictly W2 Only - No C2C
Job Title: IT Data Engineer
Location: Chesterfield, MO 63017 (Only Locals)
Duration: 6 Months (Extension Possible)
Work Type: Hybrid (3 Days Onsite Required)
⚠️ Hybrid – 3 days onsite mandatory - Only Locals
⚠️ Strong Power BI + BigQuery + SQL required
Key Responsibilities
Data Engineering & Pipeline Development
• Design, build, and maintain scalable ETL/ELT pipelines using: Google BigQuery, Microsoft Fabric dataflows, Cloud-based tools
• Optimize performance using: Partitioning, Clustering, Query cost optimization
Power Platform Development
• Develop Power Apps (Canvas & Model-Driven) for business workflows.
• Build and optimize Power BI dashboards and reports (DAX, tuning).
Data Modeling & Analytics
• Create reusable data models and curated datasets.
• Deliver analytics-ready data for reporting and business insights.
API & Integration Work
• Integrate with APIs using: REST, OAuth, Rate limiting & retry strategies
• Load and transform data into BigQuery and Fabric OneLake.
Programming & Data Processing
• Use Python and/or R for: Data processing, Statistical analysis, Workflow automation
Collaboration & Delivery
• Work with: Data scientists, Analysts, Product owners
• Translate business requirements into technical solutions.
• Manage multiple projects in a fast-paced environment.
Must Haves
• Strong experience with: Power BI (DAX, reporting, optimization), Power Apps (Canvas & Model-Driven), Google BigQuery, Advanced SQL
• Experience building ETL/ELT pipelines.
• Strong API integration experience (REST/OAuth).
• Experience with Microsoft Fabric (dataflows).
• Proficiency in Python or R.
• Strong communication and problem-solving skills.
Nice to Have
• Experience with Fabric OneLake.
• Exposure to data science workflows.
• Experience working in fast-paced agile environments.
• Experience optimizing enterprise-scale analytics platforms.






