Senior Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer in Detroit, MI, offering a 12+ month W2 contract at a competitive pay rate. Key skills include Azure Data Factory, PySpark, and Power BI. Requires a Bachelor's degree and 7+ years of data experience.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
September 17, 2025
πŸ•’ - Project duration
More than 6 months
-
🏝️ - Location type
Hybrid
-
πŸ“„ - Contract type
W2 Contractor
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
Detroit, MI
-
🧠 - Skills detailed
#BI (Business Intelligence) #Visualization #ML (Machine Learning) #Data Engineering #Microsoft Power BI #Cloud #Data Quality #"ETL (Extract #Transform #Load)" #Scripting #Azure #Data Science #Data Lake #Agile #SQL (Structured Query Language) #Spark (Apache Spark) #Databricks #S3 (Amazon Simple Storage Service) #Azure SQL #PySpark #Spark SQL #Data Ingestion #Azure Data Factory #ADF (Azure Data Factory)
Role description
Dice is the leading career destination for tech experts at every stage of their careers. Our client, Strategic Staffing Solutions, is seeking the following. Apply via Dice today! STRATEGIC STAFFING SOLUTIONS (S3) HAS AN OPENING! Strategic Staffing Solutions is currently looking for a Sr. Data Engineer for a contract opportunity with one of our largest clients located in Detroit, MI! Title: Senior Data Engineer Location: Detroit, MI (Hybrid/onsite T, W, Th) Duration: 12+ Months with extension Role Type: W2 contract role Key Responsibilities: β€’ Design, develop, and maintain ETL pipelines using Azure Data Factory and Databricks β€’ Implement data ingestion and transformation workflows using PySpark and Spark SQL β€’ Build and optimize data models and visualizations in Power BI for business reporting β€’ Collaborate with data scientists to support ML pipeline integration and feature engineering β€’ Migrate and manage data across cloud platforms including Azure SQL, Data Lake, as well as local environments β€’ Ensure data quality, governance, and performance tuning across all data processes β€’ Participate in Agile ceremonies, including sprint planning and daily standups Minimum Education & Requirements: β€’ Bachelor's degree and 7+ years of experience working in a data and analytical function, including in-depth quantitative analytics Preferred β€’ Experience in scripting with SQL, extracting large sets of data, and design of ETL flows. β€’ Work experience in an interdisciplinary/cross functional field. β€’ Utility or customer-oriented industry experience β€’ Beware of scams. S3 never asks for money during its onboarding process