Data Engineer - W2 Role

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with 7+ years of experience in Data Engineering, including data modeling, ETL, and pipeline development. The 6+ month contract in Dearborn, MI offers $55-$65 hourly on W2, with a hybrid schedule.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
480
-
πŸ—“οΈ - Date discovered
September 18, 2025
πŸ•’ - Project duration
More than 6 months
-
🏝️ - Location type
Hybrid
-
πŸ“„ - Contract type
W2 Contractor
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
Dearborn, MI
-
🧠 - Skills detailed
#Data Lake #Data Engineering #Data Pipeline #"ETL (Extract #Transform #Load)" #Data Lakehouse #Qlik #Licensing #Agile #Data Modeling #Scala #Data Warehouse #GCP (Google Cloud Platform) #Storage
Role description
Akkodis is seeking multiple of a Data Engineer (W2 role) for a 6+ months contract position with one of our Automotive Industry leading Direct Client in Dearborn, MI (hybrid schedule). The ideal candidate would be someone with a minimum 7+ years of experience in Data Engineering including data modeling, ETL and pipeline development. Bonus skills include GCP, Big Query, Data Form and Qlik Replicate. Pay Range: $55.00 - $65.00 hourly on W2 only, the pay range may be negotiable based on experience, education, geographic location, and other factors. Schedule: Hybrid - 4 Days onsite/ 1 day remote Position Description: Employees in this job function are responsible for designing, building, and maintaining data solutions including data infrastructure, pipelines, etc. for collecting, storing, processing and analyzing large volumes of data efficiently and accurately. Key Responsibilities: β€’ Collaborate with business and technology stakeholders to understand current and future data requirements. β€’ Design, build and maintain reliable, efficient and scalable data infrastructure for data collection, storage, transformation, and analysis. β€’ Plan, design, build and maintain scalable data solutions including data pipelines, data models, and applications for efficient and reliable data workflow. β€’ Design, implement and maintain existing and future data platforms like data warehouses, data lakes, Data Lakehouse etc. for structured and unstructured data. β€’ Design and develop analytical tools, algorithms, and programs to support data engineering activities like writing scripts and automating tasks. β€’ Ensure optimum performance and identify improvement opportunities. Skills Required: Data/Analytics, ETL, Data Modeling, Data Warehousing, Analytical skills Skills Preferred: GCP, Big Query Experience Required: 7+ years Data Engineering work experience. Experience Preferred: The ideal candidate will have strong curiosity and analytical skills; be a self-starter with the ability to work in an agile environment and with ambiguity; strong verbal & communications; highly collaborative but able to work independently. Key skills include defining requirements, data modeling, ETL and pipeline development. Bonus skills include GCP, Big Query, Data Form and Qlik Replicate. Education Required: Bachelor's Degree Education Preferred: Additional Safety Training/Licensing/Personal Protection Requirements: Thanks & Regards Aditya Agnihotri [Aadi] Sr. Recruiter Email: aditya.agnihotri@akkodisgroup.com Direct: +1(610)-472-0979 LinkedIn: https://www.linkedin.com/in/aditya-agnihotri-7300722061/ (An Adecco Group Company) World Leader in IT and Engineering Workforce Solutions www.akkodis.com β€œBelieve you can and you're halfway there.” β€” Theodore Roosevelt