EPITEC

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with 4+ years of data engineering experience, preferably in the automotive industry. The contract is on-site in Dearborn, MI, for 40 hours per week, with a pay rate of "$XX/hour." Key skills include Spark, GCP, and Python. A Bachelor's degree is required, and a Master's degree is preferred.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
November 27, 2025
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Remote
-
πŸ“„ - Contract
W2 Contractor
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Dearborn, MI
-
🧠 - Skills detailed
#SonarQube #Data Warehouse #GCP (Google Cloud Platform) #Scala #"ETL (Extract #Transform #Load)" #Storage #Libraries #Computer Science #Data Engineering #Data Pipeline #Spark (Apache Spark) #Data Lake #Jira #Data Science #API (Application Programming Interface) #Data Lakehouse #Cloud #Kubernetes #Python #PyTorch #TensorFlow #ML (Machine Learning)
Role description
β€’ W2 ONLY, NO C2C β€’ Job Title: Automotive Data Engineer Location: Dearborn, MI. Job Type: Engineer Expected hours per week: 40 hours per week Schedule: On-site (1 day remote) Job Description: Employees in this job function are responsible for designing, building, and maintaining data solutions including data infrastructure, pipelines, etc. for collecting, storing, processing and analyzing large volumes of data efficiently and accurately Key Responsibilities: β€’ Collaborate with business and technology stakeholders to understand current and future data requirements β€’ Design, build and maintain reliable, efficient and scalable data infrastructure for data collection, storage, transformation, and analysis β€’ Plan, design, build and maintain scalable data solutions including data pipelines, data models, and applications for efficient and reliable data workflow β€’ Design, implement and maintain existing and future data platforms like data warehouses, data lakes, data lakehouse etc. for structured and unstructured data β€’ Design and develop analytical tools, algorithms, and programs to support data engineering activities like writing scripts and automating tasks β€’ Ensure optimum performance and identify improvement opportunities Skills Required: Spark, SonarQube, GCP Cloud Run, Kubernetes, GCP, Google Cloud Platform, Tekton, Python, API, Jira Experience Required:Engineer 2 Exp: 4+ years Data Engineering work experience Experience Preferred: β€’ 5+ years of experience in the automotive industry, particularly in auto remarketing and sales. β€’ Master's degree in a relevant field (e.g., Computer Science, Data Science, Engineering). β€’ Proven ability to thrive in dynamic environments, managing multiple priorities and delivering high-impact results even with limited information. β€’ Exceptional problem-solving skills, a proactive and strategic mindset, and a passion for technical excellence and innovation in data engineering. β€’ Demonstrated commitment to continuous learning and professional development. β€’ Familiarity with machine learning libraries, such as TensorFlow, PyTorch, or Scikit-learn β€’ Experience with MLOps tools and platforms. Education Required:Bachelor's Degree Education Preferred:Master's Degree Benefits: 80 hours paid time off, medical insurance contributions, dental vision and our 401k retirement savings plan