Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer in Dearborn, MI, with a contract length of unspecified duration and a pay rate based on experience. Key skills include GCP, Kubernetes, Spark, and Python. A Bachelor's degree is required; 5+ years in the automotive industry is preferred.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
September 10, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
On-site
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
Dearborn, MI
-
🧠 - Skills detailed
#Security #Project Management #Monitoring #Data Lakehouse #AWS (Amazon Web Services) #API (Application Programming Interface) #Data Pipeline #Python #Model Deployment #Automation #TensorFlow #"ETL (Extract #Transform #Load)" #Consulting #Data Quality #Kubernetes #Data Lake #Data Governance #Data Warehouse #Cloud #Storage #SonarQube #DevSecOps #Documentation #ML (Machine Learning) #Libraries #PyTorch #GCP (Google Cloud Platform) #Jira #Spark (Apache Spark) #Scala #Data Engineering #Deployment
Role description
Details: Job Description Stefanini Group is hiring! Stefanini is looking for a Data Engineer, Dearborn, MI (Onsite) For quick apply, please reach out Vasudha Lakshmi at 248-263-5273/vasudha.l@stefanini.com We are seeking an experienced Data Engineer to design, implement, and maintain robust analytics pipeline solutions. These solutions will support the analysis, modeling, and prediction of upstream and downstream auction prices, directly benefiting the Vehicle Analytics team and its customers. The ideal candidate will excel at developing solutions, maintaining DevSecOps, and collaborating with cross-functional teams to improve processes and drive business performance. Responsibilities β€’ Collaborate with business and technology stakeholders to understand current and future data requirements β€’ Design, build and maintain reliable, efficient and scalable data infrastructure for analytics models, data collection, storage, transformation, and monitoring β€’ Plan, design, build and maintain scalable data solutions including data pipelines, data models, and applications for efficient and reliable workflow β€’ Design, implement and maintain existing and future data platforms like data warehouses, data lakes, data Lakehouse etc. for structured and unstructured data β€’ Design and develop analytical tools, algorithms, and programs to support data engineering activities like writing scripts and automating tasks β€’ Ensure optimum performance and identify improvement opportunities β€’ Develop, build and maintain infrastructure required for analytics, including data pipelines, model deployment platforms, and model monitoring. β€’ Develop and maintain tools and libraries to support the development and deployment of models. β€’ Automate machine learning workflows using DevSecOps principles and practices. β€’ Collaborate with development and operations teams to implement software solutions that improve system integration and automation of analytic pipelines. β€’ Design, develop, and manage data flows and APIs between upstream systems and applications. β€’ Troubleshoot and resolve issues related to system communication, data flow, and data quality. β€’ Collaborate with technical and non-technical teams to gather integration requirements and ensure successful deployment of data solutions. β€’ Create and maintain comprehensive technical documentation of software components. β€’ Work with IT to ensure systems meet evolving business needs and comply with data governance policies and security requirements. β€’ Implement and enforce the highest standards of data quality and integrity across all data processes. β€’ Manage deliverables through project management tools. Job Requirements Details: Experience Required β€’ GCP Cloud Run, Kubernetes, Spark, SonarQube, GCP, Google Cloud Platform, Tekton, Python, API, Jira β€’ 4+ years Data Engineering work experience AWS Experience Preferred β€’ 5+ years of experience in the automotive industry, particularly in auto remarketing and sales. β€’ Proven ability to thrive in dynamic environments, managing multiple priorities and delivering high-impact results even with limited information. β€’ Exceptional problem-solving skills, a proactive and strategic mindset, and a passion for technical excellence and innovation in data engineering. β€’ Demonstrated commitment to continuous learning and professional development. β€’ Familiarity with machine learning libraries, such as TensorFlow, PyTorch, or Scikit-learn β€’ Experience with MLOps tools and platforms. Education Required β€’ Bachelor's Degree Education Preferred: β€’ Master's Degree β€’ Listed salary ranges may vary based on experience, qualifications, and local market. Also, some positions may include bonuses or other incentives β€’ β€’ β€’ Stefanini takes pride in hiring top talent and developing relationships with our future employees. Our talent acquisition teams will never make an offer of employment without having a phone conversation with you. Those face-to-face conversations will involve a description of the job for which you have applied. We also speak with you about the process, including interviews and job offers. About Stefanini Group The Stefanini Group is a global provider of offshore, onshore and near shore outsourcing, IT digital consulting, systems integration, application, and strategic staffing services to Fortune 1000 enterprises around the world. Our presence is in countries like the Americas, Europe, Africa, and Asia, and more than four hundred clients across a broad spectrum of markets, including financial services, manufacturing, telecommunications, chemical services, technology, public sector, and utilities. Stefanini is a CMM level 5, IT consulting company with a global presence. We are a CMM Level 5 company.