EPITEC

Senior Analytics Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Analytics Engineer on a full-time contract in Dearborn, MI (hybrid). Pay is $65-70 per hour. Requires 7+ years in Data Engineering, advanced Python, and strong data pipeline experience. Preferred automotive or IoT experience.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
560
-
🗓️ - Date
March 26, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Dearborn, MI
-
🧠 - Skills detailed
#Strategy #IoT (Internet of Things) #Data Engineering #Scala #Data Pipeline #Compliance #Embedded Systems #Data Warehouse #GitHub #Python #Data Lake #JSON (JavaScript Object Notation) #Scripting #Version Control #YAML (YAML Ain't Markup Language) #Pytest #API (Application Programming Interface)
Role description
Location: Dearborn, MI (Hybrid, 4 days on-site) Employment Type: Full-Time Contract Pay: $65-70 per hour Description: We are seeking a Senior Analytics Engineer to serve as the Technical Implementation Lead for Data Contracts within Digital Product Analytics. This is a highly hands-on, “engineering force multiplier” role where you will translate analytics strategy into production-ready code and tooling for feature engineering teams. While the analytics roadmap is defined, feature engineers often lack the bandwidth to operationalize analytics requirements. You will provide white?glove enablement—authoring schemas, generating code artifacts, and building automated validation—so data contract adoption is seamless, scalable, and reliable. Responsibilities: • Collaborate with business and technology stakeholders to understand current and future data and instrumentation requirements • Design, build, and maintain scalable data infrastructure including pipelines, data models, and workflow applications • Develop and own Data Contracts by translating measurement plans into production-ready YAML/JSON schemas • Build validation frameworks by writing reusable Python/PyTest unit tests for CI/CD pipelines to verify: Event firing accuracy and Payload structure and schema compliance • Design and maintain modern data platforms including data warehouses, data lakes, and lakehouse architectures • Optimize instrumentation for constrained environments: Minimize LTE bandwidth usage, Reduce 12V battery drain, Avoid unnecessary vehicle wakeups • Manage schema evolution and versioning using GitHub, ensuring backward compatibility so analytics changes never break feature builds • Bridge analytics and embedded systems by translating business needs into: Vehicle signals (TCU/ECU, CAN) and API logic and event contracts • Automate data engineering tasks through scripting and analytical tooling • Monitor performance and continuously identify optimization opportunities Requirements: • 7+ years of professional Data Engineering experience • Bachelor’s Degree in a relevant field • Python (advanced, production-grade use) • Strong experience designing, building, and maintaining data pipelines and infrastructure • GitHub-based development and version control Preferred: • 5+ years total technical experience in one or more of the following: Software Engineering, Data Engineering, or Embedded Systems • Master’s Degree • Automotive or IoT domain experience