DRS IT Solutions Inc

Autonomous Vehicle Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an Autonomous Vehicle Engineer on a 6-month contract in Los Altos, CA, offering competitive pay. Key skills include 3+ years in data infrastructure, proficiency in Python and SQL, and strong AWS experience.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
April 25, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
1099 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Scala #ML (Machine Learning) #Data Quality #Airflow #Distributed Computing #Athena #Batch #"ETL (Extract #Transform #Load)" #EC2 #Storage #Metadata #Data Engineering #Python #AWS (Amazon Web Services) #AI (Artificial Intelligence) #SageMaker #SQL (Structured Query Language) #Spark (Apache Spark) #Data Pipeline #S3 (Amazon Simple Storage Service) #Data Storage #Datasets #Luigi
Role description
Data Engineer – Autonomous Vehicle AI Research Infrastructure LOS ALTOS, CA; / Human Interactive Driving – Contractor Candidate must be willing to work on 1099/W2. C2C/Vendor referrals will not be considered ● Design, implement, and maintain robust data pipelines for ingesting, cleaning, and transforming large-scale autonomous vehicle datasets (camera, LiDAR, radar, GPS, simulation logs). ● Develop scalable storage and retrieval systems using AWS services (S3, EC2, SageMaker, Athena, etc.). ● Ensure data quality and consistency through automated validation, deduplication, and schema enforcement. ● Collaborate with ML researchers and engineers to provide efficient access to training data, labels, and metadata. ● Optimize data preprocessing and batching pipelines to support large-scale training and evaluation workflows. ● Build tools to manage and audit dataset versions, experiment tracking, and feature reproducibility. ● Implement and maintain CI/CD workflows for data and pipeline updates, ensuring minimal downtime and reproducible outputs. ● Monitor data pipeline performance and respond to bottlenecks or outages proactively. ● 3+ years of experience building production-grade data infrastructure or ML data pipelines. ● Strong proficiency with Python and SQL, and experience with data workflow orchestration tools (e.g., Airflow, Prefect, Luigi). ● Deep experience with AWS services, especially S3 (data storage), EC2 (compute), and SageMaker (model training). ● Familiarity with distributed computing frameworks like Spark, Dask, or Ray.