

Imetris Corporation
Python Lead Developer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Python Lead Developer with 5+ years of experience in data engineering, including Python, PySpark, and Airflow. The contract is onsite for 5 days a week, with an immediate start and a competitive pay rate.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
Unknown
-
ποΈ - Date
January 9, 2026
π - Duration
Unknown
-
ποΈ - Location
On-site
-
π - Contract
W2 Contractor
-
π - Security
Unknown
-
π - Location detailed
Auburn Hills, MI
-
π§ - Skills detailed
#Computer Science #GitHub #Version Control #GIT #Spark (Apache Spark) #Deployment #Linux #PySpark #SQL (Structured Query Language) #Cloud #Python #GCP (Google Cloud Platform) #Unix #Code Reviews #Data Ingestion #GitLab #Data Pipeline #Programming #Airflow #Automated Testing #Data Engineering #Docker #Containers
Role description
No Corp to Corp. Only W2
Onsite role ( 5 days a week)
Immediate hiring
No of positions: 3
Required skills:
Hiring Data Engineer with Hands-on experience: Minimum 5+ years of practical experience building production-grade data pipelines using Python and PySpark.
Airflow Expertise: Proven track record of designing, deploying, and managing Airflow DAGs in enterprise environments.
CI/CD for Data Projects : Ability to build and maintain CI/CD pipelinesfor data engineering workflows, including automated testing and deployment
β’
β’ .
Cloud & Containers: Experience with containerization (Docker and cloud platforms (GCP) for data engineering workloads. Appreciation for twelve-factor design principles.
Python Fluency : Ability to write object-oriented Python code manage dependencies, and follow industry best practices.
Version Control: Proficiency with
β’
β’ Git
β’
β’ for source code management and collaboration (commits, branching, merging, GitHub/GitLab workflows).
Unix/Linux: Strong command-line skills
β’
β’ in Unix-like environments.
SQL : Solid understanding of SQL for data ingestion and analysis.
Collaborative Development : Comfortable with code reviews, pair programming and using remote collaboration tools effectively.
Engineering Mindset: Writes code with an eye for maintainability and testability; excited to build production-grade software.
Education: Bachelorβs or graduate degree in Computer Science, Data Analytics or related field, or equivalent work experience.
No Corp to Corp. Only W2
Onsite role ( 5 days a week)
Immediate hiring
No of positions: 3
Required skills:
Hiring Data Engineer with Hands-on experience: Minimum 5+ years of practical experience building production-grade data pipelines using Python and PySpark.
Airflow Expertise: Proven track record of designing, deploying, and managing Airflow DAGs in enterprise environments.
CI/CD for Data Projects : Ability to build and maintain CI/CD pipelinesfor data engineering workflows, including automated testing and deployment
β’
β’ .
Cloud & Containers: Experience with containerization (Docker and cloud platforms (GCP) for data engineering workloads. Appreciation for twelve-factor design principles.
Python Fluency : Ability to write object-oriented Python code manage dependencies, and follow industry best practices.
Version Control: Proficiency with
β’
β’ Git
β’
β’ for source code management and collaboration (commits, branching, merging, GitHub/GitLab workflows).
Unix/Linux: Strong command-line skills
β’
β’ in Unix-like environments.
SQL : Solid understanding of SQL for data ingestion and analysis.
Collaborative Development : Comfortable with code reviews, pair programming and using remote collaboration tools effectively.
Engineering Mindset: Writes code with an eye for maintainability and testability; excited to build production-grade software.
Education: Bachelorβs or graduate degree in Computer Science, Data Analytics or related field, or equivalent work experience.






