

Ventures Unlimited Inc
Senior Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer based in Lansing, MI, with a 12+ month contract. Key skills include AWS, Python or Scala, ETL processes, and 12+ years in database systems. Hybrid work model requires onsite presence 2 days per week.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
March 10, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Lansing, MI
-
🧠 - Skills detailed
#AWS (Amazon Web Services) #Leadership #Cloud #Azure DevOps #Compliance #Python #Oracle #Agile #Data Pipeline #Scala #Visualization #Databases #DevOps #Azure #Data Processing #Data Engineering #Programming #Database Systems #Databricks #"ETL (Extract #Transform #Load)" #Documentation #Elasticsearch #Data Integrity #Security
Role description
Position Details:
• Job Title: Programmer Analyst 6 – Data Engineer
• Location: Lansing, MI
• Work Model: Hybrid (Onsite 2 days per week – Required)
• Interview Process: Candidates must be willing to attend the interview ONSITE in Lansing, MI
• Duration: 12+ Months with possible extension
The selected candidate will act as a technical lead, contributing to system enhancement, maintenance, and integration while ensuring compliance, security, and reliable data processing.
Key Responsibilities:
• Lead design and development of scalable, high-performance solutions using AWS services.
• Develop and maintain ETL processes and data pipelines.
• Write clean, efficient, and maintainable code using Python or Scala.
• Implement and manage Elastic Search for efficient data retrieval and analytics.
• Design logical and physical database models and develop database objects (stored procedures, functions, etc.).
• Work across the full Software Development Life Cycle (SDLC) including requirement gathering and documentation.
• Collaborate with stakeholders to review and approve software design documentation.
• Ensure application security, data integrity, and system stability.
• Work within Agile development environments and contribute to large enterprise projects.
• Provide technical leadership and guidance to development teams.
Required Skills & Experience:
• 12+ years of experience developing complex database systems.
• 8+ years of experience with Databricks.
• 8+ years of experience with Elasticsearch and Kibana.
• 8+ years of programming experience with Python or Scala.
• 8+ years of experience with Oracle databases.
• 5+ years of experience in ETL processes and Data Pipeline development.
• 5+ years of experience working with AWS cloud services.
• 5+ years of experience with data warehousing, data visualization tools, and data integrity practices.
• 5+ years of experience using CMM/CMMI Level 3 methods and practices.
• 5+ years of experience implementing Agile development and Test-Driven Development (TDD).
Nice to Have:
• 3+ years of experience creating CI/CD pipelines using Azure DevOps.
• Experience with Electronic Health Records (EHR) and HL7 integrations.
Position Details:
• Job Title: Programmer Analyst 6 – Data Engineer
• Location: Lansing, MI
• Work Model: Hybrid (Onsite 2 days per week – Required)
• Interview Process: Candidates must be willing to attend the interview ONSITE in Lansing, MI
• Duration: 12+ Months with possible extension
The selected candidate will act as a technical lead, contributing to system enhancement, maintenance, and integration while ensuring compliance, security, and reliable data processing.
Key Responsibilities:
• Lead design and development of scalable, high-performance solutions using AWS services.
• Develop and maintain ETL processes and data pipelines.
• Write clean, efficient, and maintainable code using Python or Scala.
• Implement and manage Elastic Search for efficient data retrieval and analytics.
• Design logical and physical database models and develop database objects (stored procedures, functions, etc.).
• Work across the full Software Development Life Cycle (SDLC) including requirement gathering and documentation.
• Collaborate with stakeholders to review and approve software design documentation.
• Ensure application security, data integrity, and system stability.
• Work within Agile development environments and contribute to large enterprise projects.
• Provide technical leadership and guidance to development teams.
Required Skills & Experience:
• 12+ years of experience developing complex database systems.
• 8+ years of experience with Databricks.
• 8+ years of experience with Elasticsearch and Kibana.
• 8+ years of programming experience with Python or Scala.
• 8+ years of experience with Oracle databases.
• 5+ years of experience in ETL processes and Data Pipeline development.
• 5+ years of experience working with AWS cloud services.
• 5+ years of experience with data warehousing, data visualization tools, and data integrity practices.
• 5+ years of experience using CMM/CMMI Level 3 methods and practices.
• 5+ years of experience implementing Agile development and Test-Driven Development (TDD).
Nice to Have:
• 3+ years of experience creating CI/CD pipelines using Azure DevOps.
• Experience with Electronic Health Records (EHR) and HL7 integrations.






