

Ventures Unlimited Inc
Senior Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Senior Data Engineer for a 12-month hybrid contract in Lansing, MI, offering competitive pay. Key requirements include 12+ years in database systems, 8+ years with Databricks, ElasticSearch, Python/Scala, and AWS experience.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
February 7, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Lansing, MI
-
🧠 - Skills detailed
#S3 (Amazon Simple Storage Service) #Database Systems #"ETL (Extract #Transform #Load)" #Compliance #Requirements Gathering #Oracle #Elasticsearch #Python #Visualization #Azure DevOps #Security #Agile #Azure #Scala #Data Engineering #AWS (Amazon Web Services) #Databricks #Data Pipeline #DevOps #Documentation #Data Integrity #Leadership
Role description
Job Title: Programmer Analyst 6 – Data Engineer | W2-CONTRACT ROLE
Location: Lansing, MI
Work Mode: Hybrid (Onsite 2 days/week – REQUIRED)
Duration: 12 months (with possible extension)
Key Responsibilities
• Lead design and development of scalable, high-performance solutions using AWS
• Develop and maintain ETL processes and data pipelines
• Work extensively with Databricks, ElasticSearch, Kibana, and S3
• Write clean, efficient, and maintainable code using Python and/or Scala
• Ensure data integrity, security, and compliance (SEM/SUITE)
• Participate in full SDLC, including requirements gathering and technical documentation
• Design logical and physical database models and develop database objects
• Provide technical leadership and oversight to development teams
• Support CI/CD pipelines (Azure DevOps – nice to have)
• Work in an Agile environment with test-driven development practices
Required Skills & Experience
• 12+ years developing complex database systems
• 8+ years with Databricks
• 8+ years with ElasticSearch & Kibana
• 8+ years using Python and/or Scala
• 8+ years with Oracle
• 5+ years building ETL processes and data pipelines
• 5+ years with AWS
• 5+ years in data warehousing, data visualization, and data integrity
• 5+ years using CMM/CMMI Level 3 methods
• 5+ years working in Agile environments with TDD
• Nice to Have: 3+ years with CI/CD pipelines using Azure DevOps
Job Title: Programmer Analyst 6 – Data Engineer | W2-CONTRACT ROLE
Location: Lansing, MI
Work Mode: Hybrid (Onsite 2 days/week – REQUIRED)
Duration: 12 months (with possible extension)
Key Responsibilities
• Lead design and development of scalable, high-performance solutions using AWS
• Develop and maintain ETL processes and data pipelines
• Work extensively with Databricks, ElasticSearch, Kibana, and S3
• Write clean, efficient, and maintainable code using Python and/or Scala
• Ensure data integrity, security, and compliance (SEM/SUITE)
• Participate in full SDLC, including requirements gathering and technical documentation
• Design logical and physical database models and develop database objects
• Provide technical leadership and oversight to development teams
• Support CI/CD pipelines (Azure DevOps – nice to have)
• Work in an Agile environment with test-driven development practices
Required Skills & Experience
• 12+ years developing complex database systems
• 8+ years with Databricks
• 8+ years with ElasticSearch & Kibana
• 8+ years using Python and/or Scala
• 8+ years with Oracle
• 5+ years building ETL processes and data pipelines
• 5+ years with AWS
• 5+ years in data warehousing, data visualization, and data integrity
• 5+ years using CMM/CMMI Level 3 methods
• 5+ years working in Agile environments with TDD
• Nice to Have: 3+ years with CI/CD pipelines using Azure DevOps





