

Jobs via Dice
Data Engineer,Location: Lansing, MI (Hybrid – 2 Days a Week ), Duration: 12+ Months Contract
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer in Lansing, MI (Hybrid – 2 days a week) for a 12+ month contract, offering competitive pay. Requires 12+ years of experience, proficiency in AWS, Databricks, ETL processes, and Electronic Health Records (EHR) solutions.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
February 7, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Lansing, MI
-
🧠 - Skills detailed
#S3 (Amazon Simple Storage Service) #Database Systems #"ETL (Extract #Transform #Load)" #Oracle #Python #Visualization #Azure DevOps #Cloud #Agile #Azure #Scala #Data Engineering #AWS (Amazon Web Services) #Databricks #Data Pipeline #DevOps #Documentation #Data Integrity #GIT
Role description
Dice is the leading career destination for tech experts at every stage of their careers. Our client, Infomerica, Inc, is seeking the following. Apply via Dice today!
Hi,
Please find the role below and let us know your interest.
Role: Data Engineer-155145.
Location: Lansing, MI (Hybrid – 2 days a week )
NEED ONLY MI LOCALS
Experience: 12+ Years
Duration: 12+ Months contract
Interview Process: Interviews will be held in-person. Candidates MUST be available for an in-person interview.
JOB DESCRIPTION:
• Lead the design and development of scalable and high-performance solutions using AWS services.
• Experience with Databricks, Elastic search, Kibanna, S3.
• Experience with Extract, Transform, and Load (ETL) processes and data pipelines.
• Write clean, maintainable, and efficient code in Python/Scala.
• Experience with AWS Cloud-based Application Development
• Experience in Electronic Health Records (EHR) HL7 solutions.
• Implement and manage Elastic Search engine for efficient data retrieval and analysis.
• Experience with data warehousing, data visualization Tools, data integrity
• Execute full software development life cycle (SDLC) including experience in gathering requirements and writing functional/technical specifications for complex projects.
• Excellent knowledge in designing both logical and physical database model
• Develop database objects including stored procedures, functions,
• Extensive knowledge on source control tools such as GIT
• Develop software design documents and work with stakeholders for review and approval.
• Exposure to flowcharts, screen layouts and documentation to ensure logical flow of the system requirements
• Experience working on large agile projects.
• Experience or Knowledge on creating CI/CD pipelines using Azure Devops
"
Skills:
• 12+ years developing complex database systems.
• 8+ years Databricks.
• 8+ years using Elastic search, Kibanna.
• 8+ years using Python/Scala.
• 8+ years Oracle.
• 5+ years experience with Extract, Transform, and Load (ETL) processes and developing Data Pipelines.
• 5+ years experience with AWS.
• Over 5+ years experience with data warehousing, data visualization Tools, data integrity .
• Over 5+ years using CMM/CMMI Level 3 methods and practices.
• Over 5+ years implemented agile development processes including test driven development.
• Over 3+ years Experience or Knowledge on creating CI/CD pipelines using Azure Devops- Nice to have.
Dice is the leading career destination for tech experts at every stage of their careers. Our client, Infomerica, Inc, is seeking the following. Apply via Dice today!
Hi,
Please find the role below and let us know your interest.
Role: Data Engineer-155145.
Location: Lansing, MI (Hybrid – 2 days a week )
NEED ONLY MI LOCALS
Experience: 12+ Years
Duration: 12+ Months contract
Interview Process: Interviews will be held in-person. Candidates MUST be available for an in-person interview.
JOB DESCRIPTION:
• Lead the design and development of scalable and high-performance solutions using AWS services.
• Experience with Databricks, Elastic search, Kibanna, S3.
• Experience with Extract, Transform, and Load (ETL) processes and data pipelines.
• Write clean, maintainable, and efficient code in Python/Scala.
• Experience with AWS Cloud-based Application Development
• Experience in Electronic Health Records (EHR) HL7 solutions.
• Implement and manage Elastic Search engine for efficient data retrieval and analysis.
• Experience with data warehousing, data visualization Tools, data integrity
• Execute full software development life cycle (SDLC) including experience in gathering requirements and writing functional/technical specifications for complex projects.
• Excellent knowledge in designing both logical and physical database model
• Develop database objects including stored procedures, functions,
• Extensive knowledge on source control tools such as GIT
• Develop software design documents and work with stakeholders for review and approval.
• Exposure to flowcharts, screen layouts and documentation to ensure logical flow of the system requirements
• Experience working on large agile projects.
• Experience or Knowledge on creating CI/CD pipelines using Azure Devops
"
Skills:
• 12+ years developing complex database systems.
• 8+ years Databricks.
• 8+ years using Elastic search, Kibanna.
• 8+ years using Python/Scala.
• 8+ years Oracle.
• 5+ years experience with Extract, Transform, and Load (ETL) processes and developing Data Pipelines.
• 5+ years experience with AWS.
• Over 5+ years experience with data warehousing, data visualization Tools, data integrity .
• Over 5+ years using CMM/CMMI Level 3 methods and practices.
• Over 5+ years implemented agile development processes including test driven development.
• Over 3+ years Experience or Knowledge on creating CI/CD pipelines using Azure Devops- Nice to have.





