

InfoStride
Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Data Engineer position for 12 months, hybrid in Lansing, MI, with a pay rate of "unknown." Key skills include AWS, Databricks, ETL processes, and EHR experience. Requires 12+ years in database systems and 8+ years in relevant technologies.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
February 7, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Lansing, MI
-
🧠 - Skills detailed
#S3 (Amazon Simple Storage Service) #Database Systems #"ETL (Extract #Transform #Load)" #Oracle #Python #Visualization #Azure DevOps #Cloud #Agile #Azure #Scala #Data Engineering #AWS (Amazon Web Services) #Databricks #Data Pipeline #DevOps #Documentation #Data Integrity #GIT
Role description
Job Title: Programmer Analyst 6 – Data Engineer
Location: 235 S Grand Ave, Lansing, MI 48933
Work Model: Hybrid (Onsite 2 days/week – REQUIRED)
Duration: 12 Months (Possible Extension)
Responsibilities
• Lead the design and development of scalable and high-performance solutions using AWS services.
• Experience with Databricks, Elastic search, Kibanna, S3.
• Experience with Extract, Transform, and Load (ETL) processes and data pipelines.
• Write clean, maintainable, and efficient code in Python/Scala.
• Experience with AWS Cloud-based Application Development
• Experience in Electronic Health Records (EHR) HL7 solutions.
• Implement and manage Elastic Search engine for efficient data retrieval and analysis.
• Experience with data warehousing, data visualization Tools, data integrity
• Execute full software development life cycle (SDLC) including experience in gathering requirements and writing functional/technical specifications for complex projects.
• Excellent knowledge in designing both logical and physical database model
• Develop database objects including stored procedures, functions,
• Extensive knowledge on source control tools such as GIT
• Develop software design documents and work with stakeholders for review and approval.
• Exposure to flowcharts, screen layouts and documentation to ensure logical flow of the system requirements
• Experience working on large agile projects.
• Experience or Knowledge on creating CI/CD pipelines using Azure Devops
Required Skill
• 12+ years developing complex database systems.
• 8+ years Databricks.
• 8+ years using Elastic search, Kibanna.
• 8+ years using Python/Scala.
• 8+ years Oracle.
• 5+ years of experience with Extract, Transform, and Load (ETL) processes and developing Data Pipelines.
• 5+ years of experience with AWS.
• Over 5+ years of experience with data warehousing, data visualization Tools, data integrity.
• Over 5+ years using CMM/CMMI Level 3 methods and practices.
• Over 5+ years implemented agile development processes including test driven development.
• Over 3+ years of experience or Knowledge on creating CI/CD pipelines using Azure Devops- Nice to have
Job Title: Programmer Analyst 6 – Data Engineer
Location: 235 S Grand Ave, Lansing, MI 48933
Work Model: Hybrid (Onsite 2 days/week – REQUIRED)
Duration: 12 Months (Possible Extension)
Responsibilities
• Lead the design and development of scalable and high-performance solutions using AWS services.
• Experience with Databricks, Elastic search, Kibanna, S3.
• Experience with Extract, Transform, and Load (ETL) processes and data pipelines.
• Write clean, maintainable, and efficient code in Python/Scala.
• Experience with AWS Cloud-based Application Development
• Experience in Electronic Health Records (EHR) HL7 solutions.
• Implement and manage Elastic Search engine for efficient data retrieval and analysis.
• Experience with data warehousing, data visualization Tools, data integrity
• Execute full software development life cycle (SDLC) including experience in gathering requirements and writing functional/technical specifications for complex projects.
• Excellent knowledge in designing both logical and physical database model
• Develop database objects including stored procedures, functions,
• Extensive knowledge on source control tools such as GIT
• Develop software design documents and work with stakeholders for review and approval.
• Exposure to flowcharts, screen layouts and documentation to ensure logical flow of the system requirements
• Experience working on large agile projects.
• Experience or Knowledge on creating CI/CD pipelines using Azure Devops
Required Skill
• 12+ years developing complex database systems.
• 8+ years Databricks.
• 8+ years using Elastic search, Kibanna.
• 8+ years using Python/Scala.
• 8+ years Oracle.
• 5+ years of experience with Extract, Transform, and Load (ETL) processes and developing Data Pipelines.
• 5+ years of experience with AWS.
• Over 5+ years of experience with data warehousing, data visualization Tools, data integrity.
• Over 5+ years using CMM/CMMI Level 3 methods and practices.
• Over 5+ years implemented agile development processes including test driven development.
• Over 3+ years of experience or Knowledge on creating CI/CD pipelines using Azure Devops- Nice to have





