

Data Engineer - Hybrid - NO C2C
Title: Data Engineer
Location: Lansing, MI - Hybrid
Rate: $70/hr
Note: This is a contract on W2. THIS IS NOT OPEN TO C2C.
Summary
This resource will also perform as a technical lead, providing technical guidance to the other developers in the department. As a technical lead, the resource participates in a variety of analytical assignments that provide for the enhancement, integration, maintenance, and implementation of projects. The resource will also provide technical oversight to other developers in the team that support other critical applications. Not having a resource on staff will lead to the system failing to maintain, enhance, and support the modernized system, which can lead to errors causing application outages, data integrity issues, and can eventually lead to incorrect information being processed and reporting of the patient information.
Job Duties:
• Lead the design and development of scalable and high-performance solutions using AWS services.
• Experience with Databricks, Elastic search, Kibanna, S3.
• Experience with Extract, Transform, and Load (ETL) processes and data pipelines.
• Write clean, maintainable, and efficient code in Python/Scala.
• Experience with AWS Cloud-based Application Development
• Experience in Electronic Health Records (EHR) HL7 solutions.
• Implement and manage Elastic Search engine for efficient data retrieval and analysis.
• Experience with data warehousing, data visualization Tools, data integrity
• Execute full software development life cycle (SDLC) including experience in gathering requirements and writing functional/technical specifications for complex projects.
• Excellent knowledge in designing both logical and physical database model
• Develop database objects including stored procedures, functions,
• Extensive knowledge on source control tools such as GIT
• Develop software design documents and work with stakeholders for review and approval.
• Exposure to flowcharts, screen layouts and documentation to ensure logical flow of the system requirements
• Experience working on large agile projects.
• Experience or Knowledge on creating CI/CD pipelines using Azure Devops
Job Qualifications:
• 12+ years of experience developing complex database systems
• 8+ years of experience with Databricks
• 8+ years of experience using Elastic Search and Kibana
• 8+ years of experience using Python/Scala
• 8+ years of experience in Oracle
• 5+ years of experience with Extract, Transform, and Load (ETL) processes, including developing data pipelines
• 5+ years of experience with AWS
• 5+ years of experience with data warehousing, data visualization tools, and data integrity
• 5+ years of experience using CMM/CMMI Level 3 methods and practices
• 5+ years of experience implementing Agile development processes, including test driven development
• Experience designing both logical and physical database models
• Exposure to flowcharts, screen layouts, and documentation to ensure logical flow of the system requirements
• Experience working on large, Agile projects
• 3+ years of experience in creating CI/CD pipelines using Azure DevOps preferred
• A minimum of a Bachelor’s Degree in Information Technology
Title: Data Engineer
Location: Lansing, MI - Hybrid
Rate: $70/hr
Note: This is a contract on W2. THIS IS NOT OPEN TO C2C.
Summary
This resource will also perform as a technical lead, providing technical guidance to the other developers in the department. As a technical lead, the resource participates in a variety of analytical assignments that provide for the enhancement, integration, maintenance, and implementation of projects. The resource will also provide technical oversight to other developers in the team that support other critical applications. Not having a resource on staff will lead to the system failing to maintain, enhance, and support the modernized system, which can lead to errors causing application outages, data integrity issues, and can eventually lead to incorrect information being processed and reporting of the patient information.
Job Duties:
• Lead the design and development of scalable and high-performance solutions using AWS services.
• Experience with Databricks, Elastic search, Kibanna, S3.
• Experience with Extract, Transform, and Load (ETL) processes and data pipelines.
• Write clean, maintainable, and efficient code in Python/Scala.
• Experience with AWS Cloud-based Application Development
• Experience in Electronic Health Records (EHR) HL7 solutions.
• Implement and manage Elastic Search engine for efficient data retrieval and analysis.
• Experience with data warehousing, data visualization Tools, data integrity
• Execute full software development life cycle (SDLC) including experience in gathering requirements and writing functional/technical specifications for complex projects.
• Excellent knowledge in designing both logical and physical database model
• Develop database objects including stored procedures, functions,
• Extensive knowledge on source control tools such as GIT
• Develop software design documents and work with stakeholders for review and approval.
• Exposure to flowcharts, screen layouts and documentation to ensure logical flow of the system requirements
• Experience working on large agile projects.
• Experience or Knowledge on creating CI/CD pipelines using Azure Devops
Job Qualifications:
• 12+ years of experience developing complex database systems
• 8+ years of experience with Databricks
• 8+ years of experience using Elastic Search and Kibana
• 8+ years of experience using Python/Scala
• 8+ years of experience in Oracle
• 5+ years of experience with Extract, Transform, and Load (ETL) processes, including developing data pipelines
• 5+ years of experience with AWS
• 5+ years of experience with data warehousing, data visualization tools, and data integrity
• 5+ years of experience using CMM/CMMI Level 3 methods and practices
• 5+ years of experience implementing Agile development processes, including test driven development
• Experience designing both logical and physical database models
• Exposure to flowcharts, screen layouts, and documentation to ensure logical flow of the system requirements
• Experience working on large, Agile projects
• 3+ years of experience in creating CI/CD pipelines using Azure DevOps preferred
• A minimum of a Bachelor’s Degree in Information Technology