Senior Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer with a contract length of "X months" at a pay rate of "$X/hour". Key skills include AWS, Databricks, ETL processes, and EHR HL7 solutions. Requires 12+ years in database systems and 8+ years in relevant technologies.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
April 24, 2025
🕒 - Project duration
Unknown
🏝️ - Location type
Unknown
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
Lansing, MI
🧠 - Skills detailed
#Database Systems #GIT #Scala #Azure DevOps #Data Pipeline #Visualization #Agile #S3 (Amazon Simple Storage Service) #AWS (Amazon Web Services) #Cloud #Databricks #Python #DevOps #Data Integrity #Oracle #Azure #Data Engineering #Documentation #"ETL (Extract #Transform #Load)"
Role description

Lead the design and development of scalable and high-performance solutions using AWS services.

   • Experience with Databricks, Elastic search, Kibanna, S3.

   • Experience with Extract, Transform, and Load (ETL) processes and data pipelines.

   • Write clean, maintainable, and efficient code in Python/Scala.

   • Experience with AWS Cloud-based Application Development

   • Experience in Electronic Health Records (EHR) HL7 solutions.

   • Implement and manage Elastic Search engine for efficient data retrieval and analysis.

   • Experience with data warehousing, data visualization Tools, data integrity

   • Execute full software development life cycle (SDLC) including experience in gathering requirements and writing functional/technical specifications for complex projects.

   • Excellent knowledge in designing both logical and physical database model

   • Develop database objects including stored procedures, functions,

   • Extensive knowledge on source control tools such as GIT

   • Develop software design documents and work with stakeholders for review and approval.

   • Exposure to flowcharts, screen layouts and documentation to ensure logical flow of the system requirements

   • Experience working on large agile projects.

   • Experience or Knowledge on creating CI/CD pipelines using Azure Devops

Skills Needed:

   • 12+ years developing complex database systems.

   • 8+ years Databricks.

   • 8+ years using Elastic search, Kibanna.

   • 8+ years using Python/Scala.

   • 8+ years Oracle.

   • 5+ years experience with Extract, Transform, and Load (ETL) processes and developing Data Pipelines.

   • 5+ years experience with AWS.

   • Over 5+ years experience with data warehousing, data visualization Tools, data integrity .

   • Over 5+ years using CMM/CMMI Level 3 methods and practices.

   • Over 5+ years implemented agile development processes including test driven development.

   • Over 3+ years Experience or Knowledge on creating CI/CD pipelines using Azure Devops- Nice to have