RICEFW Technologies Inc

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer in Lansing, Michigan, offering a hybrid work mode. Contract length and pay rate are unspecified. Candidates must have extensive experience with AWS, Databricks, ETL processes, and Electronic Health Records (EHR) solutions.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
February 7, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Lansing, MI
-
🧠 - Skills detailed
#S3 (Amazon Simple Storage Service) #Database Systems #"ETL (Extract #Transform #Load)" #Oracle #Python #Visualization #Azure DevOps #Cloud #Agile #Azure #Scala #Data Engineering #AWS (Amazon Web Services) #Databricks #Data Pipeline #DevOps #Documentation #Data Integrity #GIT
Role description
ONLY LOCAL TO MICHIGAN CAN APPLY Mode of work: Hybrid Mode of Interview: In-Person Location: Lansing, Michigan Position Summary • Lead the design and development of scalable and high-performance solutions using AWS services. • Experience with Databricks, Elastic search, Kibanna, S3. • Experience with Extract, Transform, and Load (ETL) processes and data pipelines. • Write clean, maintainable, and efficient code in Python/Scala. • Experience with AWS Cloud-based Application Development • Experience in Electronic Health Records (EHR) HL7 solutions. • Implement and manage Elastic Search engine for efficient data retrieval and analysis. • Experience with data warehousing, data visualization Tools, data integrity • Execute full software development life cycle (SDLC) including experience in gathering requirements and writing functional/technical specifications for complex projects. • Excellent knowledge in designing both logical and physical database model • Develop database objects including stored procedures, functions, • Extensive knowledge on source control tools such as GIT • Develop software design documents and work with stakeholders for review and approval. • Exposure to flowcharts, screen layouts and documentation to ensure logical flow of the system requirements • Experience working on large agile projects. • Experience or Knowledge on creating CI/CD pipelines using Azure Devops Skill Descriptions • 12+ years developing complex database systems. • 8+ years Databricks. • 8+ years using Elastic search, Kibanna. • 8+ years using Python/Scala. • 8+ years Oracle. • 5+ years experience with Extract, Transform, and Load (ETL) processes and developing Data Pipelines. • 5+ years experience with AWS. • Over 5+ years experience with data warehousing, data visualization Tools, data integrity . • Over 5+ years using CMM/CMMI Level 3 methods and practices. • Over 5+ years implemented agile development processes including test driven development. • Over 3+ years Experience or Knowledge on creating CI/CD pipelines using Azure Devops- Nice to have