GSK Solutions Inc.

Senior Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer in Lansing, MI, with a 12-month contract at $80-$85/hr. Key skills include AWS, Databricks, ETL processes, and Python/Scala. Candidates must have experience in Electronic Health Records (EHR) and data warehousing.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
680
-
🗓️ - Date
March 10, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Lansing, MI 48933
-
🧠 - Skills detailed
#S3 (Amazon Simple Storage Service) #AWS (Amazon Web Services) #Cloud #Azure DevOps #Python #Oracle #Agile #Data Pipeline #Scala #Visualization #DevOps #Azure #GIT #Data Engineering #Database Systems #Databricks #"ETL (Extract #Transform #Load)" #Documentation #Elasticsearch #Data Integrity
Role description
Job Title: Senior Data Engineer (Hybrid Onsite)Location: Lansing, MIDuration: 12 monthsInterview: Candidates submitted must be willing to come ONSITE (Lansing, MI) for interviews.Pay Rate$85/hr on C2C / 1099 OR $80/hr on W2 NoteOn-site: Hybrid position - Onsite 2 days per week - REQUIRED. Open to candidates local and nonlocal willing to relocate. NO REMOTE ONLY OPTION.Repost of posting ID 155145 but updated the role to local or non local candidates. Please do not resubmit candidates who were reviewed by manager. Job Description Position Summary:Lead the design and development of scalable and high-performance solutions using AWS services.Experience with Databricks, Elastic search, Kibanna, S3. Experience with Extract, Transform, and Load (ETL) processes and data pipelines.Write clean, maintainable, and efficient code in Python/Scala. Experience with AWS Cloud-based Application DevelopmentExperience in Electronic Health Records (EHR) HL7 solutions. Implement and manage Elastic Search engine for efficient data retrieval and analysis.Experience with data warehousing, data visualization Tools, data integrity Execute full software development life cycle (SDLC) including experience in gathering requirements and writing functional/technical specifications for complex projects. Excellent knowledge in designing both logical and physical database model Develop database objects including stored procedures, functions, Extensive knowledge on source control tools such as GIT Develop software design documents and work with stakeholders for review and approval. Exposure to flowcharts, screen layouts and documentation to ensure logical flow of the system requirements Experience working on large agile projects. Experience or Knowledge on creating CI/CD pipelines using Azure Devops. Skill Description: 12+ years developing complex database systems.8+ years Databricks.8+ years using Elastic search, Kibanna.8+ years using Python/Scala. 8+ years Oracle.5+ years experience with Extract, Transform, and Load (ETL) processes and developing Data Pipelines.5+ years experience with AWS.Over 5+ years experience with data warehousing, data visualization Tools, data integrity . Over 5+ years using CMM/CMMI Level 3 methods and practices.Over 5+ years implemented agile development processes including test driven development.Over 3+ years Experience or Knowledge on creating CI/CD pipelines using Azure Devops- Nice to have. #GSKIT Pay: $80.00 - $85.00 per hour Expected hours: 40.0 per week Benefits: Health insurance Experience: Databricks: 3 years (Required) Elasticsearch: 5 years (Required) Python/Scala: 5 years (Required) Oracle: 5 years (Required) AWS: 5 years (Required) Work Location: Hybrid remote in Lansing, MI 48933