

Global Business Ser. 4u
Sr Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Sr Data Engineer on a 6-12 month contract, fully remote, focusing on healthcare datasets. Requires 5+ years in data engineering, Azure Data Factory, ETL tools, and relational databases. Preferred skills include Python scripting and data modeling experience.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
October 8, 2025
🕒 - Duration
More than 6 months
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Batch #Informatica #REST (Representational State Transfer) #GitHub #Scripting #Snowflake #DevOps #Informatica PowerCenter #Spark (Apache Spark) #Statistics #Mathematics #Azure Data Factory #Security #SQL (Structured Query Language) #SSIS (SQL Server Integration Services) #Databases #Data Warehouse #Python #Datasets #EDW (Enterprise Data Warehouse) #Data Modeling #Data Engineering #"ETL (Extract #Transform #Load)" #ADF (Azure Data Factory) #Visualization #SQL Server #Data Pipeline #FHIR (Fast Healthcare Interoperability Resources) #Azure DevOps #Azure #Computer Science #Shell Scripting #Databricks #Oracle
Role description
Sr Data Engineer role
6-12 months contract role
Fully Remote
The Data Engineer will work with large healthcare datasets and will translate client's business requirements into enterprise systems, applications, or process designs for large complex health data solutions. The role will drive and support initiatives for the CT EDW as well as participate in the wider EDW groups areas of data usage and governance, information management, privacy and security, SOA, data analytics and visualization and information modeling.
Requirements
• 5+ years of Data engineering experience with a focus on Data Warehousing
• 3+ years of experience creating working with Azure Data Factory (ADF) and Databricks
• 5+ years developing ETL using Informatica PowerCenter, SSIS, Azure Data Factory, or similar tools.
• 5+ years of experience with Relational Databases, such as Oracle, Snowflake, SQL Server, etc.
• 5+ years of experience creating stored procedures with Oracle PL/SQL, SQL Server T-SQL, or Snowflake SQL
• 3+ years of experience with Azure DevOps, GitHub, SVN, or similar source control systems
• 2+ years of experience processing structured and un-structured data.
• Experience with HL7 and FHIR standards, and processing files in these formats.
• 3+ years analyzing project requirements and developing detailed specifications for ETL requirements.
• Excellent problem-solving and analytical skills, with the ability to troubleshoot and optimize data pipelines.
• Ability to adapt to evolving technologies and changing business requirements.
• Bachelors or Advanced Degree in a related field such as Information Technology/Computer Science, Mathematics/Statistics, Analytics, Business
Preferred Skills
• 3+ years of batch or PowerShell scripting
• 3+ years of experience with Python scripting.
• 3+ years of data modeling experience in a data warehouse environment
• Experience or familiarity with Spark and Azure Data Fabric.
• Experience designing and building APIs in Snowflake and ADF (e.g. REST, RPC)
• Experience with State Medicaid / Medicare / Healthcare applications
• Azure certifications related to data engineering or data analytics.
Ideal background
• Strong technical experience in ADF and Snowflake
Top Skills Needed
• 5+ years of Data engineering experience with a focus on Data Warehousing
• 3+ years of experience creating pipelines in Azure Data Factory (ADF)
• 5+ years developing ETL using Informatica PowerCenter, SSIS, Azure Data Factory, or similar tools.
• 5+ years of experience with Relational Databases, such as Oracle, Snowflake, SQL Server, etc.
What experience will set candidates apart from one another?
Strong technical experience in ADF and Snowflake
Sr Data Engineer role
6-12 months contract role
Fully Remote
The Data Engineer will work with large healthcare datasets and will translate client's business requirements into enterprise systems, applications, or process designs for large complex health data solutions. The role will drive and support initiatives for the CT EDW as well as participate in the wider EDW groups areas of data usage and governance, information management, privacy and security, SOA, data analytics and visualization and information modeling.
Requirements
• 5+ years of Data engineering experience with a focus on Data Warehousing
• 3+ years of experience creating working with Azure Data Factory (ADF) and Databricks
• 5+ years developing ETL using Informatica PowerCenter, SSIS, Azure Data Factory, or similar tools.
• 5+ years of experience with Relational Databases, such as Oracle, Snowflake, SQL Server, etc.
• 5+ years of experience creating stored procedures with Oracle PL/SQL, SQL Server T-SQL, or Snowflake SQL
• 3+ years of experience with Azure DevOps, GitHub, SVN, or similar source control systems
• 2+ years of experience processing structured and un-structured data.
• Experience with HL7 and FHIR standards, and processing files in these formats.
• 3+ years analyzing project requirements and developing detailed specifications for ETL requirements.
• Excellent problem-solving and analytical skills, with the ability to troubleshoot and optimize data pipelines.
• Ability to adapt to evolving technologies and changing business requirements.
• Bachelors or Advanced Degree in a related field such as Information Technology/Computer Science, Mathematics/Statistics, Analytics, Business
Preferred Skills
• 3+ years of batch or PowerShell scripting
• 3+ years of experience with Python scripting.
• 3+ years of data modeling experience in a data warehouse environment
• Experience or familiarity with Spark and Azure Data Fabric.
• Experience designing and building APIs in Snowflake and ADF (e.g. REST, RPC)
• Experience with State Medicaid / Medicare / Healthcare applications
• Azure certifications related to data engineering or data analytics.
Ideal background
• Strong technical experience in ADF and Snowflake
Top Skills Needed
• 5+ years of Data engineering experience with a focus on Data Warehousing
• 3+ years of experience creating pipelines in Azure Data Factory (ADF)
• 5+ years developing ETL using Informatica PowerCenter, SSIS, Azure Data Factory, or similar tools.
• 5+ years of experience with Relational Databases, such as Oracle, Snowflake, SQL Server, etc.
What experience will set candidates apart from one another?
Strong technical experience in ADF and Snowflake