Sr. Data Engineer (Remote), Only W2 and No OPTs and H1B

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Sr. Data Engineer (Remote) with a 5+ year focus on Data Warehousing, 3+ years in Azure Data Factory and Databricks, and strong ETL development skills. Requires relevant degree and experience with HL7/FHIR standards.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
September 30, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
Remote
-
πŸ“„ - Contract type
W2 Contractor
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
United States
-
🧠 - Skills detailed
#ADF (Azure Data Factory) #Snowflake #DevOps #FHIR (Fast Healthcare Interoperability Resources) #Batch #SQL Server #Computer Science #Data Pipeline #GitHub #Azure DevOps #Data Warehouse #Azure Data Factory #Databricks #Data Engineering #Shell Scripting #REST (Representational State Transfer) #Informatica #Python #Statistics #SQL (Structured Query Language) #"ETL (Extract #Transform #Load)" #Mathematics #Spark (Apache Spark) #SSIS (SQL Server Integration Services) #Scripting #Azure #Oracle #Databases #Informatica PowerCenter #Data Modeling
Role description
Hi all, Immediate Closable Position Please share your resumes at praveen.vasala@veritis.com Job Title: Sr. Data Engineer (Remote), Only W2 and no OPTs and H1B Location: Remote Requirements: β€’ 5+ years of Data engineering experience with a focus on Data Warehousing β€’ 3+ years of experience creating working with Azure Data Factory (ADF) and Databricks β€’ 5+ years developing ETL using Informatica PowerCenter, SSIS, Azure Data Factory, or similar tools. β€’ 5+ years of experience with Relational Databases, such as Oracle, Snowflake, SQL Server, etc. β€’ 5+ years of experience creating stored procedures with Oracle PL/SQL, SQL Server T-SQL, or Snowflake SQL β€’ 3+ years of experience with Azure DevOps, GitHub, SVN, or similar source control systems β€’ 2+ years of experience processing structured and un-structured data. β€’ Experience with HL7 and FHIR standards, and processing files in these formats. β€’ 3+ years analyzing project requirements and developing detailed specifications for ETL requirements. β€’ Excellent problem-solving and analytical skills, with the ability to troubleshoot and optimize data pipelines. β€’ Ability to adapt to evolving technologies and changing business requirements. β€’ Bachelors or Advanced Degree in a related field such as Information Technology/Computer Science, Mathematics/Statistics, Analytics, Business Preferred skills: β€’ 3+ years of batch or PowerShell scripting β€’ 3+ years of experience with Python scripting. β€’ 3+ years of data modeling experience in a data warehouse environment β€’ Experience or familiarity with Spark and Azure Data Fabric. β€’ Experience designing and building APIs in Snowflake and ADF (e.g. REST, RPC) β€’ Experience with State Medicaid / Medicare / Healthcare applications β€’ Azure certifications related to data engineering or data analytics. Ideal background Strong technical experience in ADF and Snowflake Top skills needed: β€’ 5+ years of Data engineering experience with a focus on Data Warehousing β€’ 3+ years of experience creating pipelines in Azure Data Factory (ADF) β€’ 5+ years developing ETL using Informatica PowerCenter, SSIS, Azure Data Factory, or similar tools. β€’ 5+ years of experience with Relational Databases, such as Oracle, Snowflake, SQL Server, etc. Regards, Praveen Vasala Email ID: praveen.vasala@veritis.com Contact: +1 (469) 649 1786 Ext: 122