Horizontal Talent

Sr Data Analyst

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Sr Data Analyst with a contract length of "unknown," offering a pay rate of "unknown," and is 100% remote. Key skills include 5+ years in data engineering, Azure Data Factory, ETL tools, and healthcare data standards.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
November 5, 2025
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Remote
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Minneapolis, MN
-
🧠 - Skills detailed
#Batch #Snowflake #AI (Artificial Intelligence) #Oracle #Shell Scripting #SQL (Structured Query Language) #SQL Server #Scripting #Informatica PowerCenter #"ETL (Extract #Transform #Load)" #Data Integrity #Python #GitHub #SSIS (SQL Server Integration Services) #Version Control #Data Quality #ADF (Azure Data Factory) #Deployment #Data Analysis #Azure Data Factory #Informatica #FHIR (Fast Healthcare Interoperability Resources) #Azure #Data Engineering #Data Warehouse #Databases #Data Modeling #Data Processing
Role description
We are seeking a talented and motivated Sr Data Analyst to join our dynamic team. This role offers the opportunity to work 100% remotely while contributing to impactful data projects in the healthcare sector. Responsibilities β€’ Develop and maintain effective working relationships with various departments to ensure seamless collaboration. β€’ Communicate efficiently with ETL architects to understand requirements and business processes for data transformation. β€’ Assist in designing, implementing, and automating ETL flows to optimize data processing. β€’ Investigate and resolve data quality issues within ETL pipelines, providing solutions to end-users. β€’ Build and manage ETL pipelines using Azure Data Factory and Snowflake toolsets. β€’ Test ETL system code and conduct root cause analysis on production issues to ensure data integrity. β€’ Document implementations and create deployment documents for CI/CD processes. Skills β€’ 5+ years of experience in data engineering, specifically in data warehousing. β€’ Proficient in creating pipelines using Azure Data Factory (ADF). β€’ Extensive experience with ETL tools such as Informatica PowerCenter and SSIS. β€’ Strong knowledge of relational databases including Oracle, Snowflake, and SQL Server. β€’ Experience in writing stored procedures using PL/SQL, T-SQL, or Snowflake SQL. β€’ Proficient in using version control systems like GitHub or SVN. Preferred Skills β€’ Experience with batch or PowerShell scripting and Python scripting. β€’ Familiarity with data modeling in a data warehouse environment. β€’ Knowledge of designing APIs in Snowflake and ADF. β€’ Experience with healthcare data standards such as HL7 and FHIR. We are committed to fostering a diverse, equitable, and inclusive workplace where all individuals feel valued and empowered to contribute their unique perspectives. Once you apply for this position, you may receive a phone call, SMS or email at the time of application from our Virtual AI Recruiter, Alex, to conduct an initial interview.