Hope Tech

Sr Data Analyst(ETL,Data Warehousing)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Sr Data Analyst (ETL, Data Warehousing) in Minneapolis, MN, on a contract basis for over 6 months, offering competitive pay. Key skills include 5+ years in data engineering, Azure Data Factory, and ETL tools, with a focus on healthcare data.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
422
-
πŸ—“οΈ - Date
November 6, 2025
πŸ•’ - Duration
More than 6 months
-
🏝️ - Location
Remote
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Minneapolis, MN
-
🧠 - Skills detailed
#Shell Scripting #Data Warehouse #Mathematics #SQL Server #StreamSets #Cloud #Informatica PowerCenter #REST (Representational State Transfer) #Snowflake #GitHub #SQL (Structured Query Language) #Azure #Consulting #Oracle #SSIS (SQL Server Integration Services) #Data Engineering #Azure Data Factory #"ETL (Extract #Transform #Load)" #Security #Data Modeling #Data Pipeline #Informatica #Computer Science #Data Ingestion #ADF (Azure Data Factory) #Deployment #Scripting #IICS (Informatica Intelligent Cloud Services) #FHIR (Fast Healthcare Interoperability Resources) #Cybersecurity #Batch #Storage #Data Integration #SnowPipe #Data Quality #Statistics #Data Analysis #Python #Databases
Role description
Job Title: Sr Data Analyst(ETL,Data Warehousing) Location: Minneapolis, MN Employment Type: Contract About Us: DMV IT Service LLC, founded in 2020, is a trusted IT consulting firm specializing in IT infrastructure optimization, cybersecurity, networking, and staffing solutions. We partner with clients to achieve technology goals through expert guidance, workforce support, and innovative solutions. With a client-focused approach, we also provide online training and job placements, ensuring long-term IT success. Job Purpose: The Senior Data Analyst is responsible for designing, building, and maintaining efficient ETL pipelines, transforming data for analytics, and ensuring the integrity and reliability of data flows. This role requires close collaboration with ETL architects, data engineers, and business stakeholders to analyze requirements, implement solutions, and troubleshoot production issues. Requirements: Key Responsibilities : Develop and maintain strong working relationships with other departments and technical teams to coordinate data initiatives. Communicate effectively with ETL architects to understand requirements and business processes for data transformation. Assist in ETL design and architecture, providing input on implementing and automating ETL workflows. Investigate and analyze data to identify issues in ETL pipelines, notify end-users, and propose solutions. Build and manage ETL pipelines and data flows using Azure Data Factory (ADF) and Snowflake. Design idempotent ETL processes to allow interrupted or failed workflows to be safely rerun without errors. Work with Snowflake Virtual Warehouses and automate data ingestion using Snowpipe. Manage data versioning and capture changes using tools like StreamSets and schedule using Snowflake Tasks. Optimize data movement and storage for performance improvements and accelerated response times. Build orchestration frameworks to schedule jobs, manage dependencies, perform data quality checks, and execute workflows. Test ETL pipelines, data flows, and system code; perform root cause analysis on production issues. Document implementations, test cases, and deployment processes for CI/CD workflows. Required Skills & Experience: 5+ years of data engineering experience, with a strong focus on data warehousing. 2+ years of experience building pipelines in Azure Data Factory (ADF). 5+ years developing ETL using Informatica PowerCenter, SSIS, ADF, or similar tools. 5+ years working with relational databases such as Oracle, Snowflake, or SQL Server. 3+ years creating stored procedures using Oracle PL/SQL, SQL Server T-SQL, or Snowflake SQL. 2+ years experience with source control systems such as GitHub or SVN. 2+ years processing structured and unstructured data, including HL7 and FHIR formats. 3+ years analyzing project requirements and creating detailed ETL specifications. Strong analytical and problem-solving skills with the ability to troubleshoot and optimize data pipelines. Ability to adapt to evolving technologies and changing business requirements. Bachelor’s or Advanced Degree in Information Technology, Computer Science, Analytics, Mathematics, Statistics, or related field. Preferred Skills & Attributes: 2+ years of experience with batch or PowerShell scripting. 2+ years of Python scripting experience. 3+ years of data modeling experience in a data warehouse environment. Experience with Informatica Intelligent Cloud Services (Data Integration). Designing and building APIs in Snowflake and ADF (REST, RPC). Experience with Healthcare/Medicaid/Medicare applications. Azure certifications related to data engineering or analytics.