Meduvi

Senior SQL Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior SQL Data Engineer on a long-term contract, hybrid work location. Requires 6–8+ years in SQL data engineering, ETL/ELT, and data warehousing. Must have expertise in Snowflake, Informatica, and healthcare data compliance.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
536
-
🗓️ - Date
February 6, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Boston, MA 02111
-
🧠 - Skills detailed
#Automation #Cloud #Documentation #Database Systems #JSON (JavaScript Object Notation) #Leadership #Data Modeling #Informatica #Data Management #DevOps #S3 (Amazon Simple Storage Service) #Agile #Data Warehouse #Scrum #Scala #Deployment #"ETL (Extract #Transform #Load)" #Storage #Data Governance #Metadata #Azure #Data Pipeline #Data Quality #Complex Queries #XML (eXtensible Markup Language) #Data Analysis #AWS (Amazon Web Services) #GCP (Google Cloud Platform) #Code Reviews #IICS (Informatica Intelligent Cloud Services) #Informatica PowerCenter #Snowflake #Security #FHIR (Fast Healthcare Interoperability Resources) #Data Engineering #SQL (Structured Query Language) #Data Science #Data Ingestion
Role description
37.5 Hours/week M-F Contract Duration: Long Term Reporting Mode: Hybrid - 2 days onsite / 3 days remote per week This role provides technical leadership in the design, development, optimization, and support of enterprise Business Applications, Data Warehouse solutions, and critical healthcare data processes and reporting platforms. The Senior Data Engineer owns end-to-end data solutions, leads complex data initiatives, and partners closely with business, analytics, and IT stakeholders to ensure secure, reliable, and high-quality data delivery in a regulated healthcare environment. DETAILED LIST OF JOB DUTIES AND RESPONSIBILITIES Lead the design, development, and optimization of scalable ETL/ELT pipelines using Informatica to integrate data from diverse source systems into Snowflake. Provide technical ownership of Snowflake-based data warehouse solutions, including architecture, performance tuning, and cost optimization. Ensure data ingestion, storage, and processing fully comply with HIPAA, HITECH, and organizational privacy and security standards. Lead the development and evolution of enterprise data models, schemas, and analytics-ready views in Snowflake. Support and guide interoperability and data exchange initiatives using healthcare standards such as HL7, FHIR, and X12 EDI. Define and enforce data quality, validation, and reconciliation frameworks to ensure trusted and consistent data across environments. Proactively monitor, troubleshoot, and resolve pipeline performance, reliability, and data quality issues. Mentor and provide technical guidance to data engineers; conduct design and code reviews. Collaborate with data analysts, data scientists, architects, DevOps, and business stakeholders to translate requirements into scalable technical solutions. Lead documentation efforts for data flows, metadata, transformation logic, and operational procedures. Partner with security and governance teams to implement role-based access controls, data governance, and auditability within Snowflake and Informatica. Participate in and influence testing strategies, deployment automation, release management, and SDLC best practices. Contribute to tool selection, architectural standards, and continuous improvement initiatives across the data platform. Required Qualifications 6–8+ years of experience in SQL-based data engineering, ETL/ELT development, or data warehousing. Demonstrated experience leading or owning complex data initiatives in enterprise environments. Hands-on expertise with: Snowflake Cloud Data Platform, including advanced SQL development, data modeling, and performance optimization. Informatica PowerCenter and/or Informatica Intelligent Cloud Services (IICS) for large-scale ETL/ELT design. Advanced SQL skills, including complex queries, procedures, and deep understanding of data warehousing concepts. Strong experience designing, coding, testing, documenting, and maintaining database systems and data pipelines. Solid understanding of SDLC, Agile/Scrum methodologies, and DevOps practices. Experience integrating structured and semi-structured data (JSON, XML, CSV). Experience working in cloud environments (AWS, Azure, or GCP) and cloud storage platforms (e.g., S3, Blob Storage). Strong understanding of data governance, metadata management, and data quality frameworks. - Compensation Statement. Please see pay rate within this job posting. Employee Benefits Statement. Meduvi offers comprehensive medical health insurance (HMO/PPO), dental (PPO), 401k and weekly payroll with direct deposit. EEO Statement. We welcome all applicants and qualified individuals, who will receive consideration for employment without regard to their race, color, religion, national origin, sex, sexual orientation, gender identity, protected veteran status or disability.