TalentBurst, an Inc 5000 company

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer in Boston, MA (Hybrid), offering a long-term contract at 37.5 hours per week. Requires 3–5+ years in data engineering, expertise in Snowflake and Informatica, and knowledge of healthcare data standards.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
480
-
🗓️ - Date
October 30, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Boston, MA 02109
-
🧠 - Skills detailed
#Computer Science #Cloud #JSON (JavaScript Object Notation) #XML (eXtensible Markup Language) #Python #GCP (Google Cloud Platform) #Data Analysis #Scala #"ETL (Extract #Transform #Load)" #Data Accuracy #Scripting #Snowflake #Data Management #Automation #Security #Storage #FHIR (Fast Healthcare Interoperability Resources) #Data Governance #Data Pipeline #Deployment #Informatica #Data Engineering #GIT #Data Science #Azure #Documentation #Data Ingestion #Informatica IDQ (Informatica Data Quality) #S3 (Amazon Simple Storage Service) #IICS (Informatica Intelligent Cloud Services) #Data Catalog #Data Quality #AWS (Amazon Web Services) #Metadata #Agile #SQL (Structured Query Language) #Informatica PowerCenter #Data Security #Scrum
Role description
Data Engineer Boston, MA(Hybrid) Long term contract 37.5 hours per week Client is seeking to hire an experienced Data Engineer to work with the EHS IT team supporting the Department of Mental Health and Department of Public Health Hospitals. The Data Engineer will be responsible for developing, maintaining, and optimizing data pipelines and integration processes that support analytics, reporting, and business operations. This role focuses on designing and implementing scalable data solutions using Snowflake and Informatica, ensuring that data is accurate, reliable, and accessible to key stakeholders across the organization. Key Responsibilities: Design, build, and maintain ETL/ELT pipelines using Informatica to move and transform data from various source systems into Snowflake. Ensure data handling and storage comply with HIPAA, HITECH, and organizational privacy/security standards. Support the development of data models, schemas, and views in Snowflake to enable efficient data querying and analytics. Support interoperability and data exchange initiatives using healthcare standards such as HL7, FHIR, and X12 EDI. Implement data ingestion, transformation, and quality processes to ensure consistent, trusted data across environments. Monitor and troubleshoot data pipelines to ensure high performance and reliability. Collaborate with data analysts, data scientists, other data engineers, and business users to understand data requirements and deliver solutions. Develop and maintain documentation for data flows, metadata, and transformation logic. Assist with data security, access control, and governance within Snowflake and Informatica. Participate in testing, deployment, and release management for new data workflows and enhancements. Required Qualifications: 3–5+ years of experience in data engineering or ETL development. Hands-on experience with Snowflake Cloud Data Platform and Informatica PowerCenter or IICS. Proficiency in SQL and strong understanding of data warehousing concepts. Experience integrating structured and semi-structured data (e.g., JSON, XML, CSV). Familiarity with cloud platforms (AWS, Azure, or GCP) and storage services (e.g., S3, Blob Storage). Understanding of data governance, data quality, and metadata management principles. Preferred Knowledge, Skills & Abilities: Experience with Python for scripting and automation. Knowledge of Informatica Data Quality (IDQ) or Data Catalog tools. Exposure to CI/CD pipelines, Git, and Agile/Scrum environments. Soft Skills: Strong analytical and problem-solving skills. Excellent communication and collaboration with cross-functional teams. Detail-oriented, with a focus on data accuracy and process improvement. Eager to learn new tools and technologies in a fast-paced data environment. Education and Certification: Bachelor's degree in Computer Science, Information Systems, Data Engineering, Health Informatics, or related field, or equivalent experience. Snowflake or Informatica certifications are a plus. #TB\_EN