BCforward

Senior Data Engineer with Data Vault 2.0

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer with Data Vault 2.0, requiring 8–12 years of experience, including 4+ years with Data Vault. The contract is for 3 months, at $60/hr, fully remote, focusing on ETL/ELT and cloud platforms.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
480
-
🗓️ - Date
February 7, 2026
🕒 - Duration
3 to 6 months
-
🏝️ - Location
Remote
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Data Management #Talend #dbt (data build tool) #"ETL (Extract #Transform #Load)" #Informatica #MDM (Master Data Management) #Storage #GCP (Google Cloud Platform) #Synapse #Metadata #Deployment #Collibra #Vault #Data Governance #Cloud #Agile #Azure #Datasets #Scala #Data Engineering #AWS (Amazon Web Services) #Data Science #Databricks #Azure Data Factory #ADF (Azure Data Factory) #Data Quality #Batch #Snowflake #SQL (Structured Query Language) #Computer Science #Data Integration #Physical Data Model #Version Control #Leadership #GIT #Data Vault
Role description
Job Title: Senior Data Engineer with Data Vault 2.0 Location: Fully Remote Working hours: PST (Pacific Time) Expected Duration: 3 Months contracts with strong possibility of extension Wage: $60/hr on W2 Job Description: Senior Data Engineer – Data Vault (8–12 Years) Position Summary We are seeking an experienced Senior Data Engineer with strong expertise in Data Vault 2.0 architecture to design, implement, and optimize enterprise data warehousing solutions. The ideal candidate will have 8–12 years of experience in data engineering, with at least 4–5 years working specifically with Data Vault methodologies. Key Responsibilities Data Vault Architecture & Modeling • Design and implement Data Vault 2.0 models, including Hubs, Links, and Satellites. • Translate business requirements into scalable Data Vault components. • Develop conceptual, logical, and physical data models adhering to DV2.0 standards. • Work closely with business SMEs to ensure accurate business-key identification and model alignment. Data Engineering & ETL/ELT Development • Build and optimize ETL/ELT pipelines to load Data Vault layers using tools such as Azure Data Factory, dbt, Informatica, Talend, Databricks, or similar. • Implement robust data integration pipelines for batch and real-time ingestion. • Ensure data quality rules, metadata management, and lineage tracking within the platform. Data Platform & Cloud • Design and maintain cloud-based data warehousing platform such as Azure Synapse and Snowflake. • Optimize storage and computing performance for large-scale datasets. • Work with CI/CD pipelines to automate deployments and version control. Governance, Quality & Performance • Implement data governance standards and best practices. • Perform data validation, reconciliation, and quality checks. • Monitor and tune system performance and pipeline efficiency. Collaboration & Leadership • Work cross-functionally with architects, data scientists, analysts, and business stakeholders. • Provide technical leadership, mentoring, and guidance to junior engineers. • Participate in sprint planning, technical design discussions, and architecture reviews. Required Qualifications • Bachelor’s or master’s degree in computer science, Information Systems, or related field. • 8–12 years of total experience in Data Engineering or Data Warehousing. • 4+ years hands-on experience with Data Vault 2.0 (mandatory). • Strong expertise in SQL, data modelling, and performance tuning. • Deep understanding of ETL/ELT frameworks and pipeline orchestration. • Experience with at least one cloud platform: Azure, AWS, or GCP. • Proficiency with version control (Git), Agile methodologies, and CI/CD tools. Preferred Qualifications • Certification in Data Vault 2.0 Practitioner or Master. • Experience with dbt, Databricks, or Synapse pipelines. • Knowledge of data governance frameworks (e.g., Collibra, Purview). • Familiarity with MDM, data quality, and metadata management tools. • Experience in industry domain in Healthcare.