

Jobs via Dice
Data Architect with "Data Vault & Snowflake" (Hybrid Only - Oaks, PA)
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Architect with expertise in "Data Vault & Snowflake," offering a 12+ month contract in Oaks, PA (Hybrid). Key skills required include Data Vault 2.0, Snowflake, ELT pipelines, and data modeling. A minimum of 10 years in Data Warehousing is essential.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
March 10, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Oaks, PA
-
🧠 - Skills detailed
#Data Quality #Snowflake #Monitoring #dbt (data build tool) #Data Vault #Dimensional Modelling #Dimensional Data Models #Cloud #Leadership #Data Modeling #Azure DevOps #Big Data #Python #SQL (Structured Query Language) #Jira #Scrum #Oracle #Agile #Data Pipeline #Data Warehouse #GitLab #Scala #Visualization #DevOps #Data Architecture #Azure cloud #Data Mart #Azure #Data Engineering #GIT #Vault #Airflow #Database Performance #Deployment #EDW (Enterprise Data Warehouse) #"ETL (Extract #Transform #Load)" #Slowly Changing Dimensions #Metadata
Role description
Dice is the leading career destination for tech experts at every stage of their careers. Our client, LinQ Global Group., is seeking the following. Apply via Dice today!
Job Title - Data Architect
Location - OAKS, PA (Hybrid Onsite) Job Type Contract 12 + Months
Job Description Imp Skills
Data Architect, Data vault, and Snowflake
JD
The Technical Data Architect is a leadership role. They are responsible for ensuring the use of common client architecture patterns and best practices and providing direction to the systems architects and scrum teams. The role involves leveraging cutting-edge cloud and data engineering technologies to drive high-quality and scalable solutions.
What You Will Do
• Design, build, and maintain enterprise data warehouse solutions using modern data warehousing methodologies, with a strong focus on Data Vault 2.0 Implement Data Vault 2.0 models, including Hubs, Links, Satellites, and downstream Information Marts, ensuring auditability, historization, and scalability Develop and maintain Raw Vault and Business Vault layers, applying appropriate business rules and transformations where required Design and implement dimensional data models (Star and Snowflake schemas) for analytics, reporting, and downstream consumption Own the end-to-end data modeling lifecycle, from source analysis through conceptual, logical, and physical models Ensure proper grain definition, conformed dimensions, and consistent business definitions across the data warehouse Partner with business and technical stakeholders to translate requirements into scalable and maintainable data models Architect, develop, and optimize ELT pipelines using Snowflake, Airflow, DBT, and Python, following modern cloud data warehousing best practices Implement metadata driven and reusable pipeline patterns to support scalability and faster onboarding of new data sources Design and implement data quality checks, reconciliations, and validation frameworks to ensure accuracy, completeness, and consistency of warehouse data Monitor and optimize data warehouse performance, including query execution, data volumes, and pipeline runtimes Proactively identify and resolve data performance bottlenecks in ELT processes and analytical workloads Ensure data pipelines support incremental loads, historization, and change data capture (CDC) patterns where applicable Work closely with Agile teams to deliver production ready data solutions, from design through deployment Analyze data issues and defects, performing root cause analysis across pipelines, models, and transformations Mentor engineers on data warehousing best practices, Data Vault modeling, and performance optimization techniques Ensure quality for our customers by validating Performance, Stability, Scalability, and Reliability for all of our scrum initiatives Gain a technical and functional understanding of our product architecture and become part of the ongoing improvement of the performance of our enterprise application Work proactively with members of an Agile team to find and fix defects in our product architecture Analyze defects / test results and be able to deduct the chain of events leading to a failure Communicate critical issues and status updates in a timely manner to Agile teams Mentor technical resources on performance best practices Deep understanding of Application Performance Monitoring and ability to develop predictive and prescriptive solutions in the big data arena
What We Need From You
• BA/BS in a related technical field; or the equivalent in education and work experience A minimum of 10 years of experience in Information Technology A minimum of 10 years of hands-on experience in Data Warehousing, Data Engineering, and Data Architecture, with proven delivery of production grade enterprise data platforms Experience with data warehousing tools and technologies. Strong, hands-on experience with Data Vault 2.0 in enterprise data warehouses, including practical implementation of Hubs, Links, and Satellites Strong knowledge of Raw Vault, Business Vault, and Information/Data Mart design Solid grounding in dimensional modelling (Kimball) including Star and Snowflake schemas, fact and dimension table design along with slowly changing dimensions and hybrid DW architectures. Hands on experience designing scalable, auditable, and historized data models that support enterprise reporting and analytics needs Ability to hit the ground running with minimal ramp up on DW concepts Experience building and managing data pipelines using Snowflake, Airflow, DBT, and Python, following modern ELT patterns Expertise in performance tuning and optimization of data warehouse workloads, pipelines, and transformations Experience in working with Bit Bucket, Azure DevOps, GitLab, Jira, and Confluence. Proficiency in analysis, design, and build of SQL and PL/SQL packages. Ability to develop and implement strategies for data archiving, purging, and lifecycle management to maintain optimal performance in data warehousing environments. Fair experience with Azure Cloud architecture, deployment, and optimization. Proficiency in source control solutions such as GIT/GitLab and the Git flow process. Ability to identify performance bottlenecks from issues complement development team in speedy analysis/root cause of performance issues is preferred. Ability to lead performance optimization efforts with a strong command of complex performance issues and the ability to direct component owners is a plus Oracle Database performance tuning expertise Data Analytics and Data Visualization services Knowledge of Performance trouble shooting is preferred. Experience effectively interacting with large development and delivery teams and application development managers who are providing full lifecycle application support in complex, heterogeneous environments Strong understanding of SAFe Displays Servant Leadership Skills
What We Would Like From You
Experience in wealth management, investment processing, domestic and international Equity and / or Fixed Income workflows and analytics
Experience effectively interacting with large application development and delivery teams and application development managers who are providing full lifecycle application support in complex, heterogeneous environments
Experience working with senior leadership, ability to work comfortably with a wide range of people and skill sets
Data Vault 2.0 certification (e.g., Certified Data Vault 2.0 Practitioner CDVP2 or Data Vault Alliance certification)
Knowledge of core investment processing / wealth management transaction processing, trade flow, custody & accounting, cash processing, etc.
Ability to work effectively in a team environment, with a singular commitment to the accomplishment of team results
Dice is the leading career destination for tech experts at every stage of their careers. Our client, LinQ Global Group., is seeking the following. Apply via Dice today!
Job Title - Data Architect
Location - OAKS, PA (Hybrid Onsite) Job Type Contract 12 + Months
Job Description Imp Skills
Data Architect, Data vault, and Snowflake
JD
The Technical Data Architect is a leadership role. They are responsible for ensuring the use of common client architecture patterns and best practices and providing direction to the systems architects and scrum teams. The role involves leveraging cutting-edge cloud and data engineering technologies to drive high-quality and scalable solutions.
What You Will Do
• Design, build, and maintain enterprise data warehouse solutions using modern data warehousing methodologies, with a strong focus on Data Vault 2.0 Implement Data Vault 2.0 models, including Hubs, Links, Satellites, and downstream Information Marts, ensuring auditability, historization, and scalability Develop and maintain Raw Vault and Business Vault layers, applying appropriate business rules and transformations where required Design and implement dimensional data models (Star and Snowflake schemas) for analytics, reporting, and downstream consumption Own the end-to-end data modeling lifecycle, from source analysis through conceptual, logical, and physical models Ensure proper grain definition, conformed dimensions, and consistent business definitions across the data warehouse Partner with business and technical stakeholders to translate requirements into scalable and maintainable data models Architect, develop, and optimize ELT pipelines using Snowflake, Airflow, DBT, and Python, following modern cloud data warehousing best practices Implement metadata driven and reusable pipeline patterns to support scalability and faster onboarding of new data sources Design and implement data quality checks, reconciliations, and validation frameworks to ensure accuracy, completeness, and consistency of warehouse data Monitor and optimize data warehouse performance, including query execution, data volumes, and pipeline runtimes Proactively identify and resolve data performance bottlenecks in ELT processes and analytical workloads Ensure data pipelines support incremental loads, historization, and change data capture (CDC) patterns where applicable Work closely with Agile teams to deliver production ready data solutions, from design through deployment Analyze data issues and defects, performing root cause analysis across pipelines, models, and transformations Mentor engineers on data warehousing best practices, Data Vault modeling, and performance optimization techniques Ensure quality for our customers by validating Performance, Stability, Scalability, and Reliability for all of our scrum initiatives Gain a technical and functional understanding of our product architecture and become part of the ongoing improvement of the performance of our enterprise application Work proactively with members of an Agile team to find and fix defects in our product architecture Analyze defects / test results and be able to deduct the chain of events leading to a failure Communicate critical issues and status updates in a timely manner to Agile teams Mentor technical resources on performance best practices Deep understanding of Application Performance Monitoring and ability to develop predictive and prescriptive solutions in the big data arena
What We Need From You
• BA/BS in a related technical field; or the equivalent in education and work experience A minimum of 10 years of experience in Information Technology A minimum of 10 years of hands-on experience in Data Warehousing, Data Engineering, and Data Architecture, with proven delivery of production grade enterprise data platforms Experience with data warehousing tools and technologies. Strong, hands-on experience with Data Vault 2.0 in enterprise data warehouses, including practical implementation of Hubs, Links, and Satellites Strong knowledge of Raw Vault, Business Vault, and Information/Data Mart design Solid grounding in dimensional modelling (Kimball) including Star and Snowflake schemas, fact and dimension table design along with slowly changing dimensions and hybrid DW architectures. Hands on experience designing scalable, auditable, and historized data models that support enterprise reporting and analytics needs Ability to hit the ground running with minimal ramp up on DW concepts Experience building and managing data pipelines using Snowflake, Airflow, DBT, and Python, following modern ELT patterns Expertise in performance tuning and optimization of data warehouse workloads, pipelines, and transformations Experience in working with Bit Bucket, Azure DevOps, GitLab, Jira, and Confluence. Proficiency in analysis, design, and build of SQL and PL/SQL packages. Ability to develop and implement strategies for data archiving, purging, and lifecycle management to maintain optimal performance in data warehousing environments. Fair experience with Azure Cloud architecture, deployment, and optimization. Proficiency in source control solutions such as GIT/GitLab and the Git flow process. Ability to identify performance bottlenecks from issues complement development team in speedy analysis/root cause of performance issues is preferred. Ability to lead performance optimization efforts with a strong command of complex performance issues and the ability to direct component owners is a plus Oracle Database performance tuning expertise Data Analytics and Data Visualization services Knowledge of Performance trouble shooting is preferred. Experience effectively interacting with large development and delivery teams and application development managers who are providing full lifecycle application support in complex, heterogeneous environments Strong understanding of SAFe Displays Servant Leadership Skills
What We Would Like From You
Experience in wealth management, investment processing, domestic and international Equity and / or Fixed Income workflows and analytics
Experience effectively interacting with large application development and delivery teams and application development managers who are providing full lifecycle application support in complex, heterogeneous environments
Experience working with senior leadership, ability to work comfortably with a wide range of people and skill sets
Data Vault 2.0 certification (e.g., Certified Data Vault 2.0 Practitioner CDVP2 or Data Vault Alliance certification)
Knowledge of core investment processing / wealth management transaction processing, trade flow, custody & accounting, cash processing, etc.
Ability to work effectively in a team environment, with a singular commitment to the accomplishment of team results






