Cignitix Global

Data Architect

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Architect on a contract basis in Oaks, PA (Hybrid) with a pay rate of "unknown." Key skills include Data Vault 2.0, Snowflake, and data warehousing. Requires 10+ years of experience, expertise in ELT pipelines, and a Data Vault 2.0 certification.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
February 12, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Oaks, PA
-
🧠 - Skills detailed
#Azure #Azure DevOps #Python #SQL (Structured Query Language) #DevOps #"ETL (Extract #Transform #Load)" #Data Vault #Jira #Cloud #Slowly Changing Dimensions #Metadata #Data Architecture #Agile #dbt (data build tool) #Snowflake #Dimensional Modelling #Data Modeling #GitLab #Data Engineering #Data Warehouse #Database Performance #Oracle #Vault #Data Mart #Big Data #Azure cloud #GIT #Leadership #Scrum #EDW (Enterprise Data Warehouse) #Deployment #Scala #Visualization #Dimensional Data Models #Data Quality #Monitoring #Data Pipeline #Airflow
Role description
Position : Data Architect Location : Oaks, PA (Hybrid) Job Type: Contract Key Skills: Data Architect, Data Vault and Snowflake. Job Description: Design, build, and maintain enterprise data warehouse solutions using modern data warehousing methodologies, with a strong focus on Data Vault 2.0 Implement Data Vault 2.0 models, including Hubs, Links, Satellites, and downstream Information Marts, ensuring auditability, historization, and scalability Develop and maintain Raw Vault and Business Vault layers, applying appropriate business rules and transformations where required Design and implement dimensional data models (Star and Snowflake schemas) for analytics, reporting, and downstream consumption Own the end‑to‑end data modeling lifecycle, from source analysis through conceptual, logical, and physical models Ensure proper grain definition, conformed dimensions, and consistent business definitions across the data warehouse Partner with business and technical stakeholders to translate requirements into scalable and maintainable data models Architect, develop, and optimize ELT pipelines using Snowflake, Airflow, DBT, and Python, following modern cloud data warehousing best practices Implement metadata‑driven and reusable pipeline patterns to support scalability and faster onboarding of new data sources Design and implement data quality checks, reconciliations, and validation frameworks to ensure accuracy, completeness, and consistency of warehouse data Monitor and optimize data warehouse performance, including query execution, data volumes, and pipeline runtimes Proactively identify and resolve data performance bottlenecks in ELT processes and analytical workloads Ensure data pipelines support incremental loads, historization, and change data capture (CDC) patterns where applicable Work closely with Agile teams to deliver production‑ready data solutions, from design through deployment Analyze data issues and defects, performing root‑cause analysis across pipelines, models, and transformations Mentor engineers on data warehousing best practices, Data Vault modeling, and performance optimization techniques Ensure quality for our customers by validating Performance, Stability, Scalability, and Reliability for all of our scrum initiatives Gain a technical and functional understanding of our product architecture and become part of the ongoing improvement of the performance of our enterprise application Work proactively with members of an Agile team to find and fix defects in our product architecture Analyze defects / test results and be able to deduct the chain of events leading to a failure Communicate critical issues and status updates in a timely manner to Agile teams Mentor technical resources on performance best practices Deep understanding of Application Performance Monitoring and ability to develop predictive and prescriptive solutions in the big data arena Required Skills: • A minimum of 10 years of hands‑on experience in Data Warehousing, Data Engineering, and Data Architecture, with proven delivery of production‑grade enterprise data platforms • Experience with data warehousing tools and technologies. • Strong, hands‑on experience with Data Vault 2.0 in enterprise data warehouses, including practical implementation of Hubs, Links, and Satellites • Strong knowledge of Raw Vault, Business Vault, and Information/Data Mart design • Solid grounding in dimensional modelling (Kimball) including Star and Snowflake schemas, fact and dimension table design along with slowly changing dimensions and hybrid DW architectures. • Hands‑on experience designing scalable, auditable, and historized data models that support enterprise reporting and analytics needs • Ability to hit the ground running with minimal ramp‑up on DW concepts • Experience building and managing data pipelines using Snowflake, Airflow, DBT, and Python, following modern ELT patterns • Expertise in performance tuning and optimization of data warehouse workloads, pipelines, and transformations • Experience in working with Bit Bucket, Azure DevOps, GitLab, Jira, and Confluence. • Proficiency in analysis, design, and build of SQL and PL/SQL packages. • Ability to develop and implement strategies for data archiving, purging, and lifecycle management to maintain optimal performance in data warehousing environments. • Fair experience with Azure Cloud architecture, deployment, and optimization. • Proficiency in source control solutions such as GIT/GitLab and the Git flow process. • Ability to identify performance bottlenecks from issues – complement development team in speedy analysis/root cause of performance issues is preferred. • Ability to lead performance optimization efforts with a strong command of complex performance issues and the ability to direct component owners is a plus • Oracle Database performance tuning expertise • Data Analytics and Data Visualization services • Knowledge of Performance trouble shooting is preferred. • Experience effectively interacting with large development and delivery teams and application development managers who are providing full lifecycle application support in complex, heterogeneous environments • Strong understanding of SAFe • Displays Servant Leadership Skills What we would like from you: • Experience in wealth management, investment processing, domestic and international Equity and / or Fixed Income workflows and analytics • Experience effectively interacting with large application development and delivery teams and application development managers who are providing full lifecycle application support in complex, heterogeneous environments • Experience working with senior leadership, ability to work comfortably with a wide range of people and skill sets • Data Vault 2.0 certification (e.g., Certified Data Vault 2.0 Practitioner – CDVP2 or Data Vault Alliance certification) • Knowledge of core investment processing / wealth management transaction processing, trade flow, custody & accounting, cash processing, etc. • Ability to work effectively in a team environment, with a singular commitment to the accomplishment of team results