Falcon Smart IT

Technical Architect — Data and Analytics

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Technical Architect — Data and Analytics, offering a contract of more than 6 months in Newport Beach, CA, with a focus on Snowflake, AWS S3, and data governance in a regulated environment. Requires 12–16 years of experience.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
April 22, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
On-site
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Newport Beach, CA
-
🧠 - Skills detailed
#AWS S3 (Amazon Simple Storage Service) #BI (Business Intelligence) #Data Ingestion #"ETL (Extract #Transform #Load)" #Macros #MDM (Master Data Management) #Monitoring #Leadership #SQL (Structured Query Language) #SSIS (SQL Server Integration Services) #Cloud #S3 (Amazon Simple Storage Service) #Azure #Data Engineering #Data Governance #Tableau #SQL Server #AWS (Amazon Web Services) #Strategy #Informatica #Oracle #Oracle Cloud #Matillion #Collibra #Security #DevOps #Python #AI (Artificial Intelligence) #Data Processing #Microsoft Power BI #Data Lake #Migration #GitHub #SnowPipe #Azure DevOps #Snowflake #Version Control #Clustering #Data Quality #dbt (data build tool) #GIT
Role description
Job Title: Technical Architect — Data and Analytics Location: Newport beach, CA (Onshore) Job Type: Contract/ FTE Job Description: Experience: 12–16 years Role Summary We are looking for a seasoned Technical Architect to own end-to-end solution architecture for a Fortune 500 insurance and financial services enterprise's large-scale, multi-wave data modernization program. You will design and govern a Snowflake + AWS S3 + Matillion + dbt platform built on medallion architecture principles, define the ACORD-based enterprise data model, and set the technical standards that all delivery workstreams will follow. This is a hands-on architecture role — you will be deeply embedded in the delivery team, not a distant reviewer. Why This Role Matters This engagement is a generational transformation — migrating 372 production database instances and 80+ source systems into a unified, governed, cloud-native data platform over 18–24 months. The architecture you design will directly enable three of the client's most critical strategic programs: real-time operations transformation, new business underwriting modernization, and Finance Oracle Cloud implementation. Getting the foundation right in Wave 1 determines whether Waves 2 and 3 can scale without replatforming. What You'll Do Architecture & Solution Design Architect and deliver the enterprise data platform on Snowflake + AWS S3 using a medallion (Bronze–Silver–Gold) architecture, supporting 80+ source systems and 7-year historical migration Design the ACORD Life & Annuity-based enterprise data model customized for insurance domains — Policy, Claims, Finance, Actuarial, Agent/Distribution, Customer/Party Define the data mesh architecture with federated governance, domain ownership boundaries, and self-serve platform patterns for multi-wave delivery Establish reusable ingestion templates (Matillion), dbt transformation frameworks, and Snowflake-native quality patterns (Data Metric Functions) as cross-program standards Govern architecture decisions across all 5 parallel workstreams: Ingestion, Transformation, Data Quality/DRE, Consumption, and Governance/MDM Data Platform & Engineering Standards Design the data ingestion strategy for structured (SQL Server CDC, Oracle, SFTP, APIs) and unstructured (EHR, APS notes, underwriting documents) source systems Define dbt project structure, modular macro patterns, and Git-integrated version control standards for all transformation logic Architect the Collibra integration strategy — automated catalog, end-to-end lineage (Matillion → dbt → Snowflake → Tableau/Power BI), business glossary, and certification workflows Specify the Profisee MDM integration architecture — bi-directional Snowflake Silver layer synchronization, golden record publishing, and insurance-specific match/merge patterns AI-Driven Delivery & Acceleration Embed clients WinAIDM AI accelerator framework into the delivery model — AI-powered ingestion, quality validation, dbt transformation, and test case generation Guide adoption of Snowflake Cortex AI capabilities (AI_EXTRACT, AI_CLASSIFY, AI\_COMPLETE) for unstructured data processing within the platform's security perimeter Define the CI/CD quality gate architecture — automated dbt tests, reconciliation validation (99.9%+ match), performance benchmarks, and lineage completeness checks Champion AI-augmented engineering practices (GitHub Copilot, LLM-based accelerators) to drive 40–50% reduction in development effort across the team Technical Leadership & Governance Lead technical design reviews, architecture decision records (ADRs), and code standards across all engineering workstreams Mentor senior engineers, technical leads, and data modelers — building a high-performing delivery team capable of sustaining the platform post-engagement Drive the Data Reliability Engineering (DRE) framework: SLO/SLI definition, error budgets, automated monitoring, and incident response patterns for all certified data products Participate in the three-tier governance model — representing architecture at Program Management and Steering Committee levels Client Engagement Translate complex technical architecture into clear, decision-ready recommendations for client technology and business stakeholders Proactively surface trade-offs (performance vs. cost, speed vs. governance) and recommend options — acting as a trusted guide, not just an executor Collaborate with the customer's data engineering, infrastructure, and domain SME teams to align platform decisions to business outcomes Tech Stack Snapshot Snowflake, AWS S3, Matillion, dbt (data build tool), Collibra, Profisee MDM, Python, SQL, Snowflake Cortex AI, WinAIDM, SnowConvert AI, Tableau, Power BI, CI/CD (Azure DevOps), Git, ACORD Data Model Must-Have Skills 12+ years in data engineering and analytics with 3+ years in a solution/technical architect role on enterprise-scale programs Deep, hands-on expertise in Snowflake — query optimization, clustering, Data Metric Functions, Snowpipe, Streams, and native AI capabilities Proven experience designing medallion/lakehouse architectures with AWS S3 as the raw data lake layer Strong command of dbt — project structure, macros, testing frameworks, and CI/CD integration Experience architecting data governance solutions using Collibra — catalog, lineage, business glossary, and certification workflows Demonstrated ability to lead multi-wave, multi-workstream data modernization programs in a regulated (insurance, healthcare, or financial services) environment Hands-on experience migrating legacy ETL (Informatica or SSIS) to modern dbt/Matillion pipelines Good to Have Familiarity with ACORD Life & Annuity data standards and insurance domain concepts (Policy, Claims, Actuarial, Reinsurance) Experience with Profisee MDM or equivalent enterprise MDM platforms Exposure to Snowflake Cortex AI, AI-assisted development tools (GitHub Copilot, Azure OpenAI), or LLM-based data engineering accelerators SnowPro Advanced certification (Data Engineer or Architect) What Success Looks Like (6–12 Months) Wave 1 platform foundation (Snowflake environments, AWS S3 data lake, Matillion/dbt/Collibra/Profisee, CI/CD pipelines) delivered by Month 3 with no rework required All 9 Wave 1 certified data products (CMD Life, Finance, Actuarial) achieve SLO targets: 99.9% completeness, 99.5% accuracy, end-to-end lineage in Collibra Architecture standards and dbt/Matillion templates adopted consistently across all 5 delivery workstreams — no divergence in patterns WinAIDM accelerator framework implemented and deployed and contributing to 40–50% reduction in data engineering effort vs. baseline