Icon Global Technologies

Technical Architect - Data & Analytics

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Technical Architect - Data & Analytics, with a contract length of 3 to 6 months, offering a competitive pay rate. Key skills required include Snowflake, AWS S3, Matillion, dbt, and experience in insurance or financial services data modernization.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
April 4, 2026
πŸ•’ - Duration
3 to 6 months
-
🏝️ - Location
Unknown
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Newport Beach, CA
-
🧠 - Skills detailed
#Macros #Tableau #Collibra #Snowflake #Informatica #Clustering #"ETL (Extract #Transform #Load)" #GIT #Python #AWS (Amazon Web Services) #S3 (Amazon Simple Storage Service) #SnowPipe #Data Engineering #SSIS (SQL Server Integration Services) #MDM (Master Data Management) #Microsoft Power BI #Azure #AWS S3 (Amazon Simple Storage Service) #Migration #DevOps #GitHub #Data Governance #SQL (Structured Query Language) #Data Lake #BI (Business Intelligence) #Computer Science #Data Quality #Matillion #AI (Artificial Intelligence) #dbt (data build tool)
Role description
Job Description: We are looking for a seasoned Technical Architect to own end-to-end solution architecture for an insurance and financial services enterprise's large-scale, multi-wave data modernization program. You will design and govern a Snowflake + AWS S3 + Matillion + dbt platform built on medallion architecture principles, define the ACORD-based enterprise data model, and set the technical standards that all delivery workstreams will follow. This is a hands-on architecture role β€” you will be deeply embedded in the delivery team, not a distant reviewer. Skills / Experience: β€’ 12+ years in data engineering and analytics with 3+ years in a solution/technical architect role on enterprise-scale programs β€’ Deep, hands-on expertise in Snowflake β€” query optimization, clustering, Data Metric Functions, Snowpipe, Streams, and native AI capabilities β€’ Experience in Platform Architecture, Snowflake Data Mesh, AI-Augmented Delivery and Enterprise-Scale Modernization β€’ Key Skills – Snowflake, AWS S3, Matillion, dbt (data build tool), Collibra, Profisee MDM, Python, SQL, Snowflake Cortex AI, WinAIDM, SnowConvert AI, Tableau, Power BI, CI/CD (DevOps), Git, ACORD Data Model β€’ Proven experience designing medallion/lake house architectures with AWS S3 as the raw data lake layer β€’ Strong command of dbt β€” project structure, macros, testing frameworks, and CI/CD integration β€’ Experience architecting data governance solutions using Collibra β€” Catalog, lineage, business glossary, and certification workflows β€’ Demonstrated ability to lead multi-wave, multi-workstream data modernization programs in a regulated (insurance, healthcare, or financial services) environment β€’ Hands-on experience migrating legacy ETL (Informatica or SSIS) to modern dbt/Matillion pipelines β€’ Bachelor’s or Master’s degree in Computer Science, Engineering, or related technical discipline β€’ Strong communication skillsβ€”ability to explain technical concepts clearly; Proactive, ownership-driven mindset with high accountability β€’ Ability to collaborate across engineering, operations, and support teams; Adaptability to fast-paced iterative environments Key Responsibilities: Architecture & Solution Design β€’ Architect and deliver the enterprise data platform on Snowflake + AWS S3 using a medallion (Bronze–Silver–Gold) architecture, supporting 80+ source systems and 7-year historical migration β€’ Design the ACORD Life & Annuity-based enterprise data model customized for insurance domains β€” Policy, Claims, Finance, Actuarial, Agent/Distribution, Customer/Party β€’ Define the data mesh architecture with federated governance, domain ownership boundaries, and self-serve platform patterns for multi-wave delivery β€’ Establish reusable ingestion templates (Matillion), dbt transformation frameworks, and Snowflake-native quality patterns (Data Metric Functions) as cross-program standards β€’ Govern architecture decisions across all 5 parallel workstreams: Ingestion, Transformation, Data Quality/DRE, Consumption, and Governance/MDM Expected Outcome – The candidate is expected to deliver the following measurable outcomes β€’ Wave 1 platform foundation (Snowflake environments, AWS S3 data lake, Matillion/dbt/Collibra/Profisee, CI/CD pipelines) delivered by Month 3 with no rework required β€’ All 9 Wave 1 certified data products (CMD Life, Finance, Actuarial) achieve SLO targets: 99.9% completeness, 99.5% accuracy, end-to-end lineage in Collibra β€’ Architecture standards and dbt/Matillion templates adopted consistently across all 5 delivery workstreams β€” no divergence in patterns β€’ WinAIDM accelerator framework implemented and deployed and contributing to 40–50% reduction in data engineering effort vs. baseline β€’ Client technical stakeholders describe WinWire as a ""trusted architecture guide"" β€” proactive, decision-ready, and commercially aware Secondary Skills: β€’ Familiarity with ACORD Life & Annuity data standards and insurance domain concepts (Policy, Claims, Actuarial, Reinsurance) β€’ Experience with Profisee MDM or equivalent enterprise MDM platforms; SnowPro Advanced certification (Data Engineer or Architect) β€’ Exposure to Snowflake Cortex AI, AI-assisted development tools (GitHub Copilot, Azure OpenAI), or LLM-based data engineering accelerators