UnivEdge Consulting LLC

Lead Data Architect

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Lead Data Architect with 12+ years of experience, focusing on Azure data services and Databricks. It is a full-time remote position in the USA, with a duration of over 6 months and requires strong Python and CI/CD skills.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
150
-
🗓️ - Date
May 13, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Collibra #Terraform #GIT #Data Lake #Python #Deployment #Data Modeling #Data Pipeline #Data Architecture #Infrastructure as Code (IaC) #Spark (Apache Spark) #ADF (Azure Data Factory) #S3 (Amazon Simple Storage Service) #Scala #Kafka (Apache Kafka) #Vault #"ETL (Extract #Transform #Load)" #ADLS (Azure Data Lake Storage) #Observability #Automated Testing #Azure Data Factory #DevOps #Azure DevOps #Data Governance #Databricks #Cloud #Redshift #Strategy #Informatica #Delta Lake #AWS (Amazon Web Services) #Synapse #Azure SQL #Azure #SQL (Structured Query Language) #Data Quality #Informatica BDM (Big Data Management) #PySpark #Data Engineering
Role description
Lead Data Architect / Enterprise Data Architect Remote USA Experience: 12+ years | Type: Full-time About Vericence Vericence is a global IT services company partnering with leading enterprises across technology, healthcare, and financial services. We build high-performing data and engineering teams that power mission-critical platforms for some of the world's most recognized brands. The Role We're hiring a Lead / Enterprise Data Architect to anchor the data platform for one of our marquee enterprise clients. This is a hands-on senior architect role - not a pure governance seat. You'll own the architecture of cloud-native data pipelines, lakehouse design, and the modernization roadmap. You'll work directly with onshore engineering, product, and business stakeholders, and lead offshore delivery teams. What you'll do • Architect end-to-end data solutions on Azure — Data Lake, Lakehouse, Warehouse, and real-time pipelines • Design and govern Databricks-based data platforms: workspace structure, Unity Catalog, cluster strategy, performance tuning, cost optimization • Build scalable ingestion, transformation, and curation frameworks using PySpark / Python • Define data modeling standards (medallion architecture, dimensional, vault) and enforce them across the team • Lead CI/CD and DevOps practices for data — Azure DevOps, Git-based workflows, automated deployments, IaC (Terraform / ARM / Bicep) • Partner with stakeholders to translate business needs into reference architectures and delivery roadmaps • Mentor data engineers, conduct design reviews, and own technical decisions • Drive data quality, observability, lineage, and governance across the platform • Evaluate and recommend tools, services, and architectural patterns; lead POCs What you bring • 12+ years in data engineering / architecture, with at least 4–5 years in a lead or architect role • Deep hands-on expertise with Azure data services — ADLS Gen2, Azure Data Factory, Synapse, Azure SQL, Event Hubs, Azure Functions • Strong Databricks experience — Delta Lake, Unity Catalog, Workflows, DLT, performance optimization • Strong Python / PySpark for production-grade pipelines • Solid SQL, data modeling, and warehousing fundamentals • Experience designing lakehouse / medallion architectures at enterprise scale • Working knowledge of CI/CD for data — Azure DevOps, Git branching strategies, automated testing for data pipelines • Cloud certifications preferred — Azure Data Engineer Associate, Databricks Certified Data Engineer / Architect • Excellent communication — comfortable presenting architecture to senior stakeholders and leading technical conversations • Onsite availability in Houston, TX Nice to have • Exposure to AWS data services (Glue, Redshift, S3, EMR) — multi-cloud is a plus • Experience with Informatica BDM / IDQ / PowerCenter for legacy modernization initiatives • Streaming / real-time experience (Kafka, Event Hubs, Structured Streaming) • Data governance tooling — Purview, Collibra, or equivalent • Experience leading offshore delivery teams