Experis

Enterprise Data Architect

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an "Enterprise Data Architect" based in Austin, Texas, on a 6-month contract with a pay rate of $75-$87/hr. Requires 8+ years in Data Architecture, expertise in Databricks, Azure, and strong communication skills.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
600
-
🗓️ - Date
April 15, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
On-site
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Austin, TX
-
🧠 - Skills detailed
#Azure DevOps #Microsoft Power BI #Azure #Data Quality #Batch #Data Engineering #DMP (Data Management Platform) #Data Architecture #Data Modeling #Data Management #ADLS (Azure Data Lake Storage) #DevOps #BI (Business Intelligence) #Automation #Data Ingestion #Spark (Apache Spark) #AI (Artificial Intelligence) #Leadership #Spark SQL #GitHub #Agile #Databricks #Azure Databricks #Documentation #Python #Forecasting #"ETL (Extract #Transform #Load)" #ML (Machine Learning) #Data Lake #MLflow #SQL (Structured Query Language) #Microsoft Azure #Data Lakehouse
Role description
Job Title: Enterprise Data Architect Location: Austin Texas Contract- 6 months Pay Range: $75/hr- $87/hr Role Summary The Enterprise Data Architect will lead the design of a Data Lakehouse architecture modernizing a data environment to become the single-source of truth for a multi-sourcing service provider landscape. This role requires deep expertise in Data Architecture and Modeling, Microsoft Azure, Databricks, Unity Catalog, and semantic modeling built and operating in an Agile Software Engineering methodology using Product Oriented Delivery (POD). The architect must also be able to articulate high-level technology benefits in business terms to drive stakeholder alignment and facilitate adoption of modern data management techniques. The Data Architect partners closely with technology office, platform teams, providers, and stakeholders to translate business requirements and technology considerations into architectural designs that will lead the development of performing, governed, and production-ready data solutions expanding the platform using Agile Software Engineering methodologies (e.g. GitHub and SDLC based on CI/CD). This role owns data architecture, semantic modeling standards, and governance patterns, while enabling engineering teams to implement at scale. Key Responsibilities • Architect and govern the data management platform for data ingestion, data consumption, semantic modeling, analytics, and AI readiness. • Design master data and canonical data modeling for the ability to represent the single-source-of truth for data consumers. • Design reference architecture, patterns, standards, and lower-level technical solutions documentation for the data system, ingestion, transformation, and reporting pipelines. • Integrate ServiceNow SLA data and ApptioOne financial models into the Lakehouse. • Define and enforce governance policies using Unity Catalog. • Collaborate with the client, Capgemini resources, and providers to align architecture with business goals. • Design Data Quality and exception frameworks for data engineer reusability. • Build reusable frameworks for data engineers to develop near real-time data and batch products such as customer consumption and cost for upstream consumption, detailed analysis and executive dashboards. • Champion the integration and use of data management with providers and stakeholders. • Provide architectural direction for CI/CD pipeline development and DevOps automation for data workflows. • Explain high-level technology benefits of data management to leadership and stakeholders. • Recommend and implement the most efficient operational practices for Databricks in standard build configuration. Required Skills & Experience • 8+ years in Data Architecture, with 3+ years in Databricks and Power BI. • Proven experience designing and implementing Platinum or Semantic Layers. • Strong understanding of Medallion Architecture and Unity Catalog. • Hands-on integration experience with ServiceNow (SLA, CMDB, incident data). • Hands-on integration experience with ApptioOne (cost modeling, allocations). • Strong working knowledge of Python, Spark, SQL, and Power BI sufficient to define architectural patterns, review implementations, and guide engineering teams. • Experience with MLflow, Feature Store, and AI/ML pipelines. • Experience with Azure data platform services (e.g., ADLS Gen2, Azure-native orchestration, and integration patterns) • Familiarity with Azure DevOps and GitHub Actions. • Excellent communication and stakeholder engagement skills. Preferred Qualifications • - Experience in public sector environments. • - ITIL 4 / ITIL 5 Certification. • - ApptioOne and ServiceNow development experience • - Experience with SLA breach forecasting and cost optimization models.