

Ampstek
Digital Data Architect || Irving, TX ( 2 Days of the Week )
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Digital Data Architect in Irving, TX (2 days/week) on a 12-month contract, offering competitive pay. Requires 8+ years in data architecture, proficiency in Azure technologies, SQL, Python, and relevant certifications.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
Unknown
-
ποΈ - Date
October 24, 2025
π - Duration
More than 6 months
-
ποΈ - Location
Hybrid
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
Irving, TX
-
π§ - Skills detailed
#Data Engineering #Data Catalog #BI (Business Intelligence) #Compliance #ML (Machine Learning) #Data Modeling #NoSQL #Storage #Big Data #AWS (Amazon Web Services) #Synapse #Azure #AI (Artificial Intelligence) #GCP (Google Cloud Platform) #Microsoft Power BI #Databricks #Data Architecture #Kafka (Apache Kafka) #Scala #Python #Spark (Apache Spark) #Data Quality #Computer Science #Data Lake #Security #SQL (Structured Query Language) #Data Processing #Tableau #"ETL (Extract #Transform #Load)" #Azure Data Factory #Batch #Physical Data Model #Data Storage #Cloud #ADF (Azure Data Factory) #GDPR (General Data Protection Regulation)
Role description
Title :: Digital Data Architect
Location :: Irving, TX ( 2 days of the week )
Job Type :: 12 Months Contract
Job Description: Digital Data Architect β Individual Contributor
Role Overview
We are looking for a hands-on Digital Data Architect to design and implement modern data architectures that enable analytics, AI/ML, and digital transformation. This role focuses on technical execution, ensuring scalable, secure, and high-performing data solutions across cloud and hybrid environments.
Key Responsibilities
β’ Design conceptual, logical, and physical data models for enterprise and domain-specific solutions.
β’ Architect data platforms leveraging Azure Synapse, Databricks, Data Lake, and Cosmos DB.
β’ Build robust ETL/ELT pipelines using Azure Data Factory, Event Hubs, and streaming technologies.
β’ Enable real-time and batch data processing for analytics and operational systems.
β’ Implement data cataloging, lineage, and quality frameworks using tools like Azure Purview.
β’ Ensure compliance with security and regulatory requirements (GDPR, CCPA).
β’ Optimize data storage, query performance, and cost efficiency across cloud environments.
β’ Collaborate with solution architects, data engineers, and analytics teams to deliver end-to-end solutions.
β’ Provide technical guidance and best practices for data platform adoption.
Required Skills & Qualifications
β’ Bachelorβs in Computer Science, Information Systems, or related field.
β’ 8+ years in data engineering/architecture roles with strong hands-on expertise.
β’ Cloud data platforms (Azure preferred; AWS/GCP a plus).
β’ Data modeling (relational, NoSQL, and big data).
β’ ETL/ELT and streaming frameworks.
β’ Proficiency in SQL, Python, and Spark.
β’ Experience with Azure Synapse, Databricks, Data Lake, Cosmos DB, Azure Data Factory, Event Hubs, Kafka.
β’ Familiarity with BI tools like Power BI or Tableau.
Preferred Qualifications
Experience with Data Mesh or Data Fabric concepts.
Familiarity with ML pipeline integration.
Certifications: Azure Data Engineer Associate, Azure Solutions Architect Expert.
KPIs and Success Metrics
Successful implementation of scalable data architectures within agreed timelines.
Reduction in data processing time and cost optimization by X%.
Improved data quality and governance compliance (measured by audit scores).
Adoption rate of new data platforms and tools across teams.
Achievement of pipeline SLAs for batch and real-time data flows.
Title :: Digital Data Architect
Location :: Irving, TX ( 2 days of the week )
Job Type :: 12 Months Contract
Job Description: Digital Data Architect β Individual Contributor
Role Overview
We are looking for a hands-on Digital Data Architect to design and implement modern data architectures that enable analytics, AI/ML, and digital transformation. This role focuses on technical execution, ensuring scalable, secure, and high-performing data solutions across cloud and hybrid environments.
Key Responsibilities
β’ Design conceptual, logical, and physical data models for enterprise and domain-specific solutions.
β’ Architect data platforms leveraging Azure Synapse, Databricks, Data Lake, and Cosmos DB.
β’ Build robust ETL/ELT pipelines using Azure Data Factory, Event Hubs, and streaming technologies.
β’ Enable real-time and batch data processing for analytics and operational systems.
β’ Implement data cataloging, lineage, and quality frameworks using tools like Azure Purview.
β’ Ensure compliance with security and regulatory requirements (GDPR, CCPA).
β’ Optimize data storage, query performance, and cost efficiency across cloud environments.
β’ Collaborate with solution architects, data engineers, and analytics teams to deliver end-to-end solutions.
β’ Provide technical guidance and best practices for data platform adoption.
Required Skills & Qualifications
β’ Bachelorβs in Computer Science, Information Systems, or related field.
β’ 8+ years in data engineering/architecture roles with strong hands-on expertise.
β’ Cloud data platforms (Azure preferred; AWS/GCP a plus).
β’ Data modeling (relational, NoSQL, and big data).
β’ ETL/ELT and streaming frameworks.
β’ Proficiency in SQL, Python, and Spark.
β’ Experience with Azure Synapse, Databricks, Data Lake, Cosmos DB, Azure Data Factory, Event Hubs, Kafka.
β’ Familiarity with BI tools like Power BI or Tableau.
Preferred Qualifications
Experience with Data Mesh or Data Fabric concepts.
Familiarity with ML pipeline integration.
Certifications: Azure Data Engineer Associate, Azure Solutions Architect Expert.
KPIs and Success Metrics
Successful implementation of scalable data architectures within agreed timelines.
Reduction in data processing time and cost optimization by X%.
Improved data quality and governance compliance (measured by audit scores).
Adoption rate of new data platforms and tools across teams.
Achievement of pipeline SLAs for batch and real-time data flows.





