

AppXcelerate Solutions Pvt Ltd
Senior Data Architect
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Architect on a contract basis, paying $70.95 - $85.44 per hour. It requires expertise in data modeling, SQL, cloud platforms (Azure, AWS, GCP), and data governance, with a focus on delivering secure, scalable data architectures.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
680
-
🗓️ - Date
December 5, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
New York, NY 10004
-
🧠 - Skills detailed
#Java #Vault #Snowflake #Data Vault #Scala #Migration #Batch #PCI (Payment Card Industry) #Data Lake #Data Governance #Metadata #Kafka (Apache Kafka) #Spark (Apache Spark) #Apache Iceberg #Cloud #Azure #SQL (Structured Query Language) #Storage #Programming #MDM (Master Data Management) #Scripting #AWS (Amazon Web Services) #Airflow #Trino #Infrastructure as Code (IaC) #Security #BI (Business Intelligence) #Databricks #Data Quality #Data Pipeline #Python #BigQuery #dbt (data build tool) #AI (Artificial Intelligence) #Data Modeling #"ETL (Extract #Transform #Load)" #GCP (Google Cloud Platform) #Compliance #ML (Machine Learning) #Terraform #Data Architecture
Role description
Role Overview
The Data Architect designs and governs enterprise data architectures across cloud and on-prem platforms, ensuring trusted, secure, scalable, and cost-efficient data for analytics, AI, and operational reporting.
Core Responsibilities
Define data architecture patterns (warehouse, lake/lakehouse, streaming, ODS).
Own enterprise data models (conceptual, logical, physical), canonical definitions, and metadata standards.
Architect data pipelines (batch + streaming): ingestion, transformation, enrichment, distribution.
Establish data governance: cataloging, lineage, data quality, MDM, access controls.
Drive cloud/on-prem platform design and service selection for performance and cost efficiency.
Implement security & privacy by design (RBAC/ABAC, encryption, masking, retention).
Set standards for SQL, schema evolution, event design, orchestration, and CI/CD.
Partner with engineering and business teams to translate requirements into data interfaces.
Lead cloud/lakehouse migration and modernization initiatives.
Define SLOs/SLAs, monitor cost/performance, and drive continuous improvement.
Mentor teams and support architectural governance and best practices.
Required Skills
Data modeling expertise (3NF, dimensional, Data Vault).
Strong SQL and experience with distributed compute/storage.
Azure, Azure Data Lake, Databricks, Dell on-prem storage.
Experience with AWS/GCP, Snowflake, BigQuery, Starburst/Trino, Spark.
ETL/ELT orchestration (Airflow, dbt, cloud-native tools) and streaming systems (Kafka).
Proven data governance implementation (catalog, lineage, quality, MDM, access controls).
Strong security/compliance understanding (PII/PHI/PCI), encryption, auditability.
Programming/scripting: Python (or Scala/Java).
Excellent communication and stakeholder management.
Preferred Skills
Lakehouse architectures, Apache Iceberg/Delta, data-sharing patterns.
Semantic layers, metadata-driven design, BI acceleration.
ML/AI data readiness (feature engineering, model data pipelines).
IaC (Terraform) and CI/CD for data platforms.
Cost optimization / FinOps.
Key Outcomes
Deliver a scalable, secure, well-governed data platform.
Establish data standards to improve interoperability.
Enable trusted analytics and AI with high-quality, discoverable data.
Job Type: Contract
Pay: $70.95 - $85.44 per hour
Expected hours: 40 per week
Work Location: In person
Role Overview
The Data Architect designs and governs enterprise data architectures across cloud and on-prem platforms, ensuring trusted, secure, scalable, and cost-efficient data for analytics, AI, and operational reporting.
Core Responsibilities
Define data architecture patterns (warehouse, lake/lakehouse, streaming, ODS).
Own enterprise data models (conceptual, logical, physical), canonical definitions, and metadata standards.
Architect data pipelines (batch + streaming): ingestion, transformation, enrichment, distribution.
Establish data governance: cataloging, lineage, data quality, MDM, access controls.
Drive cloud/on-prem platform design and service selection for performance and cost efficiency.
Implement security & privacy by design (RBAC/ABAC, encryption, masking, retention).
Set standards for SQL, schema evolution, event design, orchestration, and CI/CD.
Partner with engineering and business teams to translate requirements into data interfaces.
Lead cloud/lakehouse migration and modernization initiatives.
Define SLOs/SLAs, monitor cost/performance, and drive continuous improvement.
Mentor teams and support architectural governance and best practices.
Required Skills
Data modeling expertise (3NF, dimensional, Data Vault).
Strong SQL and experience with distributed compute/storage.
Azure, Azure Data Lake, Databricks, Dell on-prem storage.
Experience with AWS/GCP, Snowflake, BigQuery, Starburst/Trino, Spark.
ETL/ELT orchestration (Airflow, dbt, cloud-native tools) and streaming systems (Kafka).
Proven data governance implementation (catalog, lineage, quality, MDM, access controls).
Strong security/compliance understanding (PII/PHI/PCI), encryption, auditability.
Programming/scripting: Python (or Scala/Java).
Excellent communication and stakeholder management.
Preferred Skills
Lakehouse architectures, Apache Iceberg/Delta, data-sharing patterns.
Semantic layers, metadata-driven design, BI acceleration.
ML/AI data readiness (feature engineering, model data pipelines).
IaC (Terraform) and CI/CD for data platforms.
Cost optimization / FinOps.
Key Outcomes
Deliver a scalable, secure, well-governed data platform.
Establish data standards to improve interoperability.
Enable trusted analytics and AI with high-quality, discoverable data.
Job Type: Contract
Pay: $70.95 - $85.44 per hour
Expected hours: 40 per week
Work Location: In person






