

Senior Data Architect
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Architect on a contract basis, offering competitive pay. Requires 14+ years in data architecture, expertise in Databricks, Snowflake, Azure Data Services, and strong Python skills. Azure certifications preferred.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
June 19, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Unknown
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Austin, TX
-
π§ - Skills detailed
#Microsoft Power BI #ADF (Azure Data Factory) #Terraform #dbt (data build tool) #Distributed Computing #Spark (Apache Spark) #Compliance #Data Engineering #Data Lake #"ETL (Extract #Transform #Load)" #Scala #DevOps #Storage #Data Processing #Azure #Databricks #Snowflake #Python #PySpark #SQL (Structured Query Language) #Automation #Big Data #Data Lakehouse #Synapse #Data Architecture #BI (Business Intelligence) #Data Quality #Data Integration #Data Pipeline #Airflow
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Key Responsibilities:
Design, develop, and maintain enterprise-grade data architecture and pipelines.
Architect and optimize data lakehouse solutions using Databricks on Azure.
Implement secure, efficient, and scalable data integration patterns using Snowflake and Azure Data Services (ADF, Data Lake, Synapse, etc.).
Collaborate with cross-functional teams (Data Engineers, Analysts, DevOps, Product Owners) to gather requirements and deliver end-to-end data solutions.
Ensure data quality, governance, and compliance across all solutions.
Develop data models and optimize storage for performance and cost efficiency.
Create automation scripts and frameworks using Python and Spark for data processing.
Conduct performance tuning and troubleshooting of data pipelines and queries.
Evaluate and integrate new tools and technologies to continuously improve the data architecture.
Must-Have Skills:
14+ years of experience in data architecture, data engineering, or data platform design.
Expert-level hands-on experience with Databricks, Snowflake, and Azure Data Services.
Strong command of Python and PySpark for ETL/ELT processes.
Proficient in SQL and performance tuning for complex data sets.
Deep understanding of big data architecture, distributed computing, and data lakehouse patterns.
Experience in building and managing CI/CD pipelines and infrastructure-as-code for data platforms.
Strong problem-solving and communication skills.
Preferred Qualifications:
Azure certifications (e.g., Azure Data Engineer Associate, Azure Solutions Architect)
Experience with tools such as Airflow, Terraform, dbt, or Power BI