

Dash Technologies Inc.
Senior Data Architect
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Architect with a contract length of "unknown" and a pay rate of "unknown," located in "unknown." Key skills include Azure Data Factory, Databricks, ETL processes, and strong experience in data governance. A Bachelor's degree and 7+ years of relevant experience are required.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
720
-
ποΈ - Date
January 31, 2026
π - Duration
Unknown
-
ποΈ - Location
Unknown
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
Los Angeles, CA
-
π§ - Skills detailed
#Azure cloud #Storage #Cloud #YAML (YAML Ain't Markup Language) #Computer Science #Data Quality #BI (Business Intelligence) #Data Engineering #Tableau #ADF (Azure Data Factory) #Scala #Big Data #GitHub #Data Loss Prevention #Data Security #Databricks #Data Modeling #Python #Data Integration #Security #Version Control #Terraform #DevOps #Data Lake #Spark (Apache Spark) #Data Pipeline #Deployment #Monitoring #R #Leadership #API (Application Programming Interface) #Azure #PySpark #Schema Design #Data Lineage #Compliance #Microsoft Power BI #Data Architecture #Data Warehouse #"ETL (Extract #Transform #Load)" #SQL (Structured Query Language) #Automation #Azure Data Factory
Role description
Job Overview
We are seeking an experienced Senior Data Engineer to design, build, and manage scalable enterprise data platforms. This role requires strong hands-on expertise in cloud-based data engineering, big data technologies, and modern data architecture, along with the ability to lead technical initiatives and collaborate across teams.
Key Responsibilities
β’ Design, develop, and maintain enterprise-level data pipelines and ETL/ELT processes
β’ Build and optimize Lakehouse and data warehouse architectures
β’ Develop scalable solutions using Azure cloud services
β’ Implement and manage APIs and data integrations
β’ Ensure data quality, security, governance, and compliance
β’ Perform performance tuning for OLAP/OLTP systems
β’ Lead technical design discussions and mentor junior engineers
β’ Implement CI/CD pipelines and infrastructure automation
β’ Support business intelligence and analytics platforms
Required Skills & Technologies
Cloud & Data Platforms
β’ Azure Data Factory, Azure Data Lake, Blob Storage, Azure Functions
β’ Databricks (Spark, PySpark)
β’ API Management (Apigee or equivalent)
Data Engineering & Big Data
β’ ETL/ELT, data pipelines, data modeling, schema design
β’ Lakehouse architectures
β’ SQL, Python (Scala or R is a plus)
Data Warehousing & BI
β’ Strong ERD and dimensional modeling concepts
β’ OLAP/OLTP systems
β’ BI tools such as Power BI or Tableau
Governance & Security
β’ RBAC / ABAC
β’ Data lineage, auditing, monitoring
β’ Data security, compliance, and data loss prevention
DevOps & Infrastructure
β’ GitHub version control
β’ CI/CD pipelines
β’ Terraform, YAML templates
β’ Script-based cloud deployments
Experience & Education
β’ 7+ years of experience applying Enterprise Architecture principles
β’ 5+ years in a lead or senior technical role
β’ 5+ years hands-on experience with:
β’ Azure Data Factory & Databricks
β’ API implementation and management
β’ Python-based data pipelines
β’ CI/CD, Terraform, and infrastructure automation
β’ Data warehousing and BI integrations
β’ Bachelorβs degree in IT, Computer Science, Engineering, or related field
Nice to Have
β’ Experience in large enterprise or public-sector environments
β’ Strong communication and stakeholder collaboration skills
β’ Ability to balance hands-on development with technical leadership
Job Overview
We are seeking an experienced Senior Data Engineer to design, build, and manage scalable enterprise data platforms. This role requires strong hands-on expertise in cloud-based data engineering, big data technologies, and modern data architecture, along with the ability to lead technical initiatives and collaborate across teams.
Key Responsibilities
β’ Design, develop, and maintain enterprise-level data pipelines and ETL/ELT processes
β’ Build and optimize Lakehouse and data warehouse architectures
β’ Develop scalable solutions using Azure cloud services
β’ Implement and manage APIs and data integrations
β’ Ensure data quality, security, governance, and compliance
β’ Perform performance tuning for OLAP/OLTP systems
β’ Lead technical design discussions and mentor junior engineers
β’ Implement CI/CD pipelines and infrastructure automation
β’ Support business intelligence and analytics platforms
Required Skills & Technologies
Cloud & Data Platforms
β’ Azure Data Factory, Azure Data Lake, Blob Storage, Azure Functions
β’ Databricks (Spark, PySpark)
β’ API Management (Apigee or equivalent)
Data Engineering & Big Data
β’ ETL/ELT, data pipelines, data modeling, schema design
β’ Lakehouse architectures
β’ SQL, Python (Scala or R is a plus)
Data Warehousing & BI
β’ Strong ERD and dimensional modeling concepts
β’ OLAP/OLTP systems
β’ BI tools such as Power BI or Tableau
Governance & Security
β’ RBAC / ABAC
β’ Data lineage, auditing, monitoring
β’ Data security, compliance, and data loss prevention
DevOps & Infrastructure
β’ GitHub version control
β’ CI/CD pipelines
β’ Terraform, YAML templates
β’ Script-based cloud deployments
Experience & Education
β’ 7+ years of experience applying Enterprise Architecture principles
β’ 5+ years in a lead or senior technical role
β’ 5+ years hands-on experience with:
β’ Azure Data Factory & Databricks
β’ API implementation and management
β’ Python-based data pipelines
β’ CI/CD, Terraform, and infrastructure automation
β’ Data warehousing and BI integrations
β’ Bachelorβs degree in IT, Computer Science, Engineering, or related field
Nice to Have
β’ Experience in large enterprise or public-sector environments
β’ Strong communication and stakeholder collaboration skills
β’ Ability to balance hands-on development with technical leadership






