

CloudIngest
Data Architect (Databricks) (REMOTE) (W2 Only) (USC & GC)
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Architect (Databricks) with a contract length of W2 at $70/hr. It requires 13+ years of experience, expertise in Databricks and Azure Data Platform, and strong skills in data modeling, ETL, and Spark technologies. Remote work location.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
560
-
🗓️ - Date
April 24, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Remote
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Data Pipeline #Triggers #Data Security #Data Engineering #Data Modeling #Azure SQL #Data Migration #Compliance #Data Governance #Delta Lake #"ETL (Extract #Transform #Load)" #Data Strategy #Scala #SQL (Structured Query Language) #Azure #Apache Spark #BI (Business Intelligence) #Python #Security #Databricks #Strategy #Cloud #Azure Data Factory #Migration #Microsoft Power BI #Spark (Apache Spark) #Data Processing #ADF (Azure Data Factory) #Data Architecture
Role description
NOTE: Please make a note, its a Data Architect position not a Data Engineer (13+Yrs Only).
Please share relevant profiles to srikanth@cloudingest.com
Role: Data Architect (Databricks)
Rate: $70/hr on W2
Location: Remote
Overview
We are seeking a highly skilled Data Architect to design, implement, and optimize scalable data architecture solutions for enterprise-level, data-driven projects. This role will focus on building modern data platforms using Databricks, Azure, and Spark technologies, enabling real-time analytics and robust data pipelines.
Key Responsibilities
• Architect and implement scalable data architecture solutions for enterprise systems
• Define and lead enterprise data strategy, standards, and governance frameworks
• Build real-time analytics models using Databricks for actionable insights
• Design and manage ETL/ELT pipelines using Azure Data Factory
• Optimize large-scale data processing using Apache Spark (Scala/Python)
• Implement Delta Lake and Adaptive Query Execution for performance optimization
• Ensure data security & compliance via dynamic data masking in Azure SQL DB
• Develop interactive dashboards and reports using Power BI
• Monitor pipelines, schedules, triggers, and logs for reliability
• Collaborate with cross-functional teams to enable data-driven decision making
Required Skills
• Strong experience with Databricks & Azure Data Platform
• Expertise in Data Modeling & Data Migration
• Hands-on with Spark, Scala, Python
• Experience with Azure Data Factory (ADF)
• Knowledge of Delta Lake & performance tuning techniques
• Strong understanding of data governance & security
• Proficiency in Power BI or similar BI tools
NOTE: Please make a note, its a Data Architect position not a Data Engineer (13+Yrs Only).
Please share relevant profiles to srikanth@cloudingest.com
Role: Data Architect (Databricks)
Rate: $70/hr on W2
Location: Remote
Overview
We are seeking a highly skilled Data Architect to design, implement, and optimize scalable data architecture solutions for enterprise-level, data-driven projects. This role will focus on building modern data platforms using Databricks, Azure, and Spark technologies, enabling real-time analytics and robust data pipelines.
Key Responsibilities
• Architect and implement scalable data architecture solutions for enterprise systems
• Define and lead enterprise data strategy, standards, and governance frameworks
• Build real-time analytics models using Databricks for actionable insights
• Design and manage ETL/ELT pipelines using Azure Data Factory
• Optimize large-scale data processing using Apache Spark (Scala/Python)
• Implement Delta Lake and Adaptive Query Execution for performance optimization
• Ensure data security & compliance via dynamic data masking in Azure SQL DB
• Develop interactive dashboards and reports using Power BI
• Monitor pipelines, schedules, triggers, and logs for reliability
• Collaborate with cross-functional teams to enable data-driven decision making
Required Skills
• Strong experience with Databricks & Azure Data Platform
• Expertise in Data Modeling & Data Migration
• Hands-on with Spark, Scala, Python
• Experience with Azure Data Factory (ADF)
• Knowledge of Delta Lake & performance tuning techniques
• Strong understanding of data governance & security
• Proficiency in Power BI or similar BI tools






