Data Architect – Databricks, AWS & Snowflake - USC/GC Only

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Architect specializing in Databricks, AWS, and Snowflake, requiring 12+ years of experience. The contract length is unspecified, with a focus on data architecture, cloud data engineering, and analytics solutions. USC/GC only.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
-
🗓️ - Date discovered
August 19, 2025
🕒 - Project duration
Unknown
-
🏝️ - Location type
Unknown
-
📄 - Contract type
Unknown
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
Santa Clara, CA
-
🧠 - Skills detailed
#Security #Snowflake #Data Pipeline #Leadership #Spark (Apache Spark) #AI (Artificial Intelligence) #Data Science #Data Quality #Visualization #S3 (Amazon Simple Storage Service) #Tableau #Data Engineering #MLflow #Data Integration #Data Management #BI (Business Intelligence) #AWS (Amazon Web Services) #Microsoft Power BI #Data Architecture #Metadata #Redshift #Data Storage #Compliance #Storage #Data Warehouse #Delta Lake #Scala #IAM (Identity and Access Management) #Databricks #ML (Machine Learning) #Data Governance #Cloud #Data Modeling #Python #"ETL (Extract #Transform #Load)" #Big Data #Lambda (AWS Lambda) #SQL (Structured Query Language) #Terraform #PySpark
Role description
We are seeking an experienced Data Architect with strong expertise in Databricks, AWS Cloud, and Snowflake to lead the design and implementation of modern data platforms. The ideal candidate will have deep knowledge of data architecture, cloud data engineering, and analytics solutions, with a proven track record of delivering large-scale data systems. Status: Only GC/USC Key Responsibilities • Design and architect end-to-end data solutions leveraging Databricks, AWS services, and Snowflake. • Define data modeling, data integration, and data warehousing strategies to meet business needs. • Develop scalable data pipelines and frameworks for ingestion, transformation, and analytics. • Collaborate with business stakeholders, data engineers, and data scientists to ensure efficient data usage. • Optimize data storage, query performance, and cost efficiency in AWS + Snowflake environments. • Ensure data governance, security, compliance, and best practices across the data ecosystem. • Provide technical leadership, mentoring, and guidance to engineering teams. • Stay updated with emerging trends in data architecture, cloud services, and big data technologies. Required Skills & Experience • 12+ years of experience in data architecture, data engineering, or related fields. • Strong expertise in Databricks (PySpark, Delta Lake, MLflow, Unity Catalog). • Hands-on experience with AWS cloud services: S3, Glue, Redshift, Lambda, EMR, IAM, CloudFormation/Terraform. • Proficiency in Snowflake (data modeling, performance tuning, security, and governance). • Expertise in ETL/ELT pipelines and data integration tools. • Strong understanding of data warehouse & lakehouse architecture. • Proficiency in SQL, Python, and Spark. • Knowledge of data governance, metadata management, and data quality frameworks. • Excellent communication, leadership, and stakeholder management skills. Nice-to-Have • Experience with modern BI/visualization tools (Tableau, Power BI, QuickSight). • Exposure to machine learning and AI-based data pipelines. • Certifications in AWS, Snowflake, or Databricks.