

HMG AMERICA LLC
Snowflake Data Architect
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Snowflake Data Architect, a contract position in Los Angeles, CA, requiring expertise in data architecture, Snowflake, DBT, Apache Kafka, and AWS services. Strong SQL skills and experience with large datasets are essential.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
February 6, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Los Angeles Metropolitan Area
-
🧠 - Skills detailed
#Cloud #Data Modeling #S3 (Amazon Simple Storage Service) #dbt (data build tool) #Scala #Airflow #Apache Kafka #Data Architecture #Datasets #Data Pipeline #Data Quality #Apache Airflow #Data Processing #Data Analysis #AWS (Amazon Web Services) #IAM (Identity and Access Management) #EC2 #Snowflake #Security #Kafka (Apache Kafka) #Data Engineering #SQL (Structured Query Language)
Role description
Job Title: Snowflake Data Architect
Location: Los Angeles, CA (Onsite)
Type: Contract
Job Summary
We are seeking a highly skilled Senior Data Engineer to design, build, and maintain scalable data platforms and pipelines. The ideal candidate will have strong expertise in modern data architecture, cloud-based analytics platforms, and real-time data processing technologies.
Key Responsibilities
• Design and implement end-to-end data architecture and data models for analytics and reporting use cases
• Build and optimize data pipelines using Snowflake and DBT
• Develop and maintain real-time and streaming data pipelines using Kafka
• Orchestrate workflows and manage dependencies using Apache Airflow
• Contribute to the overall Data & Insights platform architecture, ensuring scalability, reliability, and performance
• Collaborate with data analysts, product teams, and stakeholders to deliver high-quality data solutions
• Ensure data quality, governance, and security best practices across the platform
• Leverage AWS cloud services to build and manage data infrastructure
Required Skills & Qualifications
• Strong experience in Data Architecture and Data Modeling
• Hands-on expertise with Snowflake and DBT
• Solid experience with Apache Kafka for event streaming
• Proficiency in Apache Airflow for workflow orchestration
• Experience designing enterprise-scale Data & Insights platforms
• Working knowledge of AWS Cloud services (S3, EC2, IAM, etc.)
• Strong SQL skills and experience working with large datasets
Job Title: Snowflake Data Architect
Location: Los Angeles, CA (Onsite)
Type: Contract
Job Summary
We are seeking a highly skilled Senior Data Engineer to design, build, and maintain scalable data platforms and pipelines. The ideal candidate will have strong expertise in modern data architecture, cloud-based analytics platforms, and real-time data processing technologies.
Key Responsibilities
• Design and implement end-to-end data architecture and data models for analytics and reporting use cases
• Build and optimize data pipelines using Snowflake and DBT
• Develop and maintain real-time and streaming data pipelines using Kafka
• Orchestrate workflows and manage dependencies using Apache Airflow
• Contribute to the overall Data & Insights platform architecture, ensuring scalability, reliability, and performance
• Collaborate with data analysts, product teams, and stakeholders to deliver high-quality data solutions
• Ensure data quality, governance, and security best practices across the platform
• Leverage AWS cloud services to build and manage data infrastructure
Required Skills & Qualifications
• Strong experience in Data Architecture and Data Modeling
• Hands-on expertise with Snowflake and DBT
• Solid experience with Apache Kafka for event streaming
• Proficiency in Apache Airflow for workflow orchestration
• Experience designing enterprise-scale Data & Insights platforms
• Working knowledge of AWS Cloud services (S3, EC2, IAM, etc.)
• Strong SQL skills and experience working with large datasets






