Lead Data Architect

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Lead Data Architect with a contract length of "unknown," offering a pay rate of "unknown," and is located "remote." Key skills include SQL mastery, Snowflake expertise, and DBT proficiency. Data warehousing experience is required.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
August 12, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
Unknown
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
United States
-
🧠 - Skills detailed
#Data Modeling #Snowflake #Data Architecture #Data Integration #Data Analysis #SQL (Structured Query Language) #Data Quality #Debugging #Documentation #Quality Assurance #BI (Business Intelligence) #Datasets #Data Engineering #SQL Queries #Data Science #"ETL (Extract #Transform #Load)" #dbt (data build tool) #Scala #Data Pipeline #Cloud
Role description
Job Description – Data Engineer We are looking for a highly skilled Senior Data Engineer to join our team and help us build innovative data solutions using Snowflake and DBT. Key Responsibilities β€’ Data Pipeline Development: Design, develop, and maintain scalable data pipelines utilizing Snowflake and DBT. β€’ Data Modeling: Create and optimize efficient data models in Snowflake to support business intelligence and analytics requirements. β€’ ETL Processes: Implement ETL workflows to transform raw data into analytics-ready datasets using DBT. β€’ Performance Optimization: Tune Snowflake queries and DBT models for maximum performance and scalability. β€’ Data Integration: Seamlessly integrate Snowflake with diverse data sources and third-party tools. β€’ Collaboration: Partner with data analysts, data scientists, and other stakeholders to gather requirements and deliver impactful data solutions. β€’ Data Quality Assurance: Develop and enforce data quality checks to ensure accuracy, consistency, and reliability across pipelines. β€’ Documentation: Maintain detailed documentation of data models, transformation processes, and pipelines. Required Skills & Qualifications β€’ SQL Mastery: Exceptional skills in writing, optimizing, and debugging complex SQL queries. β€’ Snowflake Expertise: Hands-on experience with Snowflake, including data modeling, query optimization, and system integration. β€’ DBT Proficiency: In-depth knowledge of DBT for data transformation, modeling, and workflow orchestration. β€’ Data Warehousing: Strong understanding of data warehousing concepts and methodologies, particularly in cloud environments. β€’ Analytical Thinking: Proven ability to analyze large, complex datasets and derive actionable insights. β€’ Problem Solving: Adept at identifying issues and implementing effective solutions with attention to detail. β€’ Collaboration & Communication: Excellent interpersonal and communication skills to work effectively with cross-functional teams.