Aezion, Inc

Data Engineer/Architect–Snowflake

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer/Architect specializing in Snowflake, requiring 12+ months onsite in Frisco, TX. Key skills include Snowflake, DBT, Airbyte, Airflow, and Kimball modeling. Advanced SQL and Python proficiency are essential.
🌎 - Country
United States
💱 - Currency
Unknown
-
💰 - Day rate
Unknown
-
🗓️ - Date
December 3, 2025
🕒 - Duration
More than 6 months
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Frisco, TX 75034
-
🧠 - Skills detailed
#Data Engineering #BI (Business Intelligence) #"ETL (Extract #Transform #Load)" #Programming #Azure #GCP (Google Cloud Platform) #Data Quality #Data Pipeline #Python #Microsoft Power BI #Documentation #Data Warehouse #Agile #Scala #Data Modeling #Airflow #AWS (Amazon Web Services) #Security #SQL (Structured Query Language) #Compliance #Scripting #Data Governance #Data Integration #dbt (data build tool) #Tableau #Databases #Cloud #Snowflake
Role description
Title: Data Engineer/Architect – Snowflake Location: Frisco, TX (Onsite) Duration: 12+ Months long term About the Role We are seeking a highly skilled Snowflake Data Engineer with expertise in DBT, Airbyte, and Airflow, combined with strong experience in Kimball dimensional modeling. The ideal candidate will design and implement scalable data pipelines, integrate diverse data sources, and build a robust data warehouse that supports business intelligence and analytics initiatives. Key Responsibilities Data Integration & Extraction o Develop and maintain ETL/ELT pipelines using Airbyte and Airflow. o Extract data from multiple sources including APIs, direct database connections, and flat files. o Ensure data quality, consistency, and reliability across all ingestion processes. o Data Modeling & Warehousing o Design and implement Kimball-style dimensional models in Snowflake. o Build and optimize fact tables and dimension tables to support analytical workloads. o Collaborate with business teams to define and maintain the BUS Matrix for subject areas. Transformation & Orchestration o Use DBT to develop modular, testable, and version-controlled transformations. o Implement data quality checks and documentation within DBT workflows. Collaboration & Governance o Work closely with business stakeholders to understand requirements and translate them into technical solutions. o Ensure compliance with data governance, security, and privacy standards. Required Skills & Qualifications Technical Expertise o Strong proficiency in Snowflake architecture and performance tuning. o Hands-on experience with DBT, Airbyte, and Airflow. o Solid understanding of Kimball methodology for data warehousing. o Programming & Querying o Advanced SQL skills and familiarity with Python for ETL scripting. o Experience integrating data from APIs and relational databases. o Soft Skills o Excellent communication and collaboration skills. o Ability to work in an agile environment and manage multiple priorities. Preferred Qualifications o Experience with cloud platforms (AWS, Azure, or GCP). o Familiarity with BI tools (e.g., Tableau, Power BI). o Knowledge of data governance and security best practices Job Type: Contract Work Location: In person