

Senior Data Engineer (Snowflake)
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer (Snowflake) with 10–15 years of experience, offering a 12+ month contract at $75 to $90/hr. It requires expertise in Python, SQL, Snowflake, and DBT, with a hybrid work location in Phoenix, AZ or Raleigh, NC.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
760
-
🗓️ - Date discovered
June 18, 2025
🕒 - Project duration
More than 6 months
-
🏝️ - Location type
Hybrid
-
📄 - Contract type
W2 Contractor
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
Phoenix, AZ
-
🧠 - Skills detailed
#Data Quality #Scala #Cloud #Azure #Data Analysis #GCP (Google Cloud Platform) #GIT #Data Warehouse #Airflow #Fivetran #SQL (Structured Query Language) #Data Engineering #AWS (Amazon Web Services) #dbt (data build tool) #Documentation #Data Science #"ETL (Extract #Transform #Load)" #Version Control #Data Pipeline #Data Governance #Snowflake #Data Lineage #Data Processing #Compliance #Security #Data Modeling #Python
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Job Title: Senior Data Engineer
Location: Hybrid – Phoenix, AZ / Raleigh, NC (4 days per week)
Experience Level: 10–15 Years
Employment Type: 12+ Month Contract
Payrate- $75 to 90/hr
Job Overview:
We are seeking a highly experienced Senior Data Engineer to join our growing data team. The ideal candidate will have 10–15 years of experience in data engineering with a strong focus on Python, SQL, Snowflake, and DBT. This role requires a deep understanding of building robust, scalable data pipelines in a cloud-based, modern data stack environment.
Key Responsibilities:
Design, build, and maintain scalable and reliable data pipelines to support analytics, reporting, and data science initiatives.
Develop and optimize DBT (Data Build Tool) models for efficient data transformation and data lineage.
Work with Snowflake to design schemas, optimize queries, and manage cloud-based data warehouses.
Write efficient, reusable, and testable Python and SQL code for ETL/ELT processes.
Collaborate with cross-functional teams including data analysts, scientists, and business stakeholders to understand data needs and deliver effective solutions.
Ensure data quality, consistency, and integrity across all data platforms.
Implement data governance and best practices for security, privacy, and compliance.
Troubleshoot and debug production issues in data workflows and pipelines.
Required Skills & Qualifications:
10–15 years of hands-on experience in data engineering roles.
Strong experience in DBT such as Snowflake and building modular SQL-based transformation models.
Proven expertise in Snowflake data warehouse architecture and management.
Advanced proficiency in Python and SQL for data processing.
Experience with modern data pipeline and orchestration tools.
Solid understanding of data modeling, performance tuning, and best practices in cloud data engineering
Familiarity with CI/CD processes and version control tools like Git.
Strong problem-solving skills and the ability to work in a fast-paced, collaborative environment.
Preferred Qualifications:
Experience with tools such as Airflow, Fivetran, or Cloud Composer.
Familiarity with cloud platforms such as AWS, GCP, or Azure.
Experience in financial, healthcare, or retail data environments (optional).
Excellent communication and documentation skills.