

VeriiPro
Senior Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer with an 8+ year experience, focusing on AWS, Snowflake, and DBT. Contract length is unspecified, with a competitive pay rate. Key skills include ETL/ELT development, data governance, and data architecture principles.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
December 15, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Chicago, IL
-
🧠 - Skills detailed
#Data Pipeline #Snowflake #Data Science #dbt (data build tool) #Data Architecture #Python #Airflow #AWS (Amazon Web Services) #S3 (Amazon Simple Storage Service) #Compliance #Data Governance #Redshift #ML (Machine Learning) #Security #Automation #Data Engineering #Documentation #Version Control #Leadership #Vault #Batch #"ETL (Extract #Transform #Load)" #Cloud #Data Vault #SQL (Structured Query Language) #Deployment #Lambda (AWS Lambda) #GIT #Agile #BI (Business Intelligence) #Migration #Scala #Data Analysis #Monitoring #IAM (Identity and Access Management)
Role description
Job Description
We are seeking a highly skilled Architect / Senior Data Engineer to design, build, and optimise our modern data ecosystem. The ideal candidate will have deep experience with AWS cloud services, Snowflake, and DBT, along with a strong understanding of scalable data architecture, ETL/ELT development, and data modelling best practices.
Responsibilities
• Architect, design, and implement scalable, reliable, and secure data solutions using AWS, Snowflake, and DBT.
• Develop end-to-end data pipelines (batch and streaming) to support analytics, machine learning, and business intelligence needs.
• Lead the modernisation and migration of legacy data systems to cloud-native architectures.
• Define and enforce data engineering best practices, including coding standards, CI/CD, testing, and monitoring.
• Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and translate them into technical solutions.
• Optimise Snowflake performance through query tuning, warehouse sizing, and cost management.
• Establish and maintain data governance, security, and compliance standards across the data platform.
• Mentor and guide junior data engineers, providing technical leadership and direction.
Required Skills & Qualifications
• 8+ years of experience in Data Engineering, with at least 3+ years in a cloud-native data environment.
• Hands-on expertise in AWS services such as S3, Glue, Lambda, Step Functions, Redshift, and IAM.
• Strong experience with Snowflake – data modelling, warehouse design, performance optimisation, and cost governance.
• Proven experience with DBT (data build tool) – model development, documentation, and deployment automation.
• Proficient in SQL, Python, and ETL/ELT pipeline development.
• Experience with CI/CD pipelines, version control (Git), and workflow orchestration tools (Airflow, Dagster, Prefect, etc.).
• Familiarity with data governance and security best practices, including role-based access control and data masking.
• Strong understanding of data modelling techniques (Kimball, Data Vault, etc.) and data architecture principles.
Preferred Qualifications
• AWS Certification (e.g., AWS Certified Data Analytics – Speciality, Solutions Architect).
• Strong communication and collaboration skills, with a track record of working in agile environments.
Job Description
We are seeking a highly skilled Architect / Senior Data Engineer to design, build, and optimise our modern data ecosystem. The ideal candidate will have deep experience with AWS cloud services, Snowflake, and DBT, along with a strong understanding of scalable data architecture, ETL/ELT development, and data modelling best practices.
Responsibilities
• Architect, design, and implement scalable, reliable, and secure data solutions using AWS, Snowflake, and DBT.
• Develop end-to-end data pipelines (batch and streaming) to support analytics, machine learning, and business intelligence needs.
• Lead the modernisation and migration of legacy data systems to cloud-native architectures.
• Define and enforce data engineering best practices, including coding standards, CI/CD, testing, and monitoring.
• Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and translate them into technical solutions.
• Optimise Snowflake performance through query tuning, warehouse sizing, and cost management.
• Establish and maintain data governance, security, and compliance standards across the data platform.
• Mentor and guide junior data engineers, providing technical leadership and direction.
Required Skills & Qualifications
• 8+ years of experience in Data Engineering, with at least 3+ years in a cloud-native data environment.
• Hands-on expertise in AWS services such as S3, Glue, Lambda, Step Functions, Redshift, and IAM.
• Strong experience with Snowflake – data modelling, warehouse design, performance optimisation, and cost governance.
• Proven experience with DBT (data build tool) – model development, documentation, and deployment automation.
• Proficient in SQL, Python, and ETL/ELT pipeline development.
• Experience with CI/CD pipelines, version control (Git), and workflow orchestration tools (Airflow, Dagster, Prefect, etc.).
• Familiarity with data governance and security best practices, including role-based access control and data masking.
• Strong understanding of data modelling techniques (Kimball, Data Vault, etc.) and data architecture principles.
Preferred Qualifications
• AWS Certification (e.g., AWS Certified Data Analytics – Speciality, Solutions Architect).
• Strong communication and collaboration skills, with a track record of working in agile environments.






