

Intellectt Inc
Sr. AWS Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Sr. AWS Data Engineer with a contract length of [Specify Duration]. The pay rate is [Specify Rate]. Key skills include Snowflake, dbt, Fivetran, and 15+ years in retail. Remote work location.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
February 11, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Python #Data Warehouse #Data Transformations #Data Quality #Data Ingestion #AWS S3 (Amazon Simple Storage Service) #Fivetran #Redshift #DevOps #Lambda (AWS Lambda) #Airflow #Data Engineering #S3 (Amazon Simple Storage Service) #Scala #Security #"ETL (Extract #Transform #Load)" #Data Lake #Datasets #Data Governance #Data Analysis #SQL (Structured Query Language) #AWS (Amazon Web Services) #BI (Business Intelligence) #Data Pipeline #IAM (Identity and Access Management) #Data Modeling #dbt (data build tool) #Cloud #Deployment #Snowflake
Role description
Job Title: Sr. AWS Data Engineer
Location: [Remote]
Must have: – Snowflake, dbt, and Fivetran experience is required, along with 15+ years of retail industry background
Job Summary
Albertsons is seeking a highly skilled AWS Data Engineer with strong hands-on experience in Snowflake, DBT, and Fivetran to build and optimize modern cloud data pipelines and analytics platforms. The ideal candidate will have deep expertise in designing scalable ELT/ETL frameworks on AWS and implementing data transformation workflows using DBT while leveraging Fivetran for data ingestion.
Key Responsibilities
• Design, develop, and maintain scalable data pipelines using AWS services
• Implement ELT processes using Fivetran for ingesting data from multiple sources into Snowflake
• Develop and manage data transformation models using DBT
• Build and optimize Snowflake data warehouse architecture, schemas, and performance tuning
• Work with large datasets and ensure data quality, reliability, and integrity
• Develop reusable data models and transformation frameworks
• Monitor, troubleshoot, and optimize data workflows and performance
• Collaborate with data analysts, BI teams, and stakeholders to deliver data solutions
• Implement data governance, security, and best practices
• Automate deployment using CI/CD pipelines
Required Skills
• 7+ years of experience in Data Engineering
• Strong hands-on experience with AWS (S3, Glue, Lambda, Redshift, IAM, CloudWatch)
• 3+ years of experience with Snowflake
• Strong experience with DBT for data transformations
• Hands-on experience with Fivetran or similar ELT tools
• Expertise in SQL and performance optimization
• Experience building data lakes and data warehouses
• Knowledge of Python for data engineering tasks
• Experience with orchestration tools like Airflow
• Understanding of data modeling (Star/Snowflake schema)
• Experience with CI/CD and DevOps practices
• Excellent troubleshooting and performance tuning skills
Preferred Skills
• Experience in Retail / Grocery domain (highly preferred for Albertsons)
Job Title: Sr. AWS Data Engineer
Location: [Remote]
Must have: – Snowflake, dbt, and Fivetran experience is required, along with 15+ years of retail industry background
Job Summary
Albertsons is seeking a highly skilled AWS Data Engineer with strong hands-on experience in Snowflake, DBT, and Fivetran to build and optimize modern cloud data pipelines and analytics platforms. The ideal candidate will have deep expertise in designing scalable ELT/ETL frameworks on AWS and implementing data transformation workflows using DBT while leveraging Fivetran for data ingestion.
Key Responsibilities
• Design, develop, and maintain scalable data pipelines using AWS services
• Implement ELT processes using Fivetran for ingesting data from multiple sources into Snowflake
• Develop and manage data transformation models using DBT
• Build and optimize Snowflake data warehouse architecture, schemas, and performance tuning
• Work with large datasets and ensure data quality, reliability, and integrity
• Develop reusable data models and transformation frameworks
• Monitor, troubleshoot, and optimize data workflows and performance
• Collaborate with data analysts, BI teams, and stakeholders to deliver data solutions
• Implement data governance, security, and best practices
• Automate deployment using CI/CD pipelines
Required Skills
• 7+ years of experience in Data Engineering
• Strong hands-on experience with AWS (S3, Glue, Lambda, Redshift, IAM, CloudWatch)
• 3+ years of experience with Snowflake
• Strong experience with DBT for data transformations
• Hands-on experience with Fivetran or similar ELT tools
• Expertise in SQL and performance optimization
• Experience building data lakes and data warehouses
• Knowledge of Python for data engineering tasks
• Experience with orchestration tools like Airflow
• Understanding of data modeling (Star/Snowflake schema)
• Experience with CI/CD and DevOps practices
• Excellent troubleshooting and performance tuning skills
Preferred Skills
• Experience in Retail / Grocery domain (highly preferred for Albertsons)






