

Mastech Digital
Senior Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer with a contract length of "unknown," offering a pay rate of "$X/hour." Key skills include 10+ years of experience in data warehousing, ELT/ETL pipelines, AWS services, and strong programming in Python or Java.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
Unknown
-
ποΈ - Date
October 11, 2025
π - Duration
Unknown
-
ποΈ - Location
Unknown
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
Smithfield, RI
-
π§ - Skills detailed
#Data Modeling #Snowflake #Java #AWS (Amazon Web Services) #Data Mart #Cloud #Ansible #S3 (Amazon Simple Storage Service) #IAM (Identity and Access Management) #Jenkins #Data Engineering #DevOps #Computer Science #Docker #Kubernetes #Data Analysis #Python #Vault #EC2 #Agile #Programming #Kanban #Maven #SQL (Structured Query Language) #Scrum #"ETL (Extract #Transform #Load)" #Data Vault
Role description
Bachelorβs or masterβs Degree in a technology related field (e.g. Engineering, Computer Science, etc.) required with 10+ years of experience
6+ years of experience in Data Warehousing, Data mart concepts & implementations
4+ years of experience developing ELT/ETL pipelines to move data to and from Snowflake data store
4+ years of experience using AWS services such as EC2, IAM, S3, EKS, KMS, SMS, CloudWatch, CloudFormation, etc.
4+ years of experience in object-oriented programming languages (Strong programming skills required in Python or Java)
Your passion for Data Analysis with the ability to navigate and master complex transactional and warehouse database
Hands-on experience on SQL query optimization and performance tuning is preferred
Experience in job scheduling tools (Control-M preferred)
Advanced SQL/Snow SQL knowledge is preferred
Strong data modeling skills doing either Dimensional or Data Vault models
Experience in Container technologies like Docker and Kubernetes
Experience with DevOps, Continuous Integration and Continuous Delivery (Maven, Jenkins, Stash, Ansible, Docker)
Experience in Agile methodologies (Kanban and SCRUM) is a plus
Bachelorβs or masterβs Degree in a technology related field (e.g. Engineering, Computer Science, etc.) required with 10+ years of experience
6+ years of experience in Data Warehousing, Data mart concepts & implementations
4+ years of experience developing ELT/ETL pipelines to move data to and from Snowflake data store
4+ years of experience using AWS services such as EC2, IAM, S3, EKS, KMS, SMS, CloudWatch, CloudFormation, etc.
4+ years of experience in object-oriented programming languages (Strong programming skills required in Python or Java)
Your passion for Data Analysis with the ability to navigate and master complex transactional and warehouse database
Hands-on experience on SQL query optimization and performance tuning is preferred
Experience in job scheduling tools (Control-M preferred)
Advanced SQL/Snow SQL knowledge is preferred
Strong data modeling skills doing either Dimensional or Data Vault models
Experience in Container technologies like Docker and Kubernetes
Experience with DevOps, Continuous Integration and Continuous Delivery (Maven, Jenkins, Stash, Ansible, Docker)
Experience in Agile methodologies (Kanban and SCRUM) is a plus