

Sr. Data Engineer Python / Snowflake / SQL / DBT / CI/CD - 100% Remote
β - Featured Role | Apply direct with Data Freelance Hub
This role is a 6-month remote contract for a Sr. Data Engineer with expertise in Python, Snowflake, SQL, and CI/CD. Key skills include data pipeline development, GitHub CI/CD setup, and experience with Docker, Kubernetes, and dbt.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
June 13, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Remote
-
π - Contract type
Fixed Term
-
π - Security clearance
Unknown
-
π - Location detailed
United States
-
π§ - Skills detailed
#Data Accuracy #dbt (data build tool) #Kubernetes #Data Engineering #Deployment #Python #Monitoring #Snowflake #Data Pipeline #Docker #SQL (Structured Query Language) #GitHub #Programming #Airflow #Data Ingestion #"ETL (Extract #Transform #Load)"
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
We have a 6-month contract position (with possible extension) for a Sr. Data Engineer. The position is remote
As a Sr. Data Engineer, you will collaborate with team in building data pipelines, orchestrate and manage Snowflake Datawarehouse. You will play a key role in ensuring that existing pipelines are meeting SLAs and integrate CI/CD practices in our operation.
As a Data Engineer, you will collaborate with team in building data pipelines, orchestrate and manage Snowflake Datawarehouse. You will play a key role in ensuring that existing pipelines are meeting SLAs and integrate CI/CD practices in our operation.
Responsibilities:
β’ Experience in Snowflake environment and setting up Role based access control roles and Users, setup warehouses and monitors.
β’ Experience developing Github CI/CD pipelines and deployment into dev and production nodes. Familiar with setting up CI/CD pipelines with Kubernetes, and Docker.
β’ Write and maintain scripts necessary for data ingestion with orchestration in Airflow
β’ Develop testing frameworks to validate data accuracy and data consistency at each layer.
Required Skills:
β’ Hands on experience in Snowflake, administration of Role based access Control and monitoring.
β’ Hands on experience in setting up GitHub CI/CD pipelines and setting up development and production environments.
β’ Experience in Docker and Kubernetes is a plus.
β’ Hands on experience in development of dbt models.
β’ Document end-end data flow, test cases and deployment.
Preferred Qualifications:
β’ Proficiency in programming languages such as Python.
β’ Proficiency in SQL
β’ Strong analytical and problem-solving skills
β’ Familiarity with data models and ELT/ETL processes