

Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with a contract length of "unknown," offering a pay rate of "unknown," and is remote. Key skills include Snowflake management, CI/CD with GitHub, Airflow automation, and dbt development. Python and SQL proficiency is preferred.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
June 16, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Unknown
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
California, United States
-
π§ - Skills detailed
#Data Ingestion #GitHub #Kubernetes #Airflow #Scala #Snowflake #SQL (Structured Query Language) #Data Pipeline #Data Accuracy #dbt (data build tool) #Python #Monitoring #"ETL (Extract #Transform #Load)" #Automation #Docker #Data Engineering #Deployment #Documentation
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Elevait Solutions is hiring a Data Engineer to build and manage scalable data pipelines and lead the orchestration of our Snowflake data platform. If youβre passionate about data infrastructure, automation, and engineering excellence β we want to hear from you!
What Youβll Do
β’ Build and manage data pipelines with strong SLAs.
β’ Administer and monitor Snowflake (RBAC, warehouses, usage).
β’ Develop CI/CD pipelines using GitHub, Docker, and Kubernetes.
β’ Automate data ingestion workflows via Airflow.
β’ Design and maintain dbt models.
β’ Write test cases to ensure data accuracy and consistency across layers.
β’ Document full data flows, test strategies, and deployment processes.
Must-Have Skills
β’ Strong hands-on experience with Snowflake, especially RBAC and warehouse monitoring.
β’ Proven ability to set up GitHub-based CI/CD pipelines.
β’ Working knowledge of Docker, Kubernetes (bonus).
β’ Experience building robust workflows using Airflow.
β’ Solid development experience with dbt.
β’ Excellent documentation habits.
Nice-to-Have
β’ Proficient in Python and SQL.
β’ Understanding of data models, ETL/ELT pipelines, and analytical workflows.
β’ Strong analytical mindset with a problem-solving attitude.