

Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with a contract length of "unknown" and a pay rate of "$XX/hour." Work location is "remote." Key skills include Snowflake, Databricks, dbt, SQL, and Python. Experience in foodservice or similar industries is preferred.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
June 11, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Unknown
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Charlotte Metro
-
π§ - Skills detailed
#Data Integration #Scala #"ETL (Extract #Transform #Load)" #Automation #AWS (Amazon Web Services) #Python #Cloud #Databricks #Data Processing #Agile #Data Pipeline #GCP (Google Cloud Platform) #Scrum #SQL (Structured Query Language) #Data Engineering #Snowflake #Data Modeling #Datasets #Azure #dbt (data build tool)
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
As part of ongoing data transformation efforts, weβre seeking a Data Engineer to help build and optimize data pipelines for critical business applications for one of our leading clients. This role is hands-on, focusing on implementing modern data engineering solutions that streamline data flows, enhance internal operations, and improve customer-facing experiences.
You'll work with tools like Snowflake, Databricks, dbt, and 5Tran to develop and maintain scalable data pipelines, collaborating closely with senior engineers and business stakeholders.
Key Responsibilities:
β’ Data Pipeline Development: Build and optimize end-to-end data pipelines using Snowflake, Databricks, dbt, and 5Tran, ensuring data is structured and ready for business applications.
β’ Data Transformation & Modeling: Develop dbt models to transform raw data into structured datasets for analytics and operational use.
β’ System Optimization: Support the design and implementation of efficient data structures that improve internal data flows and performance.
β’ Collaboration & Communication: Work with cross-functional teams, including senior engineers, analysts, and business stakeholders, to ensure data solutions meet business needs.
β’ Code Quality & Best Practices: Write clean, efficient SQL and Python code while following best practices for data engineering, testing, and performance optimization.
β’ Agile Development: Participate in Scrum ceremonies, contributing to sprint planning, standups, and retrospectives.
Skills & Experience:
β’ Experience in data engineering, with hands-on experience in building data pipelines.
β’ Strong knowledge of Snowflake and Databricks for data processing and warehousing.
β’ Experience with dbt to create transformations and build reusable data models.
β’ Familiarity with 5Tran for data integration and pipeline automation.
β’ Proficiency in SQL and data modeling, with a focus on optimizing query performance.
β’ Experience working with cloud platforms (AWS, GCP, or Azure) and ETL processes.
β’ Ability to troubleshoot data pipeline issues and optimize performance.
β’ Strong problem-solving skills and ability to work both independently and collaboratively in a team environment.
β’ Experience with Agile/Scrum methodologies.
Preferred:
β’ Experience with Python for automation or additional data engineering tasks.
β’ Knowledge of business-specific data concepts like unit costs, skews, and menu data is a plus.
β’ Previous experience in foodservice or a similar industry is beneficial.
β’ This is an exciting opportunity to grow your expertise in modern data engineering while working on impactful projects within a collaborative team environment!