Harvey Nash

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer in Charlotte, NC, requiring 5+ years of experience, strong skills in Snowflake, dbt, Python, and SQL. The contract is onsite, 5 days/week, with a focus on data pipeline development and optimization.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
520
-
🗓️ - Date
April 23, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Charlotte, NC
-
🧠 - Skills detailed
#Scala #Data Engineering #SQL (Structured Query Language) #Data Pipeline #Python #BI (Business Intelligence) #Data Quality #Datasets #"ETL (Extract #Transform #Load)" #Data Architecture #Snowflake #dbt (data build tool)
Role description
Job Title: Data Engineer Location: Charlotte, NC (Onsite-5 days/week) Note : This position is open exclusively to candidates who do not require Sponsorship Role Overview We are seeking a skilled Data Engineer with over 5 years of hands-on experience in building, optimizing, and maintaining scalable data pipelines and data infrastructure. The ideal candidate should be comfortable working in a fast-paced, high-ownership environment and capable of making informed decisions while proactively identifying gaps and asking the right questions. Key Responsibilities • Design, develop, and maintain robust and scalable data pipelines. • Work extensively with modern data stack tools including Snowflake, dbt, Python, and SQL. • Build and optimize data models to support analytics and business intelligence use cases. • Ensure data quality, integrity, and reliability across systems. • Collaborate closely with analytics, product, and business teams to understand data requirements. • Troubleshoot data issues and implement efficient solutions in a timely manner. • Continuously improve data architecture and workflows for performance and scalability. Required Skills & Qualifications • 5+ years of experience in Data Engineering or a related field. • Strong proficiency in Snowflake, dbt, Python, and SQL. • Solid understanding of data warehousing concepts and ETL/ELT processes. • Experience working with large datasets and optimizing query performance. • Ability to thrive in a fast-paced environment with minimal supervision. • Strong problem-solving skills and a proactive mindset. • Excellent communication skills, with the ability to ask the right questions and clarify requirements. Work Environment • This is an in-office role requiring presence 5 days a week. • The role demands adaptability, quick decision-making, and ownership in a dynamic setting. If you are interested please drop me your resume at zulker.ali@harveynash.com