Gardner Resources Consulting, LLC

Senior Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer with 10+ years of experience, focusing on scalable data solutions. Contract length is unspecified, with a pay rate of "unknown." Expertise in Snowflake, Python, and AI/ML data ecosystems is required.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
January 10, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Dallas-Fort Worth Metroplex
-
🧠 - Skills detailed
#dbt (data build tool) #AI (Artificial Intelligence) #ML (Machine Learning) #Cloud #Data Science #Python #Datasets #Security #Data Pipeline #Strategy #Data Strategy #Azure #Data Engineering #Snowflake #Data Transformations #"ETL (Extract #Transform #Load)" #Data Governance #SnowSQL #AWS (Amazon Web Services) #GCP (Google Cloud Platform) #Scala
Role description
We are seeking a Senior Data Engineer with 10+ years of experience to support a growing Data & Analytics team. This role will focus on building scalable, modern data solutions that enable advanced analytics and AI/ML use cases. Key Responsibilities: • Design, build, and optimize data pipelines and workflows across modern data platforms. • Leverage Snowflake, cloud-native solutions, and dbt to deliver scalable data transformations. • Develop and maintain high-quality, production-ready datasets to support AI/ML initiatives. • Collaborate with cross-functional stakeholders, including Data Scientists and Engineers, to deliver integrated data solutions. • Mentor junior team members and contribute to best practices in data engineering. • Participate in architecture discussions and provide input on data strategy and platform evolution. Must Haves: • 10+ years of experience in data engineering, with strong expertise in data warehousing and modern data platforms (Snowflake preferred). • Proficiency with snowSQL, Python, and dbt for building and maintaining data pipelines. • Experience designing AI/ML-ready data ecosystems and supporting advanced analytics. • Proven success in leading cross-functional initiatives and mentoring engineering teams. • Strong understanding of cloud environments (AWS, Azure, or GCP). • Excellent communication and collaboration skills. Nice-to-Have: • Experience with MLOps tools and practices. • Familiarity with data governance and security best practices. • Background in healthcare, CPG, or regulated industries.