Gardner Resources Consulting, LLC

Senior Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer with 10+ years of experience, focusing on scalable data solutions. Contract length is "unknown," with a pay rate of "unknown." Key skills include Snowflake, Python, dbt, and cloud environments (AWS, Azure, GCP).
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
November 21, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Dallas-Fort Worth Metroplex
-
🧠 - Skills detailed
#Data Engineering #Python #GCP (Google Cloud Platform) #"ETL (Extract #Transform #Load)" #Data Transformations #Snowflake #AWS (Amazon Web Services) #Data Governance #Cloud #Azure #SnowSQL #Security #Strategy #AI (Artificial Intelligence) #ML (Machine Learning) #Data Strategy #Scala #Data Science #dbt (data build tool) #Data Pipeline #Datasets
Role description
We are seeking a Senior Data Engineer with 10+ years of experience to support a growing Data & Analytics team. This role will focus on building scalable, modern data solutions that enable advanced analytics and AI/ML use cases. Key Responsibilities: • Design, build, and optimize data pipelines and workflows across modern data platforms. • Leverage Snowflake, cloud-native solutions, and dbt to deliver scalable data transformations. • Develop and maintain high-quality, production-ready datasets to support AI/ML initiatives. • Collaborate with cross-functional stakeholders, including Data Scientists and Engineers, to deliver integrated data solutions. • Mentor junior team members and contribute to best practices in data engineering. • Participate in architecture discussions and provide input on data strategy and platform evolution. Must Haves: • 10+ years of experience in data engineering, with strong expertise in data warehousing and modern data platforms (Snowflake preferred). • Proficiency with snowSQL, Python, and dbt for building and maintaining data pipelines. • Experience designing AI/ML-ready data ecosystems and supporting advanced analytics. • Proven success in leading cross-functional initiatives and mentoring engineering teams. • Strong understanding of cloud environments (AWS, Azure, or GCP). • Excellent communication and collaboration skills. Nice-to-Have: • Experience with MLOps tools and practices. • Familiarity with data governance and security best practices. • Background in healthcare, CPG, or regulated industries.