Pioneer IT Systems

Data Engineering Support Lead

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineering Support Lead with 12+ years of experience, remote in Charlotte, NC. Key skills include Snowflake, SQL, Airflow, dbt, Fivetran, AWS, and Python. U.S. citizenship or Green Card required. Pay rate: "unknown", contract length: "unknown".
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
October 22, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Charlotte, NC
-
🧠 - Skills detailed
#Complex Queries #Monitoring #Data Engineering #AWS (Amazon Web Services) #Data Ingestion #Airflow #Fivetran #Debugging #Python #Data Pipeline #dbt (data build tool) #Triggers #"ETL (Extract #Transform #Load)" #Lambda (AWS Lambda) #Snowflake #AWS Lambda #SQL (Structured Query Language)
Role description
Job Title: Data Engineering Support Lead Location: Charlotte, NC (Remote – EST hours preferred) Experience Required: 12+ Years Visa: U.S. Citizens or Green Card holders only Job Overview: We are seeking a Data Engineering Support Lead with deep experience in modern data stack tools and ETL/ELT pipelines. The ideal candidate will have a strong background in Snowflake, SQL, Airflow, dbt, Fivetran, and AWS — with hands-on troubleshooting and production support expertise. Core Technical Skills (Required): • Snowflake: Querying, monitoring query history, task scheduling, data validation • SQL: Strong ability to write, debug, and optimize complex queries • Airflow: DAG dependencies, task retries, scheduling, manual triggers • dbt: Running models, debugging transformation errors, understanding project structure • Fivetran: Connector monitoring, log review, manual refreshes • Python: Ability to read/modify scripts used in ETL jobs or Lambda functions • AWS Lambda: Review logs and execution results for event-based jobs • ETL / ELT: Strong understanding of data ingestion and transformation flows Representative Support Scenarios: • Investigate Snowflake table/view refresh failures and review logs • Check and re-run Airflow DAGs or dbt models • Validate ingestion via Fivetran and confirm source loads using SQL • Review ETL/ELT job dependencies and manually trigger failed runs • Identify and resolve timing or sequencing issues in scheduled jobs Preferred Background: • Strong analytical and troubleshooting mindset • Experience supporting data pipelines in production environments • Excellent communication and coordination skills across teams --