Brooksource

ETL Developer/ Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an ETL Developer/Data Engineer with a 6-month contract in Charlotte, NC, offering competitive pay. Requires 5+ years in data integration, advanced SQL, Python, Airflow, and cloud infrastructure experience, preferably with media/advertising expertise.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
February 17, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Charlotte Metro
-
🧠 - Skills detailed
#Scripting #ML (Machine Learning) #Docker #Scala #Teradata #Agile #Data Extraction #SnowSQL #Triggers #Data Pipeline #Data Mart #Libraries #Airflow #S3 (Amazon Simple Storage Service) #Version Control #MongoDB #Data Architecture #Data Engineering #Snowflake #Automation #Data Mapping #Databases #AI (Artificial Intelligence) #"ETL (Extract #Transform #Load)" #Data Quality #Data Warehouse #Documentation #AWS (Amazon Web Services) #Scrum #Data Integration #GitHub #Data Modeling #Monitoring #Cloud #SQL (Structured Query Language) #Python
Role description
POSITION OVERVIEW Brooksource is seeking Data Engineers / ETL Developers for our Fortune 100 telecommunications client in Charlotte, NC. As a Data Engineer, you will be an integral member of the Data Operations Architecture team, supporting an advertising/media organization responsible for powering large-scale, high-visibility marketing and advertising campaigns. The primary responsibility is to design and develop reusable components, code, automation frameworks, and supporting documentation. You will maintain and enhance existing scripts and frameworks to support continuous delivery pipelines, ensuring rapid and reliable product availability. This role requires strong collaboration with your Scrum Team (Scrum Master, Product Owner, Lead Architect, Developers, and QA) to understand business requirements and deliver scalable, accurate, and efficient data integration solutions. KEY RESPONSIBILITIES • Design and develop reusable components, code, and documentation for automation frameworks • Collaborate with Product Owners and Scrum Master to define data extraction and transformation needs • Develop data mapping documentation between source and target systems • Write and analyze complex SQL for data extraction and processing • Design, develop, and deploy data movement workflows and pipelines • Extract and transform data from multiple sources, loading into cloud data warehouse • Monitor pipeline performance to ensure stability and data quality • Create, schedule, and maintain data pipelines using Airflow and Python • Build and test ETL pipelines supporting data warehouse and data marts • Validate data quality through comprehensive testing approaches • Resolve escalated incidents and code-level issues in coordination with operations teams MINIMUM REQUIREMENTS 5+ years of experience with: • Building Data Integration and Workflow solutions, including ETL pipelines for data warehousing • Strong data architecture knowledge: enterprise data warehousing concepts, SQL development/optimization, data integration • Advanced SQL scripting and querying - Tables, complex stored procedures, triggers, views, indexes, UDFs • Writing Python scripts using relevant frameworks/libraries • Creating, scheduling, and monitoring workflows (Airflow preferred) • Hands-on experience with GitHub for version control • ETL performance testing and data validation • Agile methodologies (Scrum, SAFe) • Strong team collaboration; adaptable and fast learner • Experience with MPP databases (Teradata, Netezza, Snowflake, MongoDB, etc.) • Cloud infrastructure experience (AWS preferred) NICE TO HAVE • Experience with Media/Advertising MSO data and applications • Experience with Snowflake and SnowSQL • Mentoring and guiding others in data integration • Experience with S3 buckets • Docker experience • Semantic modeling or advanced data modeling knowledge • AI/ML familiarity WORK ENVIRONMENT Company: Fortune 100 Telecommunications Hours: Standard 40-hour work week Dress Code: Business casual Schedule: Hybrid - 4 days onsite / 1 day remote Location: Charlotte, NC TECHNICAL ENVIRONMENT You will work with a modern data stack including Snowflake, AWS (Airflow, S3), Python, SQL, and GitHub. The team follows Agile/Scrum methodologies with CI/CD pipelines and is exploring cutting-edge technologies including AI-powered data pipeline automation and advanced semantic modeling. WHY JOIN THIS TEAM • Work on large-scale data integration challenges powering major advertising campaigns • Exposure to emerging AI/ML technologies in data engineering • Opportunity for contract extension based on performance • Collaborative team environment with experienced data leaders • Modern tech stack with continuous learning opportunities • Potential exposure to merger integration work BENEFITS OF WORKING WITH BROOKSOURCE • Direct communication with hiring managers for faster interview process • Transparent communication throughout the hiring process • Dedicated support from experienced recruiting team • Established relationship with client for smoother onboarding