

Brooksource
Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer/ETL Engineer on a long-term W2 rolling contract-to-hire, paying $75/hour. Located on-site in Greenwood Village, it requires strong SQL, AWS, and Apache Airflow skills, with experience in building end-to-end data pipelines.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
600
-
🗓️ - Date
April 30, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Greenwood Village, CO
-
🧠 - Skills detailed
#Batch #BI (Business Intelligence) #Python #Data Pipeline #Databases #Leadership #AI (Artificial Intelligence) #Code Reviews #Storage #GitLab #Apache Airflow #AWS (Amazon Web Services) #Spark (Apache Spark) #Data Engineering #Airflow #"ETL (Extract #Transform #Load)" #SQL (Structured Query Language) #Scala #S3 (Amazon Simple Storage Service) #Observability #Cloud #AWS S3 (Amazon Simple Storage Service) #Kafka (Apache Kafka)
Role description
Senior Data Engineer/ETL Engineer
Long-term W2 rolling contract-to-hire
3 days on-site in Greenwood Village
$75/hour maximum pay
Overview
Our client is seeking an ETL/Data Engineers to join their High Speed Data Engineering team, part of the broader WiFi & Network Data organization. This role sits upstream of WiFi analytics and focuses on large‑scale, end‑to‑end data pipelines that power visibility into our client's national high‑speed data network.
What You’ll Do
• Build and support end‑to‑end ETL pipelines from source ingestion (including Kafka streams) through transformation and storage in AWS (S3 and databases)
• Own pipelines that help analyze and improve network performance across the full HSD ecosystem (latency, upload/download speeds, reliability)
• Work primarily in AWS, leveraging SQL‑driven transformations and Python‑based workflows
• Develop and orchestrate pipelines using Apache Airflow
• Manage and optimize large‑scale databases supporting analytics and reporting teams
• Participate in a structured on‑call rotation during regular business hours only. Rotation is one work week every four weeks (Monday–Friday)
• Contribute to a highly collaborative engineering culture with GitLab‑based code reviews and shared ownership
Partner closely with:
• WiFi analytics teams downstream
• Business analytics and product teams using your data to inform network and product decisions
Team & Environment
• Join a tight‑knit, collaborative team of ~7 engineers
• Heavy emphasis on code quality, peer reviews, and knowledge sharing
• High volume, high scale data sets—this team supports reporting for the entire network
• Strong leadership that actively encourages growth and career development
• Increased adoption of AI‑assisted tooling within data workflows
Required Qualifications
• Senior‑level experience as an ETL / Data Engineer
• Strong SQL skills (this is a core requirement)
• Hands‑on cloud experience (AWS strongly preferred)
• Proven experience building production‑grade, end‑to‑end data pipelines
• Experience with Apache Airflow
• Comfort working with streaming and batch data sources (e.g., Kafka → S3)
• Experience supporting data pipelines used by analytics and business stakeholders
Preferred / Nice to Have
• Python for data engineering and pipeline logic
• Familiarity with AWS services such as queues and orchestration tools
• Exposure to or interest in AI‑enabled data tooling (e.g., accelerating SQL, pipeline development, or observability)
• Background working with high‑volume network or telemetry data
• Experience collaborating with analytics, product, or business intelligence teams
• (Note: Minimal Spark usage; no Scala required)
Interview Process
1. Initial SQL screening exercise
1. Live SQL interview (45 minutes):
• ~25 minutes hands‑on SQL
• Remainder focused on projects and background
1. Final in‑person panel interview (1 hour)
Why This Role:
• Work on critical infrastructure data that impacts how millions of customers experience the internet
• Help define and scale a new and expanding HSD data scope
• Strong leadership, collaborative culture, and opportunities to grow into broader architectural ownership
Disclaimer: Brooksource, Medasource, and Calculated Hire are part of the Eight Eleven Group family of companies and operate under Eight Eleven Group, LLC. All employees receive the same benefits, policies, and terms of employment.
EEO:
We are committed to creating an inclusive environment for all employees and applicants. We do not discriminate on the basis of race, color, religion, creed, sex, sexual orientation, gender identity or expression, national origin, ancestry, age, disability, genetic information, marital status, military or veteran status, citizenship, pregnancy (including childbirth, lactation, and related conditions), or any other protected status in accordance with applicable federal, state, and local laws.
Benefits & Perks:
Brooksource offers competitive medical, dental, vision, Health Savings Account, Dependent Care FSA, and supplemental coverage with plans that can fit each employee’s needs. We offer a 401k plan that includes a company match and is fully vested after you become eligible, paid time off, sick time, and paid company holidays. We also offer an Employee Assistance Program (EAP) that provides services like virtual counseling, financial services, legal services, life coaching, etc.
Pay Disclaimer:
The pay range for this job level is a general guideline only and not a guarantee of compensation or salary. Additional factors considered in extending an offer include (but are not limited to) responsibilities of the job, education, experience, knowledge, skills, and abilities, as well as internal equity, alignment with market data, applicable bargaining agreement (if any), or other law.
Senior Data Engineer/ETL Engineer
Long-term W2 rolling contract-to-hire
3 days on-site in Greenwood Village
$75/hour maximum pay
Overview
Our client is seeking an ETL/Data Engineers to join their High Speed Data Engineering team, part of the broader WiFi & Network Data organization. This role sits upstream of WiFi analytics and focuses on large‑scale, end‑to‑end data pipelines that power visibility into our client's national high‑speed data network.
What You’ll Do
• Build and support end‑to‑end ETL pipelines from source ingestion (including Kafka streams) through transformation and storage in AWS (S3 and databases)
• Own pipelines that help analyze and improve network performance across the full HSD ecosystem (latency, upload/download speeds, reliability)
• Work primarily in AWS, leveraging SQL‑driven transformations and Python‑based workflows
• Develop and orchestrate pipelines using Apache Airflow
• Manage and optimize large‑scale databases supporting analytics and reporting teams
• Participate in a structured on‑call rotation during regular business hours only. Rotation is one work week every four weeks (Monday–Friday)
• Contribute to a highly collaborative engineering culture with GitLab‑based code reviews and shared ownership
Partner closely with:
• WiFi analytics teams downstream
• Business analytics and product teams using your data to inform network and product decisions
Team & Environment
• Join a tight‑knit, collaborative team of ~7 engineers
• Heavy emphasis on code quality, peer reviews, and knowledge sharing
• High volume, high scale data sets—this team supports reporting for the entire network
• Strong leadership that actively encourages growth and career development
• Increased adoption of AI‑assisted tooling within data workflows
Required Qualifications
• Senior‑level experience as an ETL / Data Engineer
• Strong SQL skills (this is a core requirement)
• Hands‑on cloud experience (AWS strongly preferred)
• Proven experience building production‑grade, end‑to‑end data pipelines
• Experience with Apache Airflow
• Comfort working with streaming and batch data sources (e.g., Kafka → S3)
• Experience supporting data pipelines used by analytics and business stakeholders
Preferred / Nice to Have
• Python for data engineering and pipeline logic
• Familiarity with AWS services such as queues and orchestration tools
• Exposure to or interest in AI‑enabled data tooling (e.g., accelerating SQL, pipeline development, or observability)
• Background working with high‑volume network or telemetry data
• Experience collaborating with analytics, product, or business intelligence teams
• (Note: Minimal Spark usage; no Scala required)
Interview Process
1. Initial SQL screening exercise
1. Live SQL interview (45 minutes):
• ~25 minutes hands‑on SQL
• Remainder focused on projects and background
1. Final in‑person panel interview (1 hour)
Why This Role:
• Work on critical infrastructure data that impacts how millions of customers experience the internet
• Help define and scale a new and expanding HSD data scope
• Strong leadership, collaborative culture, and opportunities to grow into broader architectural ownership
Disclaimer: Brooksource, Medasource, and Calculated Hire are part of the Eight Eleven Group family of companies and operate under Eight Eleven Group, LLC. All employees receive the same benefits, policies, and terms of employment.
EEO:
We are committed to creating an inclusive environment for all employees and applicants. We do not discriminate on the basis of race, color, religion, creed, sex, sexual orientation, gender identity or expression, national origin, ancestry, age, disability, genetic information, marital status, military or veteran status, citizenship, pregnancy (including childbirth, lactation, and related conditions), or any other protected status in accordance with applicable federal, state, and local laws.
Benefits & Perks:
Brooksource offers competitive medical, dental, vision, Health Savings Account, Dependent Care FSA, and supplemental coverage with plans that can fit each employee’s needs. We offer a 401k plan that includes a company match and is fully vested after you become eligible, paid time off, sick time, and paid company holidays. We also offer an Employee Assistance Program (EAP) that provides services like virtual counseling, financial services, legal services, life coaching, etc.
Pay Disclaimer:
The pay range for this job level is a general guideline only and not a guarantee of compensation or salary. Additional factors considered in extending an offer include (but are not limited to) responsibilities of the job, education, experience, knowledge, skills, and abilities, as well as internal equity, alignment with market data, applicable bargaining agreement (if any), or other law.






