

Brooksource
Senior Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer on a 12-month W2 contract, paying competitively, located in Charlotte, NC or Denver, CO. Key skills include Prefect, AWS, Python, Terraform, and GCP. A Bachelor's degree in Computer Science is required.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
Unknown
-
ποΈ - Date
October 1, 2025
π - Duration
More than 6 months
-
ποΈ - Location
Hybrid
-
π - Contract
W2 Contractor
-
π - Security
Unknown
-
π - Location detailed
Charlotte, NC
-
π§ - Skills detailed
#Athena #Computer Science #Python #Data Science #Visualization #Scala #Cloud #Automation #JavaScript #AWS (Amazon Web Services) #Lambda (AWS Lambda) #Datasets #S3 (Amazon Simple Storage Service) #Data Engineering #ML (Machine Learning) #"ETL (Extract #Transform #Load)" #Redshift #Data Processing #Scripting #Storage #Data Integrity #Data Pipeline #Terraform #Deployment #GCP (Google Cloud Platform) #Data Storage #Infrastructure as Code (IaC)
Role description
We are seeking an experienced Advanced Data Engineer to join our team, focusing on supporting and enhancing our data pipelines, visualizations, and analytics capabilities. The ideal candidate will have a robust background in data engineering and analytics, with a deep understanding of modern data technologies and frameworks. This role demands a strong technical skill set, the ability to work collaboratively with cross-functional teams, and a keen eye for detail.
Key Responsibilities
β’ Design, develop, and maintain scalable data pipelines using Prefect to ensure efficient data flow across systems.
β’ Implement and optimize data storage solutions using AWS services, including Athena, for high-performance querying and analysis.
β’ Utilize Python for scripting and automation to enhance data processing workflows and integrations.
β’ Employ Terraform for infrastructure as code, ensuring reliable and consistent deployment of data-related resources.
β’ Develop CI/CD pipelines to automate testing, integration, and deployment processes for data applications.
β’ Create interactive and insightful data visualizations using JavaScript frameworks to support decision-making processes.
β’ Apply advanced analytics techniques to extract valuable insights from complex datasets, driving business growth and innovation.
β’ Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver tailored solutions.
β’ Monitor and troubleshoot data pipelines, ensuring data integrity and reliability across platforms.
β’ Implement cloud optimization strategies to maximize efficiency and reduce costs across cloud services.
β’ Leverage Google Cloud Platform (GCP) for data solutions and integration with existing AWS infrastructure.
Minimum Qualifications
β’ Proven experience as a Data Engineer or similar role, with a focus on data pipelines and analysis.
β’ Strong expertise in Prefect, AWS, Athena, Python, Terraform, and JavaScript.
β’ Solid understanding of CI/CD practices and tools to streamline data engineering workflows.
β’ Familiarity with advanced analytics techniques and their application in business contexts.
β’ Experience with cloud optimization strategies and tools to enhance performance and cost-effectiveness.
β’ Proficiency in Google Cloud Platform (GCP) and its integration with other cloud services.
β’ Excellent problem-solving skills, with the ability to diagnose and resolve complex data issues.
β’ Strong communication skills to collaborate effectively with technical and non-technical teams.
β’ Bachelorβs degree in Computer Science, Engineering, or a related field; advanced degrees are a plus.
Preferred Qualifications
β’ Experience with additional AWS services such as Redshift, S3, and Lambda.
β’ Knowledge of machine learning frameworks and their integration with data engineering processes.
β’ Ability to work in a fast-paced environment and adapt to changing requirements and priorities.
Additional Details:
β’ Location: Charlotte, NC or Denver, CO
β’ Hybrid position (4 days/week on site)
β’ 12-month W2 contract position with potential for extension or full-time conversion
Eight Eleven Group provides equal employment opportunities (EEO) to all employees and applicants for employment without regard to race, color, religion, national origin, age, sex, citizenship, disability, genetic information, gender, sexual orientation, gender identity, marital status, amnesty or status as a covered veteran in accordance with applicable federal, state, and local laws.
We are seeking an experienced Advanced Data Engineer to join our team, focusing on supporting and enhancing our data pipelines, visualizations, and analytics capabilities. The ideal candidate will have a robust background in data engineering and analytics, with a deep understanding of modern data technologies and frameworks. This role demands a strong technical skill set, the ability to work collaboratively with cross-functional teams, and a keen eye for detail.
Key Responsibilities
β’ Design, develop, and maintain scalable data pipelines using Prefect to ensure efficient data flow across systems.
β’ Implement and optimize data storage solutions using AWS services, including Athena, for high-performance querying and analysis.
β’ Utilize Python for scripting and automation to enhance data processing workflows and integrations.
β’ Employ Terraform for infrastructure as code, ensuring reliable and consistent deployment of data-related resources.
β’ Develop CI/CD pipelines to automate testing, integration, and deployment processes for data applications.
β’ Create interactive and insightful data visualizations using JavaScript frameworks to support decision-making processes.
β’ Apply advanced analytics techniques to extract valuable insights from complex datasets, driving business growth and innovation.
β’ Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver tailored solutions.
β’ Monitor and troubleshoot data pipelines, ensuring data integrity and reliability across platforms.
β’ Implement cloud optimization strategies to maximize efficiency and reduce costs across cloud services.
β’ Leverage Google Cloud Platform (GCP) for data solutions and integration with existing AWS infrastructure.
Minimum Qualifications
β’ Proven experience as a Data Engineer or similar role, with a focus on data pipelines and analysis.
β’ Strong expertise in Prefect, AWS, Athena, Python, Terraform, and JavaScript.
β’ Solid understanding of CI/CD practices and tools to streamline data engineering workflows.
β’ Familiarity with advanced analytics techniques and their application in business contexts.
β’ Experience with cloud optimization strategies and tools to enhance performance and cost-effectiveness.
β’ Proficiency in Google Cloud Platform (GCP) and its integration with other cloud services.
β’ Excellent problem-solving skills, with the ability to diagnose and resolve complex data issues.
β’ Strong communication skills to collaborate effectively with technical and non-technical teams.
β’ Bachelorβs degree in Computer Science, Engineering, or a related field; advanced degrees are a plus.
Preferred Qualifications
β’ Experience with additional AWS services such as Redshift, S3, and Lambda.
β’ Knowledge of machine learning frameworks and their integration with data engineering processes.
β’ Ability to work in a fast-paced environment and adapt to changing requirements and priorities.
Additional Details:
β’ Location: Charlotte, NC or Denver, CO
β’ Hybrid position (4 days/week on site)
β’ 12-month W2 contract position with potential for extension or full-time conversion
Eight Eleven Group provides equal employment opportunities (EEO) to all employees and applicants for employment without regard to race, color, religion, national origin, age, sex, citizenship, disability, genetic information, gender, sexual orientation, gender identity, marital status, amnesty or status as a covered veteran in accordance with applicable federal, state, and local laws.