

Compunnel Inc.
Senior Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer with a 12+ month contract, located in Charlotte, NC or Denver, CO (4x/week onsite). Key skills include Python, Prefect, AWS, Athena, Terraform, and JavaScript. A FinOps background and experience with Redshift, S3, and Lambda are preferred.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
Unknown
-
ποΈ - Date
October 7, 2025
π - Duration
More than 6 months
-
ποΈ - Location
On-site
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
Charlotte, NC
-
π§ - Skills detailed
#Storage #AWS (Amazon Web Services) #Visualization #Terraform #Lambda (AWS Lambda) #Athena #Data Pipeline #Data Science #ML (Machine Learning) #Cloud #Tableau #Deployment #JavaScript #Data Integrity #"ETL (Extract #Transform #Load)" #GCP (Google Cloud Platform) #Automation #Data Processing #SAP #Data Storage #Redshift #S3 (Amazon Simple Storage Service) #Azure #Data Engineering #Infrastructure as Code (IaC) #Scala #Python #Computer Science #Datasets #Scripting #Data Analysis
Role description
Data Engineer
Top Requirements:
Python
Prefect
AWS
Athena
Terraform
JavaScript
FinOps background
Redshift, S3, and Lambda
Job Description :
Position: Data Engineer
Location: Charlotte, NC or Denver, CO
β’ 4x/week onsite
Contract Length: 12+ months
Top Requirements:
1. Python
1. Prefect
1. AWS
1. Athena
1. Terraform
1. JavaScript
Plusses
1. FinOps background
1. Redshift, S3, and Lambda
Day to Day Responsibilities/project specifics: This resource will initially help automate projects and complete hands on code work to improve processes. The resource will work on data pipelines, putting analysis together and putting into visualization tools β Tableau or other tools. We are seeking support in analyzing how we pull data, working with SAP systems and evolving into how the organization move down the cloud pathway. Job will include data analysis/engineering and will include responsibilities around reporting on cloud costing for AWS, Google, Azure. An internal team has already creating the AWS costing blueprint and these resources will be available. Understanding data flows, how data is tied together and how to get the pipelines working and automation.
Job Description:
We are seeking an experienced Advanced Data Engineer to join our team, focusing on supporting and enhancing our data pipelines, visualizations, and analytics capabilities. The ideal candidate will have a robust background in data engineering and analytics, with a deep understanding of modern data technologies and frameworks. This role demands a strong technical skill set, the ability to work collaboratively with cross-functional teams, and a keen eye for detail.
Key Responsibilities
β’ Design, develop, and maintain scalable data pipelines using Prefect to ensure efficient data flow across systems.
β’ Implement and optimize data storage solutions using AWS services, including Athena, for high-performance querying and analysis.
β’ Utilize Python for scripting and automation to enhance data processing workflows and integrations.
β’ Employ Terraform for infrastructure as code, ensuring reliable and consistent deployment of data-related resources.
β’ Develop CI/CD pipelines to automate testing, integration, and deployment processes for data applications.
β’ Create interactive and insightful data visualizations using JavaScript frameworks to support decision-making processes.
β’ Apply advanced analytics techniques to extract valuable insights from complex datasets, driving business growth and innovation.
β’ Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver tailored solutions.
β’ Monitor and troubleshoot data pipelines, ensuring data integrity and reliability across platforms.
β’ Implement cloud optimization strategies to maximize efficiency and reduce costs across cloud services.
β’ Leverage Google Cloud Platform (GCP) for data solutions and integration with existing AWS infrastructure.
Minimum Qualifications
β’ Proven experience as a Data Engineer or similar role, with a focus on data pipelines and analysis.
β’ Strong expertise in Prefect, AWS, Athena, Python, Terraform, and JavaScript.
β’ Solid understanding of CI/CD practices and tools to streamline data engineering workflows.
β’ Familiarity with advanced analytics techniques and their application in business contexts.
β’ Experience with cloud optimization strategies and tools to enhance performance and cost-effectiveness.
β’ Proficiency in Google Cloud Platform (GCP) and its integration with other cloud services.
β’ Excellent problem-solving skills, with the ability to diagnose and resolve complex data issues.
β’ Strong communication skills to collaborate effectively with technical and non-technical teams.
β’ Bachelorβs degree in Computer Science, Engineering, or a related field; advanced degrees are a plus.
Preferred Qualifications
β’ Experience with additional AWS services such as Redshift, S3, and Lambda.
β’ Knowledge of machine learning frameworks and their integration with data engineering processes.
β’ Ability to work in a fast-paced environment and adapt to changing requirements and priorities.
Data Engineer
Top Requirements:
Python
Prefect
AWS
Athena
Terraform
JavaScript
FinOps background
Redshift, S3, and Lambda
Job Description :
Position: Data Engineer
Location: Charlotte, NC or Denver, CO
β’ 4x/week onsite
Contract Length: 12+ months
Top Requirements:
1. Python
1. Prefect
1. AWS
1. Athena
1. Terraform
1. JavaScript
Plusses
1. FinOps background
1. Redshift, S3, and Lambda
Day to Day Responsibilities/project specifics: This resource will initially help automate projects and complete hands on code work to improve processes. The resource will work on data pipelines, putting analysis together and putting into visualization tools β Tableau or other tools. We are seeking support in analyzing how we pull data, working with SAP systems and evolving into how the organization move down the cloud pathway. Job will include data analysis/engineering and will include responsibilities around reporting on cloud costing for AWS, Google, Azure. An internal team has already creating the AWS costing blueprint and these resources will be available. Understanding data flows, how data is tied together and how to get the pipelines working and automation.
Job Description:
We are seeking an experienced Advanced Data Engineer to join our team, focusing on supporting and enhancing our data pipelines, visualizations, and analytics capabilities. The ideal candidate will have a robust background in data engineering and analytics, with a deep understanding of modern data technologies and frameworks. This role demands a strong technical skill set, the ability to work collaboratively with cross-functional teams, and a keen eye for detail.
Key Responsibilities
β’ Design, develop, and maintain scalable data pipelines using Prefect to ensure efficient data flow across systems.
β’ Implement and optimize data storage solutions using AWS services, including Athena, for high-performance querying and analysis.
β’ Utilize Python for scripting and automation to enhance data processing workflows and integrations.
β’ Employ Terraform for infrastructure as code, ensuring reliable and consistent deployment of data-related resources.
β’ Develop CI/CD pipelines to automate testing, integration, and deployment processes for data applications.
β’ Create interactive and insightful data visualizations using JavaScript frameworks to support decision-making processes.
β’ Apply advanced analytics techniques to extract valuable insights from complex datasets, driving business growth and innovation.
β’ Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver tailored solutions.
β’ Monitor and troubleshoot data pipelines, ensuring data integrity and reliability across platforms.
β’ Implement cloud optimization strategies to maximize efficiency and reduce costs across cloud services.
β’ Leverage Google Cloud Platform (GCP) for data solutions and integration with existing AWS infrastructure.
Minimum Qualifications
β’ Proven experience as a Data Engineer or similar role, with a focus on data pipelines and analysis.
β’ Strong expertise in Prefect, AWS, Athena, Python, Terraform, and JavaScript.
β’ Solid understanding of CI/CD practices and tools to streamline data engineering workflows.
β’ Familiarity with advanced analytics techniques and their application in business contexts.
β’ Experience with cloud optimization strategies and tools to enhance performance and cost-effectiveness.
β’ Proficiency in Google Cloud Platform (GCP) and its integration with other cloud services.
β’ Excellent problem-solving skills, with the ability to diagnose and resolve complex data issues.
β’ Strong communication skills to collaborate effectively with technical and non-technical teams.
β’ Bachelorβs degree in Computer Science, Engineering, or a related field; advanced degrees are a plus.
Preferred Qualifications
β’ Experience with additional AWS services such as Redshift, S3, and Lambda.
β’ Knowledge of machine learning frameworks and their integration with data engineering processes.
β’ Ability to work in a fast-paced environment and adapt to changing requirements and priorities.