

Compunnel Inc.
Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer (AWS) with a contract length of "Unknown", offering a pay rate of "Unknown". Key skills required include AWS services (S3, Glue, Redshift), Python, SQL, and data architecture experience.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
Unknown
-
ποΈ - Date
November 13, 2025
π - Duration
Unknown
-
ποΈ - Location
Unknown
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
North Carolina, United States
-
π§ - Skills detailed
#IAM (Identity and Access Management) #ML (Machine Learning) #Data Warehouse #S3 (Amazon Simple Storage Service) #Normalization #Data Engineering #Data Lineage #PySpark #Data Modeling #Scala #Data Quality #Observability #Strategy #Data Lake #Data Ingestion #Jenkins #Snowflake #Airflow #SQL (Structured Query Language) #AWS (Amazon Web Services) #DevOps #Athena #Code Reviews #Batch #"ETL (Extract #Transform #Load)" #Terraform #Python #Redshift #Docker #Data Pipeline #GitHub #Security #Spark (Apache Spark) #Lambda (AWS Lambda) #Infrastructure as Code (IaC) #Programming #Data Architecture
Role description
πΌ Senior Data Engineer (AWS)
Overview
A Senior Data Engineer designs, builds, and optimizes large-scale data systems on AWS. They lead the data infrastructure strategy, mentor junior engineers, and ensure that the companyβs data is secure, scalable, and analytics-ready.
Key Responsibilities
β’ Architect and implement end-to-end data pipelines using AWS services (Glue, Lambda, Step Functions, Kinesis, Redshift, S3, EMR, etc.).
β’ Design data lake and data warehouse solutions with performance, cost, and scalability in mind.
β’ Build and manage real-time and batch data ingestion frameworks.
β’ Optimize ETL/ELT workflows, ensuring reliability and quality.
β’ Collaborate with analytics, ML, and business teams to define data models and governance standards.
β’ Implement infrastructure as code (IaC) using tools like Terraform or AWS CDK.
β’ Establish data quality, lineage, and observability best practices.
β’ Mentor junior data engineers and lead code reviews.
Required Skills
β’ AWS Expertise:
β’ Deep experience with S3, Glue, Athena, Redshift, Lambda, EMR, Kinesis, and IAM.
β’ Programming:
β’ Proficient in Python, SQL, and PySpark.
β’ Data Architecture:
β’ Experience designing scalable data lakes, warehouses, and pipelines.
β’ DevOps / CI-CD:
β’ Familiarity with Docker, GitHub Actions, or Jenkins.
β’ Workflow Orchestration:
β’ Airflow, Step Functions, or similar tools.
β’ Data Modeling:
β’ Experience with star/snowflake schema, normalization, and partitioning strategies.
β’ Security & Governance:
β’ Implement encryption, access control, and data lineage tracking.
πΌ Senior Data Engineer (AWS)
Overview
A Senior Data Engineer designs, builds, and optimizes large-scale data systems on AWS. They lead the data infrastructure strategy, mentor junior engineers, and ensure that the companyβs data is secure, scalable, and analytics-ready.
Key Responsibilities
β’ Architect and implement end-to-end data pipelines using AWS services (Glue, Lambda, Step Functions, Kinesis, Redshift, S3, EMR, etc.).
β’ Design data lake and data warehouse solutions with performance, cost, and scalability in mind.
β’ Build and manage real-time and batch data ingestion frameworks.
β’ Optimize ETL/ELT workflows, ensuring reliability and quality.
β’ Collaborate with analytics, ML, and business teams to define data models and governance standards.
β’ Implement infrastructure as code (IaC) using tools like Terraform or AWS CDK.
β’ Establish data quality, lineage, and observability best practices.
β’ Mentor junior data engineers and lead code reviews.
Required Skills
β’ AWS Expertise:
β’ Deep experience with S3, Glue, Athena, Redshift, Lambda, EMR, Kinesis, and IAM.
β’ Programming:
β’ Proficient in Python, SQL, and PySpark.
β’ Data Architecture:
β’ Experience designing scalable data lakes, warehouses, and pipelines.
β’ DevOps / CI-CD:
β’ Familiarity with Docker, GitHub Actions, or Jenkins.
β’ Workflow Orchestration:
β’ Airflow, Step Functions, or similar tools.
β’ Data Modeling:
β’ Experience with star/snowflake schema, normalization, and partitioning strategies.
β’ Security & Governance:
β’ Implement encryption, access control, and data lineage tracking.






