

Brooksource
AWS Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for an AWS Data Engineer with a 1-year contract, hybrid in Charlotte, NC. Requires 5+ years of AWS experience, proficiency in data warehousing, ETL processes, and various AWS services. W-2 only; C2C/1099 not accepted.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
640
-
ποΈ - Date
December 4, 2025
π - Duration
More than 6 months
-
ποΈ - Location
Hybrid
-
π - Contract
W2 Contractor
-
π - Security
Unknown
-
π - Location detailed
Charlotte Metro
-
π§ - Skills detailed
#Python #Data Quality #Kafka (Apache Kafka) #BitBucket #Data Warehouse #Spark (Apache Spark) #Data Science #IP (Internet Protocol) #SQL (Structured Query Language) #Data Processing #VPN (Virtual Private Network) #Infrastructure as Code (IaC) #Data Pipeline #Terraform #Pandas #Airflow #SQS (Simple Queue Service) #Athena #Redshift #Monitoring #REST API #DevOps #"ETL (Extract #Transform #Load)" #Vault #S3 (Amazon Simple Storage Service) #PySpark #Batch #Amazon Redshift #REST (Representational State Transfer) #DynamoDB #IAM (Identity and Access Management) #RDBMS (Relational Database Management System) #Security #Migration #Lambda (AWS Lambda) #Aurora #Databases #Cloud #Data Modeling #API (Application Programming Interface) #SNS (Simple Notification Service) #AWS (Amazon Web Services) #Data Engineering #Database Management
Role description
β’ Hybrid in Charlotte, NC, 2-3 days/week on site
β’ 1-year contract with potential for extension or full-time conversion
β’ W-2 only (C2C/1099 is not possible for this role)
REQUIRED EXPERIENCE:
β’ 5+ years of AWS experience
β’ AWS services: S3, EMR, Glue Jobs, Lambda, Athena, CloudTrail, SNS, SQS, CloudWatch, Step Functions, QuickSight
β’ Experience with Kafka/Messaging, preferably Confluent Kafka
β’ Experience with EMR databases such as Glue Catalog, Lake Formation, Redshift, DynamoDB, and Aurora
β’ Experience with AWS data warehousing tools such as Amazon Redshift and Amazon Athena
β’ Proven track record in the design and implementation of data warehouse solutions using AWS
β’ Skilled in data modeling and executing ETL processes tailored for data warehousing
β’ Competence in developing and refining data pipelines within AWS
β’ Proficient in handling both real-time and batch data processing tasks
β’ Extensive understanding of database management fundamentals
β’ Expertise in creating alerts and automated solutions for handling production problems
β’ Tools and Languages: Python, Spark, PySpark, and Pandas
β’ Infrastructure as Code technology: Terraform/CloudFormation
β’ Experience with Secrets Management Platform like Vault and AWS Secrets manager
β’ Experience with Event Driven Architecture
β’ DevOps pipeline (CI/CD): Bitbucket; Concourse
β’ Experience with RDBMS platforms and strong proficiency with SQL
β’ Experience with Rest APIs and API gateway
β’ Deep knowledge of IAM roles and Policies
β’ Experience using AWS monitoring services like CloudWatch, CloudTrail, and CloudWatch events
β’ Deep understanding of networking DNS, TCP/IP, and VPN
β’ Experience with AWS workflow orchestration tool like Airflow or Step Functions
RESPONSIBILITIES:
β’ Where applicable, collaborate with Lead Developers (Data Engineer, Software Engineer, Data Scientist, Technical Test Lead) to understand requirements/use cases to outline technical scope and lead delivery of the technical solution
β’ Confirm required developers and skill sets specific to the product
β’ Collaborate with Data and Solution architects on key technical decisions
β’ Skilled in developing data pipelines, focusing on long-term reliability and maintaining high data quality
β’ Design data warehousing solutions with the end-user in mind, ensuring ease of use without compromising on performance
β’ Manage and resolve issues in production data warehouse environments on AWS
CORE EXPERIENCE AND ABILITIES:
β’ Ability to perform hands-on development and peer review for certain components/tech stack on the product
β’ Standing up of development instances and migration path (with required security, access/roles)
β’ Develop components and related processes (e.g., data pipelines and associated ETL processes, workflows)
β’ Ability to build new data pipelines, identify existing data gaps, and provide automated solutions to deliver analytical capabilities and enriched data to applications
β’ Ability to implement data pipelines with the right attentiveness to durability and data quality
β’ Implement data warehousing products thinking of the end user's experience (ease of use with the right performance)
Eight Eleven Group provides equal employment opportunities (EEO) to all employees and applicants for employment without regard to race, color, religion, national origin, age, sex, citizenship, disability, genetic information, gender, sexual orientation, gender identity, marital status, amnesty, or status as a covered veteran in accordance with applicable federal, state, and local laws.
β’ Hybrid in Charlotte, NC, 2-3 days/week on site
β’ 1-year contract with potential for extension or full-time conversion
β’ W-2 only (C2C/1099 is not possible for this role)
REQUIRED EXPERIENCE:
β’ 5+ years of AWS experience
β’ AWS services: S3, EMR, Glue Jobs, Lambda, Athena, CloudTrail, SNS, SQS, CloudWatch, Step Functions, QuickSight
β’ Experience with Kafka/Messaging, preferably Confluent Kafka
β’ Experience with EMR databases such as Glue Catalog, Lake Formation, Redshift, DynamoDB, and Aurora
β’ Experience with AWS data warehousing tools such as Amazon Redshift and Amazon Athena
β’ Proven track record in the design and implementation of data warehouse solutions using AWS
β’ Skilled in data modeling and executing ETL processes tailored for data warehousing
β’ Competence in developing and refining data pipelines within AWS
β’ Proficient in handling both real-time and batch data processing tasks
β’ Extensive understanding of database management fundamentals
β’ Expertise in creating alerts and automated solutions for handling production problems
β’ Tools and Languages: Python, Spark, PySpark, and Pandas
β’ Infrastructure as Code technology: Terraform/CloudFormation
β’ Experience with Secrets Management Platform like Vault and AWS Secrets manager
β’ Experience with Event Driven Architecture
β’ DevOps pipeline (CI/CD): Bitbucket; Concourse
β’ Experience with RDBMS platforms and strong proficiency with SQL
β’ Experience with Rest APIs and API gateway
β’ Deep knowledge of IAM roles and Policies
β’ Experience using AWS monitoring services like CloudWatch, CloudTrail, and CloudWatch events
β’ Deep understanding of networking DNS, TCP/IP, and VPN
β’ Experience with AWS workflow orchestration tool like Airflow or Step Functions
RESPONSIBILITIES:
β’ Where applicable, collaborate with Lead Developers (Data Engineer, Software Engineer, Data Scientist, Technical Test Lead) to understand requirements/use cases to outline technical scope and lead delivery of the technical solution
β’ Confirm required developers and skill sets specific to the product
β’ Collaborate with Data and Solution architects on key technical decisions
β’ Skilled in developing data pipelines, focusing on long-term reliability and maintaining high data quality
β’ Design data warehousing solutions with the end-user in mind, ensuring ease of use without compromising on performance
β’ Manage and resolve issues in production data warehouse environments on AWS
CORE EXPERIENCE AND ABILITIES:
β’ Ability to perform hands-on development and peer review for certain components/tech stack on the product
β’ Standing up of development instances and migration path (with required security, access/roles)
β’ Develop components and related processes (e.g., data pipelines and associated ETL processes, workflows)
β’ Ability to build new data pipelines, identify existing data gaps, and provide automated solutions to deliver analytical capabilities and enriched data to applications
β’ Ability to implement data pipelines with the right attentiveness to durability and data quality
β’ Implement data warehousing products thinking of the end user's experience (ease of use with the right performance)
Eight Eleven Group provides equal employment opportunities (EEO) to all employees and applicants for employment without regard to race, color, religion, national origin, age, sex, citizenship, disability, genetic information, gender, sexual orientation, gender identity, marital status, amnesty, or status as a covered veteran in accordance with applicable federal, state, and local laws.





