Primary Talent Partners

AWS Data Engineer III

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an AWS Data Engineer III on a 12-month contract in Charlotte, NC, with a pay rate of $65.00 - $75.00/hr. Requires 5+ years of AWS experience, proficiency in data pipelines, and expertise in AWS services like RedShift and S3.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
600
-
πŸ—“οΈ - Date
January 27, 2026
πŸ•’ - Duration
More than 6 months
-
🏝️ - Location
On-site
-
πŸ“„ - Contract
W2 Contractor
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Charlotte, NC
-
🧠 - Skills detailed
#Vault #Data Pipeline #Infrastructure as Code (IaC) #S3 (Amazon Simple Storage Service) #REST (Representational State Transfer) #Data Processing #Amazon Redshift #Airflow #Athena #Cloud #Databases #API (Application Programming Interface) #Pandas #SQS (Simple Queue Service) #Data Warehouse #Migration #PySpark #IAM (Identity and Access Management) #Aurora #Monitoring #Redshift #Database Management #IP (Internet Protocol) #REST API #Terraform #Data Engineering #Kafka (Apache Kafka) #Batch #Security #Spark (Apache Spark) #Data Science #Data Quality #Lambda (AWS Lambda) #SNS (Simple Notification Service) #Data Modeling #AWS (Amazon Web Services) #Python #"ETL (Extract #Transform #Load)" #RDBMS (Relational Database Management System) #SQL (Structured Query Language) #DevOps #VPN (Virtual Private Network) #DynamoDB #BitBucket
Role description
Primary Talent Partners has a new contract opening for a AWS Data Engineer IIIΒ with our large power and utilities client in Charlotte, NC. This is a 12-month contract with a potential for extension. Pay: $65.00 - $75.00/hr;Β W2 contract, no PTO, no Benefits. ACA-compliant supplemental package available for enrollment. Candidates must be legally authorized to work in the United States and must be able to sit on Primary Talent Partners W2 without sponsorship. Description Core Responsibilities: β€’ Where applicable, collaborate with Lead Developers (Data Engineer, Software Engineer, Data Scientist, Technical Test Lead) to understand requirements/use cases to outline technical scope and lead delivery of the technical solution β€’ Confirm required developers and skill sets specific to the product β€’ Collaborate with Data and Solution architects on key technical decisions β€’ Skilled in developing data pipelines, focusing on long-term reliability and maintaining high data quality β€’ Design data warehousing solutions with the end-user in mind, ensuring ease of use without compromising on performance β€’ Manage and resolve issues in production data warehouse environments on AWS Core Experience and Abilities: β€’ Ability to perform hands-on development and peer review for certain components/tech stack on the product β€’ Standing up of development instances and migration path (with required security, access/roles) β€’ Develop components and related processes (e.g., data pipelines and associated ETL processes, workflows) β€’ Ability to build new data pipelines, identify existing data gaps, and provide automated solutions to deliver analytical capabilities and enriched data to applications β€’ Ability to implement data pipelines with the right attentiveness to durability and data quality β€’ Implement data warehousing products thinking of the end user's experience (ease of use with the right performance) Core Technical Skills: β€’ 5+ years of AWS experience β€’ AWS services: RedShift, S3, EMR, Glue Jobs, Lambda, Athena, CloudTrail, SNS, SQS, CloudWatch, Step Functions, QuickSight β€’ Experience with Kafka/Messaging, preferably Confluent Kafka β€’ Experience with EMR databases such as Glue Catalog, Lake Formation, Redshift, DynamoDB, and Aurora β€’ Experience with AWS data warehousing tools such as Amazon Redshift and Amazon Athena β€’ Proven track record in the design and implementation of data warehouse solutions using AWS β€’ Skilled in data modeling and executing ETL processes tailored for data warehousing β€’ Competence in developing and refining data pipelines within AWS β€’ Proficient in handling both real-time and batch data processing tasks β€’ Extensive understanding of database management fundamentals β€’ Expertise in creating alerts and automated solutions for handling production problems β€’ Tools and Languages: Python, Spark, PySpark, and Pandas β€’ Infrastructure as Code technology: Terraform/CloudFormation β€’ Experience with Secrets Management Platform like Vault and AWS Secrets manager β€’ Experience with Event Driven Architecture β€’ DevOps pipeline (CI/CD): Bitbucket; Concourse β€’ Experience with RDBMS platforms and strong proficiency with SQL β€’ Experience with Rest APIs and API gateway β€’ Deep knowledge of IAM roles and Policies β€’ Experience using AWS monitoring services like CloudWatch, CloudTrail, and CloudWatch events β€’ Deep understanding of networking DNS, TCP/IP, and VPN β€’ Experience with AWS workflow orchestration tool like Airflow or Step Functions Primary Talent Partners is an Equal Opportunity / Affirmative Action employer committed to diversity in the workplace. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, age, national origin, disability, protected veteran status, gender identity, or any other factor protected by applicable federal, state, or local laws. If you are a person with a disability needing assistance with the application or at any point in the hiring process, please contact us at info@primarytalentpartners.com #PTPJobs