

Senior Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is a Senior Data Engineer for a 12-month W2 contract in Charlotte, NC, offering $80.00 to $90.00 per hour. Requires 5+ years in Data Engineering, expertise in Data LakeHouse, AWS, Kubernetes, and Agile environments.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
720
-
ποΈ - Date discovered
July 11, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
On-site
-
π - Contract type
W2 Contractor
-
π - Security clearance
Unknown
-
π - Location detailed
Charlotte, NC
-
π§ - Skills detailed
#Data Lakehouse #Big Data #Agile #Kafka (Apache Kafka) #Scala #Vault #Redshift #API (Application Programming Interface) #Cloud #GIT #AWS (Amazon Web Services) #Spark (Apache Spark) #"ETL (Extract #Transform #Load)" #IAM (Identity and Access Management) #Data Engineering #Data Quality #Deployment #Logging #Debugging #Hadoop #Security #Lambda (AWS Lambda) #Java #Data Lake #PostgreSQL #SNS (Simple Notification Service) #Python #Aurora #Data Security #Athena #SQL (Structured Query Language) #Terraform #S3 (Amazon Simple Storage Service) #RDS (Amazon Relational Database Service) #SQS (Simple Queue Service) #Data Encryption #AWS IAM (AWS Identity and Access Management) #Containers #Kubernetes
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Primary Talent Partners has an open 12 month W2 contract with a large utilities client of ours in the Charlotte, NC area.
Pay Range:Β $80.00 to $90.00 per hour (No PTO, benefits, etc but self funded ACA-compliant healthcare plan available to opt-in to)
Our Enterprise Data Platforms Team is seeking a Subject Matter Expert to help develop our Data Fabric as an interconnected network of data capabilities and data products designed to deliver data efficiently and at scale. Candidates should have expertise in developing and building data platforms, demonstrating experience with overcoming obstacles and avoiding pitfalls. They should also possess skills in optimizing and automating deliverables to production using the required tech stack. Additionally, candidates should be experienced and adaptable to changing demands and priorities in an Agile development environment.
Experience and Skills Required
β’ At least 5 years of experience in Data Engineering and/or Software Engineering roles.
β’ Experience with building/optimizing Data LakeHouse with Open Table formats.
β’ Kubernetes deployments/cluster administration.
β’ Transitioning on-premise big data platforms to scalable cloud-based platforms like AWS.
β’ Experience with distributed systems, microservice architecture, and containers.
β’ Cloud Streaming use cases in Big Data Ecosystems (e.g., EMR, EKS, Hadoop, Spark, Hudi, Kafka/Kinesis).
Technical Expertise
β’ Git Hib and Git Hub Actions.
β’ AWS (IAM, API Gateway, Lambda, Step Functions, Lake formation, EKS & Kubernetes, Glue: Catalog, ETL, Crawler, Athena, S3).
β’ Apache Hudi and Apache Flink.
β’ PostgreSQL and SQL.
β’ RDS (Relational Database Services).
β’ Python and Java.
β’ Terraform Enterprise with an understanding of modules, providers, functions, and the ability to write and debug TF.
Helpful Tech Stack Experience
β’ Helm.
β’ Kafka and Kafka Schema Registry.
β’ AWS Services: CloudTrail, SNS, SQS, CloudWatch, Step Functions, Aurora, EMR, Redshift, Iceberg.
β’ Secrets Management Platform: Vault, AWS Secrets manager.
Core Responsibilities and Soft Skills
β’ Provide technical direction, engage the team in discussion on how to best guide/build features on key technical aspects, and be responsible for product tech delivery.
β’ Work closely with the Product Owner and team to align on delivery goals and timing.
β’ Collaborate with architects on key technical decisions for data and overall solutions.
β’ Lead design and implementation of data quality check methods.
β’ Ensure data security and permissions solutions, including data encryption, user access controls, and logging.
β’ Think unconventionally to find the best way to solve for a defined use case with fuzzy requirements.
β’ Self-starter mentality, willing to do their own research to solve problems, and can clearly present findings.
β’ Thrive in a fail-fast environment, involving mini PoCs, and participate in an inspect and adapt process.
β’ Questioning and Improvement mindset, ready to ask questions about current processes and suggest alternative solutions.
β’ Customer-facing skills, interfacing with stakeholders and other product teams via pairing, troubleshooting support, and debugging issues.
Primary Talent Partners is an Equal Opportunity / Affirmative Action employer committed to diversity in the workplace. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, age, national origin, disability, protected veteran status, gender identity, or any other factor protected by applicable federal, state, or local laws.
If you are a person with a disability needing assistance with the application or at any point in the hiring process, please contact us at info@primarytalentpartners.com
#PTPJobs