

Strategic Staffing Solutions
Senior Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer in Charlotte, NC, hybrid setting, for a 12-month contract at a pay rate of "unknown." Requires 5+ years of experience in Data Engineering, proficiency in Apache Flink, AWS, Kubernetes, Python, and Terraform.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
760
-
ποΈ - Date
October 11, 2025
π - Duration
More than 6 months
-
ποΈ - Location
Hybrid
-
π - Contract
W2 Contractor
-
π - Security
Unknown
-
π - Location detailed
Charlotte, NC
-
π§ - Skills detailed
#Debugging #Terraform #Lambda (AWS Lambda) #Java #Big Data #AWS (Amazon Web Services) #API (Application Programming Interface) #Data Security #Cloud #S3 (Amazon Simple Storage Service) #IAM (Identity and Access Management) #Aurora #Data Quality #Data Engineering #Containers #RDS (Amazon Relational Database Service) #GitHub #Athena #Logging #Redshift #SNS (Simple Notification Service) #Kubernetes #Storage #Security #Python #Vault #Scala #Spark (Apache Spark) #Data Lake #Hadoop #Kafka (Apache Kafka) #SQL (Structured Query Language) #AWS Glue #"ETL (Extract #Transform #Load)" #Deployment #SQS (Simple Queue Service) #PostgreSQL #Data Encryption
Role description
STRATEGIC STAFFING SOLUTIONS (S3) HAS AN OPENING!
Strategic Staffing Solutions is currently looking for a Data Engineer IV, for one of its clients!!
Job Title: Senior Data Engineer
Location: Charlotte, NC
Setting: Hybrid (Open to candidates local to North Carolina.)
Duration: 12 months
Test: Candidates must complete the Glider assessment to be considered for this role.
Time Zone: EST/CST
Candidates should be willing to work on our W2 ONLY. No C2C,No 1099, Candidate must be authorized to work for any employer without any sponsorship needs/No visa transfers.
Top Skills:
Apache Flink
AWS Lake Formation
Kubernetes
Python
Terraform
AWS Glue
Job Summary:
We are specifically looking for individuals with at least 5+ years of experience in Data Engineering and/or Software Engineering roles who can provide knowledge and support to our existing engineers.
Must have experience with similar platform engineering/management solutions:
Building/optimizing Data Lake House with Open Table formats
Kubernetes deployments/cluster administration
Transitioning on-premise big data platforms to scalable cloud-based platforms like AWS
Distributed Systems, Microservice architecture, and containers
Cloud Streaming use cases in Big Data Ecosystems (e.g., EMR, EKS, Hadoop, Spark, Hudi, Kafka/Kinesis)
Must have experience with the below tech stack:
GitHub Hib and GitHub Actions
AWS
o IAM
o API Gateway
o Lambda
o Step Functions
o Lake formation
o EKS & Kubernetes
o Glue: Catalog, ETL, Crawler
o Athena
o Lambda
o S3 (Strong foundational concepts like object data store vs block data store, encryption/decryption, storage tiers etc)
Apache Hudi
Apache Flink
PostgreSQL and SQL
RDS (Relational Database Services).
Python
Java
Terraform Enterprise
o Must be able to explain what TF is used for
o Understand and explain basic principles (e.g. modules, providers, functions)
o Must be able to write and debug TF
Helpful tech stack experience would include:
Helm
Kafka and Kafka Schema Registry
AWS Services: CloudTrail, SNS, SQS, CloudWatch, Step Functions, Aurora, EMR, Redshift, Iceberg
Secrets Management Platform: Vault, AWS Secrets manager
Core Responsibilities and Soft Skills
β’ Provides technical direction, engage team in discussion on how to best guide/build features on key technical aspects and responsible for product tech delivery
β’ Works closely with the Product Owner and team to align on delivery goals and timing
β’ Collaborates with architects on key technical decisions for data and overall solution
β’ Lead design and implementation of data quality check methods
β’ Ensure data security and permissions solutions, including data encryption, user access controls and logging
β’ Be able to think unconventionally to find the best way to solve for a defined use case with fuzzy requirements.
β’ Willing to do their own research to solve problems and can clearly present findings and engage in conversation on what makes one solution better than another
β’ Thrive in a fail-fast environment, involving mini PoCs, and participate in an inspect and adapt process.
β’ Questioning and Improvement mindset
β’ Must be ready to ask questions about why something is currently done the way it is and suggest alternative solutions
β’ Customer facing skills
β’ Interfacing with stakeholders and other product teams via pairing, troubleshooting support, and debugging issues they encounter with our products
βBeware of scams. S3 never asks for money during its onboarding process.β
STRATEGIC STAFFING SOLUTIONS (S3) HAS AN OPENING!
Strategic Staffing Solutions is currently looking for a Data Engineer IV, for one of its clients!!
Job Title: Senior Data Engineer
Location: Charlotte, NC
Setting: Hybrid (Open to candidates local to North Carolina.)
Duration: 12 months
Test: Candidates must complete the Glider assessment to be considered for this role.
Time Zone: EST/CST
Candidates should be willing to work on our W2 ONLY. No C2C,No 1099, Candidate must be authorized to work for any employer without any sponsorship needs/No visa transfers.
Top Skills:
Apache Flink
AWS Lake Formation
Kubernetes
Python
Terraform
AWS Glue
Job Summary:
We are specifically looking for individuals with at least 5+ years of experience in Data Engineering and/or Software Engineering roles who can provide knowledge and support to our existing engineers.
Must have experience with similar platform engineering/management solutions:
Building/optimizing Data Lake House with Open Table formats
Kubernetes deployments/cluster administration
Transitioning on-premise big data platforms to scalable cloud-based platforms like AWS
Distributed Systems, Microservice architecture, and containers
Cloud Streaming use cases in Big Data Ecosystems (e.g., EMR, EKS, Hadoop, Spark, Hudi, Kafka/Kinesis)
Must have experience with the below tech stack:
GitHub Hib and GitHub Actions
AWS
o IAM
o API Gateway
o Lambda
o Step Functions
o Lake formation
o EKS & Kubernetes
o Glue: Catalog, ETL, Crawler
o Athena
o Lambda
o S3 (Strong foundational concepts like object data store vs block data store, encryption/decryption, storage tiers etc)
Apache Hudi
Apache Flink
PostgreSQL and SQL
RDS (Relational Database Services).
Python
Java
Terraform Enterprise
o Must be able to explain what TF is used for
o Understand and explain basic principles (e.g. modules, providers, functions)
o Must be able to write and debug TF
Helpful tech stack experience would include:
Helm
Kafka and Kafka Schema Registry
AWS Services: CloudTrail, SNS, SQS, CloudWatch, Step Functions, Aurora, EMR, Redshift, Iceberg
Secrets Management Platform: Vault, AWS Secrets manager
Core Responsibilities and Soft Skills
β’ Provides technical direction, engage team in discussion on how to best guide/build features on key technical aspects and responsible for product tech delivery
β’ Works closely with the Product Owner and team to align on delivery goals and timing
β’ Collaborates with architects on key technical decisions for data and overall solution
β’ Lead design and implementation of data quality check methods
β’ Ensure data security and permissions solutions, including data encryption, user access controls and logging
β’ Be able to think unconventionally to find the best way to solve for a defined use case with fuzzy requirements.
β’ Willing to do their own research to solve problems and can clearly present findings and engage in conversation on what makes one solution better than another
β’ Thrive in a fail-fast environment, involving mini PoCs, and participate in an inspect and adapt process.
β’ Questioning and Improvement mindset
β’ Must be ready to ask questions about why something is currently done the way it is and suggest alternative solutions
β’ Customer facing skills
β’ Interfacing with stakeholders and other product teams via pairing, troubleshooting support, and debugging issues they encounter with our products
βBeware of scams. S3 never asks for money during its onboarding process.β