

Appex Innovation
AWS Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for an AWS Data Engineer in Fort Mill, SC, on a contract basis for 10+ years of experience at $60/hr. Key skills include AWS Glue, PySpark, Kafka, and data quality frameworks, specifically in financial services.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
480
-
ποΈ - Date
March 5, 2026
π - Duration
Unknown
-
ποΈ - Location
Hybrid
-
π - Contract
W2 Contractor
-
π - Security
Unknown
-
π - Location detailed
Fort Mill, SC 29707
-
π§ - Skills detailed
#PySpark #Data Quality #Terraform #Cloud #Data Engineering #AWS Glue #AWS (Amazon Web Services) #Data Pipeline #Kafka (Apache Kafka) #Python #Spark (Apache Spark) #Anomaly Detection #Infrastructure as Code (IaC) #Base #Lambda (AWS Lambda) #Scala #Monitoring #AWS Lambda #Batch #Data Processing #DevOps #"ETL (Extract #Transform #Load)" #Data Integrity #SQL (Structured Query Language) #Data Lake
Role description
Job Information
Date Opened
03/05/2026
Job Type
Contract
Industry
Financial Services
Work Experience
10 + years
Salary
$60/Hr
City
Fort Mill
State/Province
South Carolina
Country
United States
Zip/Postal Code
29707
About Us
At Appex Innovations, we believe in the power of technology to transform healthcare, we believe in being nimble and in continual innovation, we believe that every customer is unique and we believe in listening to our clients. With our deep expertise in the health care domain, solid resource base, constant industry interface and the drive to be the best in the industry, we strive to provide services that re-define health care in the years to come. The secret to our success is Appex Innovations commitment to our people and our work. We thrive on teamwork, intelligence, and innovation. Our bright and energetic employees, hailing from over different parts of the world, share a passion for leading the way to improved healthcare outcomes. Together we work diligently to add value to our clients.
Job Description
We have a job opportunity as an AWS Data Engineer with our Partner in Fort, Mill, SC for a Hybrid role.
Job Details:
Job Title: Sr.Data Engineer (MidβSenior Level) β AWS & Streaming
Experience Level: Mid-Senior (10+ Preferred)
Location: Fort Mill, SC (hybrid - 3 days WFO)
Type: Contract-W2
Domain: Financial Services
Role Summary:
We are seeking a MidβSenior Data Engineer with strong expertise in AWS-based data engineering, real-time streaming technologies, and enterprise-grade data quality frameworks. The ideal candidate will design, build, and optimize scalable batch and streaming data pipelines, implement robust data validation and monitoring processes, and support mission-critical analytics platforms.
Key Responsibilities:
Develop and maintain scalable ETL/ELT pipelines using AWS Glue, PySpark, and Python
Build event-driven workflows using AWS Lambda
Design and manage real-time streaming solutions using Kafka, KSQL, and Apache Flink
Implement and enforce comprehensive data quality frameworks, including validation, profiling, monitoring, and reconciliation
Optimize data processing performance, scalability, reliability, and cost in cloud environments
Collaborate with cross-functional teams to deliver reliable, production-grade data platforms and ensure data integrity across the pipeline
Required Skills:
Strong hands-on experience with Python and PySpark
Proven expertise in AWS Glue, Lambda, and other cloud-native data services
Solid experience with the Kafka ecosystem (topics, partitions, consumer groups, streaming patterns)
Demonstrated experience building and supporting data quality frameworks (validation rules, reconciliation checks, profiling, anomaly detection)
Strong understanding of distributed data processing and scalable architecture patterns
Good-to-Have Skills:
Experience with Apache Flink for real-time stream processing and stateful computations
Knowledge of KSQL or other streaming SQL engines
Exposure to CI/CD pipelines, IaC (Terraform/CloudFormation), and DevOps practices
Familiarity with data lake/lakehouse architectures and table formats such as Iceberg, Delta, or Hudi
Experience working in enterprise or financial data environments
Job Information
Date Opened
03/05/2026
Job Type
Contract
Industry
Financial Services
Work Experience
10 + years
Salary
$60/Hr
City
Fort Mill
State/Province
South Carolina
Country
United States
Zip/Postal Code
29707
About Us
At Appex Innovations, we believe in the power of technology to transform healthcare, we believe in being nimble and in continual innovation, we believe that every customer is unique and we believe in listening to our clients. With our deep expertise in the health care domain, solid resource base, constant industry interface and the drive to be the best in the industry, we strive to provide services that re-define health care in the years to come. The secret to our success is Appex Innovations commitment to our people and our work. We thrive on teamwork, intelligence, and innovation. Our bright and energetic employees, hailing from over different parts of the world, share a passion for leading the way to improved healthcare outcomes. Together we work diligently to add value to our clients.
Job Description
We have a job opportunity as an AWS Data Engineer with our Partner in Fort, Mill, SC for a Hybrid role.
Job Details:
Job Title: Sr.Data Engineer (MidβSenior Level) β AWS & Streaming
Experience Level: Mid-Senior (10+ Preferred)
Location: Fort Mill, SC (hybrid - 3 days WFO)
Type: Contract-W2
Domain: Financial Services
Role Summary:
We are seeking a MidβSenior Data Engineer with strong expertise in AWS-based data engineering, real-time streaming technologies, and enterprise-grade data quality frameworks. The ideal candidate will design, build, and optimize scalable batch and streaming data pipelines, implement robust data validation and monitoring processes, and support mission-critical analytics platforms.
Key Responsibilities:
Develop and maintain scalable ETL/ELT pipelines using AWS Glue, PySpark, and Python
Build event-driven workflows using AWS Lambda
Design and manage real-time streaming solutions using Kafka, KSQL, and Apache Flink
Implement and enforce comprehensive data quality frameworks, including validation, profiling, monitoring, and reconciliation
Optimize data processing performance, scalability, reliability, and cost in cloud environments
Collaborate with cross-functional teams to deliver reliable, production-grade data platforms and ensure data integrity across the pipeline
Required Skills:
Strong hands-on experience with Python and PySpark
Proven expertise in AWS Glue, Lambda, and other cloud-native data services
Solid experience with the Kafka ecosystem (topics, partitions, consumer groups, streaming patterns)
Demonstrated experience building and supporting data quality frameworks (validation rules, reconciliation checks, profiling, anomaly detection)
Strong understanding of distributed data processing and scalable architecture patterns
Good-to-Have Skills:
Experience with Apache Flink for real-time stream processing and stateful computations
Knowledge of KSQL or other streaming SQL engines
Exposure to CI/CD pipelines, IaC (Terraform/CloudFormation), and DevOps practices
Familiarity with data lake/lakehouse architectures and table formats such as Iceberg, Delta, or Hudi
Experience working in enterprise or financial data environments






