

The Glove
Senior Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer (C2H) in Reading, PA (Hybrid). Contract length is unspecified, with a focus on ETL/ELT pipelines using Talend, PySpark, and AWS services. Requires 10+ years of Data Engineering experience and local candidates only.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
440
-
🗓️ - Date
January 29, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Reading, PA
-
🧠 - Skills detailed
#Lambda (AWS Lambda) #Talend #AWS Glue #SQL (Structured Query Language) #Cloud #Data Engineering #Data Quality #PySpark #Python #Athena #Data Pipeline #Spark SQL #S3 (Amazon Simple Storage Service) #"ETL (Extract #Transform #Load)" #Datasets #Data Lake #Scala #Spark (Apache Spark) #AWS S3 (Amazon Simple Storage Service) #AWS (Amazon Web Services)
Role description
🚀 Hiring: Senior Data Engineer (C2H)
📍 Location: Reading, PA, US (Hybrid) – ONLY LOCAL
🕒 Hire Type: Contract to Hire
About the Role:
We are looking for a Senior Data Engineer to join a high-performing Data Engineering & Analytics team. You will design and build scalable, cloud-native data pipelines on AWS, working with large datasets to support analytics and business insights.
Key Responsibilities:
• Build and maintain ETL/ELT pipelines using Talend Cloud (8.0)
• Develop and optimize PySpark & Spark SQL jobs
• Ingest data from APIs, files, and streaming sources into AWS S3 data lakes
• Work with AWS Glue, Lambda, Athena, and EMR
• Ensure performance, scalability, and data quality.
Required Skills:
• 10+ years of experience in Data Engineering / ETL
• Strong hands-on experience with Talend (8.x / Cloud)
• Advanced PySpark, Spark SQL, and Python
• Solid experience with AWS data services (S3, Glue, Lambda, Athena, EMR)
• Strong understanding of data lakes and distributed systems
👉 Local candidates to Reading, PA only
👉 Hybrid work model
Interested candidates can apply or share profiles directly at seema.bisht@glovetalent.com
🚀 Hiring: Senior Data Engineer (C2H)
📍 Location: Reading, PA, US (Hybrid) – ONLY LOCAL
🕒 Hire Type: Contract to Hire
About the Role:
We are looking for a Senior Data Engineer to join a high-performing Data Engineering & Analytics team. You will design and build scalable, cloud-native data pipelines on AWS, working with large datasets to support analytics and business insights.
Key Responsibilities:
• Build and maintain ETL/ELT pipelines using Talend Cloud (8.0)
• Develop and optimize PySpark & Spark SQL jobs
• Ingest data from APIs, files, and streaming sources into AWS S3 data lakes
• Work with AWS Glue, Lambda, Athena, and EMR
• Ensure performance, scalability, and data quality.
Required Skills:
• 10+ years of experience in Data Engineering / ETL
• Strong hands-on experience with Talend (8.x / Cloud)
• Advanced PySpark, Spark SQL, and Python
• Solid experience with AWS data services (S3, Glue, Lambda, Athena, EMR)
• Strong understanding of data lakes and distributed systems
👉 Local candidates to Reading, PA only
👉 Hybrid work model
Interested candidates can apply or share profiles directly at seema.bisht@glovetalent.com





