

Charter Global
Sr. Cloud Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Sr. Cloud Data Engineer with a 12-month+ contract located in Malvern, PA/Charlotte, NC (Hybrid 3 days onsite). Requires 9–10 years of AWS data engineering experience, proficiency in AWS Glue, PySpark, and CI/CD tools. Wealth Asset Management experience is a plus.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
November 18, 2025
🕒 - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Malvern, PA
-
🧠 - Skills detailed
#PySpark #Data Pipeline #Spark (Apache Spark) #Cloud #Data Engineering #DynamoDB #GitHub #BitBucket #AWS Glue #AWS (Amazon Web Services) #Code Reviews
Role description
Job Title: Sr. Cloud Data Engineer
Location: Malvern, PA/Charlotte, NC, USA (Hybrid 3 a week onsite)
Duration: 12 months+ Contract
Contract Description:
Responsibilities:
• Maintain and optimize AWS-based data pipelines to ensure timely and reliable data delivery.
• Develop and troubleshoot workflows using AWS Glue, PySpark, Step Functions, and DynamoDB.
• Collaborate on code management and CI/CD processes using Bitbucket, GitHub, and Bamboo.
• Participate in code reviews and repository management to uphold coding standards.
• Provide technical guidance and mentorship to junior engineers and assist in team coordination.
Qualifications:
• 9–10 years of experience in data engineering with strong hands-on AWS expertise.
• Proficient in AWS Glue, PySpark, Step Functions, and DynamoDB.
• Skilled in managing code repositories and CI/CD pipelines (Bitbucket, GitHub, Bamboo).
• Experience in team coordination or mentoring roles.
Familiarity with Wealth Asset Management, especially personal portfolio performance, is a plus
Job Title: Sr. Cloud Data Engineer
Location: Malvern, PA/Charlotte, NC, USA (Hybrid 3 a week onsite)
Duration: 12 months+ Contract
Contract Description:
Responsibilities:
• Maintain and optimize AWS-based data pipelines to ensure timely and reliable data delivery.
• Develop and troubleshoot workflows using AWS Glue, PySpark, Step Functions, and DynamoDB.
• Collaborate on code management and CI/CD processes using Bitbucket, GitHub, and Bamboo.
• Participate in code reviews and repository management to uphold coding standards.
• Provide technical guidance and mentorship to junior engineers and assist in team coordination.
Qualifications:
• 9–10 years of experience in data engineering with strong hands-on AWS expertise.
• Proficient in AWS Glue, PySpark, Step Functions, and DynamoDB.
• Skilled in managing code repositories and CI/CD pipelines (Bitbucket, GitHub, Bamboo).
• Experience in team coordination or mentoring roles.
Familiarity with Wealth Asset Management, especially personal portfolio performance, is a plus






