

Randstad Digital Americas
Ata Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer in Malvern, Pennsylvania, on a contract for $65-$70 per hour. Requires 7+ years of experience, expertise in Python, SQL, AWS services, API development, and data pipeline management. Bachelor's degree required.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
560
-
🗓️ - Date
March 26, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
1099 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Malvern, PA
-
🧠 - Skills detailed
#RDS (Amazon Relational Database Service) #Programming #SQL (Structured Query Language) #S3 (Amazon Simple Storage Service) #Redshift #Athena #Data Quality #Data Engineering #Scala #Data Pipeline #Monitoring #Cloud #Lambda (AWS Lambda) #Web Services #Data Lake #Datasets #TypeScript #Data Integrity #AWS S3 (Amazon Simple Storage Service) #GraphQL #Data Architecture #Swagger #Microservices #"ETL (Extract #Transform #Load)" #Data Processing #Prometheus #API (Application Programming Interface) #Observability #AI (Artificial Intelligence) #Anomaly Detection #Grafana #ML (Machine Learning) #DynamoDB #Python #OpenSearch #AWS (Amazon Web Services) #Load Balancing
Role description
Job Summary:
Required Skills - Top 3-5 Key Words
"Required Technical Skills: Python, SQL, AWS Web services (Glu, S3, Lambda)
Core Programming Skills:
Expert proficiency in Python, with experience in building data pipelines and back-end
systems.
Advanced knowledge of SQL for querying and optimizing large datasets.
AWS Cloud Services Expertise:
DynamoDB, S3, Athena, GlueETL, Lambda, ECS, Glue Data Quality, EventBridge,
Redshift Machine Learning, OpenSearch, and RDS.
API And Resilience Engineering:
Proven expertise in designing fault-tolerant APIs using Swagger/OpenAPI, GraphQL,
and RESTful standards.
Robust understanding of distributed systems, load balancing, and failover strategies.
Monitoring And Orchestration:
Hands-on experience with Prometheus and Grafana for observability and monitoring.
Job Duties
Senior Data Engineer - 7+ Years of Experience
We are seeking a highly experienced Senior Data Engineer with 7+ years of expertise in
designing, building, and optimizing robust data solutions. The ideal candidate must
possess top-tier skills in Python, AWS services, API development, and TypeScript, and
have significant hands-on experience with anomaly detection systems.
The candidate should have a proven ability to work at both strategic and tactical levels,
from designing data architectures to implementing them in the weeds.
location: Malvern, Pennsylvania
job type: Contract
salary: $65 - 70 per hour
work hours: 8am to 5pm
education: Bachelors
Responsibilities:
Job Requirements
Key Responsibilities:
• Data Pipeline Development
• Independently design, build, and maintain complex ETL pipelines, ensuring scalability
• and efficiency for large-scale data processing needs.
• Manage pipeline complexity and orchestration, delivering high-performance data
• products accessible via APIs for business-critical applications.
• Archive processed data products into data lakes (e.g., AWS S3) for analytics and
• machine learning use cases.
• Anomaly Detection and Data Quality
• Implement advanced anomaly detection systems and data validation techniques, ensuring data integrity and quality.
• Leverage AI/ML methodologies, including Large Language Models (LLMs), to detect and address data inconsistencies.
• Develop and automate robust data quality and validation frameworks.
• Cloud and API Engineering
• Architect and manage resilient APIs using modern patterns, including microservices,
• RESTful design, and GraphQL.
• Configure API gateways, circuit breakers, and fault-tolerant mechanisms for distributed
• systems.
• Ensure horizontal and vertical scaling strategies for API-driven data products.
• Monitoring and Observability
• Implement comprehensive monitoring and observability solutions using Prometheus and Grafana to optimize system reliability.
• Establish proactive alerting systems and ensure real-time system health visibility.
• Cross-functional Collaboration and Innovation
• Collaborate with stakeholders to understand business needs and translate them into scalable, data-driven solutions.
• Continuously research and integrate emerging technologies to enhance data engineering practices.
Qualifications:
Job Requirements
Key Responsibilities:
Data Pipeline Development
Independently design, build, and maintain complex ETL pipelines, ensuring scalability
and efficiency for large-scale data processing needs.
Manage pipeline complexity and orchestration, delivering high-performance data
products accessible via APIs for business-critical applications.
Archive processed data products into data lakes (e.g., AWS S3) for analytics and
machine learning use cases.
Anomaly Detection and Data Quality
Implement advanced anomaly detection systems and data validation techniques, ensuring data integrity and quality.
Leverage AI/ML methodologies, including Large Language Models (LLMs), to detect and address data inconsistencies.
Develop and automate robust data quality and validation frameworks.
Cloud and API Engineering
Architect and manage resilient APIs using modern patterns, including microservices,
RESTful design, and GraphQL.
Configure API gateways, circuit breakers, and fault-tolerant mechanisms for distributed
systems.
Ensure horizontal and vertical scaling strategies for API-driven data products.
Monitoring and Observability
Implement comprehensive monitoring and observability solutions using Prometheus and Grafana to optimize system reliability.
Establish proactive alerting systems and ensure real-time system health visibility.
Cross-functional Collaboration and Innovation
Collaborate with stakeholders to understand business needs and translate them into scalable, data-driven solutions.
Continuously research and integrate emerging technologies to enhance data engineering practices.
Equal Opportunity Employer: Race, Color, Religion, Sex, Sexual Orientation, Gender Identity, National Origin, Age, Genetic Information, Disability, Protected Veteran Status, or any other legally protected group status.
At Randstad Digital, we welcome people of all abilities and want to ensure that our hiring and interview process meets the needs of all applicants. If you require a reasonable accommodation to make your application or interview experience a great one, please contact HRsupport@randstadusa.com.
Pay offered to a successful candidate will be based on several factors including the candidate's education, work experience, work location, specific job duties, certifications, etc. In addition, Randstad Digital offers a comprehensive benefits package, including: medical, prescription, dental, vision, AD&D, and life insurance offerings, short-term disability, and a 401K plan (all benefits are based on eligibility).
This posting is open for thirty (30) days.
Any consideration of a background check would be an individualized assessment based on the applicant or employee's specific record and the duties and requirements of the specific job.
Job Summary:
Required Skills - Top 3-5 Key Words
"Required Technical Skills: Python, SQL, AWS Web services (Glu, S3, Lambda)
Core Programming Skills:
Expert proficiency in Python, with experience in building data pipelines and back-end
systems.
Advanced knowledge of SQL for querying and optimizing large datasets.
AWS Cloud Services Expertise:
DynamoDB, S3, Athena, GlueETL, Lambda, ECS, Glue Data Quality, EventBridge,
Redshift Machine Learning, OpenSearch, and RDS.
API And Resilience Engineering:
Proven expertise in designing fault-tolerant APIs using Swagger/OpenAPI, GraphQL,
and RESTful standards.
Robust understanding of distributed systems, load balancing, and failover strategies.
Monitoring And Orchestration:
Hands-on experience with Prometheus and Grafana for observability and monitoring.
Job Duties
Senior Data Engineer - 7+ Years of Experience
We are seeking a highly experienced Senior Data Engineer with 7+ years of expertise in
designing, building, and optimizing robust data solutions. The ideal candidate must
possess top-tier skills in Python, AWS services, API development, and TypeScript, and
have significant hands-on experience with anomaly detection systems.
The candidate should have a proven ability to work at both strategic and tactical levels,
from designing data architectures to implementing them in the weeds.
location: Malvern, Pennsylvania
job type: Contract
salary: $65 - 70 per hour
work hours: 8am to 5pm
education: Bachelors
Responsibilities:
Job Requirements
Key Responsibilities:
• Data Pipeline Development
• Independently design, build, and maintain complex ETL pipelines, ensuring scalability
• and efficiency for large-scale data processing needs.
• Manage pipeline complexity and orchestration, delivering high-performance data
• products accessible via APIs for business-critical applications.
• Archive processed data products into data lakes (e.g., AWS S3) for analytics and
• machine learning use cases.
• Anomaly Detection and Data Quality
• Implement advanced anomaly detection systems and data validation techniques, ensuring data integrity and quality.
• Leverage AI/ML methodologies, including Large Language Models (LLMs), to detect and address data inconsistencies.
• Develop and automate robust data quality and validation frameworks.
• Cloud and API Engineering
• Architect and manage resilient APIs using modern patterns, including microservices,
• RESTful design, and GraphQL.
• Configure API gateways, circuit breakers, and fault-tolerant mechanisms for distributed
• systems.
• Ensure horizontal and vertical scaling strategies for API-driven data products.
• Monitoring and Observability
• Implement comprehensive monitoring and observability solutions using Prometheus and Grafana to optimize system reliability.
• Establish proactive alerting systems and ensure real-time system health visibility.
• Cross-functional Collaboration and Innovation
• Collaborate with stakeholders to understand business needs and translate them into scalable, data-driven solutions.
• Continuously research and integrate emerging technologies to enhance data engineering practices.
Qualifications:
Job Requirements
Key Responsibilities:
Data Pipeline Development
Independently design, build, and maintain complex ETL pipelines, ensuring scalability
and efficiency for large-scale data processing needs.
Manage pipeline complexity and orchestration, delivering high-performance data
products accessible via APIs for business-critical applications.
Archive processed data products into data lakes (e.g., AWS S3) for analytics and
machine learning use cases.
Anomaly Detection and Data Quality
Implement advanced anomaly detection systems and data validation techniques, ensuring data integrity and quality.
Leverage AI/ML methodologies, including Large Language Models (LLMs), to detect and address data inconsistencies.
Develop and automate robust data quality and validation frameworks.
Cloud and API Engineering
Architect and manage resilient APIs using modern patterns, including microservices,
RESTful design, and GraphQL.
Configure API gateways, circuit breakers, and fault-tolerant mechanisms for distributed
systems.
Ensure horizontal and vertical scaling strategies for API-driven data products.
Monitoring and Observability
Implement comprehensive monitoring and observability solutions using Prometheus and Grafana to optimize system reliability.
Establish proactive alerting systems and ensure real-time system health visibility.
Cross-functional Collaboration and Innovation
Collaborate with stakeholders to understand business needs and translate them into scalable, data-driven solutions.
Continuously research and integrate emerging technologies to enhance data engineering practices.
Equal Opportunity Employer: Race, Color, Religion, Sex, Sexual Orientation, Gender Identity, National Origin, Age, Genetic Information, Disability, Protected Veteran Status, or any other legally protected group status.
At Randstad Digital, we welcome people of all abilities and want to ensure that our hiring and interview process meets the needs of all applicants. If you require a reasonable accommodation to make your application or interview experience a great one, please contact HRsupport@randstadusa.com.
Pay offered to a successful candidate will be based on several factors including the candidate's education, work experience, work location, specific job duties, certifications, etc. In addition, Randstad Digital offers a comprehensive benefits package, including: medical, prescription, dental, vision, AD&D, and life insurance offerings, short-term disability, and a 401K plan (all benefits are based on eligibility).
This posting is open for thirty (30) days.
Any consideration of a background check would be an individualized assessment based on the applicant or employee's specific record and the duties and requirements of the specific job.






