JSR Tech Consulting

AWS Data Pipeline Developer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an AWS Data Pipeline Developer on a contract basis in Newark, NJ, offering up to $55/hour (W2) or $60/hour (C2C). Key skills include AWS services, PostgreSQL, CI/CD tools, and experience in financial services.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
440
-
🗓️ - Date
January 30, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Newark, NJ
-
🧠 - Skills detailed
#Data Modeling #Data Extraction #Automation #"ETL (Extract #Transform #Load)" #Lambda (AWS Lambda) #SQL (Structured Query Language) #Jenkins #Python #Security #BitBucket #Databases #Deployment #Data Pipeline #AWS (Amazon Web Services) #Cloud #Java #Agile #Data Integration #S3 (Amazon Simple Storage Service) #Documentation #PostgreSQL #Monitoring #Scala
Role description
AWS Data Pipeline Developer (Platform Enablement) Location: Newark, NJ (Hybrid – 3 days onsite per week) Engagement: Contract Compensation: Up to $55/hour on W2 or $60/hour on C2C Industry: Financial Services / Investment Management Residency Requirement: Candidates must currently reside within commutable distance to Newark, NJ Role Overview We are seeking an AWS-focused Developer to join the Platform Enablement team at a major investment firm. This role centers on designing, building, and optimizing cloud-based data feed solutions using AWS services. The team enables feature groups to generate operational data feeds through low-code and configuration-driven approaches, with a strong emphasis on scalability, reliability, and adherence to service-level agreements. Key Responsibilities Design & Development • Design, develop, test, and deploy AWS-based components supporting data pipelines • Build solutions using AWS services including Lambda, S3, EventBridge, and ECS Fargate • Implement parameter-driven feed generation logic across multiple data domains (e.g., Contract, Customer) • Build and maintain PostgreSQL-based CFGenDB schemas for transformed domain data Data Integration & Transformation • Configure and manage data extraction from on-prem and cloud-based databases • Apply transformation rules using configuration files to ensure accurate feed generation for downstream systems Automation & Deployment • Create and maintain CI/CD pipelines using tools such as Jenkins • Ensure deployments comply with security, governance, and regulatory standards Monitoring & Troubleshooting • Monitor feed generation workflows, Lambda executions, and ECS tasks • Troubleshoot issues related to data validation, file tagging, access permissions, and processing failures Collaboration & Documentation • Partner with feature teams, DBAs, and platform engineers to validate deployments and resolve defects • Produce and maintain technical documentation, deployment procedures, and operational runbooks • Communicate progress, risks, and issues clearly to stakeholders Required Skills & Experience • Strong hands-on experience with AWS services, including Lambda, S3, EventBridge, and ECS Fargate • Solid experience with PostgreSQL and SQL for data modeling and transformation • Familiarity with CI/CD tools such as Jenkins and Bitbucket • Understanding of Infrastructure-as-Code concepts • Experience with event-driven architectures and data pipeline orchestration • Hands-on development experience using Python or Java in AWS environments • Working knowledge of cloud security best practices Soft Skills • Strong analytical and problem-solving abilities • Comfortable working in Agile environments • Clear and effective written and verbal communication skills • Ability to collaborate across engineering, data, and platform teams