JSR Tech Consulting

AWS Data Pipeline Developer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an AWS Data Pipeline Developer on a contract basis in Newark, NJ, offering up to $55/hour (W2) or $60/hour (C2C). Key skills include AWS services, PostgreSQL, CI/CD tools, and experience in financial services.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
440
-
πŸ—“οΈ - Date
January 30, 2026
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Hybrid
-
πŸ“„ - Contract
W2 Contractor
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Newark, NJ
-
🧠 - Skills detailed
#Data Modeling #Data Extraction #Automation #"ETL (Extract #Transform #Load)" #Lambda (AWS Lambda) #SQL (Structured Query Language) #Jenkins #Python #Security #BitBucket #Databases #Deployment #Data Pipeline #AWS (Amazon Web Services) #Cloud #Java #Agile #Data Integration #S3 (Amazon Simple Storage Service) #Documentation #PostgreSQL #Monitoring #Scala
Role description
AWS Data Pipeline Developer (Platform Enablement) Location: Newark, NJ (Hybrid – 3 days onsite per week) Engagement: Contract Compensation: Up to $55/hour on W2 or $60/hour on C2C Industry: Financial Services / Investment Management Residency Requirement: Candidates must currently reside within commutable distance to Newark, NJ Role Overview We are seeking an AWS-focused Developer to join the Platform Enablement team at a major investment firm. This role centers on designing, building, and optimizing cloud-based data feed solutions using AWS services. The team enables feature groups to generate operational data feeds through low-code and configuration-driven approaches, with a strong emphasis on scalability, reliability, and adherence to service-level agreements. Key Responsibilities Design & Development β€’ Design, develop, test, and deploy AWS-based components supporting data pipelines β€’ Build solutions using AWS services including Lambda, S3, EventBridge, and ECS Fargate β€’ Implement parameter-driven feed generation logic across multiple data domains (e.g., Contract, Customer) β€’ Build and maintain PostgreSQL-based CFGenDB schemas for transformed domain data Data Integration & Transformation β€’ Configure and manage data extraction from on-prem and cloud-based databases β€’ Apply transformation rules using configuration files to ensure accurate feed generation for downstream systems Automation & Deployment β€’ Create and maintain CI/CD pipelines using tools such as Jenkins β€’ Ensure deployments comply with security, governance, and regulatory standards Monitoring & Troubleshooting β€’ Monitor feed generation workflows, Lambda executions, and ECS tasks β€’ Troubleshoot issues related to data validation, file tagging, access permissions, and processing failures Collaboration & Documentation β€’ Partner with feature teams, DBAs, and platform engineers to validate deployments and resolve defects β€’ Produce and maintain technical documentation, deployment procedures, and operational runbooks β€’ Communicate progress, risks, and issues clearly to stakeholders Required Skills & Experience β€’ Strong hands-on experience with AWS services, including Lambda, S3, EventBridge, and ECS Fargate β€’ Solid experience with PostgreSQL and SQL for data modeling and transformation β€’ Familiarity with CI/CD tools such as Jenkins and Bitbucket β€’ Understanding of Infrastructure-as-Code concepts β€’ Experience with event-driven architectures and data pipeline orchestration β€’ Hands-on development experience using Python or Java in AWS environments β€’ Working knowledge of cloud security best practices Soft Skills β€’ Strong analytical and problem-solving abilities β€’ Comfortable working in Agile environments β€’ Clear and effective written and verbal communication skills β€’ Ability to collaborate across engineering, data, and platform teams