Senior Data Engineer

โญ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer with a contract duration of 12+ months, located in Jersey City, NJ. Requires 10+ years in data engineering, 5+ in AWS, expertise in Kafka, Glue, and wealth management domain knowledge.
๐ŸŒŽ - Country
United States
๐Ÿ’ฑ - Currency
$ USD
-
๐Ÿ’ฐ - Day rate
-
๐Ÿ—“๏ธ - Date discovered
September 18, 2025
๐Ÿ•’ - Project duration
Unknown
-
๐Ÿ๏ธ - Location type
On-site
-
๐Ÿ“„ - Contract type
Unknown
-
๐Ÿ”’ - Security clearance
Unknown
-
๐Ÿ“ - Location detailed
Jersey City, NJ
-
๐Ÿง  - Skills detailed
#Monitoring #Lambda (AWS Lambda) #Data Engineering #Databases #Data Processing #Leadership #SQS (Simple Queue Service) #DynamoDB #Scala #Spark (Apache Spark) #Storage #Strategy #DevOps #Data Storage #PySpark #RDS (Amazon Relational Database Service) #Dynatrace #GIT #Logging #AWS (Amazon Web Services) #Aurora #Observability #Data Warehouse #IAM (Identity and Access Management) #Terraform #Python #Data Lake #Security #Infrastructure as Code (IaC) #S3 (Amazon Simple Storage Service) #Cloud #Data Architecture #Data Pipeline #AWS Glue #Data Lakehouse #Kafka (Apache Kafka) #SNS (Simple Notification Service)
Role description
Role: Lead AWS Data Engineer/AWS Data Engineer with Banking/Financial Domain Location: Jersey City, NJ Duration: Contract Minimum 12+ years of experience profiles Position Overview: We are seeking a highly skilled and experienced Lead AWS Data Engineer to join our team. This role requires deep expertise in modern AWS cloud data engineering, with a focus on real-time streaming systems, large-scale data processing, and wealth management domain knowledge. The ideal candidate will be a hands-on technologist with proven leadership skills, able to architect, build, and optimize resilient data pipelines supporting mission-critical financial systems. Roles and Responsibilities: Lead the design, development, and implementation of data streaming solutions leveraging Kafka, Kinesis, and AWS Glue (PySpark / Spark Streaming), with Lambda and related AWS services for orchestration. Develop and maintain Python code for Glue jobs, utilities, and data processing frameworks; enforce coding standards, testing, and performance tuning for large-scale workloads. Architect and manage data storage and processing across S3, RDS (Postgres), Aurora, DynamoDB, and Iceberg, optimizing partitioning, schema evolution, and lakehouse patterns. Evaluate and recommend data architecturesโ€”data warehouse, data lake, and data lakehouseโ€”clearly articulating strengths, weaknesses, and trade-offs (latency, cost, governance, scalability) for WM/TBAR use cases. Implement robust monitoring and logging using Dynatrace and CloudWatch to ensure system performance, reliability, and availability. Define and enforce IAM policies and cloud security best practices across AWS environments. Oversee CI/CD pipelines and infrastructure as code using Terraform, Git strategy/branching, and Octopus Deploy; automate build/deploy for Glue/Spark workloads. Partner with stakeholders to translate business requirements into technical solutions, especially in wealth management, trading (NSCC, BETA), and book-of-records platforms. Deliver scalable, high-performance, real-time data pipelines capable of handling large volumes of financial market and client data. Provide technical leadership and mentoring to junior engineers, promoting best practices in data engineering, streaming, and DevOps. Required Skills & Experience 10+ years of professional experience in data engineering, with at least 5 years in AWS cloud environments. Strong expertise in real-time data streaming technologies: Kafka, Kinesis, Glue, PySpark, Lambda, SQS/SNS. Hands-on experience with AWS storage & databases: S3, RDS (Postgres), Aurora, DynamoDB, Iceberg. Proficiency in infrastructure as code (Terraform) and CI/CD (Git branching strategy, Octopus). Strong background in monitoring & observability: Dynatrace, CloudWatch. Excellent understanding of IAM policies, roles, and cloud security best practices. Domain expertise in wealth management, trading data (NSCC, BETA), and book-of-records platforms