

Engineer (Risk)
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data & Risk Engineer focusing on networking, with a contract length of "unknown", offering a pay rate of "unknown". It requires expertise in AWS, Kafka, PySpark, and application development in Angular, Kotlin, and Java, ideally within financial services.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
June 5, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Hybrid
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Boston, MA
-
π§ - Skills detailed
#Spark (Apache Spark) #Data Engineering #Angular #Compliance #Kafka (Apache Kafka) #Security #PySpark #AWS (Amazon Web Services) #Cloud #Data Pipeline #Java #Network Security
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Job Title: Data & Risk Engineer (Networking Focus)
Location: Boston, MA (Hybrid β 2-3 days onsite)
Overview
We are seeking a skilled Data & Risk Engineer with a strong background in networking to support the integration efforts between two large banking institutions. This role is ideal for an engineer with experience building secure data pipelines, navigating network risks, and contributing to enterprise-grade applications in a hybrid cloud environment.
Key Responsibilities
β’ Design, build, and maintain secure data pipelines across cloud and on-premise systems (primarily AWS).
β’ Support integration efforts between two major banking systems with a focus on audit and risk mitigation.
β’ Implement and manage Kafka and PySpark pipelines to enable high-throughput, reliable data transfers.
β’ Build applications and services using technologies such as Angular, Kotlin, and Java.
β’ Assess and resolve network and firewall risks when migrating data or services between environments.
β’ Collaborate with audit and compliance teams to ensure systems align with governance requirements.
β’ Leverage knowledge of audit technologies (e.g., AuditBoard or similar tools) to support integration and reporting needs.
Required Skills
β’ Proven experience with AWS cloud services and building cloud-native pipelines.
β’ Strong understanding of network infrastructure, firewall navigation, and network security risks.
β’ Hands-on experience with Kafka and PySpark for data engineering tasks.
β’ Background in application/software development with Angular, Kotlin, and Java.
β’ Experience integrating systems in highly regulated environments such as financial services.
Preferred Qualifications
β’ Experience with audit tools or audit technology integrations (e.g., AuditBoard).
β’ Familiarity with risk management frameworks and secure software development practices.
β’ Prior experience in banking, finance, or enterprise integrations is a strong plus.