

TechLadderInc
Data Architect with AWS Cert* & Capital Market
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Architect with AWS Certification and Capital Market experience, located in New York City. Contract length is unspecified, with a pay rate of $80/hr. Key skills include Python, Azure Databricks, and REST API development.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
640
-
🗓️ - Date
March 13, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Margaretville, NY 12455
-
🧠 - Skills detailed
#API (Application Programming Interface) #AWS (Amazon Web Services) #ADLS (Azure Data Lake Storage) #Agile #Documentation #Data Ingestion #Storage #IAM (Identity and Access Management) #REST (Representational State Transfer) #SQL (Structured Query Language) #Data Architecture #Flask #Databricks #BI (Business Intelligence) #Data Engineering #Computer Science #GIT #S3 (Amazon Simple Storage Service) #Jira #DevOps #Spark (Apache Spark) #Deployment #Python #Observability #Azure #Cloud #Monitoring #REST API #Azure Databricks #Jenkins #PySpark #FastAPI #Azure DevOps #Data Pipeline #Django #Security #Batch
Role description
Job Title: Dr. Data Architect with AWS Certification
Location: New York City (On-site working)
Max Rate: $80/- hr W2
About the Role:
We are seeking a highly skilled and experienced Application Engineer and Data Architect to join our dynamic Team. As a senior member of the team, you will play a critical role in designing, implementing, and maintaining the application infrastructure. Your expertise will help drive innovative data solutions and ensure platform reliability, security, and performance.
Key Responsibilities:
Lead architecture and technical design discussions, considering industry-standard technologies and best practices.
Support production operations and resolve complex production issues as a senior developer within the Credit Risk application team.
Design and implement batch and ad-hoc data pipelines based on Medallion Lakehouse architecture using modern cloud data engineering patterns, primarily in Databricks.
Build and maintain data ingestion flows from upstream systems into object storage (e.g., S3, ADLS) using formats like Parquet, including advanced features such as partitioning, z-ordering, and schema evolution.
Integrate with external XVA/risk engines and implement orchestration logic to manage long-running external computations.
Model and optimize risk measures (e.g., EPE, PFE) for efficient querying and consumption by BI tools, notebooks, and downstream applications.
Ensure platform reliability, observability, security (IAM roles, OIDC/Bearer token authentication, encryption), and auditability.
Contribute to API design for internal and external customers, focusing on versioning, error handling, and SLAs, with proper documentation.
Required Qualifications and Skills:
12-15 years of work experience as an application developer.
AWS Certified Cloud Practitioner (or an equivalent cloud certification; Level to be confirmed).
Proficiency in REST API development using frameworks such as Django, Flask, FastAPI, or similar.
Strong domain expertise in Credit Risk and Counterparty Risk.
Expert-level proficiency in Python, including experience with PySpark/Spark for data engineering and analytics.
Hands-on experience with Azure Databricks, including Medallion Lakehouse Architecture.
Solid understanding of SQL, including joins, unions, stored procedures, and query optimization.
Familiarity with front-end and back-end development (experience is a plus).
In-depth knowledge of CI/CD pipelines utilizing Git, Jenkins, and Azure DevOps.
Exposure to technical architecture design (preferred).
Experience in creating product specifications, architecture diagrams, and design documents.
Proven experience working in an Agile environment using tools like JIRA, Confluence, and Zephyr.
Strong communication skills to clearly articulate complex ideas.
Collaborative team player with a proactive, self-starter attitude.
Demonstrated ability to quickly learn new technologies.
Passion for coding, development, and continuous improvement.
Preferred but Not Mandatory:
Advanced degree in Finance, Computer Science, or related discipline.
Experience with risk modeling and financial analytics.
Knowledge of deployment, operational support, and monitoring tools.
Pay: $75.00 - $80.00 per hour
Expected hours: 40.0 per week
Benefits:
401(k)
401(k) matching
Experience:
Data Architecting : 10 years (Required)
Capital Markets: 10 years (Required)
Azure Databricks: 10 years (Required)
Medallion Lakehouse: 5 years (Required)
Credit Risk/Counterparty Risk: 8 years (Required)
Python Django: 10 years (Required)
PySpark/Spark: 10 years (Required)
Front & Backend Development : 5 years (Required)
Work Location: In person
Job Title: Dr. Data Architect with AWS Certification
Location: New York City (On-site working)
Max Rate: $80/- hr W2
About the Role:
We are seeking a highly skilled and experienced Application Engineer and Data Architect to join our dynamic Team. As a senior member of the team, you will play a critical role in designing, implementing, and maintaining the application infrastructure. Your expertise will help drive innovative data solutions and ensure platform reliability, security, and performance.
Key Responsibilities:
Lead architecture and technical design discussions, considering industry-standard technologies and best practices.
Support production operations and resolve complex production issues as a senior developer within the Credit Risk application team.
Design and implement batch and ad-hoc data pipelines based on Medallion Lakehouse architecture using modern cloud data engineering patterns, primarily in Databricks.
Build and maintain data ingestion flows from upstream systems into object storage (e.g., S3, ADLS) using formats like Parquet, including advanced features such as partitioning, z-ordering, and schema evolution.
Integrate with external XVA/risk engines and implement orchestration logic to manage long-running external computations.
Model and optimize risk measures (e.g., EPE, PFE) for efficient querying and consumption by BI tools, notebooks, and downstream applications.
Ensure platform reliability, observability, security (IAM roles, OIDC/Bearer token authentication, encryption), and auditability.
Contribute to API design for internal and external customers, focusing on versioning, error handling, and SLAs, with proper documentation.
Required Qualifications and Skills:
12-15 years of work experience as an application developer.
AWS Certified Cloud Practitioner (or an equivalent cloud certification; Level to be confirmed).
Proficiency in REST API development using frameworks such as Django, Flask, FastAPI, or similar.
Strong domain expertise in Credit Risk and Counterparty Risk.
Expert-level proficiency in Python, including experience with PySpark/Spark for data engineering and analytics.
Hands-on experience with Azure Databricks, including Medallion Lakehouse Architecture.
Solid understanding of SQL, including joins, unions, stored procedures, and query optimization.
Familiarity with front-end and back-end development (experience is a plus).
In-depth knowledge of CI/CD pipelines utilizing Git, Jenkins, and Azure DevOps.
Exposure to technical architecture design (preferred).
Experience in creating product specifications, architecture diagrams, and design documents.
Proven experience working in an Agile environment using tools like JIRA, Confluence, and Zephyr.
Strong communication skills to clearly articulate complex ideas.
Collaborative team player with a proactive, self-starter attitude.
Demonstrated ability to quickly learn new technologies.
Passion for coding, development, and continuous improvement.
Preferred but Not Mandatory:
Advanced degree in Finance, Computer Science, or related discipline.
Experience with risk modeling and financial analytics.
Knowledge of deployment, operational support, and monitoring tools.
Pay: $75.00 - $80.00 per hour
Expected hours: 40.0 per week
Benefits:
401(k)
401(k) matching
Experience:
Data Architecting : 10 years (Required)
Capital Markets: 10 years (Required)
Azure Databricks: 10 years (Required)
Medallion Lakehouse: 5 years (Required)
Credit Risk/Counterparty Risk: 8 years (Required)
Python Django: 10 years (Required)
PySpark/Spark: 10 years (Required)
Front & Backend Development : 5 years (Required)
Work Location: In person






