Vaiticka Solution

Data Architect

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Data Architect (Credit Risk) position based in New York City on a contract basis. Requires 12-15 years of application development experience, AWS certification, expertise in Credit Risk, and proficiency in Python, Databricks, and SQL.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
640
-
πŸ—“οΈ - Date
January 16, 2026
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Hybrid
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
New City, NY
-
🧠 - Skills detailed
#S3 (Amazon Simple Storage Service) #Azure Databricks #Jira #REST (Representational State Transfer) #Databricks #Data Pipeline #Security #Data Architecture #ADLS (Azure Data Lake Storage) #Azure DevOps #Data Engineering #REST API #Python #Agile #PySpark #BI (Business Intelligence) #SQL (Structured Query Language) #AWS (Amazon Web Services) #GIT #API (Application Programming Interface) #Spark (Apache Spark) #Flask #Observability #Batch #Azure #Jenkins #DevOps #Django #FastAPI #Storage #IAM (Identity and Access Management) #Cloud
Role description
Position: Data Architect (Credit Risk) Location: New York City (Hybrid) Employment: Contract Job Description: We are seeking a highly skilled and experienced application Engineer and Data Architect to join our dynamic Credit Risk Capital IT team within Credit Risk Technology Group. As a Senior resource, you will play a crucial role in designing, implementing, and maintaining the application and infrastructure for Credit Risk Capital. Job Responsibilities and Requirements β€’ Lead architecture/technical design discussion considering industry standard technologies and practice, support production operation and resolve production issues as a senior developer in Credit Risk application team β€’ Design and implement batch and ad-hoc data pipelines with the Medallion Lakehouse architecture using modern cloud data engineering patterns, primarily in Databricks β€’ Build and maintain ingestion flows from upstream systems into object storage (e.g., S3/ADLS) using formats such as Parquet, including partitioning, z-ordering and schema evolution β€’ Integrate with external XVA / risk engines, implement orchestration logic to manage long-running external computations β€’ Model and optimize risk measures (like EPE, PFE) for efficient query and consumption by BI tools, notebooks, and downstream applications β€’ Ensure platform reliability, observability, security (IAM roles, OIDC/Bearer-token auth, encryption) and auditability. β€’ Contribute to API design for internal consumers and external services (versioning, error handling, SLAs). Be efficient at documenting. Required Qualifications, Skills and Capabilities β€’ Work experience of 12-15 years as an application developer β€’ AWS Certified Cloud Practitioner (or equivalent certification; level to be confirmed) β€’ Proficient in REST API development using but not limited to Django / Flask / FastAPI etc. β€’ Strong domain expertise in Credit Risk and Counterparty Risk β€’ Expert-level proficiency in Python including PySpark / Spark for data engineering and analytics β€’ Hands-on experience with Azure Databricks, including Medallion Architecture - Lakehouse β€’ Solid understanding of SQL, including joins, unions, stored procedures, and query optimization β€’ Familiarity with front-end and back-end development (experience is a plus) β€’ In-depth knowledge of CI/CD pipelines using Git, Jenkins, and Azure DevOps β€’ Exposure to technical architecture design (experience is a plus) β€’ Experience in creating product specifications, architecture diagram and design documents β€’ Proven experience in Agile software development using JIRA, Confluence, and Zephyr β€’ Strong communication skillsβ€”able to convey complex ideas clearly and concisely β€’ Highly collaborative team player with a proactive, self-starter mindset β€’ Demonstrated ability to learn new technologies quickly β€’ Passionate about coding, development and continuous improvement