iPivot

Data Architect

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Architect in New York City, NY, on a W2 contract for an unspecified length. Pay rate is competitive. Key skills include Python, SQL, AWS certification, and experience in Credit Risk. Hybrid work environment.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
February 7, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
New York City Metropolitan Area
-
🧠 - Skills detailed
#S3 (Amazon Simple Storage Service) #Jira #REST API #Storage #Deployment #Azure Databricks #IAM (Identity and Access Management) #Django #Python #Flask #PySpark #Azure DevOps #Security #Cloud #Agile #Azure #Monitoring #Data Ingestion #Data Engineering #REST (Representational State Transfer) #AWS (Amazon Web Services) #Databricks #Observability #API (Application Programming Interface) #Data Pipeline #Batch #DevOps #FastAPI #Documentation #ADLS (Azure Data Lake Storage) #SQL (Structured Query Language) #Spark (Apache Spark) #BI (Business Intelligence) #Computer Science #Jenkins #Data Architecture #GIT
Role description
Greetings, Role: Data Architect Location: New York City, NY Only on W2 (No C2C) Hybrid Work About the Role: We are seeking a highly skilled and experienced Application Engineer and Data Architect to join our dynamic Team. As a senior member of the team, you will play a critical role in designing, implementing, and maintaining the application infrastructure. Your expertise will help drive innovative data solutions and ensure platform reliability, security, and performance. Key Responsibilities: • Lead architecture and technical design discussions, considering industry-standard technologies and best practices. • Support production operations and resolve complex production issues as a senior developer within the Credit Risk application team. • Design and implement batch and ad-hoc data pipelines based on Medallion Lakehouse architecture using modern cloud data engineering patterns, primarily in Databricks. • Build and maintain data ingestion flows from upstream systems into object storage (e.g., S3, ADLS) using formats like Parquet, including advanced features such as partitioning, z-ordering, and schema evolution. • Integrate with external XVA/risk engines and implement orchestration logic to manage long-running external computations. • Model and optimize risk measures (e.g., EPE, PFE) for efficient querying and consumption by BI tools, notebooks, and downstream applications. • Ensure platform reliability, observability, security (IAM roles, OIDC/Bearer token authentication, encryption), and auditability. • Contribute to API design for internal and external customers, focusing on versioning, error handling, and SLAs, with proper documentation. Required Qualifications and Skills: • 12-15 years of work experience as an application developer. • AWS Certified Cloud Practitioner (or an equivalent cloud certification; Level to be confirmed). • Proficiency in REST API development using frameworks such as Django, Flask, FastAPI, or similar. • Strong domain expertise in Credit Risk and Counterparty Risk. • Expert-level proficiency in Python, including experience with PySpark/Spark for data engineering and analytics. • Hands-on experience with Azure Databricks, including Medallion Lakehouse Architecture. • Solid understanding of SQL, including joins, unions, stored procedures, and query optimization. • Familiarity with front-end and back-end development (experience is a plus). • In-depth knowledge of CI/CD pipelines utilizing Git, Jenkins, and Azure DevOps. • Exposure to technical architecture design (preferred). • Experience in creating product specifications, architecture diagrams, and design documents. • Proven experience working in an Agile environment using tools like JIRA, Confluence, and Zephyr. • Strong communication skills to clearly articulate complex ideas. • Collaborative team player with a proactive, self-starter attitude. • Demonstrated ability to quickly learn new technologies. • Passion for coding, development, and continuous improvement. Preferred but Not Mandatory: • Advanced degree in Finance, Computer Science, or related discipline. • Experience with risk modeling and financial analytics. • Knowledge of deployment, operational support, and monitoring tools