

DHI Group, Inc.
API Developer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an API Developer on a 6-month contract, remote or hybrid in the Kansas City area. Key skills include REST, GraphQL, Node.js or Python, Azure, and PostgreSQL. Must be authorized to work in the U.S. without sponsorship.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
400
-
🗓️ - Date
February 17, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Azure DevOps #GIT #API (Application Programming Interface) #Docker #REST API #Databricks #Data Lake #Kubernetes #TypeScript #Vault #Azure cloud #DevOps #GraphQL #POSTMAN #Security #Deployment #Swagger #Redis #Databases #Microservices #Documentation #Integration Testing #React #Azure #FastAPI #REST (Representational State Transfer) #Azure Databricks #GitHub #PostgreSQL #Logging #Monitoring #Microsoft Azure #Cloud #Python
Role description
API & Service Layer Developer
Contract Duration: 6 months
Location: Remote or Hybrid (Kansas City area)
Compensation: Hourly (Contract Position)
Work Authorization: Must be legally authorized to work in the U.S. without sponsorship
About the Project
Support the expansion of an enterprise lending platform by building and extending APIs within an existing microservices architecture. You will implement new endpoints and service-layer components using established design patterns that connect a React frontend with an Azure Databricks data lake and other enterprise systems, including Salesforce and loan servicing platforms.
This role focuses on hands-on development and execution. Architectural direction and platform standards are already defined; your responsibility will be to build within those frameworks and deliver production-ready APIs.
Core Responsibilities
• Develop and implement RESTful and GraphQL APIs following established architectural patterns
• Extend existing microservices and replicate service templates to support new business features
• Build service-layer components connecting Azure Databricks with business logic and external systems
• Implement authentication and authorization using predefined Azure AD, RBAC, and multi-factor authentication configurations
• Define and publish API specifications using OpenAPI/Swagger standards
• Apply caching and performance optimizations (<500ms response targets) using Redis and established best practices
• Support event-driven integrations leveraging existing Databricks streaming and messaging patterns
• Participate in CI/CD deployments and containerized application workflows
Required Technical Skills
• API Development: REST, GraphQL, experience working within existing API gateway and microservices patterns
• Backend Development: Node.js (Express.js) or Python (FastAPI)
• Security: OAuth 2.0, JWT, Azure AD/MSAL, RBAC
• Databases: PostgreSQL (query writing and tuning), Redis (caching implementation)
• Azure Cloud: App Services, container-based deployments, API Management
• Integration Patterns: Experience consuming and integrating external REST APIs
• DevOps: Docker, CI/CD (Azure DevOps, GitHub Actions)
Preferred Qualifications
• Experience integrating with Azure Databricks REST APIs
• GraphQL schema implementation experience
• Salesforce API integration (REST, SOAP, bulk APIs)
• TypeScript experience for Node.js backend services
• Exposure to Kubernetes environments (working within existing clusters)
• Experience implementing API rate limiting and throttling using predefined configurations
• Domain exposure in fintech or loan servicing environments
• Experience working in SOC 2-compliant environments
Technical Environment
• Backend: Node.js 18+ / Python 3.9+, Express.js or FastAPI
• APIs: REST, GraphQL, OpenAPI/Swagger
• Databases: PostgreSQL, Redis
• Cloud: Microsoft Azure (App Services, Databricks, Key Vault, API Management)
• Authentication: Azure AD, MSAL, OAuth 2.0
• Frontend Awareness: React 18+ with TypeScript
• Tools: Git/GitHub, Docker, Azure DevOps, Postman
• Monitoring: Application Insights, Azure Monitor
Success Criteria
• Production-ready APIs delivered on schedule using established architecture standards
• API performance aligned to <500ms average response time targets
• Complete OpenAPI documentation for frontend and integration teams
• Secure authentication implementation aligned with existing RBAC and MFA configurations
• Test coverage aligned with team standards (unit and integration testing)
• Deployment through existing CI/CD pipelines with logging and monitoring in place
API & Service Layer Developer
Contract Duration: 6 months
Location: Remote or Hybrid (Kansas City area)
Compensation: Hourly (Contract Position)
Work Authorization: Must be legally authorized to work in the U.S. without sponsorship
About the Project
Support the expansion of an enterprise lending platform by building and extending APIs within an existing microservices architecture. You will implement new endpoints and service-layer components using established design patterns that connect a React frontend with an Azure Databricks data lake and other enterprise systems, including Salesforce and loan servicing platforms.
This role focuses on hands-on development and execution. Architectural direction and platform standards are already defined; your responsibility will be to build within those frameworks and deliver production-ready APIs.
Core Responsibilities
• Develop and implement RESTful and GraphQL APIs following established architectural patterns
• Extend existing microservices and replicate service templates to support new business features
• Build service-layer components connecting Azure Databricks with business logic and external systems
• Implement authentication and authorization using predefined Azure AD, RBAC, and multi-factor authentication configurations
• Define and publish API specifications using OpenAPI/Swagger standards
• Apply caching and performance optimizations (<500ms response targets) using Redis and established best practices
• Support event-driven integrations leveraging existing Databricks streaming and messaging patterns
• Participate in CI/CD deployments and containerized application workflows
Required Technical Skills
• API Development: REST, GraphQL, experience working within existing API gateway and microservices patterns
• Backend Development: Node.js (Express.js) or Python (FastAPI)
• Security: OAuth 2.0, JWT, Azure AD/MSAL, RBAC
• Databases: PostgreSQL (query writing and tuning), Redis (caching implementation)
• Azure Cloud: App Services, container-based deployments, API Management
• Integration Patterns: Experience consuming and integrating external REST APIs
• DevOps: Docker, CI/CD (Azure DevOps, GitHub Actions)
Preferred Qualifications
• Experience integrating with Azure Databricks REST APIs
• GraphQL schema implementation experience
• Salesforce API integration (REST, SOAP, bulk APIs)
• TypeScript experience for Node.js backend services
• Exposure to Kubernetes environments (working within existing clusters)
• Experience implementing API rate limiting and throttling using predefined configurations
• Domain exposure in fintech or loan servicing environments
• Experience working in SOC 2-compliant environments
Technical Environment
• Backend: Node.js 18+ / Python 3.9+, Express.js or FastAPI
• APIs: REST, GraphQL, OpenAPI/Swagger
• Databases: PostgreSQL, Redis
• Cloud: Microsoft Azure (App Services, Databricks, Key Vault, API Management)
• Authentication: Azure AD, MSAL, OAuth 2.0
• Frontend Awareness: React 18+ with TypeScript
• Tools: Git/GitHub, Docker, Azure DevOps, Postman
• Monitoring: Application Insights, Azure Monitor
Success Criteria
• Production-ready APIs delivered on schedule using established architecture standards
• API performance aligned to <500ms average response time targets
• Complete OpenAPI documentation for frontend and integration teams
• Secure authentication implementation aligned with existing RBAC and MFA configurations
• Test coverage aligned with team standards (unit and integration testing)
• Deployment through existing CI/CD pipelines with logging and monitoring in place





