

The Cooperative Finance Association
Azure Databricks Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an Azure Databricks Data Engineer on a 6-month contract, offering $55.00 - $66.00 per hour, with remote or hybrid options. Requires expert Azure Databricks skills, data integration experience, and familiarity with agricultural finance or fintech.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
528
-
🗓️ - Date
November 20, 2025
🕒 - Duration
More than 6 months
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Remote
-
🧠 - Skills detailed
#Docker #Data Warehouse #Databricks #Classification #Vault #Programming #Documentation #Microsoft Azure #Scala #"ETL (Extract #Transform #Load)" #PySpark #Data Ingestion #Data Integrity #Python #Delta Lake #Azure #Storage #GitHub #Data Processing #TypeScript #GraphQL #Compliance #Security #API (Application Programming Interface) #Data Engineering #Data Modeling #Data Governance #Redis #Data Quality #REST (Representational State Transfer) #Azure cloud #SQL (Structured Query Language) #DevOps #React #Spark (Apache Spark) #CLI (Command-Line Interface) #Data Integration #Data Lake #GIT #PostgreSQL #Azure Databricks #Spark SQL #Batch #Azure DevOps #GDPR (General Data Protection Regulation) #Cloud #Azure CLI (Azure Command Line Interface) #BI (Business Intelligence)
Role description
Azure Databricks Data Engineer
Position Overview
Contract Duration: 6 monthsLocation: Remote or Hybrid (Kansas City area)Rate: Hourly (contract position)Work Authorization: Must be legally authorized to work in the US without sponsorship
About the Project
Join CFA’s initiative to extend our industry renowned lending platform through Azure Databricks-powered data integration. You’ll architect and integrate with a centralized data lake that unifies disparate systems (Salesforce, loan servicing platforms, document management, DocuSign, Conga) into a single source of truth for agricultural finance operations.
Core Responsibilities
Design and implement Azure Databricks data lake architecture integrating multiple enterprise data sources
Build real-time streaming and batch data processing pipelines for Salesforce, loan servicing, and document management systems
Develop data quality, validation, and cleansing processes to ensure data integrity across integrated systems
Create analytics-ready data structures optimized for business intelligence and operational reporting
Implement data governance, security controls, and compliance measures for SOC 2 Type II requirements
Collaborate with API/service layer team to expose unified data through REST and GraphQL endpoints
Required Technical Skills
Expert-level Azure Databricks experience with Delta Lake, Spark SQL, and PySpark
Azure Cloud Platform: Data Factory, Event Hubs, Blob Storage, Key Vault, Azure AD integration
Data Engineering: ETL/ELT pipeline design, data modeling (dimensional/star schema), data warehouse patterns
Programming: Python (primary), SQL, Scala (plus)
Data Integration: Experience with Salesforce APIs, REST/SOAP integrations, document system connectors
Streaming & Batch: Real-time data ingestion patterns, scheduled synchronization strategies
Security & Compliance: Encryption at rest/in transit (AES-256, TLS 1.3), RBAC, data classification
Preferred Qualifications
Experience integrating loan servicing or financial systems data
Knowledge of document management systems (DocuSign, Conga) and email archival
Familiarity with GraphQL data exposure patterns
Agricultural finance or fintech domain experience
Experience with SOC 2 or similar compliance frameworks
Technical Environment
Cloud: Microsoft Azure (Databricks, Data Factory, Blob Storage, Key Vault)
Tools: Git/GitHub, Azure DevOps/GitHub Actions, Docker, Azure CLI
Data Sources: Salesforce, NLS loan servicing, email archives, DocuSign, Conga
Backend Stack: Node.js/Python APIs, PostgreSQL, Redis
Frontend Stack: React 18+, TypeScript (awareness helpful for API integration)
Success Criteria
Operational data lake with all source systems integrated and synchronized
Data quality processes achieving >95% accuracy and completeness
Real-time pipelines with <5 minute latency for critical data streams
Documentation enabling knowledge transfer and long-term maintainability
Security controls meeting SOC 2 Type II requirements
How to Apply
Submit your resume to cfacareers@cfafs.com with your git portfolio or examples of Azure Databricks projects showing multi-source data integration, real-time and batch pipeline implementations at scale and any security/compliance work (SOC 2, GDPR, etc.) you want to showcase.
CFA is an Equal Opportunity Employer
Job Type: Contract
Pay: $55.00 - $66.00 per hour
Expected hours: 40 per week
Work Location: Remote
Azure Databricks Data Engineer
Position Overview
Contract Duration: 6 monthsLocation: Remote or Hybrid (Kansas City area)Rate: Hourly (contract position)Work Authorization: Must be legally authorized to work in the US without sponsorship
About the Project
Join CFA’s initiative to extend our industry renowned lending platform through Azure Databricks-powered data integration. You’ll architect and integrate with a centralized data lake that unifies disparate systems (Salesforce, loan servicing platforms, document management, DocuSign, Conga) into a single source of truth for agricultural finance operations.
Core Responsibilities
Design and implement Azure Databricks data lake architecture integrating multiple enterprise data sources
Build real-time streaming and batch data processing pipelines for Salesforce, loan servicing, and document management systems
Develop data quality, validation, and cleansing processes to ensure data integrity across integrated systems
Create analytics-ready data structures optimized for business intelligence and operational reporting
Implement data governance, security controls, and compliance measures for SOC 2 Type II requirements
Collaborate with API/service layer team to expose unified data through REST and GraphQL endpoints
Required Technical Skills
Expert-level Azure Databricks experience with Delta Lake, Spark SQL, and PySpark
Azure Cloud Platform: Data Factory, Event Hubs, Blob Storage, Key Vault, Azure AD integration
Data Engineering: ETL/ELT pipeline design, data modeling (dimensional/star schema), data warehouse patterns
Programming: Python (primary), SQL, Scala (plus)
Data Integration: Experience with Salesforce APIs, REST/SOAP integrations, document system connectors
Streaming & Batch: Real-time data ingestion patterns, scheduled synchronization strategies
Security & Compliance: Encryption at rest/in transit (AES-256, TLS 1.3), RBAC, data classification
Preferred Qualifications
Experience integrating loan servicing or financial systems data
Knowledge of document management systems (DocuSign, Conga) and email archival
Familiarity with GraphQL data exposure patterns
Agricultural finance or fintech domain experience
Experience with SOC 2 or similar compliance frameworks
Technical Environment
Cloud: Microsoft Azure (Databricks, Data Factory, Blob Storage, Key Vault)
Tools: Git/GitHub, Azure DevOps/GitHub Actions, Docker, Azure CLI
Data Sources: Salesforce, NLS loan servicing, email archives, DocuSign, Conga
Backend Stack: Node.js/Python APIs, PostgreSQL, Redis
Frontend Stack: React 18+, TypeScript (awareness helpful for API integration)
Success Criteria
Operational data lake with all source systems integrated and synchronized
Data quality processes achieving >95% accuracy and completeness
Real-time pipelines with <5 minute latency for critical data streams
Documentation enabling knowledge transfer and long-term maintainability
Security controls meeting SOC 2 Type II requirements
How to Apply
Submit your resume to cfacareers@cfafs.com with your git portfolio or examples of Azure Databricks projects showing multi-source data integration, real-time and batch pipeline implementations at scale and any security/compliance work (SOC 2, GDPR, etc.) you want to showcase.
CFA is an Equal Opportunity Employer
Job Type: Contract
Pay: $55.00 - $66.00 per hour
Expected hours: 40 per week
Work Location: Remote






