

Lumicity
Senior Data Architect
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Architect on an 18-month contract, paying $110–$140/hour. Requires 7–12+ years in data engineering, expertise in Azure, Databricks, Snowflake, and experience in regulated environments, particularly in financial institutions.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
1120
-
🗓️ - Date
December 5, 2025
🕒 - Duration
More than 6 months
-
🏝️ - Location
Unknown
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Dallas, TX
-
🧠 - Skills detailed
#Automation #Snowflake #Scala #Migration #SnowPipe #Collibra #Kafka (Apache Kafka) #Synapse #Cloud #Azure #SQL (Structured Query Language) #Delta Lake #Databricks #Data Pipeline #Python #dbt (data build tool) #MLflow #AI (Artificial Intelligence) #"ETL (Extract #Transform #Load)" #ADF (Azure Data Factory) #Azure Databricks #Observability #ML (Machine Learning) #Azure Data Factory #Data Engineering #Data Architecture
Role description
Staff Data Engineer / Data Architect (Contract)
Contract Length: 18-month engagement (extension highly likely)
Pay Rate: $110–$140/hour (W2 or C2C) NO THIRD PARTIES
About the Project
This is with a major Fortune 100 financial institution undergoing a large-scale data platform modernization and AI/ML enablement initiative.
This is a multi-phase, 18-month program involving the rebuild of core data pipelines, migration into a modern lakehouse architecture, and development of a unified data platform to support credit risk, fraud, real-time decisioning, and enterprise analytics.
We’re seeking a Staff-Level Data Architect to support this effort.
If you’ve architected or led programs at enterprise scale especially in regulated environments this is a great opportunity.
What You’ll Lead
• Architecting and delivering large-scale data solutions on Azure, Databricks, and Snowflake
• Designing and implementing Delta Lakehouse frameworks and governance models
• Leading the development of dbt transformation layers and establishing enterprise standards
• Building high-performance ELT/ETL pipelines with strong observability and cost controls
• Collaborating with risk, credit, fraud, and analytics teams on data needs for ML/AI models
• Supporting platform engineering teams with best practices, CI/CD for data, and automation
• Providing architectural guidance to engineering pods across multiple workstreams
Ideal Tech Stack
Core
• Python (scalable, production-grade)
• Advanced SQL + optimization
• dbt (core + cloud; experience defining architecture & standards)
Cloud / Platform
• Azure: Data Factory, Functions, Synapse, EventHub
• Databricks: Delta Lake, Unity Catalog, MLflow
• Snowflake: Snowpipe, Streams/Tasks, performance tuning
Preferred Experience
• Enterprise governance frameworks (Purview, Unity Catalog, Collibra)
• Streaming (Kafka/EventHub)
• Experience supporting real-time risk/fraud/credit decision systems
• Familiarity with model enablement for DS/ML teams
• Prior experience in large transformation programs (12+ months)
Ideal Background
You’ve operated in environments similar to:
• Tier-1 banks
• Fortune 100 financial institutions
• Large consumer credit, fintech, or payments companies
And you bring:
• 7–12+ years in data engineering or architecture
• Experience leading multi-team initiatives or platform migrations
• Ability to be hands-on while also driving architectural decisions
Pay Rate
$110–$140/hour, depending on experience and specific workstream alignment.
Staff Data Engineer / Data Architect (Contract)
Contract Length: 18-month engagement (extension highly likely)
Pay Rate: $110–$140/hour (W2 or C2C) NO THIRD PARTIES
About the Project
This is with a major Fortune 100 financial institution undergoing a large-scale data platform modernization and AI/ML enablement initiative.
This is a multi-phase, 18-month program involving the rebuild of core data pipelines, migration into a modern lakehouse architecture, and development of a unified data platform to support credit risk, fraud, real-time decisioning, and enterprise analytics.
We’re seeking a Staff-Level Data Architect to support this effort.
If you’ve architected or led programs at enterprise scale especially in regulated environments this is a great opportunity.
What You’ll Lead
• Architecting and delivering large-scale data solutions on Azure, Databricks, and Snowflake
• Designing and implementing Delta Lakehouse frameworks and governance models
• Leading the development of dbt transformation layers and establishing enterprise standards
• Building high-performance ELT/ETL pipelines with strong observability and cost controls
• Collaborating with risk, credit, fraud, and analytics teams on data needs for ML/AI models
• Supporting platform engineering teams with best practices, CI/CD for data, and automation
• Providing architectural guidance to engineering pods across multiple workstreams
Ideal Tech Stack
Core
• Python (scalable, production-grade)
• Advanced SQL + optimization
• dbt (core + cloud; experience defining architecture & standards)
Cloud / Platform
• Azure: Data Factory, Functions, Synapse, EventHub
• Databricks: Delta Lake, Unity Catalog, MLflow
• Snowflake: Snowpipe, Streams/Tasks, performance tuning
Preferred Experience
• Enterprise governance frameworks (Purview, Unity Catalog, Collibra)
• Streaming (Kafka/EventHub)
• Experience supporting real-time risk/fraud/credit decision systems
• Familiarity with model enablement for DS/ML teams
• Prior experience in large transformation programs (12+ months)
Ideal Background
You’ve operated in environments similar to:
• Tier-1 banks
• Fortune 100 financial institutions
• Large consumer credit, fintech, or payments companies
And you bring:
• 7–12+ years in data engineering or architecture
• Experience leading multi-team initiatives or platform migrations
• Ability to be hands-on while also driving architectural decisions
Pay Rate
$110–$140/hour, depending on experience and specific workstream alignment.






