

Data Integration Lead (Streaming & Warehousing) — Retail Wealth
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Integration Lead (Streaming & Warehousing) in Retail Wealth, offering a remote contract. Pay rate is competitive. Key skills required include 5+ years in data engineering, expertise in retail wealth, and proficiency in AWS, Kafka, and Databricks.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
-
🗓️ - Date discovered
September 26, 2025
🕒 - Project duration
Unknown
-
🏝️ - Location type
Remote
-
📄 - Contract type
Unknown
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Observability #Data Quality #Spark (Apache Spark) #dbt (data build tool) #S3 (Amazon Simple Storage Service) #Snowflake #Batch #Oracle #AWS (Amazon Web Services) #Databricks #"ETL (Extract #Transform #Load)" #Metadata #Data Integration #DevSecOps #SQL (Structured Query Language) #PySpark #Migration #Infrastructure as Code (IaC) #Data Pipeline #Leadership #Scala #Kafka (Apache Kafka) #Airflow #Strategy #Spark SQL #Data Governance #Data Engineering #Data Modeling #AWS S3 (Amazon Simple Storage Service)
Role description
Data Integration Lead (Streaming & Warehousing) — Retail Wealth
Location: Remote
Role
The role is a hands-on Data Engineering Lead to build streaming and batch data pipelines powering our digital self‑service roadmap. Design robust ETL/ELT on AWS, Kafka, Databricks, Oracle, and guide the transition to Snowflake. The role blends architecture and delivery—owning data quality, observability, and governance in the Retail Wealth domain.
• Domain Expertise: Deep experience in retail wealth, broker dealers, warehouses, family offices, custodians, trade order management, and digital self-service platforms (e.g., robo advisors).
• Technical Stack: AWS, Kafka, Databricks, Oracle, Snowflake (future), ETL processes.
• Hands-On Leadership: Integration lead must design/develop APIs and coordinate with architects; data lead must be strong in ETL/data integration.
Responsibilities
• Design and build streaming & batch pipelines for ingestion, curation, and consumption (real‑time + micro‑batch).
• Engineer scalable ELT/ETL on Databricks (PySpark/Spark SQL), integrating sources including APEX, custodians, broker dealers, and market/reference data.
• Optimize workloads on AWS (S3, Glue, EMR/Databricks, Lakehouse patterns); manage Oracle sources; drive Snowflake migration strategy and execution.
• Enforce data quality (DQ), lineage, metadata, and governance with best‑practice frameworks and tooling.
• Partner with analytics, product, and integration teams to support dashboards, operational reporting, advanced analytics, and ODS.
• Establish DevSecOps for data (versioned transformations, CI/CD, IaC patterns, secrets mgmt).
• Define and track SLAs/SLOs, cost controls, and performance baselines.
Must-Have Qualifications
• 5+ years (7+ preferred) in data engineering with lead-level ownership delivering production pipelines.
• Retail Wealth expertise: custodians, broker dealers, warehouses/family offices; order/trade and position/transaction data.
• Hands-on with Kafka (topics, partitions, schema/registry), Databricks (PySpark/Spark SQL), AWS data stack, and Oracle sources.
• Strong SQL/performance tuning; ELT/ETL design patterns; batch orchestration (e.g., Airflow/Databricks Jobs).
• Practical data governance: lineage, DQ, PII controls, encryption, RBAC, and regulatory awareness (FINRA/SEC).
• Experience planning/executing Snowflake migrations (data modeling, performance, cost/pricing levers).
Nice to Have
• Familiarity with Apex Fintech data domains and vendor ecosystems (Orion, Envestnet, Pershing/Schwab).
• Knowledge of DTCC/NSCC, Morningstar data, advisory/UMA/SMA billing & commissions.
• Observability for data (Great Expectations/Deequ, Delta Live Tables), cost optimization, and dbt or equivalent.
• Bachelor’s in CS/Engineering/Math/IS (Master’s a plus).
Data Integration Lead (Streaming & Warehousing) — Retail Wealth
Location: Remote
Role
The role is a hands-on Data Engineering Lead to build streaming and batch data pipelines powering our digital self‑service roadmap. Design robust ETL/ELT on AWS, Kafka, Databricks, Oracle, and guide the transition to Snowflake. The role blends architecture and delivery—owning data quality, observability, and governance in the Retail Wealth domain.
• Domain Expertise: Deep experience in retail wealth, broker dealers, warehouses, family offices, custodians, trade order management, and digital self-service platforms (e.g., robo advisors).
• Technical Stack: AWS, Kafka, Databricks, Oracle, Snowflake (future), ETL processes.
• Hands-On Leadership: Integration lead must design/develop APIs and coordinate with architects; data lead must be strong in ETL/data integration.
Responsibilities
• Design and build streaming & batch pipelines for ingestion, curation, and consumption (real‑time + micro‑batch).
• Engineer scalable ELT/ETL on Databricks (PySpark/Spark SQL), integrating sources including APEX, custodians, broker dealers, and market/reference data.
• Optimize workloads on AWS (S3, Glue, EMR/Databricks, Lakehouse patterns); manage Oracle sources; drive Snowflake migration strategy and execution.
• Enforce data quality (DQ), lineage, metadata, and governance with best‑practice frameworks and tooling.
• Partner with analytics, product, and integration teams to support dashboards, operational reporting, advanced analytics, and ODS.
• Establish DevSecOps for data (versioned transformations, CI/CD, IaC patterns, secrets mgmt).
• Define and track SLAs/SLOs, cost controls, and performance baselines.
Must-Have Qualifications
• 5+ years (7+ preferred) in data engineering with lead-level ownership delivering production pipelines.
• Retail Wealth expertise: custodians, broker dealers, warehouses/family offices; order/trade and position/transaction data.
• Hands-on with Kafka (topics, partitions, schema/registry), Databricks (PySpark/Spark SQL), AWS data stack, and Oracle sources.
• Strong SQL/performance tuning; ELT/ETL design patterns; batch orchestration (e.g., Airflow/Databricks Jobs).
• Practical data governance: lineage, DQ, PII controls, encryption, RBAC, and regulatory awareness (FINRA/SEC).
• Experience planning/executing Snowflake migrations (data modeling, performance, cost/pricing levers).
Nice to Have
• Familiarity with Apex Fintech data domains and vendor ecosystems (Orion, Envestnet, Pershing/Schwab).
• Knowledge of DTCC/NSCC, Morningstar data, advisory/UMA/SMA billing & commissions.
• Observability for data (Great Expectations/Deequ, Delta Live Tables), cost optimization, and dbt or equivalent.
• Bachelor’s in CS/Engineering/Math/IS (Master’s a plus).