

Links Technology Solutions
Resident Solutions Architect (RSA)
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Resident Solutions Architect in New York City, NY, with a long-term contract paying $140-$150/hr. Requires 10+ years in consulting, expertise in Databricks and Spark, cloud proficiency, and a background in financial markets.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
1120
-
ποΈ - Date
March 4, 2026
π - Duration
Unknown
-
ποΈ - Location
On-site
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
New York, NY
-
π§ - Skills detailed
#Cloud #Automation #GCP (Google Cloud Platform) #Data Engineering #Databricks #Data Pipeline #DevOps #Deployment #"ETL (Extract #Transform #Load)" #Consulting #Distributed Computing #AWS (Amazon Web Services) #Leadership #Linux #Spark (Apache Spark) #SQL (Structured Query Language) #Azure #Scala
Role description
Job Title: Resident Solutions Architect
Location: MUST be able to work ONSITE in New York City, NY
Experience Level: Senior (10+ Years)
Pay rate: $140/hr - $150/hr
Contract Duration: Long term contract About the Role
Our client in the trading industry is seeking a high-caliber Resident Solutions Architect to join their professional services team. This is a dual-threat role requiring the technical depth of a seasoned Data Engineer and the strategic mindset of an elite Consultant. You will act as the primary technical trusted advisor for their strategic enterprise clients, guiding them through the architecture, deployment, and optimization of complex data ecosystems.
As a Resident Architect, you donβt just draw diagrams; you bridge the gap between business value and technical execution, ensuring that large-scale data platforms are resilient, performant, and future-proof.Key Responsibilities
β’ Strategic Consulting: Leverage 10+ years of consulting experience to lead high-stakes architectural discussions and align data strategies with business outcomes.
β’ Platform Architecture: Design and deliver production-grade data platforms using the Databricks Lakehouse architecture.
β’ Performance Optimization: Apply deep knowledge of Spark runtime internals to tune, troubleshoot, and optimize distributed computing workloads for maximum scalability.
β’ Full-Lifecycle Delivery: Lead the end-to-end development of data pipelines, from ingestion and transformation to MLOps integration and CI/CD automation.
β’ Cloud Leadership: Navigate complex multi-cloud environments (AWS, Azure, or GCP), providing deep subject matter expertise in at least one primary ecosystem.
Required Qualifications
β’ Experience: 7+ years in Data Engineering/Analytics and 10+ years in a professional Consulting or Client-facing role.
β’ Technical Depth: Hands-on delivery of 6β8+ major projects specifically built on the Databricks platform.
β’ Spark Mastery: Expert-level understanding of distributed computing with Spark, including memory management, partitioning, and execution plans.
β’ Cloud Proficiency: Working knowledge of at least two major cloud providers (AWS, Azure, GCP) with deep expertise in one.
β’ Certification: Must have completed the Data Engineering Professional certification and all required prerequisite coursework.
β’ Modern DevOps: Proficiency in CI/CD workflows for data engineering and a working knowledge of MLOps principles.
β’ Databricks Expert: Up-to-date knowledge across the entire Databricks product suite (Unity Catalog, Delta Live Tables, SQL Warehouse, etc.).
β’ Experience with application development and Dataricks Apps
β’ Comfort working in Linux environments
β’ Background in financial markets, such as market making, trading, or related domains
β’ Able to work independently, prioritize effectively, and proactively identify opportunities for improvement
β’ Comfortable operating in a loosely defined scope, where they are expected not only to execute tasks but also help define priorities and drive work forward.
#IND1
Job Title: Resident Solutions Architect
Location: MUST be able to work ONSITE in New York City, NY
Experience Level: Senior (10+ Years)
Pay rate: $140/hr - $150/hr
Contract Duration: Long term contract About the Role
Our client in the trading industry is seeking a high-caliber Resident Solutions Architect to join their professional services team. This is a dual-threat role requiring the technical depth of a seasoned Data Engineer and the strategic mindset of an elite Consultant. You will act as the primary technical trusted advisor for their strategic enterprise clients, guiding them through the architecture, deployment, and optimization of complex data ecosystems.
As a Resident Architect, you donβt just draw diagrams; you bridge the gap between business value and technical execution, ensuring that large-scale data platforms are resilient, performant, and future-proof.Key Responsibilities
β’ Strategic Consulting: Leverage 10+ years of consulting experience to lead high-stakes architectural discussions and align data strategies with business outcomes.
β’ Platform Architecture: Design and deliver production-grade data platforms using the Databricks Lakehouse architecture.
β’ Performance Optimization: Apply deep knowledge of Spark runtime internals to tune, troubleshoot, and optimize distributed computing workloads for maximum scalability.
β’ Full-Lifecycle Delivery: Lead the end-to-end development of data pipelines, from ingestion and transformation to MLOps integration and CI/CD automation.
β’ Cloud Leadership: Navigate complex multi-cloud environments (AWS, Azure, or GCP), providing deep subject matter expertise in at least one primary ecosystem.
Required Qualifications
β’ Experience: 7+ years in Data Engineering/Analytics and 10+ years in a professional Consulting or Client-facing role.
β’ Technical Depth: Hands-on delivery of 6β8+ major projects specifically built on the Databricks platform.
β’ Spark Mastery: Expert-level understanding of distributed computing with Spark, including memory management, partitioning, and execution plans.
β’ Cloud Proficiency: Working knowledge of at least two major cloud providers (AWS, Azure, GCP) with deep expertise in one.
β’ Certification: Must have completed the Data Engineering Professional certification and all required prerequisite coursework.
β’ Modern DevOps: Proficiency in CI/CD workflows for data engineering and a working knowledge of MLOps principles.
β’ Databricks Expert: Up-to-date knowledge across the entire Databricks product suite (Unity Catalog, Delta Live Tables, SQL Warehouse, etc.).
β’ Experience with application development and Dataricks Apps
β’ Comfort working in Linux environments
β’ Background in financial markets, such as market making, trading, or related domains
β’ Able to work independently, prioritize effectively, and proactively identify opportunities for improvement
β’ Comfortable operating in a loosely defined scope, where they are expected not only to execute tasks but also help define priorities and drive work forward.
#IND1






