Databricks Solution Architect

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Databricks Solution Architect based in Seoul, South Korea (Hybrid) for a 7+ year contract at a competitive pay rate. Key skills include Apache Spark, Databricks, and cloud platforms (AWS, Azure, GCP). Databricks certification is a plus.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
-
🗓️ - Date discovered
August 7, 2025
🕒 - Project duration
Unknown
-
🏝️ - Location type
Hybrid
-
📄 - Contract type
Unknown
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
United Kingdom
-
🧠 - Skills detailed
#Azure #Cloud #Data Lake #Data Lakehouse #Delta Lake #"ETL (Extract #Transform #Load)" #Talend #Deployment #GCP (Google Cloud Platform) #Spark (Apache Spark) #Scala #AWS (Amazon Web Services) #Batch #Apache Spark #Databricks #HDFS (Hadoop Distributed File System) #Data Engineering
Role description
Resident Solutions Architect – Databricks Location: Seoul, South Korea (Hybrid) Experience: 7+ years Bonus points: Databricks Certified Professional We’re looking for a brilliant Resident Solutions Architect to help enterprise clients build fast, scalable, and secure data lakehouse solutions using Databricks. You’ll work directly with client teams to design high-performance Spark pipelines, lead technical workshops, troubleshoot production issues, and drive real impact. What You’ll Do • Design and deliver end-to-end ETL/ELT pipelines using Spark (batch & streaming) • Lead workshops, demos, and architectural sessions with clients • Optimise jobs using AQE, Spark UI, Delta Lake, and Unity Catalog • Migrate legacy platforms (e.g. Talend, HDFS) to Databricks • Guide CI/CD, MLOps, and secure production deployments • Troubleshoot and fix issues fast – think data skew, memory errors, executor failures What You’ll Bring • 7+ years in data engineering, analytics, and large-scale platform delivery • Deep hands-on experience with Apache Spark, Databricks, and cloud platforms (AWS, Azure or GCP) • Strong understanding of performance tuning, Spark internals, and governance • Confident working directly with clients and leading technical sessions • Bonus if you’re Databricks Certified (especially the Professional Data Engineer) If you're ready to lead complex data projects and make an impact from day one, we’d love to hear from you.