

Insight International (UK) Ltd
Databricks Solutions Architect
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Databricks Solutions Architect with a 6-month contract, offering a competitive pay rate. Requires 15+ years in data analytics, expertise in Databricks, Pyspark, Python, AWS, and experience with data migration and automation.
π - Country
United Kingdom
π± - Currency
Β£ GBP
-
π° - Day rate
Unknown
-
ποΈ - Date
March 5, 2026
π - Duration
Unknown
-
ποΈ - Location
Unknown
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
London Area, United Kingdom
-
π§ - Skills detailed
#PySpark #Datasets #Data Quality #Data Cleaning #Automation #Cloud #Data Engineering #AWS (Amazon Web Services) #Python #Spark (Apache Spark) #Informatica #Scala #Pandas #Data Architecture #Documentation #Matplotlib #Databases #Databricks #Data Analysis #"ETL (Extract #Transform #Load)" #Data Integrity #SQL (Structured Query Language)
Role description
Senior Level Data Architect with data analytics experience, Databricks, Pyspark, Python, ETL tools like Informatica
This is a key role that requires senior/lead with great communication skills who is very proactive with risk & issue management.
Experience and Education Required
15+ years of experience as Data Analyst / Data Engineer with Databricks on AWS expertise in designing and implementing scalable, secure, and cost-efficient data solutions on AWS
Job Profile:
β’ Hands-on data analytics experience with Databricks on AWS, Pyspark and Python
β’ Must have prior experience with migrating a data asset to the cloud using a GenAI automation option
β’ Experience in migrating data from on-premises to AWS
β’ Expertise in developing data models, delivering data-driven insights for business solutions
β’ Experience in pretraining, fine-tuning, augmenting and optimizing large language models (LLMs)
β’ Experience in Designing and implementing database solutions, developing PySpark applications to extract, transform, and aggregate data, generating insights
β’ Data Collection & Integration: Identify, gather, and consolidate data from diverse sources, including internal databases and spreadsheets ensuring data integrity and relevance.
β’ Data Cleaning & Transformation: Apply thorough data quality checks, cleaning processes, and transformations using Python (Pandas) and SQL to prepare datasets.
β’ Automation & Scalability: Develop and maintain scripts that automate repetitive data preparation tasks.
β’ Autonomy & Proactivity: Operate with minimal supervision, demonstrating initiative in problem-solving, prioritizing tasks, and continuously improving the quality and impact of your work
Technical Skills:
β’ 15+ years of experience as a Data Analyst, Data Engineer, or related role, ideally with a bachelorβs degree or higher in a relevant field.
β’ Strong proficiency in Python (Pandas, Scikit-learn, Matplotlib) and SQL, with experience working across various data formats and sources.
β’ Proven ability to automate data workflows, implement code-based best practices, and maintain documentation to ensure reproducibility and scalability.
Behavioral Skills:
β’ Ability to manage in tight circumstances, very pro-active with risk & issue management
β’ Requirement Clarification & Communication: Interact directly with colleagues to clarify objectives, challenge assumptions.
β’ Documentation & Best Practices: Maintain clear, concise documentation of data workflows, coding standards, and analytical methodologies to support knowledge transfer and scalability.
β’ Collaboration & Stakeholder Engagement: Work closely with colleagues who provide data, raising questions about data validity, sharing insights, and co-creating solutions that address evolving needs.
β’ Excellent communication skills for engaging with colleagues, clarifying requirements, and conveying analytical results in a meaningful, non-technical manner.
β’ Demonstrated critical thinking skills, including the willingness to question assumptions, evaluate data quality, and recommend alternative approaches when necessary.
β’ A self-directed, resourceful problem-solver who collaborates well with others while confidently managing tasks and priorities independently.
Senior Level Data Architect with data analytics experience, Databricks, Pyspark, Python, ETL tools like Informatica
This is a key role that requires senior/lead with great communication skills who is very proactive with risk & issue management.
Experience and Education Required
15+ years of experience as Data Analyst / Data Engineer with Databricks on AWS expertise in designing and implementing scalable, secure, and cost-efficient data solutions on AWS
Job Profile:
β’ Hands-on data analytics experience with Databricks on AWS, Pyspark and Python
β’ Must have prior experience with migrating a data asset to the cloud using a GenAI automation option
β’ Experience in migrating data from on-premises to AWS
β’ Expertise in developing data models, delivering data-driven insights for business solutions
β’ Experience in pretraining, fine-tuning, augmenting and optimizing large language models (LLMs)
β’ Experience in Designing and implementing database solutions, developing PySpark applications to extract, transform, and aggregate data, generating insights
β’ Data Collection & Integration: Identify, gather, and consolidate data from diverse sources, including internal databases and spreadsheets ensuring data integrity and relevance.
β’ Data Cleaning & Transformation: Apply thorough data quality checks, cleaning processes, and transformations using Python (Pandas) and SQL to prepare datasets.
β’ Automation & Scalability: Develop and maintain scripts that automate repetitive data preparation tasks.
β’ Autonomy & Proactivity: Operate with minimal supervision, demonstrating initiative in problem-solving, prioritizing tasks, and continuously improving the quality and impact of your work
Technical Skills:
β’ 15+ years of experience as a Data Analyst, Data Engineer, or related role, ideally with a bachelorβs degree or higher in a relevant field.
β’ Strong proficiency in Python (Pandas, Scikit-learn, Matplotlib) and SQL, with experience working across various data formats and sources.
β’ Proven ability to automate data workflows, implement code-based best practices, and maintain documentation to ensure reproducibility and scalability.
Behavioral Skills:
β’ Ability to manage in tight circumstances, very pro-active with risk & issue management
β’ Requirement Clarification & Communication: Interact directly with colleagues to clarify objectives, challenge assumptions.
β’ Documentation & Best Practices: Maintain clear, concise documentation of data workflows, coding standards, and analytical methodologies to support knowledge transfer and scalability.
β’ Collaboration & Stakeholder Engagement: Work closely with colleagues who provide data, raising questions about data validity, sharing insights, and co-creating solutions that address evolving needs.
β’ Excellent communication skills for engaging with colleagues, clarifying requirements, and conveying analytical results in a meaningful, non-technical manner.
β’ Demonstrated critical thinking skills, including the willingness to question assumptions, evaluate data quality, and recommend alternative approaches when necessary.
β’ A self-directed, resourceful problem-solver who collaborates well with others while confidently managing tasks and priorities independently.






