Experis UK

AI/Data Developer (SC Cleared)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an "AI/Data Developer (SC Cleared)" with a contract length of "long term" and a pay rate of "competitive". Key skills required include Python, PySpark, data management, and ETL processes. SC Clearance is essential.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
640
-
🗓️ - Date
October 11, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Yes
-
📍 - Location detailed
Greater Bristol Area, United Kingdom
-
🧠 - Skills detailed
#GCP (Google Cloud Platform) #PySpark #AWS (Amazon Web Services) #Apache Spark #Cloud #Libraries #Datasets #Data Quality #Azure #AI (Artificial Intelligence) #Data Architecture #Storage #Data Processing #Python #Data Management #Pandas #Scala #Spark (Apache Spark) #Data Ingestion #"ETL (Extract #Transform #Load)" #NumPy #ML (Machine Learning)
Role description
Our client a high profile deep-tech organisation, urgently require an experienced AI/Data Developer to undertake a contract assignment. In order to be successful, you will have the following experience: • Extensive AI & Data Development background • Experiences with Python (including data libraries such as Pandas, NumPy, and PySpark) and Apache Spark (PySpark preferred) • Strong experience with data management and processing pipelines • Algorithm development and knowledge of graphs will be beneficial • SC Clearance is essential Within this role, you will be responsible for: • Supporting the development and delivery of AI solution to a Government customer • Design, develop, and maintain data processing pipelines using Apache Spark • Implement ETL/ELT workflows to extract, transform and load large-scale datasets efficiently • Develop and optimize Python-based applications for data ingestion • Collaborate on development of machine learning models • Ensure data quality, integrity, and performance across distributed environments • Contribute to the design of data architectures, storage strategies, and processing frameworks • Work with cloud data platforms (e.g., AWS, Azure, or GCP) to deploy scalable solutions • Monitor, troubleshoot, and optimize Spark jobs for performance and cost efficiency • Liaise with customer and internal stakeholders on a regular basis This represents an excellent opportunity to secure a long term contract, within a high profile organisation.