Senior Data Analyst

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Analyst with a contract length of "unknown" and a pay rate of "unknown," located in "unknown." Key skills include Python, SQL, Azure Databricks, and Power BI. Experience in big data and Agile practices is essential.
🌎 - Country
United Kingdom
πŸ’± - Currency
Β£ GBP
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
August 20, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
Unknown
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
London Area, United Kingdom
-
🧠 - Skills detailed
#PySpark #Cybersecurity #Spark SQL #Spark (Apache Spark) #Documentation #Databricks #Agile #Python #Azure #Azure Databricks #Security #Data Lake #Big Data #SQL (Structured Query Language) #Data Science #Visualization #Automation #Microsoft Power BI #BI (Business Intelligence) #Programming #Data Storage #Datasets #"ETL (Extract #Transform #Load)" #Data Quality #Data Analysis #Data Pipeline #Storage #DevOps
Role description
Job Description: The role holder will join the Analytics team within the Cryptography programme who are developing within a Cyber Azure Data Lake. Our Cryptography workspace is managed within the Databricks platform where we are employing a medallion architecture to ensure efficient data storage, accessibility, and processing. Our team works with the Cyber Data Lake teams to land data sources into the Cryptography workspace ready for cleaning, enrichment, aggregation, and further processing towards our gold layer. We work with big datasets and pre-process our data as much as possible prior to facing to our presentation layer. The data is then available to support automation as well as creating dashboards/reports using our own workspace using Power BI Service. Key responsibilities β€’ Performing in-depth analysis of large, complex and dynamic datasets to extract useful insights, identify potential data quality issues and design a suitable logic optimized for these datasets β€’ Developing efficient, reusable and maintainable scripts which collect, clean and validate data from multiple sources to produce datasets which feed into automation, further data analytics and reporting β€’ Visualising the findings from the data analysis using relevant formats such as charts, graphs and dashboards and presenting those in a way that is clear and easy-to-understand for both technical and non-technical audiences β€’ Collaborating with a wide range of reporting customers to gain a better understanding of the data and the business domain around this data in order to align the deliverables to the business requirements β€’ Developing and maintaining documentation of data processes, methodologies and insights β€’ Using good software development practices to write efficient, reusable and maintainable software by adopting the team’s DevOps practices β€’ Working with Agile development practices in mind, including test-driven development Requirements β€’ Proficiency in data analysis tools such as Python and SQL programming languages β€’ Experience working with big data and data science frameworks, with experience in PySpark and Spark SQL being strongly desirable β€’ Experience with developing data pipelines and performing data analysis, preferably using Azure Databricks platform β€’ Strong data visualization and presentation skills, i.e. using tools such as PowerBI, to present findings to stakeholders with varying technical expertise β€’ Strong analytical skills with a keep attention to detail and accuracy β€’ Excellent communication skills and willingness to collaborate with the other team members β€’ Being enthusiastic and willing to learn more about the area of Cybersecurity β€’ Hands-on experience working in an Agile team