Infinity Quest

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with a contract length of "unknown," offering a pay rate of "unknown," and is remote. Key skills include Azure services, SQL, Python, Spark, CI/CD practices, and data architecture. Experience in cloud migration is desirable.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
Unknown
-
🗓️ - Date
March 6, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
London Area, United Kingdom
-
🧠 - Skills detailed
#PySpark #Apache Airflow #Data Management #Shell Scripting #Unix #Azure Databricks #Data Architecture #Data Pipeline #GitHub #SQL (Structured Query Language) #Azure DevOps #Cloud #Airflow #Azure #DevOps #Azure ADLS (Azure Data Lake Storage) #Data Engineering #Data Lake #Scripting #Migration #Metadata #Strategy #Azure Data Factory #ADF (Azure Data Factory) #Databricks #Agile #ADLS (Azure Data Lake Storage) #Spark (Apache Spark) #Cloudera #Linux #Storage #Data Strategy #Python
Role description
Minimum Criteria • Extensive experience with Azure services including Azure Databricks, Azure Data Lake Storage, and Azure Data Factory. • Advanced proficiency in SQL, Python, and Spark (PySpark), with a strong focus on performance optimization and distributed processing. • Proven experience in CI/CD practices using industry-standard tools (e.g., GitHub Actions, Azure DevOps). • Strong understanding of data architecture principles and cloud-native design patterns. Essential Criteria • Demonstrated ability to lead technical delivery, mentor engineering teams and collaborate with stakeholders to ensure alignment between data solutions and business strategy. • Proficiency in Linux/Unix environments and shell scripting. • Deep understanding of source control, testing strategies, and agile development practices. • Self-motivated with a strategic mindset and a passion for driving innovation in data engineering. Desirable Criteria • Experience delivering data pipelines on Hortonworks/Cloudera on-prem and leading cloud migration initiatives. • Familiarity with: • Apache Airflow • Data modelling and metadata management Experience influencing enterprise data strategy and contributing to architectural governance.