

Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer (remote, till Feb 2026) paying £42/hour inside IR35. Requires 5+ years in SQL, Python, Azure Data Factory, and data visualization (Tableau). Experience in a global tech company and customer support analytics is essential.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
336
-
🗓️ - Date discovered
July 31, 2025
🕒 - Project duration
More than 6 months
-
🏝️ - Location type
Remote
-
📄 - Contract type
Inside IR35
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
United Kingdom
-
🧠 - Skills detailed
#Data Integration #"ETL (Extract #Transform #Load)" #SQL (Structured Query Language) #Airflow #Data Engineering #Scala #Scripting #Strategy #ADF (Azure Data Factory) #Data Modeling #Data Science #Apache Airflow #Python #Tableau #Visualization #Azure #Data Pipeline #Azure Data Factory
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Job Title: Data Engineer
Location: London (Fully Remote)
Duration: Till Feb 2026
Salary: £42 per hour Inside IR35 (Negotiable for right candidate) | 40 hours a week
About the Role
We are seeking a skilled Data Engineer to join a high-impact team within the Central Integrity & Support organization at a leading global tech company. This role is focused on supporting High Touch Support initiatives—helping improve user experience and satisfaction by building robust data pipelines, success metrics, and tools to empower decision-making across multiple support teams
You’ll collaborate cross-functionally with engineers, data scientists, and product managers to build scalable data solutions and support internal efforts that drive measurable improvements in customer support experiences.
Key Responsibilities
• Design, develop, and maintain scalable and reliable data pipelines using tools such as Azure Data Factory, Airflow, and Python-based frameworks.
• Partner with stakeholders to define, implement, and track success metrics for internal support tools and customer communication systems.
• Create dashboards and data visualizations using tools like Tableau or MicroStrategy to present key insights to stakeholders.
• Collaborate with Data Scientists to ensure metric definitions are implemented correctly, and with Engineers to ensure data is captured and logged accurately.
• Support knowledge management analytics, helping identify inefficiencies in agent search queries and recommend improvements.
Required Skills & Experience
• 5+ years of experience with SQL for complex data querying and transformation.
• 5+ years of hands-on experience with Python for data engineering and scripting.
• Experience in a leading global tech company is essential.
• Strong understanding of data modeling, ETL development, and data integration tools such as: Azure Data Factory & Apache Airflow
• Proficiency in data visualization tools (e.g., Tableau, MicroStrategy).
• Experience building data solutions in customer support, product analytics, or similar operational spaces.
If you thrive in fast-paced, collaborative environments and want to work at the intersection of data, product, and user experience, we’d love to hear from you.
Apply now or share your CV at urvashi.kandpal@russelltobin.com