Infoplus Technologies UK Limited

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with 5-10 years of experience, focusing on ETL/ELT pipeline design and data analysis. The contract is hybrid, offering a competitive pay rate. Key skills include SQL, Python, and data warehousing expertise.
🌎 - Country
United Kingdom
πŸ’± - Currency
Β£ GBP
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
October 2, 2025
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Hybrid
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Warwick, England, United Kingdom
-
🧠 - Skills detailed
#Compliance #Data Manipulation #Apache Airflow #Data Analysis #Scala #Data Warehouse #DevOps #Python #Redshift #SQL (Structured Query Language) #ADF (Azure Data Factory) #Looker #Data Governance #Data Engineering #AWS (Amazon Web Services) #Pandas #Tableau #Airflow #Version Control #Data Pipeline #Azure #Data Science #BigQuery #Cloud #Visualization #Computer Science #BI (Business Intelligence) #ML (Machine Learning) #Data Quality #GCP (Google Cloud Platform) #Datasets #Microsoft Power BI #Azure Data Factory #Snowflake #Databases #"ETL (Extract #Transform #Load)" #Libraries #GIT
Role description
We are seeking a versatile and detail-oriented Data Engineer & Data Analyst to join our data team. This hybrid role involves designing and maintaining data pipelines, ensuring data quality, and performing in-depth analysis to support business decision-making. The ideal candidate will have strong technical skills in data engineering and a keen analytical mindset to extract actionable insights. Data Engineering: β€’ Design, build, and maintain scalable ETL/ELT pipelines using tools like Apache Airflow, Azure Data Factory, or similar. β€’ Develop and optimize data models and data warehouses (e.g., Snowflake, Redshift, BigQuery). β€’ Work with structured and unstructured data from various sources (APIs, databases, flat files). β€’ Ensure data quality, integrity, and governance across systems. β€’ Collaborate with DevOps and cloud teams to manage data infrastructure (AWS, Azure, GCP). Data Analysis: β€’ Analyze large datasets to identify trends, patterns, and insights. β€’ Create dashboards and reports using tools like Power BI, Tableau, or Looker. β€’ Work closely with business stakeholders to understand requirements and deliver data-driven solutions. β€’ Perform ad hoc analysis to support strategic initiatives. β€’ Communicate findings clearly through visualizations and presentations. Required Skills & Qualifications: β€’ Bachelor’s or Master’s degree in Computer Science, Data Science, Engineering, or a related field. β€’ 5 to 10 years of experience in data engineering and/or data analysis roles. β€’ Proficiency in SQL, Python, and data manipulation libraries (e.g., Pandas). β€’ Experience with data pipeline tools, cloud platforms, and data warehousing. β€’ Strong analytical and problem-solving skills. β€’ Excellent communication and stakeholder management abilities. Preferred Qualifications: β€’ Experience with machine learning or predictive analytics. β€’ Familiarity with CI/CD pipelines and version control (Git). β€’ Knowledge of data governance, privacy, and compliance standards. β€’ Certification in cloud platforms (AWS, Azure, GCP) is a plus.