

US3 Consulting
Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with a contract length of "unknown," offering a pay rate of "unknown." It requires strong skills in SQL, Python, Azure cloud platforms, and data pipeline maintenance, with a focus on data quality and governance. On-site work in "Manchester" is required.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
Unknown
-
🗓️ - Date
February 19, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Manchester, England, United Kingdom
-
🧠 - Skills detailed
#Azure cloud #Synapse #Scala #Data Quality #Azure #Data Engineering #Metadata #Data Governance #Python #Security #Data Pipeline #Cloud #Airflow #Logging #ADF (Azure Data Factory) #Documentation #Deployment #Data Ingestion #Batch #Azure Data Factory #"ETL (Extract #Transform #Load)" #Data Lineage #SQL (Structured Query Language) #Version Control #Monitoring #Data Analysis
Role description
Exciting Opportunity for Data Engineers
Role Purpose
The Data Engineer is responsible for designing, building, and maintaining reliable, scalable data pipelines and data models that support analytics, reporting, and operational use cases. The role focuses on high-quality data ingestion, transformation, orchestration, and environment management across the data platform, ensuring data is trusted, accessible, and fit for purpose.
Key Responsibilities
Data Pipelines & Integration
• Design, build, and maintain robust data pipelines for ingesting data from source systems (e.g. operational systems, APIs, files, third-party platforms)
• Implement batch and, where required, near-real-time data ingestion patterns
• Ensure pipelines are resilient, performant, and recoverable, with appropriate error handling and logging
Orchestration & Scheduling
• Define and manage workflow orchestration using scheduling and orchestration tools (e.g. Airflow or equivalent)
• Manage dependencies, retries, alerts, and pipeline monitoring to support reliable data delivery
• Optimise pipeline execution to meet agreed service levels for downstream reporting and analytics
Data Modelling & Transformation
• Design and maintain data models to support reporting, analytics, and operational use cases (e.g. ODS, dimensional, or analytical models)
• Apply best practices for data transformation, naming standards, and model documentation
• Collaborate with analysts and stakeholders to ensure models meet business requirements
Environments & Platform Management
• Work across development, test, and production environments, ensuring safe and controlled deployment of changes
• Support environment configuration, version control, and CI/CD practices for data engineering workloads
• Contribute to platform stability, performance tuning, and cost-effective use of infrastructure
Data Quality & Governance
• Implement basic data quality checks and validation rules within pipelines
• Support data lineage, metadata, and documentation to improve transparency and trust in data
• Work within established data governance, security, and access control frameworks
Collaboration & Delivery
• Work closely with data analysts, architects, and wider technology teams to deliver end-to-end data solutions
• Participate in planning, estimation, and delivery of data engineering work
• Support incident investigation and resolution related to data pipelines and data availability
Essential
• Strong experience building and maintaining data pipelines in a modern data platform
• Solid understanding of data modelling concepts and patterns
• Experience with workflow orchestration and scheduling tools
• Strong capability in SQL and Python
• Experience with Azure cloud-based data platforms such as Azure Synapse and Azure Data Factory
• Experience working across multiple environments with version control
• Good understanding of data quality, reliability, and operational considerations
• Familiarity with CI/CD approaches for data engineering
Desirable
• Experience supporting analytics and reporting use cases in a production environment
• Exposure to regulated or data-sensitive environments
Please apply with an updated CV, if you're available and can do 3 days onsite in Manchester.
Exciting Opportunity for Data Engineers
Role Purpose
The Data Engineer is responsible for designing, building, and maintaining reliable, scalable data pipelines and data models that support analytics, reporting, and operational use cases. The role focuses on high-quality data ingestion, transformation, orchestration, and environment management across the data platform, ensuring data is trusted, accessible, and fit for purpose.
Key Responsibilities
Data Pipelines & Integration
• Design, build, and maintain robust data pipelines for ingesting data from source systems (e.g. operational systems, APIs, files, third-party platforms)
• Implement batch and, where required, near-real-time data ingestion patterns
• Ensure pipelines are resilient, performant, and recoverable, with appropriate error handling and logging
Orchestration & Scheduling
• Define and manage workflow orchestration using scheduling and orchestration tools (e.g. Airflow or equivalent)
• Manage dependencies, retries, alerts, and pipeline monitoring to support reliable data delivery
• Optimise pipeline execution to meet agreed service levels for downstream reporting and analytics
Data Modelling & Transformation
• Design and maintain data models to support reporting, analytics, and operational use cases (e.g. ODS, dimensional, or analytical models)
• Apply best practices for data transformation, naming standards, and model documentation
• Collaborate with analysts and stakeholders to ensure models meet business requirements
Environments & Platform Management
• Work across development, test, and production environments, ensuring safe and controlled deployment of changes
• Support environment configuration, version control, and CI/CD practices for data engineering workloads
• Contribute to platform stability, performance tuning, and cost-effective use of infrastructure
Data Quality & Governance
• Implement basic data quality checks and validation rules within pipelines
• Support data lineage, metadata, and documentation to improve transparency and trust in data
• Work within established data governance, security, and access control frameworks
Collaboration & Delivery
• Work closely with data analysts, architects, and wider technology teams to deliver end-to-end data solutions
• Participate in planning, estimation, and delivery of data engineering work
• Support incident investigation and resolution related to data pipelines and data availability
Essential
• Strong experience building and maintaining data pipelines in a modern data platform
• Solid understanding of data modelling concepts and patterns
• Experience with workflow orchestration and scheduling tools
• Strong capability in SQL and Python
• Experience with Azure cloud-based data platforms such as Azure Synapse and Azure Data Factory
• Experience working across multiple environments with version control
• Good understanding of data quality, reliability, and operational considerations
• Familiarity with CI/CD approaches for data engineering
Desirable
• Experience supporting analytics and reporting use cases in a production environment
• Exposure to regulated or data-sensitive environments
Please apply with an updated CV, if you're available and can do 3 days onsite in Manchester.






