RED Global

AWS Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an AWS Data Engineer in Belfast, offering a 12-month contract at an inside IR35 pay rate. Requires 4+ years of experience, strong Python and SQL skills, and expertise in ETL processes and Apache Airflow. Hybrid work model.
🌎 - Country
United Kingdom
πŸ’± - Currency
Β£ GBP
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
December 3, 2025
πŸ•’ - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
πŸ“„ - Contract
Inside IR35
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Belfast, Northern Ireland, United Kingdom
-
🧠 - Skills detailed
#PySpark #Apache Airflow #Data Engineering #Oracle #Data Lake #Data Pipeline #Data Quality #MS SQL (Microsoft SQL Server) #"ETL (Extract #Transform #Load)" #Python #AWS (Amazon Web Services) #Airflow #SQL (Structured Query Language) #Athena #Documentation #Data Exploration #Spark (Apache Spark)
Role description
Data Engineer (Python/AWS) Belfast - 3 Days Onsite - 2 Days Remote 12 Month Initial Contract Inside IR35 We are seeking an experienced Data Engineer to join our client on a long-term assignment in Belfast. This role requires a skilled engineer with strong Python and AWS expertise who can build, optimise, and maintain high-quality data pipelines in a complex environment. Key Responsibilities β€’ Design, develop, and maintain robust data pipelines and ETL processes. β€’ Build and optimise data workflows using Python. β€’ Manage workflow orchestration with Apache Airflow (MWAA). β€’ Perform data testing, validation, and produce data quality reports. β€’ Conduct data exploration and analysis to understand data structures prior to ETL development. β€’ Collaborate with system owners and stakeholders to gather requirements and deliver solutions. β€’ Monitor, troubleshoot, and ensure reliability and performance of data pipelines. β€’ Maintain clear documentation of data workflows, processes, and configurations. Experience & Competencies β€’ 4+ years’ experience as a Data Engineer. β€’ Strong SQL/PLSQL skills across MS SQL and Oracle. β€’ Extensive hands-on experience coding in Python. β€’ Solid understanding of ETL concepts and data pipeline architecture. β€’ Knowledge of data lakes and associated architectures. β€’ Experience with Apache Airflow (MWAA). β€’ Familiarity with AWS Athena / PySpark (Glue). If you would like immediate consideration, please send me an updated CV/contact details to jcaria@redglobal.com so we can discuss further or reach out to me through LinkedIn.