

Avance Consulting
ETL Developer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Ab Initio Developer with a 6-month contract, offering a pay rate of "$X/hour". Key skills required include 7+ years in Ab Initio GDE, real-time data processing, and message-based architectures. Remote work is available.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
Unknown
-
🗓️ - Date
October 9, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Glasgow, Scotland, United Kingdom
-
🧠 - Skills detailed
#Linux #Scala #Business Analysis #Documentation #Shell Scripting #Ab Initio #Deployment #Azure #GCP (Google Cloud Platform) #Data Architecture #Cloud #Scripting #Data Modeling #Kafka (Apache Kafka) #Security #SQL (Structured Query Language) #Data Integration #Data Pipeline #"ETL (Extract #Transform #Load)" #AWS (Amazon Web Services) #Code Reviews #Data Processing #Unit Testing #Data Quality #Unix
Role description
Role Description:
We are seeking a highly skilled Senior Ab Initio Developer with strong expertise in real-time data processing and Ab Initio GDE, Continuous Flows, and MQ/Streaming technologies. The ideal candidate will play a key role in designing, developing, and optimizing data integration solutions that support mission-critical applications and real-time analytics.
\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_
Key Responsibilities:
• Design and develop Ab Initio-based ETL solutions, focusing on real-time data flows using Continuous Flows (CF) and Messaging Queues (MQ).
• Collaborate with data architects and business analysts to understand data requirements and translate them into scalable solutions.
• Optimize performance of real-time data pipelines and troubleshoot latency or throughput issues.
• Integrate Ab Initio flows with Kafka, MQ, or other streaming platforms.
• Develop and maintain technical documentation, including data flow diagrams and solution architecture.
• Participate in code reviews, unit testing, and deployment activities.
• Ensure data quality, integrity, and security across all data flows.
• Mentor junior developers and contribute to best practices in Ab Initio development.
\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_
Required Skills & Experience:
• 7+ years of hands-on experience with Ab Initio GDE, EME, and Conduct?It.
• 3+ years of experience with Ab Initio Continuous Flows and real-time data processing.
• Strong understanding of message-based architectures (e.g., IBM MQ, Kafka, RabbitMQ).
• Experience with data modeling, data warehousing, and data integration principles.
• Proficiency in Unix/Linux shell scripting and SQL.
• Familiarity with cloud platforms (AWS, Azure, GCP) is a plus.
• Excellent problem-solving and communication skills.
\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_
Role Description:
We are seeking a highly skilled Senior Ab Initio Developer with strong expertise in real-time data processing and Ab Initio GDE, Continuous Flows, and MQ/Streaming technologies. The ideal candidate will play a key role in designing, developing, and optimizing data integration solutions that support mission-critical applications and real-time analytics.
\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_
Key Responsibilities:
• Design and develop Ab Initio-based ETL solutions, focusing on real-time data flows using Continuous Flows (CF) and Messaging Queues (MQ).
• Collaborate with data architects and business analysts to understand data requirements and translate them into scalable solutions.
• Optimize performance of real-time data pipelines and troubleshoot latency or throughput issues.
• Integrate Ab Initio flows with Kafka, MQ, or other streaming platforms.
• Develop and maintain technical documentation, including data flow diagrams and solution architecture.
• Participate in code reviews, unit testing, and deployment activities.
• Ensure data quality, integrity, and security across all data flows.
• Mentor junior developers and contribute to best practices in Ab Initio development.
\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_
Required Skills & Experience:
• 7+ years of hands-on experience with Ab Initio GDE, EME, and Conduct?It.
• 3+ years of experience with Ab Initio Continuous Flows and real-time data processing.
• Strong understanding of message-based architectures (e.g., IBM MQ, Kafka, RabbitMQ).
• Experience with data modeling, data warehousing, and data integration principles.
• Proficiency in Unix/Linux shell scripting and SQL.
• Familiarity with cloud platforms (AWS, Azure, GCP) is a plus.
• Excellent problem-solving and communication skills.
\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_