

Altak Group Inc.
ETL Developer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for an ETL Developer with a 6-month contract, offering a pay rate of "X" per hour. Required skills include 5+ years of Ab Initio ETL development, AWS/Azure integration, strong SQL, and Linux/Unix proficiency. Cloud certifications are preferred.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
480
-
ποΈ - Date
February 18, 2026
π - Duration
Unknown
-
ποΈ - Location
Unknown
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
United States
-
π§ - Skills detailed
#ADLS (Azure Data Lake Storage) #Monitoring #Observability #Redshift #SQL (Structured Query Language) #Metadata #SQL Server #Data Integrity #Lambda (AWS Lambda) #Terraform #Batch #Infrastructure as Code (IaC) #Splunk #"ETL (Extract #Transform #Load)" #Vault #IAM (Identity and Access Management) #Kafka (Apache Kafka) #Unix #Airflow #Scripting #S3 (Amazon Simple Storage Service) #VPC (Virtual Private Cloud) #AWS (Amazon Web Services) #Prometheus #Version Control #Data Architecture #Cloud #Data Management #ADF (Azure Data Factory) #Grafana #Data Quality #Snowflake #Agile #Synapse #Azure #Data Engineering #Databricks #GIT #Security #Bash #Ab Initio #Data Pipeline #DevOps #Linux #Storage #RDBMS (Relational Database Management System) #Compliance #Spark (Apache Spark) #Oracle #Delta Lake #Python
Role description
Job description:
Role Summary
Weβre looking for an experienced ETL Developer to build and optimize high-throughput, resilient data pipelines that move and transform data from on-prem sources to cloud destinations. Youβll design, develop, and support Ab Initio graphs and plans, integrate with AWS and/or Azure services, and partner with platform, security, and analytics teams to deliver governed, production-grade data at scale.
What Youβll Do
β’ Design & Develop: Build ETL processes using Ab Initio graphs/plans (GDE, Co>Operating System, EME, and Conduct>It) to ingest, transform, and publish data.
β’ Migrate Pipelines: Move data workflows from on-prem to cloud targets (batch and near-real-time), ensuring restartability, parameterization (PDL), metadata management, and resiliency.
β’ Cloud Integration: Integrate with AWS (e.g., S3, Redshift, Glue/Glue Catalog, Lambda, EMR, MSK/Kinesis) and/or Azure (e.g., ADLS Gen2, Synapse, Data Factory, Event Hubs).
β’ Source Systems & CDC: Build connectors/jobs for Oracle, SQL Server, DB2, files, MQ/Kafka; implement incremental loads/CDC patterns where applicable.
β’ Performance & Reliability: Tune memory/parallelism/partitioning, optimize file and database I/O, and implement robust error handling, alerting, and SLA monitoring.
β’ Security & Governance: Apply data quality checks (e.g., Ab Initio DQE), lineage/metadata practices, and enterprise security controls (IAM, KMS/Key Vault, tokenization).
β’ DevOps & CI/CD: Utilize version control (Git/EME), automated build/deploy, environment promotion, and infrastructure coordination with platform teams.
β’ Operations: Troubleshoot production issues, perform capacity planning, and deliver clear runbooks.
β’ Collaboration: Partner with data architects, platform/cloud engineers, and analysts; contribute to standards and best practices.
Required Qualifications
β’ 5+ years of hands-on ETL development using Ab Initio (GDE, EME, Co>Operating System; Conduct>It scheduling).
β’ Proven Delivery: Experience with high-volume batch and near-real-time pipelines, including restart/recovery, parameterization, and metadata-driven design.
β’ Database Expertise: Strong SQL and performance tuning across major RDBMS (Oracle/SQL Server/DB2; plus Redshift/Synapse/Snowflake a plus).
β’ Cloud Experience: Production experience integrating Ab Initio with AWS and/or Azure for data landing, processing, and analytics.
β’ Technical Foundations: Solid Linux/Unix fundamentals and scripting (bash, Python preferred).
β’ Messaging & Networking: Experience with Kafka/MQ (publish/subscribe), file transfer patterns, and secure networking (VPC/VNet, PrivateLink/Private Endpoints).
β’ Data Integrity: Familiarity with data quality, lineage, and compliance for sensitive data (e.g., PHI/PII).
β’ Problem Solving: Excellent troubleshooting skills and the ability to own solutions end-to-end.
Preferred / βNice to Haveβ
β’ Cloud Certifications: AWS Data Analytics / Solutions Architect, or Azure Data Engineer Associate.
β’ Modern Data Stack: Experience with Glue/Spark or Synapse Spark for complementary processing; EMR/Databricks exposure a plus.
β’ Orchestration: Experience with Airflow/ADF alongside Conduct>It; event-driven designs with Lambda/Functions.
β’ Infrastructure as Code (IaC): Awareness of Terraform, CloudFormation, or Bicep to collaborate with platform teams.
β’ Storage Patterns: Experience with Snowflake or Delta Lake patterns on S3/ADLS.
β’ Observability: Proficiency with CloudWatch, Azure Monitor, Prometheus/Grafana, or Splunk.
β’ Methodology: Agile/SAFe delivery in regulated environments (healthcare/financial services).
Job description:
Role Summary
Weβre looking for an experienced ETL Developer to build and optimize high-throughput, resilient data pipelines that move and transform data from on-prem sources to cloud destinations. Youβll design, develop, and support Ab Initio graphs and plans, integrate with AWS and/or Azure services, and partner with platform, security, and analytics teams to deliver governed, production-grade data at scale.
What Youβll Do
β’ Design & Develop: Build ETL processes using Ab Initio graphs/plans (GDE, Co>Operating System, EME, and Conduct>It) to ingest, transform, and publish data.
β’ Migrate Pipelines: Move data workflows from on-prem to cloud targets (batch and near-real-time), ensuring restartability, parameterization (PDL), metadata management, and resiliency.
β’ Cloud Integration: Integrate with AWS (e.g., S3, Redshift, Glue/Glue Catalog, Lambda, EMR, MSK/Kinesis) and/or Azure (e.g., ADLS Gen2, Synapse, Data Factory, Event Hubs).
β’ Source Systems & CDC: Build connectors/jobs for Oracle, SQL Server, DB2, files, MQ/Kafka; implement incremental loads/CDC patterns where applicable.
β’ Performance & Reliability: Tune memory/parallelism/partitioning, optimize file and database I/O, and implement robust error handling, alerting, and SLA monitoring.
β’ Security & Governance: Apply data quality checks (e.g., Ab Initio DQE), lineage/metadata practices, and enterprise security controls (IAM, KMS/Key Vault, tokenization).
β’ DevOps & CI/CD: Utilize version control (Git/EME), automated build/deploy, environment promotion, and infrastructure coordination with platform teams.
β’ Operations: Troubleshoot production issues, perform capacity planning, and deliver clear runbooks.
β’ Collaboration: Partner with data architects, platform/cloud engineers, and analysts; contribute to standards and best practices.
Required Qualifications
β’ 5+ years of hands-on ETL development using Ab Initio (GDE, EME, Co>Operating System; Conduct>It scheduling).
β’ Proven Delivery: Experience with high-volume batch and near-real-time pipelines, including restart/recovery, parameterization, and metadata-driven design.
β’ Database Expertise: Strong SQL and performance tuning across major RDBMS (Oracle/SQL Server/DB2; plus Redshift/Synapse/Snowflake a plus).
β’ Cloud Experience: Production experience integrating Ab Initio with AWS and/or Azure for data landing, processing, and analytics.
β’ Technical Foundations: Solid Linux/Unix fundamentals and scripting (bash, Python preferred).
β’ Messaging & Networking: Experience with Kafka/MQ (publish/subscribe), file transfer patterns, and secure networking (VPC/VNet, PrivateLink/Private Endpoints).
β’ Data Integrity: Familiarity with data quality, lineage, and compliance for sensitive data (e.g., PHI/PII).
β’ Problem Solving: Excellent troubleshooting skills and the ability to own solutions end-to-end.
Preferred / βNice to Haveβ
β’ Cloud Certifications: AWS Data Analytics / Solutions Architect, or Azure Data Engineer Associate.
β’ Modern Data Stack: Experience with Glue/Spark or Synapse Spark for complementary processing; EMR/Databricks exposure a plus.
β’ Orchestration: Experience with Airflow/ADF alongside Conduct>It; event-driven designs with Lambda/Functions.
β’ Infrastructure as Code (IaC): Awareness of Terraform, CloudFormation, or Bicep to collaborate with platform teams.
β’ Storage Patterns: Experience with Snowflake or Delta Lake patterns on S3/ADLS.
β’ Observability: Proficiency with CloudWatch, Azure Monitor, Prometheus/Grafana, or Splunk.
β’ Methodology: Agile/SAFe delivery in regulated environments (healthcare/financial services).





