

Altak Group Inc.
Abinitio Developer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for an Abinitio Developer with a contract length of "unknown," offering a pay rate of "unknown." Candidates should have 5+ years of Ab Initio development experience, strong SQL skills, and cloud integration experience with AWS or Azure.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
Unknown
-
ποΈ - Date
October 31, 2025
π - Duration
Unknown
-
ποΈ - Location
Unknown
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
United States
-
π§ - Skills detailed
#Unix #Monitoring #AWS (Amazon Web Services) #SQL Server #Scripting #Version Control #Terraform #Airflow #Agile #DevOps #GIT #Data Engineering #Lambda (AWS Lambda) #Metadata #Observability #Python #Batch #Linux #Snowflake #VPC (Virtual Private Cloud) #Compliance #Synapse #Databricks #Bash #Infrastructure as Code (IaC) #SQL (Structured Query Language) #Spark (Apache Spark) #Cloud #Security #Azure #Delta Lake #Data Management #Kafka (Apache Kafka) #S3 (Amazon Simple Storage Service) #Grafana #Vault #Data Architecture #Oracle #Ab Initio #Splunk #Data Quality #ADF (Azure Data Factory) #Redshift #ADLS (Azure Data Lake Storage) #"ETL (Extract #Transform #Load)" #RDBMS (Relational Database Management System) #IAM (Identity and Access Management) #Prometheus
Role description
Job Description:
Role summary
β’ Design & develop Ab Initio graphs/plans using GDE, Co>Operating System, EME, and Conduct>It to ingest, transform, and publish data.
β’ Migrate pipelines from on-prem to cloud targets (batch and near-real-time), ensuring restartability, parameterization (PDL), metadata management, and resiliency.
β’ Integrate with AWS (e.g., S3, Redshift, Glue/Glue Catalog, Lambda, EMR, MSK/Kinesis) and/or Azure (e.g., ADLS Gen2, Synapse, Data Factory, Event Hubs).
β’ Source systems & CDC: Build connectors/jobs for Oracle, SQL Server, DB2, files, MQ/Kafka; implement incremental loads/CDC patterns where applicable.
β’ Performance & reliability: Tune memory/parallelism/partitioning, optimize file and database I/O, and implement robust error handling, alerting, and SLA monitoring.
β’ Security & governance: Apply data quality checks (e.g., Ab Initio DQE), lineage/metadata practices, and enterprise security controls (IAM, KMS/Key Vault, tokenization).
β’ DevOps & CI/CD: Version control (Git/EME), automated build/deploy, environment promotion, and infrastructure coordination with platform teams.
β’ Operations: Troubleshoot production issues, perform capacity planning, and deliver clear runbooks.
β’ Collaboration: Partner with data architects, platform/cloud engineers, and analysts; contribute to standards and best practices.
Required qualifications
β’ 5+ years hands-on Ab Initio development (GDE, EME, Co>Operating System; Conduct>It scheduling).
β’ Proven delivery of high-volume batch and near-real-time pipelines, including restart/recovery, parameterization, and metadata-driven design.
β’ Strong SQL and performance tuning across major RDBMS (Oracle/SQL Server/DB2; plus Redshift/Synapse/Snowflake a plus).
β’ Production experience integrating Ab Initio with AWS and/or Azure for data landing, processing, and analytics.
β’ Solid Linux/Unix fundamentals and scripting (bash, Python preferred).
β’ Experience with Kafka/MQ (publish/subscribe), file transfer patterns, and secure networking (VPC/VNet, PrivateLink/Private Endpoints).
β’ Familiarity with data quality, lineage, and compliance for sensitive data (e.g., PHI/PII).
β’ Excellent troubleshooting skills and the ability to own solutions end-to-end.
Preferred/βnice to haveβ
β’ Cloud certs (e.g., AWS Data Analytics / Solutions Architect, Azure Data Engineer Associate).
β’ Experience with Glue/Spark or Synapse Spark for complementary processing; EMR/Databricks exposure a plus.
β’ Orchestration with Airflow/ADF alongside Conduct>It; event-driven designs with Lambda/Functions.
β’ IaC awareness (Terraform/CloudFormation/Bicep) to collaborate with platform teams.
β’ Experience with Snowflake or Delta Lake patterns on S3/ADLS.
β’ Monitoring/observability (CloudWatch, Azure Monitor, Prometheus/Grafana, Splunk).
β’ Agile/SAFe delivery in regulated environments (healthcare/financial services).
Job Description:
Role summary
β’ Design & develop Ab Initio graphs/plans using GDE, Co>Operating System, EME, and Conduct>It to ingest, transform, and publish data.
β’ Migrate pipelines from on-prem to cloud targets (batch and near-real-time), ensuring restartability, parameterization (PDL), metadata management, and resiliency.
β’ Integrate with AWS (e.g., S3, Redshift, Glue/Glue Catalog, Lambda, EMR, MSK/Kinesis) and/or Azure (e.g., ADLS Gen2, Synapse, Data Factory, Event Hubs).
β’ Source systems & CDC: Build connectors/jobs for Oracle, SQL Server, DB2, files, MQ/Kafka; implement incremental loads/CDC patterns where applicable.
β’ Performance & reliability: Tune memory/parallelism/partitioning, optimize file and database I/O, and implement robust error handling, alerting, and SLA monitoring.
β’ Security & governance: Apply data quality checks (e.g., Ab Initio DQE), lineage/metadata practices, and enterprise security controls (IAM, KMS/Key Vault, tokenization).
β’ DevOps & CI/CD: Version control (Git/EME), automated build/deploy, environment promotion, and infrastructure coordination with platform teams.
β’ Operations: Troubleshoot production issues, perform capacity planning, and deliver clear runbooks.
β’ Collaboration: Partner with data architects, platform/cloud engineers, and analysts; contribute to standards and best practices.
Required qualifications
β’ 5+ years hands-on Ab Initio development (GDE, EME, Co>Operating System; Conduct>It scheduling).
β’ Proven delivery of high-volume batch and near-real-time pipelines, including restart/recovery, parameterization, and metadata-driven design.
β’ Strong SQL and performance tuning across major RDBMS (Oracle/SQL Server/DB2; plus Redshift/Synapse/Snowflake a plus).
β’ Production experience integrating Ab Initio with AWS and/or Azure for data landing, processing, and analytics.
β’ Solid Linux/Unix fundamentals and scripting (bash, Python preferred).
β’ Experience with Kafka/MQ (publish/subscribe), file transfer patterns, and secure networking (VPC/VNet, PrivateLink/Private Endpoints).
β’ Familiarity with data quality, lineage, and compliance for sensitive data (e.g., PHI/PII).
β’ Excellent troubleshooting skills and the ability to own solutions end-to-end.
Preferred/βnice to haveβ
β’ Cloud certs (e.g., AWS Data Analytics / Solutions Architect, Azure Data Engineer Associate).
β’ Experience with Glue/Spark or Synapse Spark for complementary processing; EMR/Databricks exposure a plus.
β’ Orchestration with Airflow/ADF alongside Conduct>It; event-driven designs with Lambda/Functions.
β’ IaC awareness (Terraform/CloudFormation/Bicep) to collaborate with platform teams.
β’ Experience with Snowflake or Delta Lake patterns on S3/ADLS.
β’ Monitoring/observability (CloudWatch, Azure Monitor, Prometheus/Grafana, Splunk).
β’ Agile/SAFe delivery in regulated environments (healthcare/financial services).






