

1 Apex Tech Inc.
Senior ETL Tools Developer - Senior ETL Tools Developer – Azure & Enterprise Integration
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior ETL Tools Developer focused on Azure & Enterprise Integration, lasting "contract length". The pay rate is "pay rate". Key skills include Azure Data Factory, SQL, and integration with Coupa and SAP. A degree and 5+ years of ETL experience are required.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
November 7, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Fort Worth, TX
-
🧠 - Skills detailed
#GIT #Spark (Apache Spark) #Databases #Version Control #Storage #"ETL (Extract #Transform #Load)" #DevOps #Linux #Azure #SQL (Structured Query Language) #Data Governance #Code Reviews #Data Processing #Shell Scripting #JSON (JavaScript Object Notation) #Microsoft Power BI #Scripting #XML (eXtensible Markup Language) #Compliance #Azure Databricks #Data Integration #Data Lake #Azure DevOps #Big Data #BI (Business Intelligence) #Data Engineering #Data Architecture #REST (Representational State Transfer) #Automation #SAP #ADF (Azure Data Factory) #Azure Data Factory #Scala #Data Quality #Apache Spark #Documentation #Azure Blob Storage #Snowflake #Databricks #Logging #Computer Science #Security #DataStage #Microsoft Azure #Data Extraction #Unix
Role description
Job Title: Senior ETL Tools Developer – Azure & Enterprise Integration
Position Summary:
We are seeking a highly skilled and experienced Senior ETL Tools Developer with deep expertise in Microsoft Azure Data Factory (ADF) and modern data integration platforms. The ideal candidate will lead the design, development, and optimization of scalable ETL pipelines and build robust integration interfaces with enterprise platforms such as Coupa and SAP.
Key Responsibilities:
· Design, develop, and maintain robust ETL pipelines using MS Azure Data Factory, MS Azure Databricks, and other Azure services.
· Build and manage data integration interfaces between internal systems and external platforms such as Coupa and SAP.
· Collaborate with data architects, analysts, and business stakeholders to understand data requirements and deliver scalable solutions.
· Optimize data workflows for performance, reliability, and cost-efficiency.
· Implement data quality checks, error handling, and logging mechanisms.
· Participate in code reviews, architecture discussions, unit testing, and performance tuning.
· Ensure compliance with data governance, security, and privacy standards.
· Mentor junior developers and contribute to best practices in data engineering.
Required Qualifications:
· Bachelor’s or Master’s degree in Computer Science, Information Systems, or related field.
· 5+ years of experience in ETL/ELT development, with at least 2 years focused on Azure Data Factory.
· Strong proficiency in SQL, Postgres, and data transformation logic.
· Experience with Azure Blob Storage, Azure Functions, and Databricks.
· Hands-on experience integrating with Coupa and SAP platforms.
· Familiarity with CI/CD pipelines and version control tools (e.g., Git, Azure DevOps).
· Experience with Unix/Linux shell scripting for automation and process orchestration.
· Expertise in data extraction, transformation, and loading from/to various data sources (e.g., relational databases, flat files, XML, JSON, etc.).
· Solid understanding of data warehousing concepts, dimensional modeling, and big data processing.
· Excellent problem-solving, communication, and documentation skills.
Preferred / Good-to-Have Qualifications:
· Experience with IBM DataStage development.
· Experience with Power BI, Snowflake, or Apache Spark.
· Knowledge of data lake architecture, data mesh, or data fabric concepts.
· Microsoft Azure certifications (e.g., DP-203: Data Engineering on Microsoft Azure).
· Experience with REST/SOAP APIs and middleware platforms for enterprise integration.
Job Title: Senior ETL Tools Developer – Azure & Enterprise Integration
Position Summary:
We are seeking a highly skilled and experienced Senior ETL Tools Developer with deep expertise in Microsoft Azure Data Factory (ADF) and modern data integration platforms. The ideal candidate will lead the design, development, and optimization of scalable ETL pipelines and build robust integration interfaces with enterprise platforms such as Coupa and SAP.
Key Responsibilities:
· Design, develop, and maintain robust ETL pipelines using MS Azure Data Factory, MS Azure Databricks, and other Azure services.
· Build and manage data integration interfaces between internal systems and external platforms such as Coupa and SAP.
· Collaborate with data architects, analysts, and business stakeholders to understand data requirements and deliver scalable solutions.
· Optimize data workflows for performance, reliability, and cost-efficiency.
· Implement data quality checks, error handling, and logging mechanisms.
· Participate in code reviews, architecture discussions, unit testing, and performance tuning.
· Ensure compliance with data governance, security, and privacy standards.
· Mentor junior developers and contribute to best practices in data engineering.
Required Qualifications:
· Bachelor’s or Master’s degree in Computer Science, Information Systems, or related field.
· 5+ years of experience in ETL/ELT development, with at least 2 years focused on Azure Data Factory.
· Strong proficiency in SQL, Postgres, and data transformation logic.
· Experience with Azure Blob Storage, Azure Functions, and Databricks.
· Hands-on experience integrating with Coupa and SAP platforms.
· Familiarity with CI/CD pipelines and version control tools (e.g., Git, Azure DevOps).
· Experience with Unix/Linux shell scripting for automation and process orchestration.
· Expertise in data extraction, transformation, and loading from/to various data sources (e.g., relational databases, flat files, XML, JSON, etc.).
· Solid understanding of data warehousing concepts, dimensional modeling, and big data processing.
· Excellent problem-solving, communication, and documentation skills.
Preferred / Good-to-Have Qualifications:
· Experience with IBM DataStage development.
· Experience with Power BI, Snowflake, or Apache Spark.
· Knowledge of data lake architecture, data mesh, or data fabric concepts.
· Microsoft Azure certifications (e.g., DP-203: Data Engineering on Microsoft Azure).
· Experience with REST/SOAP APIs and middleware platforms for enterprise integration.






