

MTK Technologies
Azure Data Engineer (W2)
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an Azure Data Engineer (W2) with a contract length of "unknown" and a pay rate of "unknown." Key skills include ETL/ELT development, SQL, Python, and experience with EPIC and Informatica. Remote work is permitted, with local candidates preferred.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
January 8, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Remote
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Columbus, OH
-
🧠 - Skills detailed
#Azure #Workday #Programming #Data Modeling #Data Ingestion #Azure Event Hubs #Microsoft Azure #"ETL (Extract #Transform #Load)" #Data Processing #Data Warehouse #Cloud #Python #Informatica Cloud #SQL (Structured Query Language) #Data Engineering #GIT #Kafka (Apache Kafka) #Data Pipeline #Scripting #Computer Science #Informatica #Data Manipulation #Automation #Security #Version Control #IICS (Informatica Intelligent Cloud Services) #Databases #Data Integrity #Compliance #Scala #Datasets #ADF (Azure Data Factory) #Data Analysis #Data Governance #Azure Data Factory
Role description
Role Overview:
We are seeking a highly skilled and motivated Data Engineer to join our team in a critical, urgent role. This position is key to building and maintaining robust data solutions. The ideal candidate will possess a strong foundation in data warehousing principles and a keen design mindset to handle complex data challenges.
Key Responsibilities:
• Design, build, and maintain scalable and efficient ETL/ELT data pipelines to ingest data from internal and external sources (e.g., APIs from EPIC, Workday, relational databases, flat files). and data warehouse to ensure data is clean, accessible, and ready for analysis and model training.
• Collaborate with the Data Analyst and other stakeholders to understand their data requirements and provide them with clean, well-structured datasets.
• Implement data governance, security, and quality controls to ensure data integrity and compliance.
• Automate data ingestion, transformation, and validation processes.
• Work with our broader IT team to ensure seamless integration of data infrastructure with existing systems.
• Contribute to the evaluation and implementation of new data technologies and tools.
Required Skills & Qualifications:
• ETL/ELT Development: Strong experience in designing and building data pipelines using ETL/ELT tools and frameworks.
• SQL: Advanced proficiency in SQL for data manipulation, transformation, and optimization.
• Experience with EPIC EMR and Informatica is required.
• Programming: Strong programming skills in Python (or a similar language) for scripting, automation, and data processing.
• Data Warehousing: Experience with data warehousing concepts and technologies.
• Cloud Computing: Hands-on experience with at least one major cloud platform's data services (e.g., Microsoft Azure Data Factory, Azure Fabric, IICS).
• Version Control: Proficiency with Git for code management and collaboration.
• Problem-Solving: Proven ability to troubleshoot and resolve data pipeline issues.
• Data Modeling: Experience with various data modeling techniques (e.g., dimensional modeling).
• Real-time Processing: Familiarity with real-time data streaming technologies (e.g., Kafka, Azure Event Hubs).
• Education: Bachelor's degree in Computer Science, Engineering, or related field.
Main Skill Sets
1. Azure: Experience with Microsoft Azure data services.
1. Fabric: Familiarity or experience with Microsoft Fabric.
1. Informatica Cloud or PowerCenter: Hands-on experience with either Informatica Cloud or PowerCenter.
1. Epic Experience: Highly preferred, but not a mandatory requirement.
1. Work Arrangement: The position allows for remote work but local candidates are strongly preferred.
Role Overview:
We are seeking a highly skilled and motivated Data Engineer to join our team in a critical, urgent role. This position is key to building and maintaining robust data solutions. The ideal candidate will possess a strong foundation in data warehousing principles and a keen design mindset to handle complex data challenges.
Key Responsibilities:
• Design, build, and maintain scalable and efficient ETL/ELT data pipelines to ingest data from internal and external sources (e.g., APIs from EPIC, Workday, relational databases, flat files). and data warehouse to ensure data is clean, accessible, and ready for analysis and model training.
• Collaborate with the Data Analyst and other stakeholders to understand their data requirements and provide them with clean, well-structured datasets.
• Implement data governance, security, and quality controls to ensure data integrity and compliance.
• Automate data ingestion, transformation, and validation processes.
• Work with our broader IT team to ensure seamless integration of data infrastructure with existing systems.
• Contribute to the evaluation and implementation of new data technologies and tools.
Required Skills & Qualifications:
• ETL/ELT Development: Strong experience in designing and building data pipelines using ETL/ELT tools and frameworks.
• SQL: Advanced proficiency in SQL for data manipulation, transformation, and optimization.
• Experience with EPIC EMR and Informatica is required.
• Programming: Strong programming skills in Python (or a similar language) for scripting, automation, and data processing.
• Data Warehousing: Experience with data warehousing concepts and technologies.
• Cloud Computing: Hands-on experience with at least one major cloud platform's data services (e.g., Microsoft Azure Data Factory, Azure Fabric, IICS).
• Version Control: Proficiency with Git for code management and collaboration.
• Problem-Solving: Proven ability to troubleshoot and resolve data pipeline issues.
• Data Modeling: Experience with various data modeling techniques (e.g., dimensional modeling).
• Real-time Processing: Familiarity with real-time data streaming technologies (e.g., Kafka, Azure Event Hubs).
• Education: Bachelor's degree in Computer Science, Engineering, or related field.
Main Skill Sets
1. Azure: Experience with Microsoft Azure data services.
1. Fabric: Familiarity or experience with Microsoft Fabric.
1. Informatica Cloud or PowerCenter: Hands-on experience with either Informatica Cloud or PowerCenter.
1. Epic Experience: Highly preferred, but not a mandatory requirement.
1. Work Arrangement: The position allows for remote work but local candidates are strongly preferred.






