

ECCO Select
Implementation Engineer #11167
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an Implementation Engineer (Data / Client-Facing) for a 6-month contract, paying "$X/hour". Requires strong experience in Azure Data Factory, Python, advanced SQL, and client-facing skills, with a preference for healthcare data integration experience.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
560
-
🗓️ - Date
April 22, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Data Integration #Data Pipeline #Azure #Azure Data Factory #Data Engineering #Data Processing #"ETL (Extract #Transform #Load)" #SQL (Structured Query Language) #ADF (Azure Data Factory) #PostgreSQL #Python #Data Architecture #Compliance #Scala
Role description
Implementation Engineer (Data / Client-Facing)
Overview
We are seeking an Implementation Engineer to support the rollout of a new application within an Azure-based data environment. This role blends strong data engineering expertise with client-facing responsibilities, making it ideal for someone who can build scalable solutions while effectively communicating with stakeholders.
Key Responsibilities
• Support end-to-end implementation of a new application for clients
• Design, build, and maintain scalable data pipelines (ETL processes)
• Work within an existing Azure-based data architecture, primarily using Azure Data Factory
• Collaborate with internal teams and client stakeholders throughout implementation
• Translate technical solutions into clear, client-friendly presentations
• Contribute to broader data engineering and integration initiatives
Required Skills & Experience
• Strong experience with Azure Data Factory (ADF)
• Proficiency in Python for data processing and pipeline development
• Solid background in data engineering, ETL, and pipeline design
• Advanced SQL skills (PostgreSQL preferred)
• Strong client-facing communication and presentation skills
• Understanding of HIPAA compliance and data handling standards
Nice to Have
• Experience with EDI transactions (837/835)
• Prior exposure to healthcare data integrations
• Experience in implementation or client delivery roles
Ideal Candidate Profile
• Data engineer with implementation or client delivery experience
• Comfortable operating in both technical and customer-facing environments
• Able to clearly explain complex data concepts to non-technical stakeholders
Implementation Engineer (Data / Client-Facing)
Overview
We are seeking an Implementation Engineer to support the rollout of a new application within an Azure-based data environment. This role blends strong data engineering expertise with client-facing responsibilities, making it ideal for someone who can build scalable solutions while effectively communicating with stakeholders.
Key Responsibilities
• Support end-to-end implementation of a new application for clients
• Design, build, and maintain scalable data pipelines (ETL processes)
• Work within an existing Azure-based data architecture, primarily using Azure Data Factory
• Collaborate with internal teams and client stakeholders throughout implementation
• Translate technical solutions into clear, client-friendly presentations
• Contribute to broader data engineering and integration initiatives
Required Skills & Experience
• Strong experience with Azure Data Factory (ADF)
• Proficiency in Python for data processing and pipeline development
• Solid background in data engineering, ETL, and pipeline design
• Advanced SQL skills (PostgreSQL preferred)
• Strong client-facing communication and presentation skills
• Understanding of HIPAA compliance and data handling standards
Nice to Have
• Experience with EDI transactions (837/835)
• Prior exposure to healthcare data integrations
• Experience in implementation or client delivery roles
Ideal Candidate Profile
• Data engineer with implementation or client delivery experience
• Comfortable operating in both technical and customer-facing environments
• Able to clearly explain complex data concepts to non-technical stakeholders






