Twine

Freelance Data Engineer – Azure, Remote

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Freelance Data Engineer position focused on designing and maintaining data pipelines on Azure. It is a full-time, remote contract lasting over 6 months, requiring 5+ years of experience, proficiency in SQL, Python, and Azure Data Factory.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
November 7, 2025
🕒 - Duration
More than 6 months
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Data Modeling #GIT #Version Control #"ETL (Extract #Transform #Load)" #Azure #SQL (Structured Query Language) #PostgreSQL #ML (Machine Learning) #Synapse #AI (Artificial Intelligence) #BI (Business Intelligence) #Data Engineering #Python #ADF (Azure Data Factory) #Azure Data Factory #Scala #Data Access #Data Quality #Data Pipeline #Azure Synapse Analytics #Databricks
Role description
Join a dynamic team as a Freelance Data Engineer, where you will play a pivotal role in designing, building, and maintaining robust data pipelines and services on the Azure platform. This 100% remote, full-time contract position is ideal for professionals passionate about integrating diverse data sources, optimizing data performance, and enabling analytics and AI-driven decision-making. You will collaborate with a global team to enhance the data platform and support business intelligence initiatives. Responsibilities • Design, develop, and maintain scalable data pipelines using Azure Data Factory and related Azure services • Integrate data from multiple sources, ensuring data quality and consistency • Develop and optimize ETL processes and data models to support analytics and machine learning workflows • Implement and manage CI/CD pipelines and version control for data engineering projects • Collaborate with cross-functional teams to define data requirements and deliver solutions • Monitor, troubleshoot, and improve data pipeline performance and reliability • Support the development of APIs for data access and integration • Document data engineering processes and best practices Skills And Requirements • Minimum 5 years of experience in data engineering or a related field • Proficiency in SQL, PostgreSQL, and Python • Strong hands-on experience with Azure Data Factory and other Azure data services • Experience with data modeling, ETL pipeline development, and data validation • Familiarity with CI/CD pipelines and version control systems (e.g., Git) • Knowledge of Azure Synapse Analytics, Databricks, or machine learning data pipelines is a plus • Excellent problem-solving skills and attention to detail • Strong communication skills and ability to work effectively in a remote, global team environment • Availability for full-time contract work About Twine Twine is a leading freelance marketplace connecting top freelancers, consultants, and contractors with companies needing creative and tech expertise. Trusted by Fortune 500 companies and innovative startups alike, Twine enables companies to scale their teams globally. Our Mission Twine's mission is to empower creators and businesses to thrive in an AI-driven, freelance-first world.