

USM Business Systems
Azure Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an Azure Data Engineer in Washington, DC, lasting 6 months. Key skills include Azure Data Factory, Azure Databricks, DBT, and Python. Experience with Azure cloud platforms and data pipeline development is required.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
March 13, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Washington DC-Baltimore Area
-
🧠 - Skills detailed
#Azure cloud #dbt (data build tool) #"ETL (Extract #Transform #Load)" #Data Architecture #Azure Databricks #Data Quality #Python #Databricks #ADF (Azure Data Factory) #BI (Business Intelligence) #Data Engineering #Azure Data Factory #Scala #Data Processing #Data Pipeline #Azure #Datasets #Cloud
Role description
Job Title: Azure Data Engineer
Location: Washington, DC 20433 (Preferred Local)
Duration: 6 Months - Long Term
Note: - Need Strong Azure Data Engineer with ADF(Azure Data Factory), Azure Databricks, DBT(Data Build Tool) and Python.
Job Responsibilities:
• Design, develop, and maintain scalable data pipelines using Azure Data Factory (ADF).
• Build and manage data transformation workflows using DBT (Data Build Tool).
• Develop and optimize data processing solutions using Azure Databricks.
• Write efficient data processing scripts using Python.
• Work with large datasets to support data analytics and reporting needs.
• Ensure data quality, reliability, and performance optimization across data pipelines.
• Collaborate with data architects, analysts, and BI teams to deliver data solutions.
• Experience working in Azure cloud-based data platforms is preferred.
Job Title: Azure Data Engineer
Location: Washington, DC 20433 (Preferred Local)
Duration: 6 Months - Long Term
Note: - Need Strong Azure Data Engineer with ADF(Azure Data Factory), Azure Databricks, DBT(Data Build Tool) and Python.
Job Responsibilities:
• Design, develop, and maintain scalable data pipelines using Azure Data Factory (ADF).
• Build and manage data transformation workflows using DBT (Data Build Tool).
• Develop and optimize data processing solutions using Azure Databricks.
• Write efficient data processing scripts using Python.
• Work with large datasets to support data analytics and reporting needs.
• Ensure data quality, reliability, and performance optimization across data pipelines.
• Collaborate with data architects, analysts, and BI teams to deliver data solutions.
• Experience working in Azure cloud-based data platforms is preferred.






