

Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with a W2 contract in the San Francisco Bay area, requiring one day onsite weekly. Key skills include Azure Data Factory, Synapse Analytics, and SQL, with a focus on building scalable data pipelines.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
680
-
ποΈ - Date discovered
May 22, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Hybrid
-
π - Contract type
W2 Contractor
-
π - Security clearance
Unknown
-
π - Location detailed
San Francisco, CA
-
π§ - Skills detailed
#Metadata #Azure Data Factory #ADF (Azure Data Factory) #Automation #GIT #Azure SQL #Data Lake #Data Quality #Python #Visualization #Synapse #SQL (Structured Query Language) #Cloud #Data Management #Version Control #Data Engineering #Data Catalog #Microsoft Power BI #Security #Logic Apps #Documentation #Data Architecture #Data Modeling #"ETL (Extract #Transform #Load)" #Scala #Storage #Data Pipeline #Data Governance #BI (Business Intelligence) #Azure SQL Data Warehouse #Data Dictionary #Data Warehouse #Observability #Datasets #Anomaly Detection #Data Privacy #Azure #Data Security #Azure Logic Apps
Role description
Are you an Azure Data Engineer in the San Francisco Bay area?
We have a w2 contract (No C2C) for one of our clients located in the heart of SFO!
This role does require one day a week onsite (Monday) so interested candidates MUST BE LOCATED within driving distance.
Please let me know if you are actively engaged in the job market!
Milestone Technologies, Inc. is seeking an Azure Data Engineer to support the implementation of its Unified Data Ecosystem, a strategic initiative to modernize data infrastructure using Microsoft cloud technologies. This role will focus on building scalable, automated data pipelines using tools such as Azure Data Factory, Synapse Analytics(formerly Azure SQL Data Warehouse), Azure Data Lake, and Azure Logic Apps to support reporting, analytics, and operational efficiency. The engineer will work closely with IT and business stakeholders to streamline data processes, ensure data quality, and contribute to the design and governance of a secure, reusable, and scalable data platform that supports long-term integration, automation, and decision-making across the organization.
β’ Duties Design, develop, and maintain scalable data pipelines using Azure Data Factory, Synapse Analytics, and Data Lake Storage to ingest, transform, and publish structured and unstructured data.
β’ Automate end-to-end and event-driven workflows using Azure Logic Apps, ensuring real-time or scheduled data synchronization and integration with external systems.
β’ Conduct data modeling and transformation to support reporting, dashboards, and analytics, including validation and quality checks to ensure consistency and reliability.
β’ Collaborate with IT and business stakeholders to gather data requirements, define transformation logic, and contribute to documentation such as the data dictionary.
β’ Support evaluation and enhancement of the Azure-based data architecture, focusing on scalability, reusability, and long-term maintainability.
β’ Implement and uphold data governance standards, including metadata management, lineage tracking, and data privacy and security controls.
β’ Optimize datasets for Power BI and promote self-service analytics through clean, reusable data models.
β’ Monitor, troubleshoot, and tune pipeline performance; establish alerting and observability for operational reliability.
β’ Provide documentation and facilitate knowledge transfer to IT and business users to ensure sustainability and adoption.
Skills
β’ Proficient in building scalable data pipelines using Azure Data Factory, Synapse Analytics, Azure Data Lake, and Logic Apps.
β’ Strong SQL skills with experience in data transformation, validation, and optimization for reporting and analytics.
β’ Skilled in dimensional modeling and modern data platform design to support dashboarding and analytical use cases.
β’ Familiar with data quality frameworks, including validation, anomaly detection, and exception handling processes.
β’ Experience with metadata management, data cataloging, and governance tools such as Azure Purview (or equivalents).
β’ Familiarity with CI/CD practices and version control systems such as Git.
β’ Demonstrated ability to translate business needs into technical requirements and collaborate effectively across IT and business teams.
β’ Strong communication, documentation, and stakeholder engagement skills to support alignment and knowledge sharing.
Nice to have:
β’ Data engineering languages such as Python
β’ Knowledge of data security best practices including encryption, data masking
β’ Familiarity with data visualization tool Power BI
Are you an Azure Data Engineer in the San Francisco Bay area?
We have a w2 contract (No C2C) for one of our clients located in the heart of SFO!
This role does require one day a week onsite (Monday) so interested candidates MUST BE LOCATED within driving distance.
Please let me know if you are actively engaged in the job market!
Milestone Technologies, Inc. is seeking an Azure Data Engineer to support the implementation of its Unified Data Ecosystem, a strategic initiative to modernize data infrastructure using Microsoft cloud technologies. This role will focus on building scalable, automated data pipelines using tools such as Azure Data Factory, Synapse Analytics(formerly Azure SQL Data Warehouse), Azure Data Lake, and Azure Logic Apps to support reporting, analytics, and operational efficiency. The engineer will work closely with IT and business stakeholders to streamline data processes, ensure data quality, and contribute to the design and governance of a secure, reusable, and scalable data platform that supports long-term integration, automation, and decision-making across the organization.
β’ Duties Design, develop, and maintain scalable data pipelines using Azure Data Factory, Synapse Analytics, and Data Lake Storage to ingest, transform, and publish structured and unstructured data.
β’ Automate end-to-end and event-driven workflows using Azure Logic Apps, ensuring real-time or scheduled data synchronization and integration with external systems.
β’ Conduct data modeling and transformation to support reporting, dashboards, and analytics, including validation and quality checks to ensure consistency and reliability.
β’ Collaborate with IT and business stakeholders to gather data requirements, define transformation logic, and contribute to documentation such as the data dictionary.
β’ Support evaluation and enhancement of the Azure-based data architecture, focusing on scalability, reusability, and long-term maintainability.
β’ Implement and uphold data governance standards, including metadata management, lineage tracking, and data privacy and security controls.
β’ Optimize datasets for Power BI and promote self-service analytics through clean, reusable data models.
β’ Monitor, troubleshoot, and tune pipeline performance; establish alerting and observability for operational reliability.
β’ Provide documentation and facilitate knowledge transfer to IT and business users to ensure sustainability and adoption.
Skills
β’ Proficient in building scalable data pipelines using Azure Data Factory, Synapse Analytics, Azure Data Lake, and Logic Apps.
β’ Strong SQL skills with experience in data transformation, validation, and optimization for reporting and analytics.
β’ Skilled in dimensional modeling and modern data platform design to support dashboarding and analytical use cases.
β’ Familiar with data quality frameworks, including validation, anomaly detection, and exception handling processes.
β’ Experience with metadata management, data cataloging, and governance tools such as Azure Purview (or equivalents).
β’ Familiarity with CI/CD practices and version control systems such as Git.
β’ Demonstrated ability to translate business needs into technical requirements and collaborate effectively across IT and business teams.
β’ Strong communication, documentation, and stakeholder engagement skills to support alignment and knowledge sharing.
Nice to have:
β’ Data engineering languages such as Python
β’ Knowledge of data security best practices including encryption, data masking
β’ Familiarity with data visualization tool Power BI