Hyrhub

Cloud Architect

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Cloud Architect with a contract length of "unknown," offering a pay rate of "unknown." Key skills required include Azure Data Factory, ETL/ELT, and experience in Agile/Scrum. A Bachelor's or Master's degree in a related field and relevant certifications are necessary.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
November 18, 2025
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Unknown
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Boston, MA
-
🧠 - Skills detailed
#Agile #Azure #Migration #Security #ADF (Azure Data Factory) #SQL (Structured Query Language) #Scrum #Data Lake #"ETL (Extract #Transform #Load)" #Cloud #Compliance #Databricks #Delta Lake #Storage #Scala #Synapse #Automation #Leadership #Data Architecture #Data Governance #Data Quality #Data Engineering #Data Integration #Azure Data Factory #Computer Science #DevOps #Logic Apps #Data Pipeline #SAP
Role description
β€’ A seasoned Azure Data Architect with deep expertise in the Microsoft Fabric platform to design, develop, and govern enterprise-scale analytics and data-platform solutions. β€’ Bachelor’s or Master’s degree in computer science, Information Technology, or a related field. β€’ Candidate will be responsible for designing, developing, and maintaining scalable, secure, and efficient data integration pipelines using Microsoft Fabric capabilities. β€’ Develop and manage lakehouse and warehouse architectures (using Fabric Lakehouse/Warehouse, Delta Lake, etc.). β€’ This role requires strong expertise in Azure Data Factory, Azure Synapse / Data Lake, Data Lake Storage Gen2 and associated Azure data services, along with hands-on experience in ETL/ELT development, performance tuning, and automation. β€’ Design and develop data pipelines using Azure Data Factory for ingesting, transforming, and orchestrating data from multiple on-prem and cloud sources. β€’ Implement ETL/ELT processes to move data from various sources (SQL, APIs, files, SAP, etc.) to Azure Data Lake, Azure Synapse, or Databricks. β€’ Strong problem-solving skills and attention to detail. β€’ Experience working in Agile/Scrum development teams is preferred. β€’ Certifications such as Microsoft Certified: Fabric Data Engineer Associate, Azure Solutions Architect or similar. β€’ Exposure to Customer data platform (MS Customer Insights Data and MS customer insights journey). Key Responsibilities β€’ Collaborate with business stakeholders and senior leadership to translate business requirements into a coherent data-platform architecture. β€’ Define and maintain the data-architecture roadmap, including data-ingestion, transformation, storage and analytics layers using Microsoft Fabric. β€’ Design end-to-end data solutions: ingestion pipelines, lakehouses/warehouses, semantic layer, analytics consumption and real-time capabilities. β€’ Architect and guide modelling of data (conceptual, logical, physical) – ensuring consistency, performance and reusability. β€’ Oversee migration of existing platforms (on-premises or legacy cloud systems) to Fabric-centric architecture. β€’ Work with data-engineering and analytics teams to implement solutions (e.g., Azure Data Factory/SQL/Synapse, Fabric pipelines, OneLake) β€’ Build and maintain parameterized, reusable ADF pipelines with dynamic configurations. β€’ Integrate ADF with Azure Functions, Logic Apps, and DevOps CI/CD pipelines. β€’ Ensure data quality, data governance, security and compliance across Fabric solutions. β€’ Monitor, tune, and optimise performance of data pipelines and storage solutions to ensure efficiency, cost-effectiveness and reliability.