

TalentBurst, an Inc 5000 company
Senior Data Architect
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Architect in Boston, MA (Hybrid) on a 4+ month contract. Pay rate is competitive. Requires 8-10 years in data architecture, 3-5 years with Azure lakehouses, and expertise in Azure Databricks, Python, and SQL.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
February 13, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Boston, MA
-
🧠 - Skills detailed
#Azure Data Factory #Terraform #Python #Data Engineering #Semantic Models #Infrastructure as Code (IaC) #Documentation #Automation #Monitoring #Security #Cybersecurity #Strategy #Databricks #Continuous Deployment #ADLS (Azure Data Lake Storage) #SQL (Structured Query Language) #Disaster Recovery #Network Security #Cloud #Azure Databricks #ADF (Azure Data Factory) #BI (Business Intelligence) #Data Quality #Vault #Data Lake #"ETL (Extract #Transform #Load)" #Data Lakehouse #Microsoft Power BI #Data Management #Synapse #Batch #Data Architecture #Libraries #Data Security #Delta Lake #Deployment #Kafka (Apache Kafka) #Azure #Metadata
Role description
Senior Data Architect
Boston, MA(Hybrid)
4 plus months contract(Possible extension)
Details:
Client is seeking a Senior Data Architect to design, implement, and operationalize a data lakehouse architecture within OSA's Azure Government Community Cloud (GCC) tenancies. You will establish the platform, governance, ingestion, transformation, and consumption layers, and enable our data engineers for ongoing operations. You will be required to work with developers, cybersecurity engineers, and other OSA staff to review the environment and implement best practices.
OSA serves as the chief accountability entity for the Massachusetts state government and its residents. OSA conducts audits of state entities and contractors to assess their performance and recommend improvements to enhance the effectiveness of government operations. In addition to ensuring that tax dollars are spent wisely, audits, reports, and investigations have also improved the performance of state governments. OSA has offices in Boston, Marlborough, Chicopee, and Brockton.
Key Responsibilities
1. Architecture & Platform
• Define and establish the target reference architecture for an enterprise lakehouse, with Azure Databricks as the foundational platform.
• Design and implement ADLS Gen2 structure and Medallion (Bronze/Silver/Gold) model using Delta Lake.
• Establish Unity Catalog, the metastore, and semantic layers.
1. Governance & Security
• Implement Microsoft Purview for catalog, lineage, and metadata management.
• Design, align, and integrate data security policies (RBAC, RLS/CLS, masking) and PII handling.
• Integrate Key Vault, private endpoints, and network security baselines.
1. Data Engineering Enablement
• Build batch/streaming ingestion pipelines (ADF/Synapse/Databricks Workflows, Event Hubs/Kafka).
• Implement change data capture (CDC) patterns and schema evolution handling with quarantine controls.
• Support automation of data reliability assessment.
• Create reusable transformation libraries, standards, and pipelines for Bronze→Silver→Gold.
1. Operations and Reliability
• Establish continuous integration/continuous deployment (CI/CD) pipelines, along with infrastructure-as-a-code orchestration (Terraform or Bicep), and segregated environment strategy (dev/test/prod).
• Define service level agreements (SLAs) and service level objectives (SLOs) for data reliability; configure monitoring & alerting.
• Deliver cost optimization, performance tuning (OPTIMIZE, Z-ORDER), Backup strategy, and Disaster Recovery strategy.
1. Delivery and Handover
• Document architecture, patterns, and runbooks; conduct training for data engineers.
• Collaborate with security, networking, and BI teams; drive stakeholder alignment.
Required Skills & Experience
• 8-10 years in data architecture/engineering, 3-5 years building lakehouses in Azure. Experience operating within an GCC environment is highly desirable.
• Hands-on with Azure Databricks (Delta Lake, Unity Catalog, Jobs, SQL).
• Strong with ADLS Gen2, Azure Data Factory Pipelines, Event Hubs/Kafka, Purview, Key Vault.
• Expertise in Delta Lake, schema evolution, SCDs, CDC, streaming (Structured Streaming).
• Proficient in Python and SQL; experience with CI/CD and IaC (Terraform or Bicep).
• Security-first mindset: RBAC, RLS/CLS, PII governance, private networking.
• Excellent communication, documentation, and stakeholder management.
• Expertise in engagement with business users to align business objectives with technical considerations within Azure.
Preferred Qualifications
• Microsoft Certified: Azure Data Engineer Associate or Databricks Certified Data Engineer Professional.
• Experience integrating Power BI semantic models with Delta tables.
• Familiarity with data quality frameworks (Great Expectations, Databricks expectations).
#TB\_EN
#ZR
Senior Data Architect
Boston, MA(Hybrid)
4 plus months contract(Possible extension)
Details:
Client is seeking a Senior Data Architect to design, implement, and operationalize a data lakehouse architecture within OSA's Azure Government Community Cloud (GCC) tenancies. You will establish the platform, governance, ingestion, transformation, and consumption layers, and enable our data engineers for ongoing operations. You will be required to work with developers, cybersecurity engineers, and other OSA staff to review the environment and implement best practices.
OSA serves as the chief accountability entity for the Massachusetts state government and its residents. OSA conducts audits of state entities and contractors to assess their performance and recommend improvements to enhance the effectiveness of government operations. In addition to ensuring that tax dollars are spent wisely, audits, reports, and investigations have also improved the performance of state governments. OSA has offices in Boston, Marlborough, Chicopee, and Brockton.
Key Responsibilities
1. Architecture & Platform
• Define and establish the target reference architecture for an enterprise lakehouse, with Azure Databricks as the foundational platform.
• Design and implement ADLS Gen2 structure and Medallion (Bronze/Silver/Gold) model using Delta Lake.
• Establish Unity Catalog, the metastore, and semantic layers.
1. Governance & Security
• Implement Microsoft Purview for catalog, lineage, and metadata management.
• Design, align, and integrate data security policies (RBAC, RLS/CLS, masking) and PII handling.
• Integrate Key Vault, private endpoints, and network security baselines.
1. Data Engineering Enablement
• Build batch/streaming ingestion pipelines (ADF/Synapse/Databricks Workflows, Event Hubs/Kafka).
• Implement change data capture (CDC) patterns and schema evolution handling with quarantine controls.
• Support automation of data reliability assessment.
• Create reusable transformation libraries, standards, and pipelines for Bronze→Silver→Gold.
1. Operations and Reliability
• Establish continuous integration/continuous deployment (CI/CD) pipelines, along with infrastructure-as-a-code orchestration (Terraform or Bicep), and segregated environment strategy (dev/test/prod).
• Define service level agreements (SLAs) and service level objectives (SLOs) for data reliability; configure monitoring & alerting.
• Deliver cost optimization, performance tuning (OPTIMIZE, Z-ORDER), Backup strategy, and Disaster Recovery strategy.
1. Delivery and Handover
• Document architecture, patterns, and runbooks; conduct training for data engineers.
• Collaborate with security, networking, and BI teams; drive stakeholder alignment.
Required Skills & Experience
• 8-10 years in data architecture/engineering, 3-5 years building lakehouses in Azure. Experience operating within an GCC environment is highly desirable.
• Hands-on with Azure Databricks (Delta Lake, Unity Catalog, Jobs, SQL).
• Strong with ADLS Gen2, Azure Data Factory Pipelines, Event Hubs/Kafka, Purview, Key Vault.
• Expertise in Delta Lake, schema evolution, SCDs, CDC, streaming (Structured Streaming).
• Proficient in Python and SQL; experience with CI/CD and IaC (Terraform or Bicep).
• Security-first mindset: RBAC, RLS/CLS, PII governance, private networking.
• Excellent communication, documentation, and stakeholder management.
• Expertise in engagement with business users to align business objectives with technical considerations within Azure.
Preferred Qualifications
• Microsoft Certified: Azure Data Engineer Associate or Databricks Certified Data Engineer Professional.
• Experience integrating Power BI semantic models with Delta tables.
• Familiarity with data quality frameworks (Great Expectations, Databricks expectations).
#TB\_EN
#ZR





