

Sr Data Architect
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Sr Data Architect with a remote work location and an immediate start. The contract length is unspecified, offering a competitive pay rate. Key requirements include 5+ years in data architecture, expertise in Snowflake, and knowledge of compliance frameworks.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
-
🗓️ - Date discovered
July 23, 2025
🕒 - Project duration
Unknown
-
🏝️ - Location type
Remote
-
📄 - Contract type
Unknown
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
Boston, MA
-
🧠 - Skills detailed
#Data Management #Infrastructure as Code (IaC) #Data Architecture #Databricks #Security #GitHub #Scala #DevOps #Leadership #Data Analysis #Delta Lake #Data Engineering #Airflow #Data Science #Computer Science #Cloud #Python #AWS (Amazon Web Services) #Data Pipeline #Data Integrity #Spark (Apache Spark) #Terraform #Azure #Data Privacy #"ETL (Extract #Transform #Load)" #Tableau #Visualization #Data Governance #Redshift #Snowflake #Programming #Monitoring #Java #Jenkins #Metadata #Compliance #Statistics #GCP (Google Cloud Platform)
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Position: Senior Data Architect
Location: Remote (Client location: Boston, MA)
Client: Accenture (through our prime vendor)
Start Date: Immediate
Visa Requirement: Must be legally authorized to work in the U.S.
Background Check: Candidates must be able to clear standard background, criminal, and CORI checks (CJIS certification required)
Key Responsibilities
Data Architecture & Engineering
• Design and optimize cloud-native data platforms, including Snowflake, Redshift, and Big Query
• Architect and manage Lakehouse environments such as Databricks and Delta Lake
• Develop scalable and secure Spark-based data pipelines
• Implement robust ETL orchestration using Airflow
• Define and enforce metadata management and data governance frameworks
• Ensure data integrity through validation, monitoring, and quality controls
Technical Leadership
• Uphold data privacy and compliance standards (HIPAA, FISMA, FedRAMP)
• Translate complex business requirements into scalable data solutions
• Troubleshoot performance bottlenecks and optimize data workflows
• Mentor team members and contribute to solution architecture decisions
Collaboration & Communication
• Partner with cross-functional teams including data analysts, developers, and business stakeholders
• Deliver technical insights and architectural guidance to both technical and non-technical teams
• Stay up to date with evolving technologies and recommend improvements accordingly
Required Qualifications
• Bachelor’s or master's in computer science, Data Science, Statistics, or a related field
• 5+ years of experience in data architecture or data engineering
• Hands-on expertise in Snowflake (performance tuning, micro-partitioning, and security)
• Strong programming skills (Python, Java, or Scala)
• Experience with cloud platforms such as AWS, Azure, or GCP
• Familiarity with DevOps/IaC tools (e.g., Terraform, GitHub Actions, Jenkins)
• Proven track record in designing secure, scalable data platforms
Preferred Qualifications
• Experience with public health data analytics or government data systems
• Familiarity with public sector compliance frameworks (HIPAA, CJIS, FedRAMP)
• Experience with Tableau or other data visualization tools
• Strong understanding of metadata management and data governance frameworks