Hope Tech

Data Architect

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Architect in San Francisco, CA, with a contract length of over 6 months and a pay rate of $120,967.18 - $145,680.91 per year. Key skills include AWS expertise, Python proficiency, and data modeling.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
662
-
🗓️ - Date
February 28, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
On-site
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
San Francisco, CA 94114
-
🧠 - Skills detailed
#Computer Science #Security #Data Processing #Data Governance #Terraform #Automation #Data Engineering #Data Architecture #Data Manipulation #"ETL (Extract #Transform #Load)" #Data Vault #Lambda (AWS Lambda) #Cloud #Infrastructure as Code (IaC) #API (Application Programming Interface) #Data Modeling #Scala #Schema Design #AWS (Amazon Web Services) #AWS S3 (Amazon Simple Storage Service) #Redshift #Airflow #SQL (Structured Query Language) #ML (Machine Learning) #S3 (Amazon Simple Storage Service) #Strategy #Kafka (Apache Kafka) #Vault #Python #Data Science
Role description
Job Title: Data Architect / Data Platform ArchitectLocation: San Francisco, CA 94105 (Required: Onsite 5 days a week) Strictly W2 No C2C Note: Do not apply if you're looking for C2C Full Job Description We are looking for a seasoned Data Architect to own the technical vision for our data ecosystem. In this role, you will architect a best-in-class Data Platform on AWS and contribute hands-on code using Python. This role requires a strategic thinker who can design high-level data models while also being comfortable writing production-grade code alongside the engineering team. This position is 100% onsite in San Francisco. You will work directly with stakeholders across the business to translate complex requirements into scalable, reliable data solutions. Responsibilities Architect & Design: Design and implement scalable data architectures and ETL pipelines on AWS (S3, Redshift, Glue, EMR, Kinesis, Airflow). Platform Development: Build and optimize the core Data Platform, focusing on data governance, security, and schema design. Hands-on Coding: Write production-level Python code for data processing, API development, and automation. Strategy: Define data standards and best practices for data modeling, ingestion, and lifecycle management. Collaboration: Work closely with data scientists and analysts to ensure data availability for analytics and machine learning initiatives. Performance: Troubleshoot performance bottlenecks and optimize query performance for scalability. Qualifications Experience: 7+ years in data engineering or data architecture, with at least 2 years in a lead or architect role. Cloud Expertise: Deep experience with the modern AWS data stack (Glue, Redshift, S3, Lambda, Step Functions, or EMR). Coding: Expert-level proficiency in Python for data manipulation and application development. Strong SQL skills required. Platform Mindset: Proven experience building reusable data platform components (not just one-off pipelines). Data Modeling: Strong knowledge of Kimball, Inmon, or Data Vault methodologies. Nice to Have Experience with streaming technologies (Kafka, MSK, or Kinesis). Familiarity with Infrastructure as Code (Terraform or CloudFormation). Bachelor's degree in Computer Science, Engineering, or related field. Pay: $120,967.18 - $145,680.91 per year Work Location: In person