ALOIS Solutions

Lead Data Platform Architect – AWS / Snowflake

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Lead Data Platform Architect – AWS / Snowflake, offering a competitive W2 pay rate. The contract is remote for 15+ years of experience in Data Engineering, preferably in insurance, with expertise in AWS, SQL, Python, and real-time streaming.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
560
-
🗓️ - Date
February 11, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Remote
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#DataOps #Python #Leadership #Vault #VPC (Virtual Private Cloud) #AWS S3 (Amazon Simple Storage Service) #Athena #PySpark #Redshift #DevOps #Spark (Apache Spark) #Data Security #Data Vault #Data Architecture #Data Engineering #S3 (Amazon Simple Storage Service) #Splunk #Observability #Scala #Security #"ETL (Extract #Transform #Load)" #AWS Glue #Talend #Monitoring #Metadata #Code Reviews #ML (Machine Learning) #SQL (Structured Query Language) #AWS (Amazon Web Services) #Data Pipeline #Data Science #Compliance #IAM (Identity and Access Management) #Data Modeling #Kafka (Apache Kafka) #dbt (data build tool) #Cloud #Strategy #Snowflake
Role description
🚀 Hiring: Full Stack Senior Data Engineer (Principal-Level) 📍 Location: Remote (United States) 💼 Experience: 15+ Years (Must Have) 🏢 Industry: [Insurance Preferred] 💰 Competitive Compensation (W2 Only) We are looking for a visio nary Senior Full Stack Data Engineer to lead the architecture and evolution of a next-generation cloud-native data platform. This is a high-impact, hands-on leadership role for someone who thrives in designing scalable, secure, and high-performance data ecosystems. If you’ve built petabyte-scale data platforms and enjoy driving both architecture and execution — this role is for you. 🔥 What You’ll Do 🏗 Platform Strategy & Technical Leadership • Define and drive the architectural roadmap for end-to-end data pipelines • Lead best practices across scalability, reliability, and security • Mentor engineers, conduct code reviews, and accelerate project velocity • Partner with Data Scientists, ML Engineers, and business stakeholders to deliver production-ready solutions 🔄 Data Pipeline Engineering • Build high-volume ingestion & transformation pipelines (dbt Core, AWS Glue, Talend) • Design modular, reusable workflows (Dagster / Talend) • Implement real-time streaming pipelines (Kinesis / Confluent / Kafka) • Ensure data security, anonymization, and regulatory compliance • Apply strong DataOps and DevOps fundamentals 🧊 Lakehouse & Ecosystem Optimization • Implement Iceberg open table format & manage schema evolution • Optimize Snowflake, Redshift, and Athena for performance & cost efficiency • Integrate observability & monitoring (Splunk) • Manage metadata & AWS Glue Catalog 🎯 What We’re Looking For • 15+ years of progressive Data Engineering / Data Architecture experience • Deep expertise in AWS (S3, IAM, VPC, Redshift, Athena) • Strong SQL, Python & PySpark • Experience building petabyte-scale cloud-native data platforms • Strong understanding of data modeling (Dimensional, Data Vault) • Experience with Iceberg, dbt Core, Talend, Snowflake • Real-time streaming experience (Kinesis / Kafka / Confluent) • Proven technical leadership & mentorship experience • Insurance domain experience is a plus 💡 Why Join? • Architect and lead a modern lakehouse platform • High visibility and ownership • Work with cutting-edge open table formats & streaming technologies • Drive impact across Data Science & Business teams