Cyber Sphere

Data Architect Location-Hybrid 2days @ Chicago, IL or Battle Creek, Michigan , IL-Need Locals Only

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Architect on a contract basis, hybrid (2 days in Chicago, IL or Battle Creek, MI). Requires 8+ years in Data Engineering, FMCG experience, AWS, Snowflake, dbt expertise, and strong SQL/Python skills.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
December 17, 2025
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Hybrid
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Chicago, IL
-
🧠 - Skills detailed
#Documentation #Deployment #Data Science #Data Pipeline #Data Modeling #Leadership #ML (Machine Learning) #Snowflake #Agile #SQL (Structured Query Language) #Data Governance #GIT #Data Analysis #.Net #Data Architecture #Cloud #Data Engineering #dbt (data build tool) #Security #Python #Automation #S3 (Amazon Simple Storage Service) #AWS (Amazon Web Services) #"ETL (Extract #Transform #Load)" #IAM (Identity and Access Management) #Batch #Data Vault #Vault #Redshift #BI (Business Intelligence) #Lambda (AWS Lambda) #Compliance #Migration #Version Control #Monitoring #Scala #Airflow
Role description
Title - Data Architect Location-Hybrid 2daysΒ @ Chicago, IL or Battle Creek, Michigan, IL-Need Locals Only Duration –Contract Any working Experience in Data Architect with( FMCG )Fast-Moving Consumer Goods Background We are seeking a highly skilled Architect / Senior Data Engineer to design, build, and optimize our modern data ecosystem. The ideal candidate will have deep experience with AWS cloud services, Snowflake, and dbt, along with a strong understanding of scalable data architecture, ETL/ELT development, and data modeling best practices. Key Responsibilities β€’ Architect, design, and implement scalable, reliable, and secure data solutions using AWS, Snowflake, and dbt. β€’ Develop end-to-end data pipelines (batch and streaming) to support analytics, machine learning, and business intelligence needs. β€’ Lead the modernization and migration of legacy data systems to cloud-native architectures. β€’ Define and enforce data engineering best practices including coding standards, CI/CD, testing, and monitoring. β€’ Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and translate them into technical solutions. β€’ Optimize Snowflake performance through query tuning, warehouse sizing, and cost management. β€’ Establish and maintain data governance, security, and compliance standards across the data platform. β€’ Mentor and guide junior data engineers, providing technical leadership and direction. Required Skills & Qualifications β€’ 8+ years of experience in Data Engineering, with at least 3+ years in a cloud-native data environment. β€’ Hands-on expertise in AWS services such as S3, Glue, Lambda, Step Functions, Redshift, and IAM. β€’ Strong experience with Snowflake – data modeling, warehouse design, performance optimization, and cost governance. β€’ Proven experience with dbt (data build tool) – model development, documentation, and deployment automation. β€’ Proficient in SQL, Python, and ETL/ELT pipeline development. β€’ Experience with CI/CD pipelines, version control (Git), and workflow orchestration tools (Airflow, Dagster, Prefect, etc.). β€’ Familiarity with data governance and security best practices, including role-based access control and data masking. β€’ Strong understanding of data modeling techniques (Kimball, Data Vault, etc.) and data architecture principles. Preferred Qualifications β€’ AWS Certification (e.g., AWS Certified Data Analytics – Specialty, Solutions Architect). β€’ Strong communication and collaboration skills, with a track record of working in agile environments. Regards, Sai Srikar 7704565690 Email:sai@cysphere.net