

Dminds Solutions Inc.
Snowflake Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Snowflake Architect in Chicago, IL, on a full-time contract for over 6 months, offering competitive pay. Requires 8+ years in Data Engineering, 3+ years as a Snowflake Architect, and insurance domain experience.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
November 14, 2025
🕒 - Duration
More than 6 months
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Chicago, IL
-
🧠 - Skills detailed
#Data Warehouse #Security #Oracle #Data Quality #Azure #Data Migration #Snowflake #Matillion #Alation #Cloud #Data Processing #Data Governance #Spark (Apache Spark) #GCP (Google Cloud Platform) #SQL (Structured Query Language) #Version Control #Informatica #Migration #Compliance #Data Engineering #Data Pipeline #dbt (data build tool) #Fivetran #Data Modeling #Teradata #"ETL (Extract #Transform #Load)" #Collibra #Python #DevOps #AWS (Amazon Web Services) #Talend #Scala #Data Architecture
Role description
Job Title: Snowflake Architect
Location: Chicago, IL (Onsite)
Domain: Insurance
Job Type: Full-time / Contract
Job Summary:
We are seeking a highly skilled Snowflake Architect with proven experience in the insurance domain to lead our data architecture and cloud transformation initiatives. The ideal candidate will have deep expertise in Snowflake Data Cloud, cloud data warehousing, data modeling, and ETL/ELT architecture. The role requires working onsite in Chicago, IL, collaborating with business stakeholders, data engineers, and analysts.
Key Responsibilities:
• Design and implement scalable Snowflake architecture to support business data needs and analytics.
• Lead data migration efforts from legacy systems (Teradata, Oracle, etc.) to Snowflake.
• Collaborate with business stakeholders to understand data and reporting requirements, especially within the insurance sector (e.g., policy, claims, underwriting, etc.).
• Optimize data pipelines and ELT/ETL processes using tools like Informatica, DBT, Matillion, or custom scripts.
• Implement security models, RBAC, and data governance best practices in Snowflake.
• Work closely with data engineers and analysts to ensure high data quality, performance, and accessibility.
• Establish and promote architectural standards, best practices, and reusable frameworks.
• Troubleshoot performance issues and recommend Snowflake performance tuning strategies.
• Stay current on Snowflake releases, features, and industry trends, especially in insurance analytics.
Required Qualifications:
• 8+ years of experience in Data Engineering/Data Architecture.
• 3+ years of hands-on experience as a Snowflake Architect.
• Strong experience with data modeling (dimensional, normalized) and data warehouse design.
• Hands-on experience with ELT/ETL development using tools such as Informatica, Talend, DBT, Fivetran, etc.
• Experience in cloud platforms (AWS, Azure, or GCP); Snowflake on AWS is preferred.
• Strong knowledge of SQL, performance tuning, views, and stored procedures.
• Demonstrated experience working in the Insurance domain - knowledge of claims, policies, customers, premiums, actuarial, and regulatory data is a big plus.
• Experience with CI/CD pipelines, version control, and DevOps practices in data engineering.
• Strong communication skills and ability to present technical ideas to business users.
Preferred Qualifications:
• Snowflake certification (e.g., SnowPro Advanced Architect).
• Experience with data governance tools like Collibra, Alation, or Informatica EDC.
• Experience in regulatory reporting and compliance in the insurance sector.
• Familiarity with Python, Spark, or other data processing tools.
Work Environment:
• This is a 100% onsite role based in Chicago, IL.
• Must be comfortable working in a fast-paced, collaborative environment.
Job Title: Snowflake Architect
Location: Chicago, IL (Onsite)
Domain: Insurance
Job Type: Full-time / Contract
Job Summary:
We are seeking a highly skilled Snowflake Architect with proven experience in the insurance domain to lead our data architecture and cloud transformation initiatives. The ideal candidate will have deep expertise in Snowflake Data Cloud, cloud data warehousing, data modeling, and ETL/ELT architecture. The role requires working onsite in Chicago, IL, collaborating with business stakeholders, data engineers, and analysts.
Key Responsibilities:
• Design and implement scalable Snowflake architecture to support business data needs and analytics.
• Lead data migration efforts from legacy systems (Teradata, Oracle, etc.) to Snowflake.
• Collaborate with business stakeholders to understand data and reporting requirements, especially within the insurance sector (e.g., policy, claims, underwriting, etc.).
• Optimize data pipelines and ELT/ETL processes using tools like Informatica, DBT, Matillion, or custom scripts.
• Implement security models, RBAC, and data governance best practices in Snowflake.
• Work closely with data engineers and analysts to ensure high data quality, performance, and accessibility.
• Establish and promote architectural standards, best practices, and reusable frameworks.
• Troubleshoot performance issues and recommend Snowflake performance tuning strategies.
• Stay current on Snowflake releases, features, and industry trends, especially in insurance analytics.
Required Qualifications:
• 8+ years of experience in Data Engineering/Data Architecture.
• 3+ years of hands-on experience as a Snowflake Architect.
• Strong experience with data modeling (dimensional, normalized) and data warehouse design.
• Hands-on experience with ELT/ETL development using tools such as Informatica, Talend, DBT, Fivetran, etc.
• Experience in cloud platforms (AWS, Azure, or GCP); Snowflake on AWS is preferred.
• Strong knowledge of SQL, performance tuning, views, and stored procedures.
• Demonstrated experience working in the Insurance domain - knowledge of claims, policies, customers, premiums, actuarial, and regulatory data is a big plus.
• Experience with CI/CD pipelines, version control, and DevOps practices in data engineering.
• Strong communication skills and ability to present technical ideas to business users.
Preferred Qualifications:
• Snowflake certification (e.g., SnowPro Advanced Architect).
• Experience with data governance tools like Collibra, Alation, or Informatica EDC.
• Experience in regulatory reporting and compliance in the insurance sector.
• Familiarity with Python, Spark, or other data processing tools.
Work Environment:
• This is a 100% onsite role based in Chicago, IL.
• Must be comfortable working in a fast-paced, collaborative environment.






