Euclid Innovations

Senior Snowflake Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Snowflake Engineer in Fort Mill, SC, for a contract of 12+ months, offering competitive pay. Key skills include Snowflake, Python ETL, Apache Airflow, and AWS services. Requires 3+ years in Snowflake and 5+ years in ETL development.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
600
-
🗓️ - Date
March 27, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Fort Mill, SC
-
🧠 - Skills detailed
#Clustering #Data Pipeline #Monitoring #Airflow #Data Quality #Scala #AWS Lambda #Data Governance #Compliance #SnowSQL #Logging #EC2 #AWS (Amazon Web Services) #Cloud #Data Warehouse #Lambda (AWS Lambda) #Schema Design #PySpark #Redshift #Pandas #IAM (Identity and Access Management) #Spark (Apache Spark) #S3 (Amazon Simple Storage Service) #BI (Business Intelligence) #Amazon Redshift #Security #API (Application Programming Interface) #GIT #Snowflake #Data Modeling #Version Control #Data Engineering #SnowPipe #Python #"ETL (Extract #Transform #Load)" #SQL (Structured Query Language) #Apache Airflow
Role description
Hello... Hope you are doing good! This is Rahul from Euclid Innovations. Please find the JD and let me know if you are interested. Senior Snowflake Engineer Fort Mill, SC l2+ Months We are seeking an experienced Snowflake Developer with strong expertise in Python-based ETL development, Apache Airflow orchestration, and AWS cloud services. The ideal candidate will design, develop, and optimize scalable data pipelines and data warehouse solutions using Snowflake in a cloud-native environment. Key Responsibilities. • Design, develop, and maintain scalable data warehouse solutions in Snowflake. • Develop robust ETL/ELT pipelines using Python. • Build and manage workflow orchestration using Apache Airflow. • Integrate Snowflake with various AWS services including Amazon S3, AWS Lambda, and Amazon Redshift where applicable. • Optimize Snowflake performance including clustering, partitioning, query tuning, and warehouse sizing. • Implement data quality checks, logging, monitoring, and error handling frameworks. • Work closely with Data Engineers, BI teams, and business stakeholders to gather requirements and translate them into technical solutions. • Ensure best practices around data governance, security, and compliance. Required Skills & Qualifications: • Strong hands-on experience with Snowflake (schema design, SnowSQL, Snowpipe, Streams & Tasks). • 3+ years in Snowflake and 5+ years of experience in ETL development. • Advanced proficiency in Python (Pandas, PySpark, API integrations). • Experience building DAGs and scheduling pipelines using Apache Airflow. • Solid experience with AWS services (S3, EC2, Lambda, IAM, CloudWatch). • Strong SQL expertise and data modeling experience (Star/Snowflake schema). • Experience with CI/CD pipelines and version control (Git). • Knowledge of performance tuning and cost optimization strategies in Snowflake. Thanks & Regards, Rahul Yalamanchili | Technical Recruiter Euclid Innovations Inc. 15720 Brixham Hill avenue, Suite 201 Charlotte NC 28277 Email: rahul.yalamanchili@euclidinnovations.com Mobile: 980-246-0555 www.euclidinnovations.com Euclid Innovations is an Equal Opportunity Employer We do not discriminate against any applicant or employee based on race, color, religion, sex, sexual orientation, gender identity, national origin, disability, veteran status, or any other legally protected characteristic. At Euclid Innovations, we embrace individuals of all abilities and strive to ensure that our hiring and interview processes are accessible and accommodating to meet the needs of all applicants.