

Cortex Consultants LLC
ETL Developer - 10-11 Years Experience Only
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an ETL Developer with 10-11 years of experience, focusing on Snowflake and AWS, based in Windsor, CT. Contract position with a strong emphasis on data engineering, SQL, Python, and AWS services. Snowflake certification preferred.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
October 28, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Windsor, CT
-
🧠 - Skills detailed
#Programming #BI (Business Intelligence) #"ETL (Extract #Transform #Load)" #Cloud #Snowflake #Lambda (AWS Lambda) #SQL (Structured Query Language) #Data Governance #Apache Airflow #Scripting #Data Integration #Deployment #Security #Scala #S3 (Amazon Simple Storage Service) #Redshift #Data Processing #Data Architecture #Agile #Data Pipeline #Visualization #Tableau #GIT #Computer Science #Documentation #Scrum #Microsoft Power BI #Data Engineering #Data Quality #Version Control #Python #Data Modeling #Airflow #AWS (Amazon Web Services) #Data Ingestion #Databases
Role description
Position : ETL Developer (Snowflake & AWS)
Location : Windsor, CT
Job Type : Contract
Job Roles and Responsibilities:
• We are seeking a skilled ETL Developer with strong experience in Snowflake and AWS to design, develop, and maintain scalable data pipelines and ETL processes. The ideal candidate will have a deep understanding of cloud-based data warehousing, data integration, and transformation techniques, and will play a key role in enabling data-driven decision-making across the organization.
Key Responsibilities:
• Design, develop, and optimize ETL workflows and data pipelines using Snowflake and AWS services.
• Implement data ingestion from various sources including APIs, databases, and flat files.
• Ensure data quality, integrity, and consistency across all ETL processes.
• Collaborate with data architects, analysts, and business stakeholders to understand data requirements.
• Monitor and troubleshoot ETL jobs and performance issues.
• Automate data workflows and implement CI/CD practices for data pipeline deployment.
• Maintain documentation for ETL processes, data models, and data flow diagrams.
Required Skills & Qualifications:
• Bachelor’s degree in computer science, Information Systems, or related field.
• 9+ years of experience in ETL development and data engineering.
• Hands-on experience with Snowflake including data modeling, performance tuning, and SQL scripting.
• Proficiency in AWS services such as S3, Lambda, Glue, Redshift, and CloudWatch.
• Strong programming skills in Python or Scala for data processing.
• Experience with orchestration tools like Apache Airflow or AWS Step Functions.
• Familiarity with version control systems (e.g., Git) and CI/CD pipelines.
• Excellent problem-solving and communication skills.
Preferred Qualifications:
• Snowflake certification.
• Experience with data visualization tools (e.g., Tableau, Power BI).
• Knowledge of data governance and security best practices.
• Experience in Agile/Scrum development environments
Position : ETL Developer (Snowflake & AWS)
Location : Windsor, CT
Job Type : Contract
Job Roles and Responsibilities:
• We are seeking a skilled ETL Developer with strong experience in Snowflake and AWS to design, develop, and maintain scalable data pipelines and ETL processes. The ideal candidate will have a deep understanding of cloud-based data warehousing, data integration, and transformation techniques, and will play a key role in enabling data-driven decision-making across the organization.
Key Responsibilities:
• Design, develop, and optimize ETL workflows and data pipelines using Snowflake and AWS services.
• Implement data ingestion from various sources including APIs, databases, and flat files.
• Ensure data quality, integrity, and consistency across all ETL processes.
• Collaborate with data architects, analysts, and business stakeholders to understand data requirements.
• Monitor and troubleshoot ETL jobs and performance issues.
• Automate data workflows and implement CI/CD practices for data pipeline deployment.
• Maintain documentation for ETL processes, data models, and data flow diagrams.
Required Skills & Qualifications:
• Bachelor’s degree in computer science, Information Systems, or related field.
• 9+ years of experience in ETL development and data engineering.
• Hands-on experience with Snowflake including data modeling, performance tuning, and SQL scripting.
• Proficiency in AWS services such as S3, Lambda, Glue, Redshift, and CloudWatch.
• Strong programming skills in Python or Scala for data processing.
• Experience with orchestration tools like Apache Airflow or AWS Step Functions.
• Familiarity with version control systems (e.g., Git) and CI/CD pipelines.
• Excellent problem-solving and communication skills.
Preferred Qualifications:
• Snowflake certification.
• Experience with data visualization tools (e.g., Tableau, Power BI).
• Knowledge of data governance and security best practices.
• Experience in Agile/Scrum development environments






