Snowflake Develop

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Junior Data Engineer/Snowflake Developer on a 6-12 month remote contract, offering competitive pay. Key skills include Snowflake, AWS, Python, and complex SQL. Requires a Bachelor's degree and 4+ years in IT, with 1+ year in Snowflake.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
August 7, 2025
πŸ•’ - Project duration
More than 6 months
-
🏝️ - Location type
Remote
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
United States
-
🧠 - Skills detailed
#SQL (Structured Query Language) #SnowPipe #DataStage #Data Quality #Data Mart #SQL Queries #Data Ingestion #Database Performance #SnowSQL #Talend #Deployment #Data Analysis #Data Integration #Python #AWS (Amazon Web Services) #Teradata #Migration #Data Engineering #"ETL (Extract #Transform #Load)" #Informatica #Programming #Scala #Data Extraction #Snowflake #Computer Science #Automation
Role description
Job Role: Jr Data Engineer/ Snowflake Developer Location: Remote Duration : 6-12 Months Contract Job Summary: We are seeking a highly motivated Junior Data Engineer / Junior Snowflake Developer with a strong foundation in data engineering concepts and hands-on experience in Snowflake, AWS, and Python. The ideal candidate will have a good grasp of ETL pipelines, complex SQL development, and Snowflake's advanced features. You will be a key player in designing, developing, and maintaining scalable data solutions that enable reliable and efficient analytics. Key Responsibilities: β€’ Develop and maintain scalable ETL pipelines from various source systems to staging and data marts. β€’ Work with Snowflake utilities, SnowSQL, and SnowPipe to develop stored procedures and handle data ingestion and processing tasks. β€’ Leverage Snowflake advanced features like Resource Monitors, Role-Based Access Control (RBAC), Zero-Copy Cloning, Virtual Warehouses, Streams, and Tasks. β€’ Write, optimize, and debug complex SQL queries for data extraction, transformation, and loading. β€’ Perform database performance tuning to support fast and reliable reporting. β€’ Work closely with senior engineers and stakeholders to analyze, migrate, cleanse, and validate data for analytics and reporting purposes. β€’ Support CI/CD pipeline automation and manage deployments using industry-standard tools. β€’ Perform data quality checks including mismatch identification, validation, import/export, and transformation using ETL tools like Informatica, Talend, Teradata, or DataStage. β€’ Collaborate across teams to ensure efficient data integration and availability in Snowflake for downstream systems. Required Qualifications: β€’ Bachelor’s degree or foreign equivalent in Computer Science, Information Technology, or a related field. β€’ 4+ years of total experience in Information Technology. β€’ 1+ year of hands-on experience with Snowflake development including SnowSQL, SnowPipe, stored procedures, and performance tuning. β€’ 3+ years of experience with AWS services and Python programming. β€’ Strong proficiency in writing and understanding complex SQL queries. β€’ Knowledge of CI/CD tools and automation best practices. β€’ Solid experience in data analysis, data validation, cleansing, and migration using ETL tools (e.g., Informatica, Talend, Teradata, DataStage).