Vedan Technologies

Snowflake Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Snowflake Data Engineer in Boston, MA, with a contract length of unspecified duration. The position offers a competitive pay rate and requires strong experience in Snowflake, SQL, ETL/ELT tools, and cloud platforms (AWS/Azure/GCP).
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
April 7, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Boston, MA
-
🧠 - Skills detailed
#Programming #Data Modeling #Data Analysis #Data Quality #Data Engineering #GIT #Security #Snowflake #GCP (Google Cloud Platform) #API (Application Programming Interface) #Scala #Azure #Matillion #Version Control #dbt (data build tool) #Data Warehouse #Data Pipeline #Informatica #Airflow #Talend #"ETL (Extract #Transform #Load)" #Data Integration #AWS (Amazon Web Services) #Cloud #Java #Python #JSON (JavaScript Object Notation) #SQL (Structured Query Language)
Role description
Job Title: Snowflake Data Engineer Location: Boston, MA (Only Local) Contract Role Job Summary: We are seeking a skilled Snowflake Data Engineer to design, build, and optimize scalable data pipelines and data warehouse solutions using Snowflake. The ideal candidate will have strong experience in cloud data platforms, ETL/ELT processes, and data modeling, with a focus on delivering high-quality, reliable data solutions. Key Responsibilities: • Design, develop, and maintain data pipelines using Snowflake • Implement ETL/ELT workflows for ingesting data from multiple sources • Optimize Snowflake performance, including query tuning and cost management • Build and maintain data models (star/snowflake schemas) • Work with structured and semi-structured data (JSON, Parquet, etc.) • Integrate Snowflake with cloud platforms like AWS, Azure, or GCP • Ensure data quality, integrity, and security best practices • Collaborate with data analysts, scientists, and business stakeholders • Automate workflows using orchestration tools (e.g., Airflow) • Monitor and troubleshoot data pipelines and system issues Required Skills & Qualifications: • Strong experience with Snowflake Data Warehouse • Proficiency in SQL (advanced level) • Experience with ETL/ELT tools (e.g., Informatica, Talend, dbt, Matillion) • Hands-on experience with cloud platforms (AWS / Azure / GCP) • Programming knowledge in Python, Scala, or Java • Understanding of data modeling techniques • Experience with data integration and API-based ingestion • Familiarity with CI/CD pipelines and version control (Git)