

Azure Snowflake Data Engineer ____ Boston, MA (Onsite) ____ Contract
β - Featured Role | Apply direct with Data Freelance Hub
This role is for an Azure Snowflake Data Engineer in Boston, MA, on a 6-12+ month contract. Requires 8+ years in data engineering, 4-6+ years with Snowflake and Azure, advanced SQL, and Python skills. Hybrid work model.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
July 16, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Hybrid
-
π - Contract type
W2 Contractor
-
π - Security clearance
Unknown
-
π - Location detailed
Boston, MA
-
π§ - Skills detailed
#Computer Science #Datasets #Data Processing #Delta Lake #ADF (Azure Data Factory) #Azure #"ETL (Extract #Transform #Load)" #Databricks #Data Pipeline #Scala #Azure Blob Storage #Data Manipulation #SQL (Structured Query Language) #Synapse #Data Ingestion #Storage #Security #Azure Data Factory #Automation #dbt (data build tool) #Snowflake #Data Lake #Airflow #Azure Data Platforms #Batch #Apache Airflow #Data Engineering #Scripting #SnowPipe #Data Analysis #Python
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Job Title: Azure Snowflake Data Engineer
Location: Boston, MA β HYBDID (4x per week Onsite)
Duration: 6-12+ Months Contract β W2/C2C
Job Summary: We are looking for a highly skilled Azure Snowflake Data Engineer to join our data engineering team on a contract basis. This role will focus on building robust, scalable data solutions using Snowflake and Azure, enabling real-time and batch data processing pipelines. The ideal candidate will be well-versed in modern data engineering practices and Snowflake-specific capabilities such as Time Travel, Zero Copy Cloning, and Snowpipe.
Key Responsibilities
β’ Design and implement end-to-end data pipelines using Azure Data Services and Snowflake.
β’ Develop scalable ELT/ETL frameworks and manage large datasets efficiently.
β’ Implement Snowpipe for continuous data ingestion and streaming.
β’ Leverage Time Travel and Zero Copy Cloning features for data versioning, recovery, and testing.
β’ Optimize query performance and storage usage within Snowflake.
β’ Collaborate with data analysts, engineers, and business stakeholders to understand data requirements.
β’ Ensure data reliability, governance, and security across platforms.
β’ Monitor, troubleshoot, and maintain existing data workflows and infrastructure.
Required Skills
β’ Bachelor's or Masterβs degree in Computer Science, Engineering, or a related field.
β’ Must have 8+ Years of in data engineering or analytics roles.
β’ 4-6+ years of hands-on experience with Snowflake and Azure data platforms.
β’ Snowflake: Advanced expertise including Time Travel, Zero Copy Cloning, and Snowpipe.
β’ Azure Data Services: Azure Data Factory, Azure Synapse, Azure Data Lake, Azure Blob Storage.
β’ Strong proficiency in SQL for data manipulation and optimization.
β’ Experience with Python for scripting and automation.
β’ Solid understanding of data warehousing, data lakes, and distributed systems.
Nice To Have
β’ Experience with dbt, Apache Airflow, or similar orchestration tools.
β’ Familiarity with CI/CD in data engineering workflows.
β’ Knowledge of Delta Lake or Databricks is a plus.