

The Brixton Group
Snowflake Data Engineer - Snowpark
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Snowflake Data Engineer - Snowpark, lasting 9+ months, with a pay rate of $80-100/hr W2. It requires 7+ years of data engineering experience, 3+ years with Snowflake, and 2+ years with Snowpark (Python). Remote work is limited to the East Coast U.S.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
800
-
🗓️ - Date
March 3, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Remote
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Florham Park, NJ
-
🧠 - Skills detailed
#GCP (Google Cloud Platform) #"ETL (Extract #Transform #Load)" #Azure #Data Quality #SQL (Structured Query Language) #Snowflake #Data Engineering #Scala #AWS (Amazon Web Services) #Python #Snowpark #SnowPipe #Consulting #Airflow #Data Pipeline #Datasets #Cloud
Role description
Duration: 9+ months
Pay: $80-100/hr W2
Location: 100% REMOTE - East Coast U.S. only (Must reside in Eastern or Central Time Zone)
We are seeking a Senior Data Engineer with hands-on Snowpark (Python) experience to build and optimize scalable data pipelines within Snowflake. This role will support pharma/life sciences client initiatives and requires strong technical execution in a consulting environment.
Requirements:
• 7+ years of experience as a Data Engineer
• 3+ years hands-on Snowflake production experience
• 2+ years production experience with Snowpark (Python). Not just Python — specifically Snowpark in Snowflake.
• Experience implementing ingestion workflows using Snowpipe.
• Strong SQL skills in Snowflake.
• Experience building ELT pipelines directly in Snowflake (using Tasks, Streams, or Snowpipe)
• Experience in client-facing or consulting environments.
Preferred:
• Pharma / Life Sciences industry experience.
• Experience working with analytics or commercial datasets.
• Cloud platform experience (AWS, Azure, or GCP).
• Experience with orchestration tools (Airflow, etc.).
Responsibilities:
• Develop scalable ELT pipelines in Snowflake.
• Build data transformation logic using Snowpark (Python).
• Implement and manage ingestion workflows using Snowpipe.
• Optimize Snowflake warehouse performance and query tuning.
• Ensure data quality and reliability.
• Collaborate directly with client stakeholders and project teams.
26-00261
Duration: 9+ months
Pay: $80-100/hr W2
Location: 100% REMOTE - East Coast U.S. only (Must reside in Eastern or Central Time Zone)
We are seeking a Senior Data Engineer with hands-on Snowpark (Python) experience to build and optimize scalable data pipelines within Snowflake. This role will support pharma/life sciences client initiatives and requires strong technical execution in a consulting environment.
Requirements:
• 7+ years of experience as a Data Engineer
• 3+ years hands-on Snowflake production experience
• 2+ years production experience with Snowpark (Python). Not just Python — specifically Snowpark in Snowflake.
• Experience implementing ingestion workflows using Snowpipe.
• Strong SQL skills in Snowflake.
• Experience building ELT pipelines directly in Snowflake (using Tasks, Streams, or Snowpipe)
• Experience in client-facing or consulting environments.
Preferred:
• Pharma / Life Sciences industry experience.
• Experience working with analytics or commercial datasets.
• Cloud platform experience (AWS, Azure, or GCP).
• Experience with orchestration tools (Airflow, etc.).
Responsibilities:
• Develop scalable ELT pipelines in Snowflake.
• Build data transformation logic using Snowpark (Python).
• Implement and manage ingestion workflows using Snowpipe.
• Optimize Snowflake warehouse performance and query tuning.
• Ensure data quality and reliability.
• Collaborate directly with client stakeholders and project teams.
26-00261





