GuruSchools LLC

Sr Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Sr Data Engineer in New York City, on a contract basis with potential for extension. Key skills include PySpark, Snowflake, and modern data warehousing. Requires 5+ years of experience and advanced SQL proficiency. Immediate start preferred.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
March 3, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
New York, NY
-
🧠 - Skills detailed
#Data Mart #"ETL (Extract #Transform #Load)" #Azure #React #Data Processing #AWS (Amazon Web Services) #Data Security #PySpark #Data Modeling #AI (Artificial Intelligence) #Snowflake #Python #Compliance #Data Engineering #Scala #Data Pipeline #Security #Data Quality #Cloud #GCP (Google Cloud Platform) #Spark (Apache Spark) #SQL (Structured Query Language)
Role description
Position Title • Sr Data Engineer Position Responsibilities Role: Senior Data Engineer Location: New York City (Onsite) Work Arrangement: Onsite Duration: Contract, potential for extension Key Highlights / Must-Haves • Hands-on experience with PySpark • Experience with Snowflake Cortex AI • Strong knowledge of modern Data Warehousing practices • Recent experience with React.js highly advantageous Role Summary Client is seeking a Senior Python and Snowflake Data Engineer to design, build, and optimize scalable data pipelines, implement Snowflake data models, and deliver end-to-end data engineering solutions. The role involves collaborating with cross-functional teams to improve data quality, performance, and secure delivery of data products in complex enterprise environments. Note: The client has an urgent need and is seeking candidates who can start immediately. Key Responsibilities • Design, build, and deploy high-performance data pipelines and backend services • Implement and optimize Snowflake data models and data marts • Deliver end-to-end data engineering solutions, including ingestion, processing, and delivery • Optimize performance, scalability, and cost of data workflows • Collaborate with cross-functional teams to support complex business requirements • Support front-end applications with React.js where applicable (Preferred) Required Skills & Expertise • 5+ years in data engineering or backend development • Strong hands-on experience with Python, PySpark, and Snowflake • Proven expertise in Snowflake data modeling, performance tuning, and optimization • Experience with Snowflake Cortex AI or similar AI-enabled data platform capabilities • Solid understanding of modern data warehousing concepts (dimensional modeling, ELT/ETL, optimization strategies) • Advanced SQL skills for designing scalable data models • Experience delivering cloud-based end-to-end data engineering solutions (AWS, Azure, GCP) • Working knowledge of React.js for supporting data-driven front-end applications • Strong understanding of distributed data processing frameworks • Knowledge of data security, governance, and compliance best practices • Proven ability to collaborate effectively in complex enterprise environments