Saransh Inc

Lead Data Engineer (Snowflake, Airflow and Graph RAG) - W2 Role

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Lead Data Engineer (Snowflake, Airflow, and Graph RAG) on a 12-month W2 contract, located in New York, NY / Westport, CT. Key skills include proficiency in Snowflake, Airflow, and Graph Data Structures, with a focus on AI integration.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
October 24, 2025
πŸ•’ - Duration
More than 6 months
-
🏝️ - Location
On-site
-
πŸ“„ - Contract
W2 Contractor
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Westport, CT
-
🧠 - Skills detailed
#Data Engineering #Data Catalog #Compliance #Data Management #AWS (Amazon Web Services) #AI (Artificial Intelligence) #Data Integration #Data Ingestion #Data Lineage #Snowflake #Knowledge Graph #Data Architecture #Scala #Data Quality #Security #Neo4J #Logical Data Model #"ETL (Extract #Transform #Load)" #Metadata #Airflow
Role description
Role: Lead Data Engineer (Snowflake, Airflow and Graph RAG) Location: New York, NY / Westport, CT (Onsite from Day 1) Job Type: W2 Contract Project Duration: 12 months Requirements Technical Expertise: β€’ Strong proficiency in Snowflake and Airflow (mandatory). β€’ Deep knowledge of GraphRAG and Graph Data Structures. β€’ Strong background in Data Architecture/Design with experience in market data sources. AI And Integration Experience β€’ Experience integrating heavy AI-driven architectures. β€’ Ability to share knowledge with project and client teams on Cypher, Graph, and RAG traversals. Additional Skills β€’ AWS experience is a plus. Responsibilities Data Integration and Management: β€’ Identify and integrate structured, unstructured, and semi-structured data sources. β€’ Implement RBAC and develop logical data models, ensuring compliance and data lineage. β€’ Document business transformations and establish data quality rules. β€’ Create a comprehensive data catalog and register data assets with metadata. Design Implementation β€’ Translate and implement the architectural blueprint for the Data Factory. β€’ Ensure robust data management and quality to support automated investment processes and AI-driven analytics. β€’ Implement Neo4j-powered Knowledge Graph and ensure investment decision traceability. Advanced Data Solutions β€’ Develop and maintain multi-dimensional asset ontology and temporal versioning. β€’ Incorporate GraphRAG for transforming unstructured documents into queryable graph entities. Security And Access Control β€’ Implement property-based access control and oversee real-time data ingestion pipelines. Data Architecture And Management β€’ Oversee scalable data warehousing solutions using Snowflake. β€’ Develop frameworks for data quality and governance, including metadata registration and sensitivity/security considerations. Note: Visa Independent candidates are preferred