Lead Data Engineer (Hybrid in Boston, MA)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Lead Data Engineer with a hybrid arrangement in Boston, MA, offering a contract length of unspecified duration and competitive pay. Requires 3+ years of Snowflake experience, advanced SQL and Python skills, and expertise in data modeling.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
September 13, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
Hybrid
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
Boston, MA
-
🧠 - Skills detailed
#Snowflake #Data Engineering #Azure #Normalization #Agile #Dataflow #"ETL (Extract #Transform #Load)" #SQL (Structured Query Language) #Data Modeling #Data Pipeline #Physical Data Model #Python
Role description
Lead Data Engineer This position will be a hybrid work arrangement, which translates to 2-3 days minimum per week in the Boston, MA office. The lead data engineer will work toward the transformation of our firm’s data infrastructure, primarily using the Snowflake data environment, plus the Azure data stack as well. The resource joining us will partner with product owners, data owners, project managers, business users, data engineers, and infrastructure engineers to form complete end-to end-solutions. The resource will enjoy working in an evolving, fast-paced environment, and bring a work style marked by high energy, flexibility, quick learning, and collaboration. The ideal candidate is someone who has: β€’ Successful track record delivering on data-platform projects of significant scale and complexity, involving modern data platform implementations on Snowflake. 3+ years of Snowflake-specific experience required. β€’ Expertise in developing, presenting, and implementing data models aligned with the needs of business applications β€’ Confidence and skill in building and communicating data-flow schematics understandable to both business and technical teams β€’ Perseverance, empathy, "give-and-take" attitude, and respect for the inputs and contributions of others What you will do: β€’ Implement and support Snowflake-based data pipelines with source data originating in across a variety of technical environments β€’ Analyze business and technical requirements as basis for delivering sophisticated conceptual, logical, and physical data models β€’ Prepare data-flow schematics at multiple levels of detail, indicating the pros and cons of alternative design patterns β€’ Design and build data validations, transformations, normalizations, reports and extracts, and integration processes, using Snowflake, Dagster, and Azure data stacks β€’ Create and implement CI/CD pipelines β€’ Organize work and adhere to thorough work tracking, using Agile techniques β€’ Coordinate and contribute to the technical work of a team of 4-6 resources, including contractors What you bring: β€’ Advanced SQL and Python knowledge β€’ Proficiency in Snowflake and Dagster, plus similar capabilities in Azure data environment β€’ In-depth experience with data modeling tools and knowledge of sophisticated data modeling techniques/approaches β€’ Highly collaborative attitudes and work style; welcoming the inputs of others β€’ Confidence and skill to document work items and explain deliverables, to enhance overall team productivity β€’ Capability to balance demands of scope, schedule, and budget β€’ Demonstrable experience as a technical leader Education Preferred: β€’ Bachelor's degree Experience β€’ 10-12 Years