

Senior Data Engineer with Snowflake and Databricks.
โญ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer with Snowflake and Databricks in Seattle, WA, for 9-12 months. Requires 12+ years of experience, 2+ years with Snowflake and Databricks, strong Python and SQL skills, and expertise in data architecture and security.
๐ - Country
United States
๐ฑ - Currency
$ USD
-
๐ฐ - Day rate
-
๐๏ธ - Date discovered
September 30, 2025
๐ - Project duration
More than 6 months
-
๐๏ธ - Location type
On-site
-
๐ - Contract type
Unknown
-
๐ - Security clearance
Unknown
-
๐ - Location detailed
Seattle, WA
-
๐ง - Skills detailed
#Snowflake #Security #Batch #Data Quality #Data Architecture #Data Pipeline #Data Security #Automation #Scala #Databricks #Data Engineering #Python #SQL (Structured Query Language) #Programming #ML (Machine Learning) #AI (Artificial Intelligence) #Microservices #NoSQL #Data Science #Data Modeling #Cloud
Role description
Role: Senior Data Engineer with Snowflake and Databricks.
Location: Seattle, WA
Duration: 9-12 months
Rates: DOE
Job Responsibilities:
โข Develop data engineering solutions, design and develop data pipelines to support Product, business and Data Science teams specific data needs.
โข Design and develop data architectures in on-premise, cloud and hybrid platforms
โข Collaborate with cross-functional teamsโincluding product, security, privacy, operations, and partnersโto understand data needs, define and deliver features, model data using relational and NoSQL systems, and develop pipelines that are scalable, repeatable, secure, and aligned with data quality standards.
โข Drive automation and optimization of data workflows using AI/ML techniques.
โข Mentor other team members in their efforts to build data engineering skillsets
โข Assist team management in defining projects, including helping estimate, plan and scope work
Work Experience:
โข 12+ years of experience in designing and building scalable data systems, including both batch and event driven data pipelines.
โข 2+ years of experience with Snowflake and Databricks
โข Strong programming skills in Python, SQL, with experience building data tooling and APIs.
โข Solid understanding of distributed systems, data modeling, and microservices architecture.
โข Familiarity with data security best practices, including encryption, access control, and privacy-forward design principles.
โข Passion for solving complex problems, learning new technologies, and enabling AI innovation through robust data engineering
Role: Senior Data Engineer with Snowflake and Databricks.
Location: Seattle, WA
Duration: 9-12 months
Rates: DOE
Job Responsibilities:
โข Develop data engineering solutions, design and develop data pipelines to support Product, business and Data Science teams specific data needs.
โข Design and develop data architectures in on-premise, cloud and hybrid platforms
โข Collaborate with cross-functional teamsโincluding product, security, privacy, operations, and partnersโto understand data needs, define and deliver features, model data using relational and NoSQL systems, and develop pipelines that are scalable, repeatable, secure, and aligned with data quality standards.
โข Drive automation and optimization of data workflows using AI/ML techniques.
โข Mentor other team members in their efforts to build data engineering skillsets
โข Assist team management in defining projects, including helping estimate, plan and scope work
Work Experience:
โข 12+ years of experience in designing and building scalable data systems, including both batch and event driven data pipelines.
โข 2+ years of experience with Snowflake and Databricks
โข Strong programming skills in Python, SQL, with experience building data tooling and APIs.
โข Solid understanding of distributed systems, data modeling, and microservices architecture.
โข Familiarity with data security best practices, including encryption, access control, and privacy-forward design principles.
โข Passion for solving complex problems, learning new technologies, and enabling AI innovation through robust data engineering