Sr. Snowflake Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Sr. Snowflake Data Engineer on a contract basis in Basildon, UK (Hybrid) for 10+ years of experience. Key skills include Snowflake, AWS, SQL, Python, data migration, and data warehousing.
🌎 - Country
United Kingdom
πŸ’± - Currency
Β£ GBP
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
September 10, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
Hybrid
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
Basildon, England, United Kingdom
-
🧠 - Skills detailed
#Airflow #Snowflake #Oracle #AWS (Amazon Web Services) #Lambda (AWS Lambda) #Python #Data Pipeline #Data Migration #StreamSets #Datasets #"ETL (Extract #Transform #Load)" #Data Quality #Migration #Data Modeling #Data Warehouse #Cloud #S3 (Amazon Simple Storage Service) #Storage #Redshift #SQL (Structured Query Language) #Data Analysis #Computer Science #dbt (data build tool) #RDBMS (Relational Database Management System) #Data Integration #Scala #Data Engineering
Role description
Role- Senior Snowflake Data Engineer Location : Basildon, UK (Hybrid) Hiring Type - Contract Exp- 10+ Years Responsibilities: β€’ Design, develop, and maintain robust data pipelines and ETL processes using Snowflake on AWS. β€’ Implement data warehousing solutions, ensuring efficient storage, retrieval, and transformation of large datasets. β€’ Collaborate with data analysts, scientists, and other stakeholders to define and fulfil data requirements. β€’ Optimize performance and scalability of Snowflake data warehouse, ensuring high availability and reliability. β€’ Develop and maintain data integration solutions, ensuring seamless data flow between various sources and Snowflake. β€’ Monitor, troubleshoot, and resolve data pipeline issues, ensuring data quality and integrity. β€’ Stay up to date with the latest trends and best practices in data engineering and cloud technologies. Cloud Services such as AWS Qualifications: β€’ Bachelor’s degree in computer science, Engineering, or a related field. β€’ 5+ years of experience in data engineering, with a strong focus on Snowflake and AWS. β€’ Proficiency in SQL, Python, and ETL tools (Streamsets, DBT etc.) β€’ Hands on experience with Oracle RDBMS β€’ Data Migration experience to Snowflake β€’ Experience with AWS services such as S3, Lambda, Redshift, and Glue. β€’ Strong understanding of data warehousing concepts and data modeling. β€’ Excellent problem-solving and communication skills, with a focus on delivering high-quality solutions. β€’ Understanding/hands on experience in Orchestration solutions such as Airflow β€’ Deep knowledge of key non-functional requirements such as availability, scalability, operability, and maintainability