

PRACYVA
Senior Data Engineer - Snowflake
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer - Snowflake, with a contract length of "unknown" and a pay rate of "unknown." Key skills required include Snowflake, SQL, Python, and data integration with Oracle. Experience in data architecture and ETL/ELT processes is essential.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
Unknown
-
🗓️ - Date
October 7, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
London Area, United Kingdom
-
🧠 - Skills detailed
#Version Control #Data Governance #SQLAlchemy #Monitoring #GIT #SQL (Structured Query Language) #Documentation #Security #Data Pipeline #Data Integration #Data Science #SnowSQL #Data Lake #Data Vault #ML (Machine Learning) #Snowflake #Deployment #Data Architecture #Libraries #SQL Queries #"ETL (Extract #Transform #Load)" #Normalization #Pandas #Data Lifecycle #Oracle #Databases #Vault #Batch #Complex Queries #Data Engineering #Dimensional Modelling #Informatica #Scala #SnowPipe #Python #Data Wrangling #Model Deployment #Datasets #Data Quality
Role description
Data Architecture & Platform Development
• Design, architect, and implement scalable data solutions using the Snowflake platform, including data warehousing, data lakes, and hybrid architectures
• Develop and maintain robust data pipelines that integrate Oracle databases with Snowflake, ensuring efficient data movement and transformation
• Optimize Snowflake features including materialized views, streams, tasks, and resource monitors to maximize performance and cost-efficiency
• Establish and maintain data integration patterns between Oracle sources and Snowflake using appropriate tools (Informatica) and methodologies (CDC, batch processing, real-time streaming)
Data Engineering & Pipeline Development
• Collaborate with cross-functional teams including data scientists, analysts, business stakeholders, and product managers to understand and document data requirements
• Design and develop efficient, reliable ETL/ELT data pipelines ensuring data quality, integrity, and security throughout the entire data lifecycle
• Implement data validation, monitoring, and alerting mechanisms to proactively identify and resolve data quality issues
• Build automated data workflows and orchestration processes to minimize manual intervention and ensure timely data availability
Technical Expertise & Best Practices
• Demonstrate strong expertise in data modelling including dimensional modelling, data vault, and normalization techniques appropriate for both OLTP and OLAP systems
• Perform data wrangling, cleansing, and transformation to prepare data for analytical and operational use cases
• Conduct performance tuning and optimization of SQL queries, stored procedures, and data pipelines in both Oracle and Snowflake environments
• Implement data governance, access controls, and security best practices including role-based access control (RBAC), data masking, and encryption
Analytics & Advanced Applications
• Support data analytics initiatives by creating optimized data models and aggregations that enable efficient reporting and analysis
• Collaborate with data science teams to prepare and structure data for machine learning workflows, including feature engineering and model deployment pipelines
• Develop and maintain documentation for data models, pipeline architectures, data dictionaries, and operational procedures
Technical Skills
• Expert-level proficiency in SQL with demonstrated experience writing complex queries, optimizing performance, and working with large datasets
• Strong hands-on experience with Snowflake platform including SnowSQL, Snowpipe, Streams, Tasks, and Time Travel features
• Proficiency in Python for data engineering tasks (bonus: experience with libraries such as pandas, SQLAlchemy, Snowflake Connector)
• Familiarity with version control systems (Git) and CI/CD practices for data pipelines
Soft Skills
• Strong analytical and problem-solving abilities with attention to detail
• Excellent communication skills with ability to translate technical concepts for non-technical stakeholders
• Ability to work collaboratively in cross-functional teams and manage multiple priorities
• Self-motivated with a continuous learning mindset to stay current with emerging technologies
Data Architecture & Platform Development
• Design, architect, and implement scalable data solutions using the Snowflake platform, including data warehousing, data lakes, and hybrid architectures
• Develop and maintain robust data pipelines that integrate Oracle databases with Snowflake, ensuring efficient data movement and transformation
• Optimize Snowflake features including materialized views, streams, tasks, and resource monitors to maximize performance and cost-efficiency
• Establish and maintain data integration patterns between Oracle sources and Snowflake using appropriate tools (Informatica) and methodologies (CDC, batch processing, real-time streaming)
Data Engineering & Pipeline Development
• Collaborate with cross-functional teams including data scientists, analysts, business stakeholders, and product managers to understand and document data requirements
• Design and develop efficient, reliable ETL/ELT data pipelines ensuring data quality, integrity, and security throughout the entire data lifecycle
• Implement data validation, monitoring, and alerting mechanisms to proactively identify and resolve data quality issues
• Build automated data workflows and orchestration processes to minimize manual intervention and ensure timely data availability
Technical Expertise & Best Practices
• Demonstrate strong expertise in data modelling including dimensional modelling, data vault, and normalization techniques appropriate for both OLTP and OLAP systems
• Perform data wrangling, cleansing, and transformation to prepare data for analytical and operational use cases
• Conduct performance tuning and optimization of SQL queries, stored procedures, and data pipelines in both Oracle and Snowflake environments
• Implement data governance, access controls, and security best practices including role-based access control (RBAC), data masking, and encryption
Analytics & Advanced Applications
• Support data analytics initiatives by creating optimized data models and aggregations that enable efficient reporting and analysis
• Collaborate with data science teams to prepare and structure data for machine learning workflows, including feature engineering and model deployment pipelines
• Develop and maintain documentation for data models, pipeline architectures, data dictionaries, and operational procedures
Technical Skills
• Expert-level proficiency in SQL with demonstrated experience writing complex queries, optimizing performance, and working with large datasets
• Strong hands-on experience with Snowflake platform including SnowSQL, Snowpipe, Streams, Tasks, and Time Travel features
• Proficiency in Python for data engineering tasks (bonus: experience with libraries such as pandas, SQLAlchemy, Snowflake Connector)
• Familiarity with version control systems (Git) and CI/CD practices for data pipelines
Soft Skills
• Strong analytical and problem-solving abilities with attention to detail
• Excellent communication skills with ability to translate technical concepts for non-technical stakeholders
• Ability to work collaboratively in cross-functional teams and manage multiple priorities
• Self-motivated with a continuous learning mindset to stay current with emerging technologies