

PRIMUS Global Solutions (PRIMUS UK & Europe)
Snowflake Developer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer – Snowflake Developer on a contract basis, remote (UK-based), offering a competitive pay rate. Requires 5–6 years of experience, strong skills in Snowflake, Python, Unix scripting, and Cloud platforms (Azure/AWS/GCP).
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
Unknown
-
🗓️ - Date
October 15, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United Kingdom
-
🧠 - Skills detailed
#Data Architecture #Scala #Data Science #Programming #Data Quality #SQL (Structured Query Language) #Deployment #Data Engineering #GCP (Google Cloud Platform) #"ETL (Extract #Transform #Load)" #SnowPipe #Metadata #Automation #AWS (Amazon Web Services) #Data Modeling #Scripting #Snowflake #Data Pipeline #Agile #Cloud #Unix #React #SnowSQL #Azure #Python #Data Integration #BI (Business Intelligence)
Role description
🚀 Hiring: Data Engineer – Snowflake Developer | Remote (London / UK)
We are seeking an experienced Data Engineer (Snowflake Developer) with strong expertise in Python, Unix scripting, and Cloud Data Platforms (Azure / AWS / GCP). The ideal candidate will design, build, and maintain scalable data pipelines and implement best practices in Snowflake development, data modeling, and ETL/ELT workflows.
🔹 Role: Data Engineer – Snowflake Developer
🔹 Location: London (Remote – UK Based)
🔹 Experience: 5–6+ years
🔹 Mode: Contract
🧠 Key Responsibilities
• Develop and maintain Snowflake SQL scripts, stored procedures, and data pipelines.
• Utilize Snowflake features such as SnowPipe, SnowSQL, Tasks, Streams, Time Travel, Optimizer, and Metadata Manager.
• Perform ETL/ELT development using Python, Unix scripts, and Cloud tools.
• Design and implement data architecture and data models for enterprise analytics.
• Collaborate with data science, BI, and analytics teams to build custom data integrations.
• Conduct SQL query optimization, testing, and troubleshooting for performance.
• Manage data flows, ensure data quality, and enable seamless data sharing across systems.
• Work within Agile frameworks, contributing to design, development, and deployment cycles.
💻 Technical Skills Required
• Strong hands-on experience with Snowflake Development.
• Expertise in Snowflake SQL, data modeling, and ETL/ELT.
• Experience with Python, Unix scripting, and Cloud Data Platforms (Azure / AWS / GCP).
• Proficiency in data warehousing concepts, performance tuning, and automation.
• Familiarity with object-oriented programming, design patterns, and Agile methodologies.
• Exposure to ReactJS (added advantage).
✨ Must-Have Competencies
• 5–6 years of proven Data Engineering / Snowflake Developer experience.
• Strong analytical and problem-solving skills.
• Excellent communication and collaboration abilities.
• Ability to work independently in a remote, client-facing environment.
📢 Keywords / Hashtags
#DataEngineer #SnowflakeDeveloper #PythonDeveloper #SnowflakeSQL #ETLDeveloper #DataModeling #CloudDataEngineer #AzureDataEngineer #AWSEngineer #GCP #Unix #DataWarehouse #DataArchitecture #ELT #ETL #SnowPipe #SnowflakeJobs #RemoteJobs #UKJobs #LondonTech #HiringNow #ContractRole
🚀 Hiring: Data Engineer – Snowflake Developer | Remote (London / UK)
We are seeking an experienced Data Engineer (Snowflake Developer) with strong expertise in Python, Unix scripting, and Cloud Data Platforms (Azure / AWS / GCP). The ideal candidate will design, build, and maintain scalable data pipelines and implement best practices in Snowflake development, data modeling, and ETL/ELT workflows.
🔹 Role: Data Engineer – Snowflake Developer
🔹 Location: London (Remote – UK Based)
🔹 Experience: 5–6+ years
🔹 Mode: Contract
🧠 Key Responsibilities
• Develop and maintain Snowflake SQL scripts, stored procedures, and data pipelines.
• Utilize Snowflake features such as SnowPipe, SnowSQL, Tasks, Streams, Time Travel, Optimizer, and Metadata Manager.
• Perform ETL/ELT development using Python, Unix scripts, and Cloud tools.
• Design and implement data architecture and data models for enterprise analytics.
• Collaborate with data science, BI, and analytics teams to build custom data integrations.
• Conduct SQL query optimization, testing, and troubleshooting for performance.
• Manage data flows, ensure data quality, and enable seamless data sharing across systems.
• Work within Agile frameworks, contributing to design, development, and deployment cycles.
💻 Technical Skills Required
• Strong hands-on experience with Snowflake Development.
• Expertise in Snowflake SQL, data modeling, and ETL/ELT.
• Experience with Python, Unix scripting, and Cloud Data Platforms (Azure / AWS / GCP).
• Proficiency in data warehousing concepts, performance tuning, and automation.
• Familiarity with object-oriented programming, design patterns, and Agile methodologies.
• Exposure to ReactJS (added advantage).
✨ Must-Have Competencies
• 5–6 years of proven Data Engineering / Snowflake Developer experience.
• Strong analytical and problem-solving skills.
• Excellent communication and collaboration abilities.
• Ability to work independently in a remote, client-facing environment.
📢 Keywords / Hashtags
#DataEngineer #SnowflakeDeveloper #PythonDeveloper #SnowflakeSQL #ETLDeveloper #DataModeling #CloudDataEngineer #AzureDataEngineer #AWSEngineer #GCP #Unix #DataWarehouse #DataArchitecture #ELT #ETL #SnowPipe #SnowflakeJobs #RemoteJobs #UKJobs #LondonTech #HiringNow #ContractRole