

Data Engineer – (Snowflake/Tableau/Airflow/DBT/GCP/Python/IAC)
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Contract Data Engineer position focused on Snowflake, Airflow, and Tableau, located remotely in the UK. Key skills include Snowflake, DBT, Python, and Infrastructure as Code. Experience in ETL and data quality testing is essential.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
-
🗓️ - Date discovered
August 2, 2025
🕒 - Project duration
Unknown
-
🏝️ - Location type
Remote
-
📄 - Contract type
Unknown
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
United Kingdom
-
🧠 - Skills detailed
#GIT #Infrastructure as Code (IaC) #SQL (Structured Query Language) #PySpark #Tableau #C++ #GCP (Google Cloud Platform) #SnowPipe #Terraform #Automation #Airflow #Python #Scala #"ETL (Extract #Transform #Load)" #Metadata #Data Engineering #Data Analysis #dbt (data build tool) #Data Quality #Spark (Apache Spark) #Snowflake
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
I am hiring for,
Job Title: Data Engineer – Snowflake | Airflow | Tableau
Location: UK Remote
Job Description:
We’re seeking a Contract Data Engineer to help modernize our data infrastructure and establish scalable, reliable data practices. You will be instrumental in enhancing platform performance, embedding engineering best practices, and enabling future growth.
Key Responsibilities:
• Upgrade Snowflake connectivity and implement modern authentication and secrets management
• Audit and refactor legacy ETL/Python jobs for reliability and maintainability
• Design and implement data quality tests (dbt-tests) and alerting systems
• Optimize Airflow DAG schedules and compute resource allocation
• Document architecture and deliver handover sessions with data analysts
• Implement and scale DBT workflows to enforce data-as-code principles
Key Skills Required:
• Snowflake, DBT, SQL Optimization
• Airflow (GCP Composer), Python 3, Git-based CI/CD
• Infrastructure as Code (Terraform/Pulumi)
• Tableau extract automation & governance
Nice-to-Have Skills:
• Streaming ingestion (Pub/Sub → Snowpipe)
• PySpark performance tuning
• Experience with Metadata/Lineage tools (DataHub, OpenMetadata)