Data Architect

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Architect on a contract basis in London, offering a hybrid working model. Key skills include Snowflake, SQL, Python, and cloud platforms (AWS, Azure, GCP). Proven experience in data architecture and governance is required.
🌎 - Country
United Kingdom
πŸ’± - Currency
Β£ GBP
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
June 3, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
Hybrid
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
London Area, United Kingdom
-
🧠 - Skills detailed
#Airflow #Synapse #AWS (Amazon Web Services) #Kafka (Apache Kafka) #Looker #Data Orchestration #S3 (Amazon Simple Storage Service) #Cloud #ML (Machine Learning) #Visualization #Delta Lake #Redshift #Data Modeling #Automation #BI (Business Intelligence) #Observability #Data Lake #Scripting #AI (Artificial Intelligence) #"ETL (Extract #Transform #Load)" #Snowflake #Version Control #BigQuery #SQL (Structured Query Language) #GIT #Python #Compliance #Data Governance #Lambda (AWS Lambda) #Data Quality #Tableau #Scala #Data Engineering #Metadata #Data Architecture #Azure #Data Management #Data Science #dbt (data build tool) #Microsoft Power BI #Security #GCP (Google Cloud Platform)
Role description
Location: London Contract Type: Contract Working Model: Hybrid Overview: We are seeking an experienced Data Architect to join our growing data team and lead the design and implementation of scalable, secure, and high-performance data solutions. You’ll play a key role in architecting modern data platforms using Snowflake, SQL, Python, and leading cloud technologies to support advanced analytics, reporting, and machine learning initiatives across the business. Key Responsibilities: β€’ Design and maintain end-to-end data architectures, data models, and pipelines in Snowflake and cloud platforms (AWS, Azure, or GCP). β€’ Develop and optimize scalable ELT/ETL processes using SQL and Python. β€’ Define data governance, metadata management, and security best practices. β€’ Collaborate with data engineers, analysts, product managers, and stakeholders to understand data needs and translate them into robust architectural solutions. β€’ Oversee data quality, lineage, and observability initiatives. β€’ Recommend and implement performance tuning for large-scale data sets. β€’ Ensure platform scalability, cost-efficiency, and system reliability. Required Skills & Experience: β€’ Proven experience as a Data Architect or Senior Data Engineer working on cloud-native data platforms. β€’ Strong hands-on experience with Snowflake – data modeling, performance tuning, security configuration, and data sharing. β€’ Proficiency in SQL for complex querying, optimization, and stored procedures. β€’ Strong coding skills in Python for data transformation, scripting, and automation. β€’ Experience with cloud platforms such as AWS (e.g., S3, Redshift, Lambda), Azure (e.g., Data Factory, Synapse), or GCP (e.g., BigQuery, Cloud Functions). β€’ Familiarity with data orchestration tools (e.g., Airflow, dbt) and version control (Git). β€’ Solid understanding of data governance, security, and compliance frameworks. Nice to Have: β€’ Experience with data lake architectures (Delta Lake, Lakehouse). β€’ Familiarity with BI/visualization tools (Tableau, Power BI, Looker). β€’ Knowledge of streaming data tools (Kafka, Kinesis). β€’ Background in supporting ML/AI pipelines or data science environments.