

LTIMindtree
Senior Azure Data Factory Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Azure Data Factory Engineer with a contract length of "X months" at a pay rate of "$X/hour." Key skills include advanced SQL, Python, PowerShell, and experience with Azure Data Factory and Snowflake. Certifications in Azure or Snowflake are preferred.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
Unknown
-
🗓️ - Date
January 20, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
London Area, United Kingdom
-
🧠 - Skills detailed
#Azure Data Factory #Migration #Compliance #Data Engineering #ADLS (Azure Data Lake Storage) #Data Ingestion #Data Processing #Data Cleansing #Storage #ADF (Azure Data Factory) #Clustering #Terraform #XML (eXtensible Markup Language) #YAML (YAML Ain't Markup Language) #Data Quality #Scala #Logging #Azure SQL #Kubernetes #DevOps #Physical Data Model #Cloud #SQL Queries #Automation #JSON (JavaScript Object Notation) #Metadata #Azure Function Apps #SQL (Structured Query Language) #Security #Data Lineage #Data Management #Indexing #"ETL (Extract #Transform #Load)" #Datasets #Azure #Docker #Databricks #Deployment #Infrastructure as Code (IaC) #AI (Artificial Intelligence) #Python #Data Migration #Azure DevOps #SharePoint #Databases #Snowflake
Role description
Role Sr Azure Data Factory Engineer
• Data Engineering Processing
• Design and implement scalable data ingestion pipelines from diverse sources including APIs SharePoint on premise systems and file based sources
• Perform data cleansing validation and transformation to produce high quality reliable datasets
• Develop and maintain data migration and archival strategies ensuring accuracy integrity and compliance
• Build and optimise logical and physical data models
• Handle diverse data structures and formats including BAK MDF CSV JSON XML and Parquet
• Automation Orchestration
• Automate ingestion and processing workflows using Python PowerShell and orchestration tools
• Apply an automation first mindset including experience integrating AI agents for workflow automation
• Cloud Data Platforms Azure
• Build and maintain solutions using Azure Data Factory Blob Storage ADLS and Azure SQL
• Implement data lineage cataloguing and governance capabilities
• Data Quality Security Compliance
• Oversee data quality frameworks ensuring accuracy consistency and integrity
• Implement audit logging data lineage and compliance practices
• Maintain strong security and governance controls
• Performance Scalability
• Optimize pipelines SQL queries and storage layers
• Troubleshoot performance issues across compute and storage
Must Have Skills Data focused Essentials
• Strong experience in ingestion migration archival pipeline automation and largescale data processing
• Advanced SQL expertise including stored procedures indexing and performance tuning
• Handson experience with Python and PowerShell for ETLELT and automation
• Experience with Azure Data Factory ADLS Blob Storage Azure SQL
• Expertise in data modelling logical physical quality frameworks and optimisation
• Ability to work with structured semi structured and unstructured data formats
• Strong knowledge of audit logging lineage cataloguing metadata management and security
• Demonstrated automation mindset including use of AI agents
• Handson experience or strong understanding of modern cloud data warehousing including Snowflake fundamentals such as virtual warehouses micro partitioning query optimization and role based access control
Good to Have Skills Infrastructure Advanced Platforms
• Infrastructure DevOps
• Experience with Terraform Azure DevOps YAML pipelines and cloud automation
• Exposure to Azure Function Apps serverless compute and orchestration
• Understanding of infrastructure as code and cloud deployment patterns
• Exposure to Docker Kubernetes
• Advanced Data Platforms Tools
• Deep or hands on exposure to Snowflake including
• Creating and managing Snowflake objects databases schemas roles
• Using Snow pipe for automated ingestion
• Performance tuning using clustering caching and micro partitioning
• Understanding Snowflake cost optimization and storage compute separation
• Experience with Databricks for advanced data engineering workflows
• Familiarity with Denodo or Microsoft Purview
• Certifications
• Azure or Snowflake certifications are strong advantages
Role Sr Azure Data Factory Engineer
• Data Engineering Processing
• Design and implement scalable data ingestion pipelines from diverse sources including APIs SharePoint on premise systems and file based sources
• Perform data cleansing validation and transformation to produce high quality reliable datasets
• Develop and maintain data migration and archival strategies ensuring accuracy integrity and compliance
• Build and optimise logical and physical data models
• Handle diverse data structures and formats including BAK MDF CSV JSON XML and Parquet
• Automation Orchestration
• Automate ingestion and processing workflows using Python PowerShell and orchestration tools
• Apply an automation first mindset including experience integrating AI agents for workflow automation
• Cloud Data Platforms Azure
• Build and maintain solutions using Azure Data Factory Blob Storage ADLS and Azure SQL
• Implement data lineage cataloguing and governance capabilities
• Data Quality Security Compliance
• Oversee data quality frameworks ensuring accuracy consistency and integrity
• Implement audit logging data lineage and compliance practices
• Maintain strong security and governance controls
• Performance Scalability
• Optimize pipelines SQL queries and storage layers
• Troubleshoot performance issues across compute and storage
Must Have Skills Data focused Essentials
• Strong experience in ingestion migration archival pipeline automation and largescale data processing
• Advanced SQL expertise including stored procedures indexing and performance tuning
• Handson experience with Python and PowerShell for ETLELT and automation
• Experience with Azure Data Factory ADLS Blob Storage Azure SQL
• Expertise in data modelling logical physical quality frameworks and optimisation
• Ability to work with structured semi structured and unstructured data formats
• Strong knowledge of audit logging lineage cataloguing metadata management and security
• Demonstrated automation mindset including use of AI agents
• Handson experience or strong understanding of modern cloud data warehousing including Snowflake fundamentals such as virtual warehouses micro partitioning query optimization and role based access control
Good to Have Skills Infrastructure Advanced Platforms
• Infrastructure DevOps
• Experience with Terraform Azure DevOps YAML pipelines and cloud automation
• Exposure to Azure Function Apps serverless compute and orchestration
• Understanding of infrastructure as code and cloud deployment patterns
• Exposure to Docker Kubernetes
• Advanced Data Platforms Tools
• Deep or hands on exposure to Snowflake including
• Creating and managing Snowflake objects databases schemas roles
• Using Snow pipe for automated ingestion
• Performance tuning using clustering caching and micro partitioning
• Understanding Snowflake cost optimization and storage compute separation
• Experience with Databricks for advanced data engineering workflows
• Familiarity with Denodo or Microsoft Purview
• Certifications
• Azure or Snowflake certifications are strong advantages






