

Ampstek
Senior Data Engineer (Python Enterprise Developer)
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer (Python Enterprise Developer) with a contract length of "InsideIR35" based in London, UK (Hybrid 3 days onsite/week). Requires 8+ years of experience in Python, SQL, AWS, Azure, and energy trading domain expertise. Pay rate is "competitive."
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
Unknown
-
🗓️ - Date
April 3, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
Inside IR35
-
🔒 - Security
Unknown
-
📍 - Location detailed
London Area, United Kingdom
-
🧠 - Skills detailed
#GIT #"ETL (Extract #Transform #Load)" #Data Pipeline #Azure DevOps #SQL Queries #Automation #Pandas #AWS S3 (Amazon Simple Storage Service) #S3 (Amazon Simple Storage Service) #Jenkins #Big Data #Agile #Scripting #Databricks #Libraries #SQL (Structured Query Language) #Leadership #Azure #Deployment #Programming #Apache Airflow #Airflow #Scala #Spark (Apache Spark) #NumPy #Data Engineering #Lambda (AWS Lambda) #Python #AWS (Amazon Web Services) #Cloud #DevOps #Scrum #PostgreSQL #Data Processing #Azure Databricks
Role description
About Us:
AmpsTek – a global technology leader since 2013 – is transforming how businesses approach technology and staffing solutions. Founded by seasoned technology leaders across the UK, Europe, APAC, North America, and LATAM, and with registered offices in 30+ countries, we deliver exceptional service, scalable solutions, and measurable impact.
With a portfolio of 200+ clients and millions of users across web and mobile platforms, we empower businesses to innovate, grow, and succeed.
Join our team and be part of a dynamic, growth-oriented organization that values talent, creativity, and results.
Role : Senior Data Engineer (Python Enterprise Developer)
Location : London, UK (Hybrid 3 days onsite/week)
Contract (InsideIR35)
Must have skills:
-Python, Postgres SQL, Azure Databricks, AWS(S3), Git, Azure DevOps CICD, Apache Airflow
-Preferred Domain: Energy trading
Minimum number of relevant years of experience: 8+
Detailed Job Description:
JD:
• 8+ years of experience in python scripting.
• Exposed to python packages such as NumPy, pandas, polars beautiful soup, Selenium, Requests etc.
• Experience in building native python ETL pipelines for scraping/processing.
• Good to have experience in processing big data.
• Proficient in SQL programming, Postgres SQL. Spark knowledge is a good to have
• Knowledge on DevOps CI/CD like Azure Pipelines, Jenkins, Git.
• Experience working with AWS services(S3, lambda,etc) and Azure Databricks.
• Have experience in delivering project with Agile and Scrum methodology.
• Able to co-ordinate with Teams across multiple locations and time zones
• Strong interpersonal and communication skills with an ability to lead a team and keep them motivated.
• Python, Postgres SQL, Azure Databricks, AWS(S3), Git, Azure DevOps CICD, Apache Airflow
Key Responsibilities:
• Lead the design, development, and optimization of scalable data pipelines, data processing frameworks, and enterprise grade Python applications.
• Provide technical leadership in Python development, including best practices for automation, data processing, and integration workflows.
• Work extensively with Python and data centric libraries such as NumPy, Pandas, BeautifulSoup, Selenium, pdfplumber, and Requests.
• Architect and optimize SQL queries and database solutions, with strong proficiency in PostgreSQL and other relational data stores.
• Integrate DevOps practices using CI/CD pipelines, Jenkins, Git, and ensure smooth deployment of data solutions.
• Develop and manage cloud based data workflows on platforms such as AWS (S3) and Azure Databricks.
• Drive agile delivery by collaborating with cross functional teams under Agile/Scrum methodologies.
About Us:
AmpsTek – a global technology leader since 2013 – is transforming how businesses approach technology and staffing solutions. Founded by seasoned technology leaders across the UK, Europe, APAC, North America, and LATAM, and with registered offices in 30+ countries, we deliver exceptional service, scalable solutions, and measurable impact.
With a portfolio of 200+ clients and millions of users across web and mobile platforms, we empower businesses to innovate, grow, and succeed.
Join our team and be part of a dynamic, growth-oriented organization that values talent, creativity, and results.
Role : Senior Data Engineer (Python Enterprise Developer)
Location : London, UK (Hybrid 3 days onsite/week)
Contract (InsideIR35)
Must have skills:
-Python, Postgres SQL, Azure Databricks, AWS(S3), Git, Azure DevOps CICD, Apache Airflow
-Preferred Domain: Energy trading
Minimum number of relevant years of experience: 8+
Detailed Job Description:
JD:
• 8+ years of experience in python scripting.
• Exposed to python packages such as NumPy, pandas, polars beautiful soup, Selenium, Requests etc.
• Experience in building native python ETL pipelines for scraping/processing.
• Good to have experience in processing big data.
• Proficient in SQL programming, Postgres SQL. Spark knowledge is a good to have
• Knowledge on DevOps CI/CD like Azure Pipelines, Jenkins, Git.
• Experience working with AWS services(S3, lambda,etc) and Azure Databricks.
• Have experience in delivering project with Agile and Scrum methodology.
• Able to co-ordinate with Teams across multiple locations and time zones
• Strong interpersonal and communication skills with an ability to lead a team and keep them motivated.
• Python, Postgres SQL, Azure Databricks, AWS(S3), Git, Azure DevOps CICD, Apache Airflow
Key Responsibilities:
• Lead the design, development, and optimization of scalable data pipelines, data processing frameworks, and enterprise grade Python applications.
• Provide technical leadership in Python development, including best practices for automation, data processing, and integration workflows.
• Work extensively with Python and data centric libraries such as NumPy, Pandas, BeautifulSoup, Selenium, pdfplumber, and Requests.
• Architect and optimize SQL queries and database solutions, with strong proficiency in PostgreSQL and other relational data stores.
• Integrate DevOps practices using CI/CD pipelines, Jenkins, Git, and ensure smooth deployment of data solutions.
• Develop and manage cloud based data workflows on platforms such as AWS (S3) and Azure Databricks.
• Drive agile delivery by collaborating with cross functional teams under Agile/Scrum methodologies.





