

Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with a 6-month contract, offering a pay rate of "X" per hour. Candidates need a Bachelor’s degree and 2 years of relevant experience, along with proficiency in Python, SQL, data modeling, and ETL tools.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
-
🗓️ - Date discovered
July 26, 2025
🕒 - Project duration
Unknown
-
🏝️ - Location type
Unknown
-
📄 - Contract type
Fixed Term
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
London Area, United Kingdom
-
🧠 - Skills detailed
#Data Engineering #Data Pipeline #Statistics #MySQL #PostgreSQL #Programming #Hadoop #AWS (Amazon Web Services) #Agile #Compliance #Data Modeling #Data Governance #ArangoDB #Big Data #Data Quality #Kafka (Apache Kafka) #R #Tableau #REST (Representational State Transfer) #API (Application Programming Interface) #Bash #Docker #Java #"ETL (Extract #Transform #Load)" #Graph Databases #SQL (Structured Query Language) #Data Extraction #NoSQL #Data Wrangling #Computer Science #Mathematics #MIS Systems (Management Information Systems) #Scrum #Data Mapping #Security #Cloud #ERWin #Batch #Python #Spark (Apache Spark) #Luigi #Scala #Scripting #Data Analysis #MongoDB #Data Cleansing #SaaS (Software as a Service) #Azure #Data Lake #Data Integration #Informatica #BI (Business Intelligence) #SSIS (SQL Server Integration Services) #Neo4J #Airflow #Linux #Puppet #Databases #Talend
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Role Details
• Data Engineers will focus on the design and build out of data models codification of business rules mapping of data sources to the data models structured and unstructured engineering of scalable ETL pipelines development of data quality solutions and continuous evaluation of technologies to continue to enhance the capabilities of the Data Engineer team and broader Innovation group
• Minimum Years of Experience 2 preferably as a data engineer business systems analyst data analyst or similar role
• Minimum Degree Required Bachelors degree in one of the following Accounting FinanceEconomics Management Information Systems Computer Science Business Administration Statistics Mathematics Regulatory Compliance Science Technology Engineering Mathematics andor other business field of study
Technical skills required
• Object oriented object function scripting languages
• Python R CC Java Scala etc
• Relational SQL distributed SQL and NoSQL databases
• MSSQL PostgreSQL MySQL etc
• MemSQL CrateDB etc
• MongoDB Cassandra etc
• Neo4j AllegroGraph ArangoDB etc
• Big data tools such as Hadoop Spark Kafka etc
• Data modeling tools such as ERWin Enterprise Architect Visio etc
• Data integration tools such as SSIS Informatica SnapLogic etc
• Data pipeline and workflow management tools Azkaban Luigi Airflow etc
• Business Intelligence Tools such as Tableau PowerBI Zoomdata Pentaho etc
• Cloud technologies such as SaaS IaaS and PaaS within Azure AWS or Google
• Linux and comfortable with bash scripting
• Docker and Puppet
Job Description
• Working knowledge of Python and experience with data extraction data cleansing and data wrangling
• Working knowledge of SQL and experience with relational databases
• Experience in codification of business rules analytics in one of the programming languages listed above
• Experience working with business teams to capture and define data models and data flows to enable downstream analytics
• Experience with data modeling data mapping data governance and the processes and technologies commonly used in this space
• Experience with data integration tools eg Talend SnapLogic Informatica and data warehousing data lake tools
• Experience with systems development life cycles such as Agile and Scrum methodologies and
• Demonstrating experience in API based data acquisition and management
Skills Preferred
• Has built enterprise data pipelines and can craft code in SQL Python andor R
• Has built batch data pipeline with relational and columnar database engines as well as Hadoop or Spark and understands their respective strengths and weaknesses
• Ability to build scalable and performant data models
• Possesses strong computer science fundamentals data structures algorithms programming languages distributed systems and information retrieval
• Experience with agile development processes
• Execution focused knows how to get things done
• Possesses a keen analytical mind with attention to detail and accuracy
• Excellent verbal and written communication skills with ability to present technical and nontechnical information to various audiences
• Excellent organization and prioritization skills with strong ability to multitask and switch focus as necessary to meet deadlines andor with change in priorities
• Experience working with large data sets and deriving insights from data using various BI and data analytics tools
• Ability to think outside of the box to solve complex business problems
• Understanding of the security requirements for handling data both in motion and at rest such as communication protocols encryption authentication and authorization
• Understanding of Graph databases and graph modeling