

Hexaware Technologies
Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with a contract length of "unknown" and a pay rate of "unknown," located in "unknown." Requires 10+ years in Java and Python, AWS expertise, and experience in financial services data systems.
π - Country
United Kingdom
π± - Currency
Β£ GBP
-
π° - Day rate
Unknown
-
ποΈ - Date
April 24, 2026
π - Duration
Unknown
-
ποΈ - Location
Unknown
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
London Area, United Kingdom
-
π§ - Skills detailed
#Data Pipeline #Data Quality #AWS (Amazon Web Services) #AWS Glue #Data Engineering #dbt (data build tool) #Snowflake #Lambda (AWS Lambda) #Athena #Data Management #Monitoring #Apache Iceberg #Quality Assurance #Apache Airflow #Airflow #Python #Programming #Java #PostgreSQL #Cloud #Spark (Apache Spark) #Metadata #RDBMS (Relational Database Management System)
Role description
Join a dynamic team shaping the future of financial services by building and maintaining modern data platforms. This role involves collaborating with business and technology partners to deliver impactful solutions, leveraging cutting-edge tools and technologies. You will contribute to the architectural design, development, and maintenance of data platforms, ensuring performance, stability, and alignment with business needs.
Key Responsibilities:
β’ Architect and Develop: Contribute to the platformβs architectural design and build integration, modeling, data persistence, and analytical systems.
β’ Data Pipelines: Implement, maintain, and test robust data pipelines.
β’ Metadata Management: Develop and manage metadata processes and tools.
β’ Performance Monitoring: Ensure the stability and performance of data pipelines.
β’ Data Quality: Implement tools for data curation, metadata management, and quality assurance.
β’ Collaboration: Engage with business and technology teams to align the platform with organizational goals.
Preferred Technical Skills:
β’ Programming: 10+ years of experience in Java and Python development.
β’ Cloud Expertise: Strong understanding of AWS services (e.g., Lambda, Step Functions, ECS).
β’ Data Platforms: Hands-on experience with Snowflake and data stack technologies like Apache Iceberg and Spark.
β’ Workflow Orchestration: Exposure to tools like Apache Airflow, Prefect, Dagster, or DBT.
β’ Data Services: Familiarity with AWS Glue, Lake Formation, EMR, EventBridge, Athena, and similar services.
β’ Metadata Tools: Experience with tools like Amundsen, Atlas, DataHub, OpenDataDiscovery, or Marquez.
β’ RDBMS: Knowledge of PostgreSQL is a plus.
β’ Industry Experience: Proven experience building enterprise-wide data and analytics systems, preferably in financial services or asset management.
Join a dynamic team shaping the future of financial services by building and maintaining modern data platforms. This role involves collaborating with business and technology partners to deliver impactful solutions, leveraging cutting-edge tools and technologies. You will contribute to the architectural design, development, and maintenance of data platforms, ensuring performance, stability, and alignment with business needs.
Key Responsibilities:
β’ Architect and Develop: Contribute to the platformβs architectural design and build integration, modeling, data persistence, and analytical systems.
β’ Data Pipelines: Implement, maintain, and test robust data pipelines.
β’ Metadata Management: Develop and manage metadata processes and tools.
β’ Performance Monitoring: Ensure the stability and performance of data pipelines.
β’ Data Quality: Implement tools for data curation, metadata management, and quality assurance.
β’ Collaboration: Engage with business and technology teams to align the platform with organizational goals.
Preferred Technical Skills:
β’ Programming: 10+ years of experience in Java and Python development.
β’ Cloud Expertise: Strong understanding of AWS services (e.g., Lambda, Step Functions, ECS).
β’ Data Platforms: Hands-on experience with Snowflake and data stack technologies like Apache Iceberg and Spark.
β’ Workflow Orchestration: Exposure to tools like Apache Airflow, Prefect, Dagster, or DBT.
β’ Data Services: Familiarity with AWS Glue, Lake Formation, EMR, EventBridge, Athena, and similar services.
β’ Metadata Tools: Experience with tools like Amundsen, Atlas, DataHub, OpenDataDiscovery, or Marquez.
β’ RDBMS: Knowledge of PostgreSQL is a plus.
β’ Industry Experience: Proven experience building enterprise-wide data and analytics systems, preferably in financial services or asset management.






