Hexaware Technologies

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with a contract length of "unknown" and a pay rate of "unknown," located in "unknown." Requires 10+ years in Java and Python, AWS expertise, and experience in financial services data systems.
🌎 - Country
United Kingdom
πŸ’± - Currency
Β£ GBP
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
April 24, 2026
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Unknown
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
London Area, United Kingdom
-
🧠 - Skills detailed
#Data Pipeline #Data Quality #AWS (Amazon Web Services) #AWS Glue #Data Engineering #dbt (data build tool) #Snowflake #Lambda (AWS Lambda) #Athena #Data Management #Monitoring #Apache Iceberg #Quality Assurance #Apache Airflow #Airflow #Python #Programming #Java #PostgreSQL #Cloud #Spark (Apache Spark) #Metadata #RDBMS (Relational Database Management System)
Role description
Join a dynamic team shaping the future of financial services by building and maintaining modern data platforms. This role involves collaborating with business and technology partners to deliver impactful solutions, leveraging cutting-edge tools and technologies. You will contribute to the architectural design, development, and maintenance of data platforms, ensuring performance, stability, and alignment with business needs. Key Responsibilities: β€’ Architect and Develop: Contribute to the platform’s architectural design and build integration, modeling, data persistence, and analytical systems. β€’ Data Pipelines: Implement, maintain, and test robust data pipelines. β€’ Metadata Management: Develop and manage metadata processes and tools. β€’ Performance Monitoring: Ensure the stability and performance of data pipelines. β€’ Data Quality: Implement tools for data curation, metadata management, and quality assurance. β€’ Collaboration: Engage with business and technology teams to align the platform with organizational goals. Preferred Technical Skills: β€’ Programming: 10+ years of experience in Java and Python development. β€’ Cloud Expertise: Strong understanding of AWS services (e.g., Lambda, Step Functions, ECS). β€’ Data Platforms: Hands-on experience with Snowflake and data stack technologies like Apache Iceberg and Spark. β€’ Workflow Orchestration: Exposure to tools like Apache Airflow, Prefect, Dagster, or DBT. β€’ Data Services: Familiarity with AWS Glue, Lake Formation, EMR, EventBridge, Athena, and similar services. β€’ Metadata Tools: Experience with tools like Amundsen, Atlas, DataHub, OpenDataDiscovery, or Marquez. β€’ RDBMS: Knowledge of PostgreSQL is a plus. β€’ Industry Experience: Proven experience building enterprise-wide data and analytics systems, preferably in financial services or asset management.