Wipro

AWS GEN AI Developer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an AWS GEN AI Developer with a contract length of 1 year (extendable) and a pay rate of "competitive compensation." Key skills include expert programming in Python and Scala, experience with Big Data technologies, and familiarity with AWS.
🌎 - Country
United Kingdom
πŸ’± - Currency
Β£ GBP
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
October 17, 2025
πŸ•’ - Duration
More than 6 months
-
🏝️ - Location
Unknown
-
πŸ“„ - Contract
Fixed Term
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
London Area, United Kingdom
-
🧠 - Skills detailed
#Agile #Databricks #Programming #Big Data #Monitoring #Spring Boot #MongoDB #Automation #Azure #Airflow #Hadoop #Couchbase #NoSQL #Scala #SQL (Structured Query Language) #PostgreSQL #HBase #Java #Oracle #Data Accuracy #Snowflake #AI (Artificial Intelligence) #Python #Cloud #Deployment #Data Orchestration #AWS (Amazon Web Services) #"ETL (Extract #Transform #Load)" #Spark (Apache Spark) #Data Processing #API (Application Programming Interface)
Role description
Wipro is seeking a skilled AWS GEN AI Developer to join our dynamic team. Are you passionate about building scalable data systems and driving automation? Join our dynamic team as a Senior Data Systems Engineer and help shape the future of data infrastructure and intelligent automation. What You'll Do β€’ Design, develop, and maintain scalable and reliable data systems. β€’ Build and optimize ETL/ELT pipelines for high-volume data processing. β€’ Code, test, and debug data applications using Python and Scala. β€’ Implement self-service, self-healing, and automation frameworks. β€’ Configure and maintain monitoring tools to ensure system health. β€’ Triage and resolve incidents, analyze logs, and drive root cause analysis. β€’ Manage environment provisioning and deployment workflows. What You Bring Core Technical Skills: β€’ Expert-level programming in Python and Scala β€’ Strong experience with Big Data technologies: Spark, Flink, Hadoop β€’ Proficiency in SQL (Oracle, PostgreSQL) and NoSQL (MongoDB, Couchbase) β€’ Hands-on with Cloud platforms: AWS, Azure β€’ Familiarity with Data Orchestration tools: Airflow, Prefect β€’ API development using Spring Boot and Core Java (J2EE, Multithreading) β€’ Knowledge of AI and Generative AI applications β€’ Bonus: Experience with Fenergo, Snowflake, Databricks, and CI/CD pipelines Soft Skills β€’ Excellent communication and stakeholder management β€’ Analytical mindset with a focus on data accuracy and integrity β€’ Agile team player with cross-functional collaboration experience Why Join Us? β€’ Work on cutting-edge data infrastructure projects β€’ Collaborate with top-tier engineering and AI teams β€’ Flexible work environment and competitive compensation β€’ Opportunities for growth in AI, cloud, and automation domains β€’ β€’ β€’ Please note this will be fixed term employment (1 Year Extendable) and Visa sponsorship / Transfer is not provided for this role β€’ β€’ β€’