IPrime Info Solutions Inc.

Cloud Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Cloud Engineer (AWS Bedrock Engineer) with a contract length of "unknown" and a pay rate of "unknown." Key skills include Java, AWS Bedrock, microservices, and AI/ML integration. Experience with cloud-native architectures is essential.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
October 1, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Seattle, WA
-
🧠 - Skills detailed
#Langchain #Observability #Hugging Face #Monitoring #Data Science #Scala #GraphQL #GitHub #Cloud #Security #Spring Boot #SQS (Simple Queue Service) #Java #AWS (Amazon Web Services) #Lambda (AWS Lambda) #Logging #Kafka (Apache Kafka) #Microservices #S3 (Amazon Simple Storage Service) #IAM (Identity and Access Management) #ML (Machine Learning) #Programming #Jenkins #Terraform #Deployment #Kubernetes #API (Application Programming Interface) #SNS (Simple Notification Service) #Model Deployment #AI (Artificial Intelligence) #Prometheus #NLP (Natural Language Processing) #EC2 #DynamoDB #SageMaker #Docker #GitLab
Role description
About the Role We are seeking a highly skilled AWS Bedrock Engineer with strong Java expertise to design, develop, and deploy Generative AI solutions leveraging Amazon Bedrock and AWS cloud services. The ideal candidate will have hands-on experience in Java-based backend development, cloud-native architectures, and AI/LLM integrations. Job Description The AWS Bedrock Engineer will be responsible for designing and developing scalable AI-driven applications using AWS Bedrock and related services. Responsibilities • Design and develop scalable AI-driven applications using AWS Bedrock and related services. • Build and integrate Java-based microservices to support AI/ML workloads. • Work with Generative AI models (LLMs, RAG, embeddings, fine-tuning) using Bedrock APIs. • Implement RESTful and GraphQL APIs to integrate AI services with enterprise applications. • Ensure security, scalability, and performance optimization for AI-powered systems. • Collaborate with data scientists, ML engineers, and architects to productionize AI models. • Automate deployment pipelines using Terraform / CloudFormation and CI/CD tools. • Implement monitoring, logging, and observability using CloudWatch, Prometheus, or OpenTelemetry. Qualifications • Strong programming experience in Java (8/11/17), Spring Boot, and microservices. • Hands-on experience with AWS Bedrock (LLMs, embeddings, RAG, model deployment). • Solid knowledge of AWS services: Lambda, S3, EC2, API Gateway, SageMaker, DynamoDB, CloudWatch, IAM. • Experience with LLMs / NLP frameworks (Hugging Face, LangChain, OpenAI APIs). • Strong understanding of RESTful APIs, gRPC, and event-driven architectures (Kafka, SNS, SQS). • Experience with Docker, Kubernetes (EKS), Terraform/CloudFormation. • Familiarity with CI/CD pipelines (Jenkins, GitHub Actions, GitLab CI/CD, or AWS CodePipeline). • Knowledge of secure API design, JWT/OAuth2, RBAC. Pay range and compensation package Contract W2 Equal Opportunity Statement We are committed to diversity and inclusivity. \`\`\`