

Lead Gen AI Engineer - Remote - W2 Only
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Lead Gen AI Engineer, remote for 5 months (possible extension), paying "pay rate." Key skills required include AWS Bedrock, LLMs, RAG, and Python or Node.js. Experience in high-volume document processing and cloud architecture is essential.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
June 25, 2025
π - Project duration
3 to 6 months
-
ποΈ - Location type
Remote
-
π - Contract type
W2 Contractor
-
π - Security clearance
Unknown
-
π - Location detailed
United States
-
π§ - Skills detailed
#Data Extraction #ML (Machine Learning) #AWS (Amazon Web Services) #ChatGPT #AI (Artificial Intelligence) #Observability #Scala #Aurora #Monitoring #SQS (Simple Queue Service) #ML Ops (Machine Learning Operations) #Data Automation #Base #Compliance #Aurora RDS #Python #GitHub #Data Privacy #AWS Lambda #RDS (Amazon Relational Database Service) #Debugging #"ETL (Extract #Transform #Load)" #Security #Automation #PostgreSQL #Cloud #Lambda (AWS Lambda)
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Position: Level: Lead Gen AI engineer
Project duration: 5 months with possible extension
Location: remote in the US
Level: Lead Gen AI engineer
Must have skills: AWS Bedrock, LLMs, RAG, Python or Node.js
Responsibilities
β’ Deliver production-grade, end-to-end automation workflows for AI-driven mortgage document OCR, data extraction, and due diligence, using AWS Bedrock, AWS Textract and supporting cloud services
β’ Develop, integrate, and optimize GenAI/RAG solutions using Bedrock Prompt Engineering & Management, BDA/Blueprint/Knowledge Base, chunking strategies, and LLM customization
β’ Engineer and orchestrate scalable cloud pipelines using Python and/or Node.js: Lambda functions (and Layers), Step Functions, SQS, EventBridge, Aurora RDS, etc.
β’ Implement massive-scale document processing (target: up to 2 million docs/hour) - focusing on high reliability, low latency, and robust error handling
β’ Design and deploy efficient, observable, and testable workflows - instrument monitoring, error tracking, and metrics for system health and business KPIs
β’ Collaborate tightly with product owners, technical leads, and peer developers to accelerate new workflow onboarding, parallelize delivery, and maintain momentum while safeguarding quality
β’ Apply best practices for code quality, infrastructure-as-code, security, compliance, and data privacy
β’ Use GitHub Copilot, ChatGPT Sandbox, and other approved AI tools to improve productivity and accelerate secure code development
Must-Have Skills & Experience
β’ Production experience with AWS Bedrock, including:
β’ Bedrock Data Automation (BDA)
β’ BDA Blueprint
β’ Bedrock Knowledge Base
β’ Bedrock Agents
β’ Bedrock Prompt Engineering and Management
β’ Hands-on expertise with LLMs, RAG architectures, and document βchunkingβ for retrieval/extraction
β’ Strong Python or Node.js development skills for cloud serverless pipelines
β’ Proven track record building, deploying, and maintaining high-throughput, enterprise-grade solutions using:
β’ AWS Lambda Functions & Lambda Layers
β’ Step Functions
β’ SQS
β’ EventBridge
β’ Aurora RDS Serverless (PostgreSQL)
β’ Amazon Textract (OCR/Document AI)
β’ Direct experience delivering real production solutions (not just prototypes/POCs) in high-volume, high-velocity workflows
β’ Strong knowledge of cloud architecture patterns, security, performance optimization, observability, and cost control for serverless, distributed workloads
β’ Excellent problem solving, debugging, and incident triage skills
β’ Effective collaboration and communication in a remote, fast-moving, cross-functional environment
β’ Adaptability and resilience to maintain velocity and quality under tight timelines and shifting priorities
Nice-to-Have Skills
β’ Knowledge of mortgage and/or real estate document workflows, compliance, or data models
β’ Experience optimizing or parallelizing workflow orchestration for massive-scale document processing
β’ Familiarity with AI/ML Ops, monitoring tools, and enterprise governance and compliance in cloud environments
β’ Advanced prompt engineering for GenAI/LLMs in document automation scenarios
β’ Experience leveraging GitHub Copilot or comparable AI-powered coding tools in secure, enterprise settings