

Tek Leaders Inc
Senior AI Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior AI Data Engineer, offering a long-term contract with a pay rate of "unknown." It is 100% remote and requires expertise in AWS, data pipelines, CI/CD, and data governance, with a focus on AI applications.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
March 10, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
California, United States
-
🧠 - Skills detailed
#Data Quality #Datasets #Triggers #AWS (Amazon Web Services) #Data Governance #Cloud #Compliance #Batch #Scala #Data Pipeline #Databases #SaaS (Software as a Service) #AI (Artificial Intelligence) #Data Access #Data Engineering #Deployment #"ETL (Extract #Transform #Load)" #Observability #Indexing #Classification
Role description
Senior AI Data Engineer
Location: 100% Remote
Duration: Long Term
Job Description:
• Design and build scalable data pipelines for AI agents across cloud platforms
• Create and maintain agent‑ready data models, schemas, and data contracts
• Build and operate vector data pipelines (data prep, chunking, embeddings, indexing, re‑indexing)
• Integrate structured, semi‑structured, and unstructured data sources for agent consumption
• Develop MCP (Model Context Protocol) data adapters/connectors for databases, APIs, SaaS, files, and streams
• Define standard MCP request/response schemas and transformation logic
• Integrate MCPs with the MCP gateway (auth, routing, throttling, observability)
• Build CI/CD pipelines for MCP build, test, deployment, and rollback
• Implement CI/CD pipelines for data pipelines, datasets, and vector stores
• Automate environment promotion (dev/test/prod) for data assets
• Embed data quality checks (schema validation, freshness, completeness) into pipelines
• Design and operate real‑time streaming pipelines (event ingestion, enrichment, aggregation)
• Enable event‑driven data triggers for AI agents
• Build batch + streaming hybrid architectures for historical and real‑time context
• Develop and maintain certified data connectors for Low‑Code / No‑Code platforms
• Standardize enterprise data models for reuse by agents and citizen developers
• Manage secure data access using RBAC, managed identities, secrets, and tokenization
• Monitor data quality, drift, and freshness impacting agent behavior
• Implement data observability and lineage tracking across pipelines and MCPs
• Enforce data governance, classification, and compliance controls
• Optimize data performance, latency, and cost for agent workloads
• Experience developing these using AWS cloud services and open source
Thanks & Regards,
Sanju
TekLeaders Inc
5151 Headquarters Dr. Suite 105
Plano TX 75024
Email: sanju@tekleaders.com
www.tekleaders.com
Senior AI Data Engineer
Location: 100% Remote
Duration: Long Term
Job Description:
• Design and build scalable data pipelines for AI agents across cloud platforms
• Create and maintain agent‑ready data models, schemas, and data contracts
• Build and operate vector data pipelines (data prep, chunking, embeddings, indexing, re‑indexing)
• Integrate structured, semi‑structured, and unstructured data sources for agent consumption
• Develop MCP (Model Context Protocol) data adapters/connectors for databases, APIs, SaaS, files, and streams
• Define standard MCP request/response schemas and transformation logic
• Integrate MCPs with the MCP gateway (auth, routing, throttling, observability)
• Build CI/CD pipelines for MCP build, test, deployment, and rollback
• Implement CI/CD pipelines for data pipelines, datasets, and vector stores
• Automate environment promotion (dev/test/prod) for data assets
• Embed data quality checks (schema validation, freshness, completeness) into pipelines
• Design and operate real‑time streaming pipelines (event ingestion, enrichment, aggregation)
• Enable event‑driven data triggers for AI agents
• Build batch + streaming hybrid architectures for historical and real‑time context
• Develop and maintain certified data connectors for Low‑Code / No‑Code platforms
• Standardize enterprise data models for reuse by agents and citizen developers
• Manage secure data access using RBAC, managed identities, secrets, and tokenization
• Monitor data quality, drift, and freshness impacting agent behavior
• Implement data observability and lineage tracking across pipelines and MCPs
• Enforce data governance, classification, and compliance controls
• Optimize data performance, latency, and cost for agent workloads
• Experience developing these using AWS cloud services and open source
Thanks & Regards,
Sanju
TekLeaders Inc
5151 Headquarters Dr. Suite 105
Plano TX 75024
Email: sanju@tekleaders.com
www.tekleaders.com





