

Optomi
Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with a contract length of "unknown", offering a pay rate of "unknown", based remotely. Requires 5+ years of software development experience, proficiency in Python, AWS, and Terraform, plus familiarity with AI/ML concepts.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
600
-
🗓️ - Date
May 6, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Version Control #Data Catalog #AI (Artificial Intelligence) #Collibra #Data Marketplace #Scala #ML (Machine Learning) #GitHub #OpenSearch #DynamoDB #API (Application Programming Interface) #Elasticsearch #Pytest #S3 (Amazon Simple Storage Service) #Data Engineering #VPC (Virtual Private Cloud) #Data Pipeline #Documentation #Terraform #Infrastructure as Code (IaC) #Python #AWS (Amazon Web Services) #Observability #IAM (Identity and Access Management) #Kafka (Apache Kafka) #Security #Cloud #Lambda (AWS Lambda)
Role description
Overview:
• We are seeking a Senior Software Engineer – Data Marketplace to help build a modern, enterprise-scale data platform that makes data easy to discover, understand, and consume. This role focuses on developing scalable services, search and catalog capabilities, and AI-powered data experiences.
• This is a hands-on individual contributor role working across cloud infrastructure, data platforms, and advanced search/ML systems. You’ll collaborate with cross-functional teams to deliver high-impact solutions that improve how users find and use data.
Qualifications
• 5+ years of professional software development experience
• Strong proficiency in Python (including testing frameworks like pytest/unittest)
• Experience building and deploying applications on AWS
• Hands-on experience with Terraform for Infrastructure as Code
• Experience with Amazon OpenSearch or Elasticsearch, including search/query capabilities
• Familiarity with AI/ML concepts such as embeddings, semantic search, or model integration
• Experience working with serverless and cloud-native architectures (e.g., Lambda, API Gateway)
• Strong understanding of data pipelines and distributed systems
• Experience with version control (e.g., GitHub) and documentation tools
Nice to Haves
• Experience with vector search, RAG (Retrieval-Augmented Generation), or LLM-based systems
• Exposure to data catalog or governance platforms (e.g., Collibra)
• Experience with event-driven architectures (e.g., Kafka)
• Familiarity with additional AWS services such as Lake Formation
• Experience working in large-scale data platform or marketplace environments
Responsibilities
• Design, develop, and maintain Python-based services, APIs, and data pipelines
• Build and manage AWS infrastructure using Terraform
• Develop and enhance search and discovery capabilities using OpenSearch
• Implement AI/ML-powered features, including semantic search and data retrieval
• Collaborate with engineering, data, and security teams to deliver end-to-end solutions
• Work across AWS services such as Lambda, API Gateway, Glue, S3, DynamoDB, IAM, and VPC
• Improve system reliability through testing, CI/CD, and observability best practices
• Contribute to architectural decisions and technical standards
• Mentor junior engineers and share best practices
• Solve complex problems with ambiguous requirements and propose scalable solutions
Overview:
• We are seeking a Senior Software Engineer – Data Marketplace to help build a modern, enterprise-scale data platform that makes data easy to discover, understand, and consume. This role focuses on developing scalable services, search and catalog capabilities, and AI-powered data experiences.
• This is a hands-on individual contributor role working across cloud infrastructure, data platforms, and advanced search/ML systems. You’ll collaborate with cross-functional teams to deliver high-impact solutions that improve how users find and use data.
Qualifications
• 5+ years of professional software development experience
• Strong proficiency in Python (including testing frameworks like pytest/unittest)
• Experience building and deploying applications on AWS
• Hands-on experience with Terraform for Infrastructure as Code
• Experience with Amazon OpenSearch or Elasticsearch, including search/query capabilities
• Familiarity with AI/ML concepts such as embeddings, semantic search, or model integration
• Experience working with serverless and cloud-native architectures (e.g., Lambda, API Gateway)
• Strong understanding of data pipelines and distributed systems
• Experience with version control (e.g., GitHub) and documentation tools
Nice to Haves
• Experience with vector search, RAG (Retrieval-Augmented Generation), or LLM-based systems
• Exposure to data catalog or governance platforms (e.g., Collibra)
• Experience with event-driven architectures (e.g., Kafka)
• Familiarity with additional AWS services such as Lake Formation
• Experience working in large-scale data platform or marketplace environments
Responsibilities
• Design, develop, and maintain Python-based services, APIs, and data pipelines
• Build and manage AWS infrastructure using Terraform
• Develop and enhance search and discovery capabilities using OpenSearch
• Implement AI/ML-powered features, including semantic search and data retrieval
• Collaborate with engineering, data, and security teams to deliver end-to-end solutions
• Work across AWS services such as Lambda, API Gateway, Glue, S3, DynamoDB, IAM, and VPC
• Improve system reliability through testing, CI/CD, and observability best practices
• Contribute to architectural decisions and technical standards
• Mentor junior engineers and share best practices
• Solve complex problems with ambiguous requirements and propose scalable solutions






