

Myticas Consulting
Senior Python Data Engineer – Conversational Analytics (34485)
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Python Data Engineer specializing in Conversational Analytics, offering a 40-hour/week remote contract. Required skills include AWS, Python, SQL, and experience with conversational agents. Must work in AZ or EST time zones.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
October 22, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#ML (Machine Learning) #SageMaker #Logging #Redshift #Cloud #Automation #AI (Artificial Intelligence) #Programming #Terraform #Security #Monitoring #Data Quality #Data Pipeline #Deployment #SQL (Structured Query Language) #Scala #API (Application Programming Interface) #Python #Athena #Lambda (AWS Lambda) #DevOps #Data Engineering #AWS (Amazon Web Services) #S3 (Amazon Simple Storage Service) #"ETL (Extract #Transform #Load)"
Role description
Senior Data Engineer – Conversational Analytics (AWS/Python)
Location: Remote
Contract Schedule: 40 hours/week | Must work in AZ or EST time zones
About The Role
Myticas is seeking a senior-level Data Engineer with strong expertise in AWS, Python, and Conversational Analytics to help build secure, scalable, and resilient cloud data solutions. In this role, you will develop knowledgebases and conversational agents that allow business users to interact with data through natural language. You will own the end-to-end development of data and AI pipelines in AWS—ensuring reliability, performance, and automation across environments.
This role is ideal for someone who enjoys building modern data platforms, contributing to AI-driven solutions, and partnering with cross-functional teams to democratize data and improve decision-making.
What You’ll Do
• Design, build, and optimize scalable data pipelines on AWS
• Implement security, monitoring, and logging best practices across data and AI pipelines
• Develop and maintain knowledgebases and conversational AI agents, integrating structured and unstructured data sources
• Build and support machine learning workflows in SageMaker (training, deployment, monitoring)
• Utilize AWS Bedrock to develop and deploy conversational AI solutions
• Ensure data quality, lineage, and reliability across data and LLM systems
• Build and maintain CI/CD pipelines enabling automated and secure deployments
• Automate pipeline orchestration using AWS services such as Step Functions and Lambda
• Partner with analytics, product, and business stakeholders to deliver data solutions that improve decision-making and accessibility
Required Skills & Experience
• 2+ years of hands-on experience in AWS, including:
• S3, Athena, Lambda, Step Functions, Redshift Serverless, SageMaker, Bedrock
• Strong Python programming skills (data engineering, automation, API integration)
• Experience building knowledgebases or conversational agents (or equivalent)
• Strong SQL skills and experience with large-scale data environments
• Solid understanding of ETL/ELT orchestration, optimization, and cloud data workflows
Preferred Experience
• LLM integration into business workflows
• Lakehouse frameworks (Iceberg, Delta)
• DevOps/Infrastructure-as-Code (Terraform or CloudFormation)
• Marketing analytics or customer intelligence experience
,
Senior Data Engineer – Conversational Analytics (AWS/Python)
Location: Remote
Contract Schedule: 40 hours/week | Must work in AZ or EST time zones
About The Role
Myticas is seeking a senior-level Data Engineer with strong expertise in AWS, Python, and Conversational Analytics to help build secure, scalable, and resilient cloud data solutions. In this role, you will develop knowledgebases and conversational agents that allow business users to interact with data through natural language. You will own the end-to-end development of data and AI pipelines in AWS—ensuring reliability, performance, and automation across environments.
This role is ideal for someone who enjoys building modern data platforms, contributing to AI-driven solutions, and partnering with cross-functional teams to democratize data and improve decision-making.
What You’ll Do
• Design, build, and optimize scalable data pipelines on AWS
• Implement security, monitoring, and logging best practices across data and AI pipelines
• Develop and maintain knowledgebases and conversational AI agents, integrating structured and unstructured data sources
• Build and support machine learning workflows in SageMaker (training, deployment, monitoring)
• Utilize AWS Bedrock to develop and deploy conversational AI solutions
• Ensure data quality, lineage, and reliability across data and LLM systems
• Build and maintain CI/CD pipelines enabling automated and secure deployments
• Automate pipeline orchestration using AWS services such as Step Functions and Lambda
• Partner with analytics, product, and business stakeholders to deliver data solutions that improve decision-making and accessibility
Required Skills & Experience
• 2+ years of hands-on experience in AWS, including:
• S3, Athena, Lambda, Step Functions, Redshift Serverless, SageMaker, Bedrock
• Strong Python programming skills (data engineering, automation, API integration)
• Experience building knowledgebases or conversational agents (or equivalent)
• Strong SQL skills and experience with large-scale data environments
• Solid understanding of ETL/ELT orchestration, optimization, and cloud data workflows
Preferred Experience
• LLM integration into business workflows
• Lakehouse frameworks (Iceberg, Delta)
• DevOps/Infrastructure-as-Code (Terraform or CloudFormation)
• Marketing analytics or customer intelligence experience
,