

Aptonet Inc
GenAI Data Automation
โญ - Featured Role | Apply direct with Data Freelance Hub
This role is for a GenAI Data Automation Engineer, a 6-month contract position with a pay rate of "$XX/hour". It requires expertise in AWS and Azure, Generative AI frameworks, and 2+ years in data engineering. U.S. citizenship is mandatory.
๐ - Country
United States
๐ฑ - Currency
$ USD
-
๐ฐ - Day rate
400
-
๐๏ธ - Date
April 1, 2026
๐ - Duration
Unknown
-
๐๏ธ - Location
Hybrid
-
๐ - Contract
Unknown
-
๐ - Security
Unknown
-
๐ - Location detailed
Washington, DC
-
๐ง - Skills detailed
#Indexing #S3 (Amazon Simple Storage Service) #Kafka (Apache Kafka) #REST (Representational State Transfer) #AI (Artificial Intelligence) #AWS (Amazon Web Services) #Hugging Face #SQL Server #Deployment #CRM (Customer Relationship Management) #DevOps #GitHub #RDS (Amazon Relational Database Service) #Bash #Computer Science #Azure DevOps #Azure SQL #IAM (Identity and Access Management) #BI (Business Intelligence) #SSIS (SQL Server Integration Services) #Spark (Apache Spark) #Metadata #Data Engineering #Jenkins #Cloud #Compliance #Batch #ML (Machine Learning) #DynamoDB #Python #Data Quality #Apache Spark #Data Pipeline #"ETL (Extract #Transform #Load)" #Anomaly Detection #Security #VPC (Virtual Private Cloud) #Model Deployment #Lambda (AWS Lambda) #Data Automation #CLI (Command-Line Interface) #Scripting #Scala #Langchain #Programming #API (Application Programming Interface) #Agile #Monitoring #Automation #REST API #SQL (Structured Query Language) #Firewalls #OpenSearch #Azure
Role description
The GenAI Data Automation Engineer is responsible for designing, developing, and maintaining AI-driven data pipelines and automation solutions across hybrid AWS and Azure environments. This role focuses on integrating Generative AI capabilities into scalable data systems to support analytics, reporting, and enterprise platforms. The position requires hands-on development, problem-solving, and collaboration to deliver mission-critical solutions in a federal environment.
Key Responsibilities
โข Design and maintain data pipelines in AWS using S3, RDS/SQL Server, Glue, Lambda, EMR, DynamoDB, and Step Functions
โข Develop ETL/ELT processes to move data across systems, including DynamoDB to SQL Server and AWS to Azure SQL integrations
โข Integrate AWS Connect and Nice inContact CRM data into enterprise data pipelines for analytics and reporting
โข Build and enhance ingestion pipelines using Apache Spark, Flume, and Kafka for real-time and batch processing into Solr and AWS OpenSearch
โข Leverage Generative AI frameworks (AWS Bedrock, Amazon Q, Azure OpenAI, Hugging Face, LangChain) to:
โข Generate embeddings and vector data from unstructured sources
โข Automate data quality, metadata tagging, and lineage tracking
โข Enhance ETL processes with LLM-based transformations and anomaly detection
โข Develop conversational BI interfaces for natural language querying
โข Build AI-powered copilots for monitoring and troubleshooting pipelines
โข Implement SQL Server optimization including stored procedures, indexing, query tuning, and execution plan analysis
โข Apply CI/CD practices using GitHub, Jenkins, or Azure DevOps
โข Ensure security and compliance using IAM, KMS encryption, VPC isolation, RBAC, and firewalls
โข Support Agile DevOps processes with sprint-based delivery
โข Collaborate with cross-functional teams and communicate technical solutions clearly
Required Technical Skills
โข AWS services: S3, RDS/SQL Server, Glue, Lambda, EMR, DynamoDB, Step Functions
โข Azure services including Azure SQL and Azure OpenAI
โข Generative AI frameworks: AWS Bedrock, Amazon Q, Azure OpenAI, Hugging Face, LangChain
โข Programming and scripting: Python, SQL, SSIS, Spark, Bash, PowerShell
โข Data engineering tools: Apache Spark, Flume, Kafka, Solr, AWS OpenSearch
โข REST API integration within data pipelines
โข CI/CD tools: GitHub, Jenkins, Azure DevOps
โข Cloud CLI tools (AWS/Azure)
โข SQL performance tuning and optimization
โข GenAI Ops including model deployment, monitoring, retraining, and lifecycle management
Qualifications & Experience
โข Bachelorโs degree in Computer Science or related field
โข 2+ years of experience in data engineering and automation
โข Hands-on experience with Generative AI and LLM frameworks
โข Experience working in AWS and/or Azure environments
โข Strong troubleshooting and performance optimization skills
โข Familiarity with Agile/DevOps methodologies
โข Strong communication and presentation skills
โข U.S. Citizenship required
โข Ability to obtain Public Trust clearance
About the Team / Company
Leidos is a Fortune 500ยฎ technology, engineering, and science solutions provider supporting defense, intelligence, civil, and health markets. The Civil Group focuses on modernizing government operations through AI/ML-driven data solutions, partnering with agencies such as FAA, DOE, DOJ, NASA, and TSA to deliver mission-critical systems.
The GenAI Data Automation Engineer is responsible for designing, developing, and maintaining AI-driven data pipelines and automation solutions across hybrid AWS and Azure environments. This role focuses on integrating Generative AI capabilities into scalable data systems to support analytics, reporting, and enterprise platforms. The position requires hands-on development, problem-solving, and collaboration to deliver mission-critical solutions in a federal environment.
Key Responsibilities
โข Design and maintain data pipelines in AWS using S3, RDS/SQL Server, Glue, Lambda, EMR, DynamoDB, and Step Functions
โข Develop ETL/ELT processes to move data across systems, including DynamoDB to SQL Server and AWS to Azure SQL integrations
โข Integrate AWS Connect and Nice inContact CRM data into enterprise data pipelines for analytics and reporting
โข Build and enhance ingestion pipelines using Apache Spark, Flume, and Kafka for real-time and batch processing into Solr and AWS OpenSearch
โข Leverage Generative AI frameworks (AWS Bedrock, Amazon Q, Azure OpenAI, Hugging Face, LangChain) to:
โข Generate embeddings and vector data from unstructured sources
โข Automate data quality, metadata tagging, and lineage tracking
โข Enhance ETL processes with LLM-based transformations and anomaly detection
โข Develop conversational BI interfaces for natural language querying
โข Build AI-powered copilots for monitoring and troubleshooting pipelines
โข Implement SQL Server optimization including stored procedures, indexing, query tuning, and execution plan analysis
โข Apply CI/CD practices using GitHub, Jenkins, or Azure DevOps
โข Ensure security and compliance using IAM, KMS encryption, VPC isolation, RBAC, and firewalls
โข Support Agile DevOps processes with sprint-based delivery
โข Collaborate with cross-functional teams and communicate technical solutions clearly
Required Technical Skills
โข AWS services: S3, RDS/SQL Server, Glue, Lambda, EMR, DynamoDB, Step Functions
โข Azure services including Azure SQL and Azure OpenAI
โข Generative AI frameworks: AWS Bedrock, Amazon Q, Azure OpenAI, Hugging Face, LangChain
โข Programming and scripting: Python, SQL, SSIS, Spark, Bash, PowerShell
โข Data engineering tools: Apache Spark, Flume, Kafka, Solr, AWS OpenSearch
โข REST API integration within data pipelines
โข CI/CD tools: GitHub, Jenkins, Azure DevOps
โข Cloud CLI tools (AWS/Azure)
โข SQL performance tuning and optimization
โข GenAI Ops including model deployment, monitoring, retraining, and lifecycle management
Qualifications & Experience
โข Bachelorโs degree in Computer Science or related field
โข 2+ years of experience in data engineering and automation
โข Hands-on experience with Generative AI and LLM frameworks
โข Experience working in AWS and/or Azure environments
โข Strong troubleshooting and performance optimization skills
โข Familiarity with Agile/DevOps methodologies
โข Strong communication and presentation skills
โข U.S. Citizenship required
โข Ability to obtain Public Trust clearance
About the Team / Company
Leidos is a Fortune 500ยฎ technology, engineering, and science solutions provider supporting defense, intelligence, civil, and health markets. The Civil Group focuses on modernizing government operations through AI/ML-driven data solutions, partnering with agencies such as FAA, DOE, DOJ, NASA, and TSA to deliver mission-critical systems.






