

Project Manager USA, Inc.(DBA PM America)
Unified Application Programming Interface (UAPI) and Artificial Intelligence (AI) Specialists
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for Unified Application Programming Interface (UAPI) and Artificial Intelligence (AI) Specialists, offering a contract length of "X months" at a pay rate of "$Y/hour". Key skills required include API development, AI/ML integration, and experience with legacy systems.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
January 9, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Washington, DC
-
🧠 - Skills detailed
#Deep Learning #BI (Business Intelligence) #Datadog #Agile #React #Batch #Data Warehouse #SageMaker #AI (Artificial Intelligence) #Delta Lake #POSTMAN #Data Access #JavaScript #Model Deployment #PyTorch #Prometheus #Security #Databases #GDPR (General Data Protection Regulation) #JSON (JavaScript Object Notation) #Monitoring #BigQuery #Databricks #Spark (Apache Spark) #Data Lake #AWS (Amazon Web Services) #Deployment #GraphQL #TensorFlow #Data Processing #MLflow #Microservices #Data Governance #Redshift #Java #dbt (data build tool) #C# #Compliance #Cloud #Snowflake #SQL (Structured Query Language) #Python #HTML (Hypertext Markup Language) #TypeScript #GCP (Google Cloud Platform) #Angular #Azure #DevOps #ML (Machine Learning) #Documentation #NoSQL #Kubernetes #Code Reviews #Scala #FastAPI #Kafka (Apache Kafka) #NLP (Natural Language Processing) #Data Bricks #Data Pipeline #Programming #Airflow #"ETL (Extract #Transform #Load)" #Data Engineering #Regression #Apache Airflow #REST (Representational State Transfer) #Docker #API (Application Programming Interface) #Data Integrity
Role description
Perform the following tasks:
Ø Create, modernize, and maintain a suite of APIs that support critical federal customer functions.
v These APIs create a single layer for data access to support the future technology vision of the Agency.
v Data sources span several legacy systems, modern databases, and emerging Artificial Intelligence (AI) driven platforms.
Ø Provide technical teams to support development, integration, maintenance, and optimization of APIs across Agency’s Information Technology (IT).
Ø Support the delivery of scalable APIs that align to the domain data models, and provide clean, consistent, and timely data to Agency’s business applications.
Ø Ensure compliance with enterprise architectural Representational State Transfer (REST), Graph Query Language (GraphQL), or Google Remote Procedure Call (gRPC) standards, API design best practices consistent Uniform Resource Locater (URL) naming, versioning, Hypertext Transfer Protocol (HTTP) methods & standard status codes and Authentication & Authorization Standards (OAuth, JSON Web Token (JWT)) standards and supporting downstream analytics, monitoring (Datadog, New Relic, Prometheus) and service integration.
Ø Define and document UAPI API specifications.
Ø Discover and validate data sources by decomposing legacy source.
Ø Develop, maintain, and enhance APIs.
Ø Map and document relevant data elements.
Ø Ensure consistency between discovered elements and object design.
Ø Create, document, and validate objects through testing.
Ø Ensure backward compatibility and data integrity across legacy systems.
Ø Support analysis of legacy Common Business-Oriented Language (COBOL)-based services and batch processing integrations.
Ø Develop and document the agency Micro Service.
Ø Validate and test services to ensure operational readiness.
Ø Conduct API readiness reviews for consumer onboarding.
Ø Identify additional command codes that the API can replace, working with the AI team to expand mappings.
Ø Manage versioning of APIs, ensuring updates and deprecations follow governance procedures.
Ø Provide AI-Driven API development to support software development tasks (code generation, code reviews, documentation, error detection, and testing)
Ø Build APIs that support Artificial Intelligence / Machine Learning (AI/ML) workflows using Automated Language Conversion (ALC) and COBOL-based logic (or example, converting COBOL into a modern language like Java, C#, Python, or TypeScript).
Ø Provide Model deployment, inference, and feedback loops via API endpoints.
Ø Support coordination, risk management, and reporting to Agency stakeholders, including schedule and milestone tracking.
Ø Ensure alignment with agency IT governance.
Ø Ensure that automated Unit and Integration Tests are created and maintained using frameworks (like Postman/Newman or test for automated regression)
Ø Ensure that synthetic monitoring and alerts are in place to proactively monitor API health and latency.
Ø Deliver APIs that provide the ability to replace legacy command codes.
Ø Deliver APIs that provide the ability to integrate with modern databases.
Ø Deliver APIs that provide the ability to integrate with external flat files.
Ø Deliver APIs that provide the ability to access data whose original source was Agency flat files.
Ø Deliver modernized APIs that provide data access that replace legacy command codes while ensuring secure, validated, and scalable integration.
Ø Reduce reliance on outdated systems, enhances data accessibility, and streamlines onboarding for consumer services.
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
• Positions:
Job Title: Full Stack Developer:
Ø 3–8+ years in full-stack development with a strong background in both frontend and backend, plus experience leading projects or mentoring team members.
Ø Proficient in frontend frameworks (React, Angular, Vue) and backend technologies (Node.js, Python, Java, Spark etc.), with strong HTML/CSS/JavaScript skills and REST/GraphQL API design. OpenShift, Kubernetes, Data Bricks
Ø Able to design scalable, maintainable web applications with knowledge of databases (SQL/NoSQL), caching, authentication, CI/CD, and cloud infrastructure (AWS, GCP, Azure).
Ø Strong problem solving, communication, and collaboration abilities, with a focus on code quality, documentation, and agile practices.
Ø Experience with microservices, containerization (Docker/K8s), testing frameworks, DevOps, or specialization in performance optimization, security, or mobile/web hybrid apps.
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
• Job Title: Data Engineer:
Ø 3–5+ years in data engineering or backend data systems, with experience leading data pipeline projects and mentoring junior engineers.
Ø Proficient in building and optimizing ETL/ELT pipelines using tools like Apache Airflow, Spark, dbt, Kafka, Databricks, or Flink.
Ø Strong skills in SQL, Python or Scala, and working with data warehouses (Snowflake, BigQuery, Redshift) and data lakes (e.g., Delta Lake, Lakehouse architecture).
Ø Familiar with cloud platforms (AWS, GCP, Azure), containerization (Docker/Kubernetes), and orchestration, with experience in CI/CD and data versioning (e.g., DVC).
Ø Experience with real-time data processing, cataloging, data governance, privacy/compliance (GDPR, HIPAA), and cross-functional collaboration with analytics, ML, or BI teams.
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
• Job Title: AI Engineer:
Ø 3–5+ years in software/ML engineering with a strong track record of leading end-to-end AI/ML projects and mentoring junior team members.
Ø Proficient in Python, deep learning (PyTorch/TensorFlow), Natural Language Processing (NLP) /LLMs, MLOps (e.g. MLflow, Docker, cloud platforms), and data pipelines.:
Ø Hands-on experience deploying models to production using tools like SageMaker, Vertex AI, or custom APIs (FastAPI, TorchServe).
Ø Strong communication, cross-functional collaboration, critical thinking, and the ability to translate business needs into AI solutions.
Ø Experience with LLMs, generative AI, RAG, open-source contributions, or domain expertise in finance, healthcare, or recommendation systems.
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
Perform the following tasks:
Ø Create, modernize, and maintain a suite of APIs that support critical federal customer functions.
v These APIs create a single layer for data access to support the future technology vision of the Agency.
v Data sources span several legacy systems, modern databases, and emerging Artificial Intelligence (AI) driven platforms.
Ø Provide technical teams to support development, integration, maintenance, and optimization of APIs across Agency’s Information Technology (IT).
Ø Support the delivery of scalable APIs that align to the domain data models, and provide clean, consistent, and timely data to Agency’s business applications.
Ø Ensure compliance with enterprise architectural Representational State Transfer (REST), Graph Query Language (GraphQL), or Google Remote Procedure Call (gRPC) standards, API design best practices consistent Uniform Resource Locater (URL) naming, versioning, Hypertext Transfer Protocol (HTTP) methods & standard status codes and Authentication & Authorization Standards (OAuth, JSON Web Token (JWT)) standards and supporting downstream analytics, monitoring (Datadog, New Relic, Prometheus) and service integration.
Ø Define and document UAPI API specifications.
Ø Discover and validate data sources by decomposing legacy source.
Ø Develop, maintain, and enhance APIs.
Ø Map and document relevant data elements.
Ø Ensure consistency between discovered elements and object design.
Ø Create, document, and validate objects through testing.
Ø Ensure backward compatibility and data integrity across legacy systems.
Ø Support analysis of legacy Common Business-Oriented Language (COBOL)-based services and batch processing integrations.
Ø Develop and document the agency Micro Service.
Ø Validate and test services to ensure operational readiness.
Ø Conduct API readiness reviews for consumer onboarding.
Ø Identify additional command codes that the API can replace, working with the AI team to expand mappings.
Ø Manage versioning of APIs, ensuring updates and deprecations follow governance procedures.
Ø Provide AI-Driven API development to support software development tasks (code generation, code reviews, documentation, error detection, and testing)
Ø Build APIs that support Artificial Intelligence / Machine Learning (AI/ML) workflows using Automated Language Conversion (ALC) and COBOL-based logic (or example, converting COBOL into a modern language like Java, C#, Python, or TypeScript).
Ø Provide Model deployment, inference, and feedback loops via API endpoints.
Ø Support coordination, risk management, and reporting to Agency stakeholders, including schedule and milestone tracking.
Ø Ensure alignment with agency IT governance.
Ø Ensure that automated Unit and Integration Tests are created and maintained using frameworks (like Postman/Newman or test for automated regression)
Ø Ensure that synthetic monitoring and alerts are in place to proactively monitor API health and latency.
Ø Deliver APIs that provide the ability to replace legacy command codes.
Ø Deliver APIs that provide the ability to integrate with modern databases.
Ø Deliver APIs that provide the ability to integrate with external flat files.
Ø Deliver APIs that provide the ability to access data whose original source was Agency flat files.
Ø Deliver modernized APIs that provide data access that replace legacy command codes while ensuring secure, validated, and scalable integration.
Ø Reduce reliance on outdated systems, enhances data accessibility, and streamlines onboarding for consumer services.
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
• Positions:
Job Title: Full Stack Developer:
Ø 3–8+ years in full-stack development with a strong background in both frontend and backend, plus experience leading projects or mentoring team members.
Ø Proficient in frontend frameworks (React, Angular, Vue) and backend technologies (Node.js, Python, Java, Spark etc.), with strong HTML/CSS/JavaScript skills and REST/GraphQL API design. OpenShift, Kubernetes, Data Bricks
Ø Able to design scalable, maintainable web applications with knowledge of databases (SQL/NoSQL), caching, authentication, CI/CD, and cloud infrastructure (AWS, GCP, Azure).
Ø Strong problem solving, communication, and collaboration abilities, with a focus on code quality, documentation, and agile practices.
Ø Experience with microservices, containerization (Docker/K8s), testing frameworks, DevOps, or specialization in performance optimization, security, or mobile/web hybrid apps.
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
• Job Title: Data Engineer:
Ø 3–5+ years in data engineering or backend data systems, with experience leading data pipeline projects and mentoring junior engineers.
Ø Proficient in building and optimizing ETL/ELT pipelines using tools like Apache Airflow, Spark, dbt, Kafka, Databricks, or Flink.
Ø Strong skills in SQL, Python or Scala, and working with data warehouses (Snowflake, BigQuery, Redshift) and data lakes (e.g., Delta Lake, Lakehouse architecture).
Ø Familiar with cloud platforms (AWS, GCP, Azure), containerization (Docker/Kubernetes), and orchestration, with experience in CI/CD and data versioning (e.g., DVC).
Ø Experience with real-time data processing, cataloging, data governance, privacy/compliance (GDPR, HIPAA), and cross-functional collaboration with analytics, ML, or BI teams.
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
• Job Title: AI Engineer:
Ø 3–5+ years in software/ML engineering with a strong track record of leading end-to-end AI/ML projects and mentoring junior team members.
Ø Proficient in Python, deep learning (PyTorch/TensorFlow), Natural Language Processing (NLP) /LLMs, MLOps (e.g. MLflow, Docker, cloud platforms), and data pipelines.:
Ø Hands-on experience deploying models to production using tools like SageMaker, Vertex AI, or custom APIs (FastAPI, TorchServe).
Ø Strong communication, cross-functional collaboration, critical thinking, and the ability to translate business needs into AI solutions.
Ø Experience with LLMs, generative AI, RAG, open-source contributions, or domain expertise in finance, healthcare, or recommendation systems.
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•






