Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer on a long-term remote contract, offering a competitive pay rate. Key skills required include 5+ years in API development, proficiency in Node.js, FastAPI, GraphQL, AWS Lambda, and experience with CI/CD pipelines and Databricks.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
720
-
πŸ—“οΈ - Date discovered
August 16, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
Remote
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
United States
-
🧠 - Skills detailed
#Cloud #Hugging Face #Metadata #Databricks #DevOps #DynamoDB #GitHub #Golang #Kubernetes #Data Science #Observability #Data Mapping #PostgreSQL #Databases #AWS (Amazon Web Services) #Microservices #Langchain #MongoDB #Data Access #Recommender Systems #AWS Lambda #FastAPI #ML (Machine Learning) #Grafana #Monitoring #Scala #Python #Data Processing #Deployment #Prometheus #HTML (Hypertext Markup Language) #API (Application Programming Interface) #Pytest #Lambda (AWS Lambda) #React #POSTMAN #Data Engineering #Jenkins #GraphQL #Kafka (Apache Kafka) #AI (Artificial Intelligence) #Debugging #SQL (Structured Query Language) #NoSQL
Role description
We are recruiting for a highly skilled Senior API Engineer to join our dynamic engineering team in a remote, long-term contract role to design, develop, and maintain scalable RESTful and GraphQL APIs within a modern microservices architecture. This role involves working with cutting-edge technologies including Node.js, FastAPI, AWS Lambda, EKS, GraphQL (Ariadne), and API Gateway, as well as CI/CD pipelines and advanced data systems such as Databricks. As a key contributor, you will collaborate across front-end, backend, data science, and DevOps teams to build robust, secure, and highly performant APIs that power mission-critical applications and AI-powered workflows. Responsibilities: β€’ Design, develop, and deploy scalable RESTful and GraphQL APIs using Node.js (Express/Nest.js), FastAPI (Python), Ariadne, and Golang. β€’ Implement APIs as part of a microservices architecture with a focus on API Gateway and APIGEE management. β€’ Integrate with Databricks to expose and manage OLAP workloads for large-scale data analytics. β€’ Architect APIs using metadata-driven design principles, supporting dynamic authentication and flexible data exposure. β€’ Deploy services using AWS Lambda, Kubernetes (EKS), and AWS Bedrock to support serverless and scalable microservices. β€’ Implement and manage authentication and authorization using OAuth2, JWT, and API Gateway. β€’ Design and optimize data access with MongoDB, PostgreSQL, and DynamoDB. β€’ Build and maintain CI/CD pipelines using AWS CodePipeline, CloudFormation, and Jenkins/GitHub Actions. β€’ Monitor API performance and reliability using Prometheus, Grafana, and ELK Stack. β€’ Work on AI/ML API integrations, including Recommender Systems, LLM-based Agents, and AI-powered image/video processing workflows. β€’ Collaborate closely with front-end developers (React, HTML, JS), data scientists, and platform teams to ensure smooth integrations and deployments. Requirements: β€’ 5+ years of hands-on experience with API development using RESTful services and GraphQL β€’ Strong proficiency in: β€’ Node.js (Express/Nest.js) β€’ FastAPI (Python) β€’ GraphQL development and architecture β€’ EKS (Kubernetes) β€’ Authentication and Authorization (OAuth2, JWT, API Gateway). β€’ Experience with AWS Lambda, EKS (Kubernetes), and AWS Bedrock. β€’ Deep understanding of authentication/authorization frameworks: OAuth2, JWT, API Gateway. β€’ Proven experience in building metadata-driven API frameworks with dynamic auth models. β€’ Solid knowledge of Databricks and OLAP workloads for data analytics via APIs. β€’ Hands-on with both SQL and NoSQL databases (MongoDB, PostgreSQL, DynamoDB). β€’ CI/CD expertise using tools like AWS CodePipeline, CloudFormation, Jenkins, and GitHub Actions. β€’ Proficient in API testing and debugging with tools like Postman, Newman, Jest, Mocha, and PyTest. β€’ Strong monitoring and observability experience with Grafana, Prometheus, and the ELK stack. β€’ Experience with GraphQL Federation and Apollo Server. β€’ Familiarity with AI/ML APIs, LLM-based chatbots, and platforms like OpenAI, Hugging Face, LangChain. β€’ Background in streaming technologies like Kafka, WebSockets, or Golang-based streaming services. β€’ Exposure to AI-powered data mapping, recommendation engines, and real-time data processing APIs.