Backend Engineer with InfluxDB, Opensearch

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Backend Engineer with InfluxDB and OpenSearch, on a contract basis in Snoqualmie, WA. Requires 7+ years of backend development, API frameworks experience, and proficiency in SQL, data ingestion, and telemetry data processing.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
-
🗓️ - Date discovered
September 10, 2025
🕒 - Project duration
Unknown
-
🏝️ - Location type
On-site
-
📄 - Contract type
Unknown
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
Snoqualmie, WA
-
🧠 - Skills detailed
#OpenSearch #Compliance #MongoDB #Security #Flask #Observability #Monitoring #REST API #Kafka (Apache Kafka) #Splunk #DynamoDB #API (Application Programming Interface) #Data Pipeline #Python #Data Ingestion #Elasticsearch #JSON (JavaScript Object Notation) #"ETL (Extract #Transform #Load)" #DevOps #Data Modeling #SQL (Structured Query Language) #Docker #NoSQL #PostgreSQL #MySQL #FastAPI #Visualization #Business Analysis #REST (Representational State Transfer)
Role description
Job Title: Backend Engineer with InfluxDB, Opensearch Location: Snoqualmie, WA- Onsite On Contract Mandatory Skills: Data Ingestion, api development, backend engineer, InfluxDB, Opensearch Job Description: We are looking for a strong Backend Engineer who can design and implement robust data pipelines that ingest logs, metrics, and telemetry data from various observability tools such as Splunk, InfluxDB, and OpenSearch. This data must be processed, normalized, and persisted in appropriate backend data stores (SQL/NoSQL) for downstream usage. The engineer will also be responsible for developing performant and secure RESTful APIs to support frontend visualizations and dashboards. \_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_ 🛠️ Key Responsibilities • Design and implement data ingestion pipelines to pull data from: o           Splunk (via REST API or SDKs) o           InfluxDB (using Flux/InfluxQL) o           OpenSearch (via query DSL or API) • Normalize, transform, and insert collected data into backend systems such as: o           PostgreSQL / MySQL o           MongoDB / DynamoDB / TimescaleDB (optional based on use case) • Build RESTful APIs to expose processed data to the frontend for: o           Dashboards o           Alerts/Health indicators o           Metrics visualizations • Implement data retention and archival logic as needed for compliance or performance • Work with DevOps to integrate pipelines into CI/CD and containerized environments (Docker/K8s) • Implement basic observability (logs, metrics, alerts) for the APIs and pipelines • Collaborate closely with frontend developers and business analysts to shape data contracts and endpoint requirements \_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_ 💼 Required Skills & Experience • 7+ years backend development experience with Python, Node.js, or Go • Hands-on experience with API development frameworks (e.g., FastAPI, Flask, Express, or Gin) • Experience integrating with Splunk, InfluxDB, and/or OpenSearch • Strong grasp of query languages like: o           SPL (Splunk) o           Flux or InfluxQL (InfluxDB) o           Elasticsearch DSL (OpenSearch) • Proficiency in SQL and data modeling • Experience with JSON, REST, OAuth, JWT, and API security best practices • Experience building services that process high-velocity telemetry or monitoring data • Solid understanding of asynchronous processing (Celery, Kafka, etc.)