W3Global

Vector Data Pipeline Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Vector Data Pipeline Engineer with solid experience in Datadog, Kubernetes, and data ingestion. Contract length is unspecified, with a pay rate of "unknown." Remote work is allowed. Key skills include Snowflake, Splunk, and ETL.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
October 29, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Bellevue, WA
-
🧠 - Skills detailed
#Data Pipeline #Datadog #Data Ingestion #Snowflake #Kafka (Apache Kafka) #Kubernetes #Observability #Splunk #"ETL (Extract #Transform #Load)"
Role description
Position: Vector Data Pipeline Engineers Location: Bellevue, WA ( remote-OK) Jd • Solid experience of Datadog - data ingestion and observability pipeline tool • Solid understand of how to collect, transform, and route logs, metrics, and traces from multiple sources into systems like Snowflake, Splunk, ADX, or Kafka. • Strong experience implementing Vector on Kubernetes, configuring Vector and Vector agents, defining transforms and sinks, managing schema and pipeline performance, and integrating it within observability or data streaming architectures.