

Prudent Technologies and Consulting, Inc.
Senior Security Data Engineer (SIEM Data Pipeline)_Remote(W2-Only)
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Security Data Engineer (SIEM Data Pipeline) focused on orchestrating complex security telemetry data flows. Contract length is 12 months+, with a pay rate of "W2-Only". Key skills include 10+ years in Cybersecurity, 5+ years with CRIBL/Vector, and proficiency in JavaScript or Python.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
February 9, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Remote
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Data Pipeline #Cybersecurity #Data Ingestion #Data Engineering #Security #Splunk #"ETL (Extract #Transform #Load)" #JavaScript #NiFi (Apache NiFi) #Datadog #Python #Scripting
Role description
Senior Security Data Engineer (SIEM Data Pipeline)\_Remote
Contract :12 Months+
We are seeking two Senior Data Engineers to lead efforts in orchestrating and transforming complex security telemetry data flows. These individuals will be responsible for high-level architecture, governance, and ensuring secure and reliable movement of data between systems, particularly for legacy and non-standard log sources. There are 100+ data sources including existing and new that are specific to Cyber Security workloads that are in-scope. These tasks will be performed on one or more data ingestion pipelines (Cribl, Vector, NiFi)
Experience Required:
• 10+ Years of experience working in Cybersecurity
• 5+ Years of experience on CRIBL/Vector/Datadog/Splunk or other data pipeline platforms
• 5+ Years of experience on JavaScript, python, or other scripting language
Senior Security Data Engineer (SIEM Data Pipeline)\_Remote
Contract :12 Months+
We are seeking two Senior Data Engineers to lead efforts in orchestrating and transforming complex security telemetry data flows. These individuals will be responsible for high-level architecture, governance, and ensuring secure and reliable movement of data between systems, particularly for legacy and non-standard log sources. There are 100+ data sources including existing and new that are specific to Cyber Security workloads that are in-scope. These tasks will be performed on one or more data ingestion pipelines (Cribl, Vector, NiFi)
Experience Required:
• 10+ Years of experience working in Cybersecurity
• 5+ Years of experience on CRIBL/Vector/Datadog/Splunk or other data pipeline platforms
• 5+ Years of experience on JavaScript, python, or other scripting language






