TechnoSphere, Inc.

Senior Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer focused on Cybersecurity, requiring 10+ years in the field and 5+ years with CRIBL/Vector/Splunk. Remote work, competitive pay rate. Key skills include data engineering, scripting (Python, JavaScript), and ETL processes.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
January 28, 2026
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Remote
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
United States
-
🧠 - Skills detailed
#Apache NiFi #Kafka (Apache Kafka) #Python #Snowflake #Security #Strategy #Splunk #Scripting #JavaScript #Cybersecurity #Datadog #NiFi (Apache NiFi) #"ETL (Extract #Transform #Load)" #XML (eXtensible Markup Language) #JSON (JavaScript Object Notation) #Logging #Normalization #Monitoring #Data Transformations #Data Engineering #Storage #Groovy #Data Integration #Scala #Anomaly Detection #Data Pipeline #Observability
Role description
Title: Cybersecurity Data Engineer (SIEM Data Pipeline) Location: Remote Description: β€’ Lead the architecture, design, and implementation of scalable, modular, and reusable data flow pipelines using Cribl, Apache NiFi, Vector, and other open-source platforms, ensuring consistent ingestion strategies across a complex, multi-source telemetry environment. β€’ Develop platform-agnostic ingestion frameworks and template-driven architectures to enable reusable ingestion patterns, supporting a variety of input types (e.g., syslog, Kafka, HTTP, Event Hubs, Blob Storage) and output destinations (e.g., Snowflake, Splunk, ADX, Log Analytics, Anvilogic). β€’ Spearhead the creation and adoption of a schema normalization strategy, leveraging the Open Cybersecurity Schema Framework (OCSF), including field mapping, transformation templates, and schema validation logicβ€”designed to be portable across ingestion platforms. β€’ Design and implement custom data transformations and enrichments using scripting languages such as Groovy, Python, or JavaScript, while enforcing robust governance and security controls (SSL/TLS, client authentication, input validation, logging). β€’ Collaborate with observability and platform teams to integrate pipeline-level health monitoring, transformation failure logging, and anomaly detection mechanisms. β€’ Oversee and validate data integration efforts, ensuring high-fidelity delivery into downstream analytics platforms and data stores, with minimal data loss, duplication, or transformation drift. β€’ Lead technical working sessions to evaluate and recommend best-fit technologies, tools, and practices for managing structured and unstructured security telemetry data at scale. β€’ Implement data transformation logic including filtering, enrichment, dynamic routing, and format conversions (e.g., JSON ↔ CSV, XML, Logfmt) to prepare data for downstream analytics platforms. (100 plus sources of data) Experience Required: 1. 10+ Years of experience working in Cybersecurity is Mandatory 1. 5+ Years of experience on CRIBL/Vector/Datadog/Splunk or other data pipeline platforms 1. 5+ Years of experience on JavaScript, Python, or other scripting language.