

Security Data Pipelines Developer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Security Data Pipelines Developer on a contract-to-hire basis (6–12 months) with a pay rate of "unknown". Remote work is available for U.S.-based candidates. Key skills include Cribl Stream, SPL2, Python, and JavaScript, with experience in security data transformation and compliance required.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
-
🗓️ - Date discovered
May 30, 2025
🕒 - Project duration
More than 6 months
-
🏝️ - Location type
Remote
-
📄 - Contract type
Unknown
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Security #"ETL (Extract #Transform #Load)" #Grafana #Monitoring #Splunk #Observability #Documentation #JavaScript #Metadata #Python #Data Quality #Data Pipeline #Data Governance #Logstash #Dynatrace #Data Enrichment #Compliance
Role description
Job Title: Security Data Pipelines & Observability Developer
Location: Remote (U.S.-based only)
Type: Contract-to-Hire (6–12 Months)
We’re hiring a Security Data Pipelines Developer to build and optimize security event data pipelines using Cribl Stream, SPL2, and integrations across observability platforms like Splunk, Grafana, and Dynatrace.
This role will be instrumental in transforming raw event data into actionable, compliant, and enriched streams that power our security operations and audit processes.
What You’ll Do
• Build and optimize data pipelines using Cribl Stream based on audit findings and security compliance requirements.
• Transform, enrich, route, and tag security event data to meet internal standards and regulatory frameworks.
• Integrate pipelines with Splunk, Grafana, Dynatrace, Logstash, and other observability platforms.
• Work closely with security operations, audit, and compliance teams to improve data quality and usability.
• Ensure pipelines support real-time monitoring, incident response, and audit traceability.
Must-Have Experience
• 3–5 years of hands-on experience with Cribl Stream, SPL2 (Splunk Processing Language), Python, and JavaScript.
• Experience building pipelines for security event data transformation and compliance.
• Proven ability to integrate with observability tools: Splunk, Grafana, Dynatrace, Logstash.
• Deep understanding of metadata enrichment, event routing, and pipeline performance optimization.
• Familiarity with security frameworks and data governance requirements.
Nice-to-Haves
• Exposure to CI/CD pipelines for managing data infrastructure.
• Experience working with audit teams or supporting data compliance initiatives.
Deliverables
• Fully functioning, compliant, and well-documented security data pipelines.
• Seamless integration of security data into observability platforms.
• Documentation and handoff support for ongoing monitoring and response workflows.
Job Title: Security Data Pipelines & Observability Developer
Location: Remote (U.S.-based only)
Type: Contract-to-Hire (6–12 Months)
We’re hiring a Security Data Pipelines Developer to build and optimize security event data pipelines using Cribl Stream, SPL2, and integrations across observability platforms like Splunk, Grafana, and Dynatrace.
This role will be instrumental in transforming raw event data into actionable, compliant, and enriched streams that power our security operations and audit processes.
What You’ll Do
• Build and optimize data pipelines using Cribl Stream based on audit findings and security compliance requirements.
• Transform, enrich, route, and tag security event data to meet internal standards and regulatory frameworks.
• Integrate pipelines with Splunk, Grafana, Dynatrace, Logstash, and other observability platforms.
• Work closely with security operations, audit, and compliance teams to improve data quality and usability.
• Ensure pipelines support real-time monitoring, incident response, and audit traceability.
Must-Have Experience
• 3–5 years of hands-on experience with Cribl Stream, SPL2 (Splunk Processing Language), Python, and JavaScript.
• Experience building pipelines for security event data transformation and compliance.
• Proven ability to integrate with observability tools: Splunk, Grafana, Dynatrace, Logstash.
• Deep understanding of metadata enrichment, event routing, and pipeline performance optimization.
• Familiarity with security frameworks and data governance requirements.
Nice-to-Haves
• Exposure to CI/CD pipelines for managing data infrastructure.
• Experience working with audit teams or supporting data compliance initiatives.
Deliverables
• Fully functioning, compliant, and well-documented security data pipelines.
• Seamless integration of security data into observability platforms.
• Documentation and handoff support for ongoing monitoring and response workflows.