

Sr. Data Engineer - (Cyber Team)
โญ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Sr. Data Engineer on the Cyber Team, lasting 12 months with a pay rate of $55 ON C2C. Candidates need 14+ years of experience, cybersecurity expertise, and familiarity with mobility clients. Onsite work is required in Bellevue or Overland Park.
๐ - Country
United States
๐ฑ - Currency
$ USD
-
๐ฐ - Day rate
-
๐๏ธ - Date discovered
August 29, 2025
๐ - Project duration
More than 6 months
-
๐๏ธ - Location type
On-site
-
๐ - Contract type
Unknown
-
๐ - Security clearance
Unknown
-
๐ - Location detailed
Overland Park, KS
-
๐ง - Skills detailed
#"ETL (Extract #Transform #Load)" #Data Governance #XML (eXtensible Markup Language) #Snowflake #Storage #JavaScript #NiFi (Apache NiFi) #Groovy #Documentation #Libraries #Logging #Cybersecurity #Security #Observability #Anomaly Detection #Monitoring #Data Integration #Kafka (Apache Kafka) #Compliance #Apache NiFi #Data Transformations #Python #Metadata #Normalization #JSON (JavaScript Object Notation) #Scripting #Strategy #Scala #Splunk #Data Security #Data Engineering #Data Ingestion
Role description
I have 8 open roles for Data Engineer
These roles are 4 days onsite, so please send me local candidates. Nonโlocal candidates are also acceptable, but the candidate should be ready to relocate to one of the locations below โ no last-minute surprises.
This is a cybersecurity role, so data security is more important. Look for candidates coming from Mobility clients, such as TโMobile, Verizon, AT&T, etc.
Job Description
Sr. Data Engineer - (Cyber Team)
Duration: 12 Months
Rate 55 ON C2C
Need 14+ Years of exp.
Location: Bellevue HQ or Overland Park โ onsite 4 days a week
โข Location 1 (Bellevue)
โข Location 2 (Overland Park, Kansas)
Overview
We are seeking eight Senior Data Engineers to lead efforts in orchestrating and transforming complex security telemetry data flows. These individuals will be responsible for high-level architecture, governance, and ensuring secure and reliable movement of data between systems, particularly for legacy and non-standard log sources. There are 100+ data sources including existing and new that are specific to Cyber Security workloads that are in-scope. These tasks will be performed on one or more data ingestion pipelines (Cribl, Vector, NiFi)
Project/Initiative: SIEM Modernization
Work Required
โข Lead the architecture, design, and implementation of scalable, modular, and reusable data flow pipelines using Cribl, Apache NiFi, Vector, and other open-source platforms, ensuring consistent ingestion strategies across a complex, multi-source telemetry environment.
โข Develop platform-agnostic ingestion frameworks and template-driven architectures to enable reusable ingestion patterns, supporting a variety of input types (e.g., syslog, Kafka, HTTP, Event Hubs, Blob Storage) and output destinations (e.g., Snowflake, Splunk, ADX, Log Analytics, Anvilogic).
โข Spearhead the creation and adoption of a schema normalization strategy, leveraging the Open Cybersecurity Schema Framework (OCSF), including field mapping, transformation templates, and schema validation logicโdesigned to be portable across ingestion platforms.
โข Design and implement custom data transformations and enrichments using scripting languages such as Groovy, Python, or JavaScript, while enforcing robust governance and security controls (SSL/TLS, client authentication, input validation, logging).
โข Ensure full end-to-end traceability and lineage of data across the ingestion, transformation, and storage lifecycle, including metadata tagging, correlation IDs, and change tracking for forensic and audit readiness.
โข Collaborate with observability and platform teams to integrate pipeline-level health monitoring, transformation failure logging, and anomaly detection mechanisms.
โข Oversee and validate data integration efforts, ensuring high-fidelity delivery into downstream analytics platforms and data stores, with minimal data loss, duplication, or transformation drift.
โข Lead technical working sessions to evaluate and recommend best-fit technologies, tools, and practices for managing structured and unstructured security telemetry data at scale.
โข Implement data transformation logic including filtering, enrichment, dynamic routing, and format conversions (e.g., JSON โ CSV, XML, Logfmt) to prepare data for downstream analytics platforms. (100 plus sources of data)
โข Contribute to and maintain a centralized documentation repository, including ingestion patterns, transformation libraries, naming standards, schema definitions, data governance procedures, and platform-specific integration details.
โข Coordinate with security, analytics, and platform teams to understand use cases and ensure pipeline logic supports threat detection, compliance, and data analytics requirements.
Skills: data,engineer,security,analytics,platforms,cyber
I have 8 open roles for Data Engineer
These roles are 4 days onsite, so please send me local candidates. Nonโlocal candidates are also acceptable, but the candidate should be ready to relocate to one of the locations below โ no last-minute surprises.
This is a cybersecurity role, so data security is more important. Look for candidates coming from Mobility clients, such as TโMobile, Verizon, AT&T, etc.
Job Description
Sr. Data Engineer - (Cyber Team)
Duration: 12 Months
Rate 55 ON C2C
Need 14+ Years of exp.
Location: Bellevue HQ or Overland Park โ onsite 4 days a week
โข Location 1 (Bellevue)
โข Location 2 (Overland Park, Kansas)
Overview
We are seeking eight Senior Data Engineers to lead efforts in orchestrating and transforming complex security telemetry data flows. These individuals will be responsible for high-level architecture, governance, and ensuring secure and reliable movement of data between systems, particularly for legacy and non-standard log sources. There are 100+ data sources including existing and new that are specific to Cyber Security workloads that are in-scope. These tasks will be performed on one or more data ingestion pipelines (Cribl, Vector, NiFi)
Project/Initiative: SIEM Modernization
Work Required
โข Lead the architecture, design, and implementation of scalable, modular, and reusable data flow pipelines using Cribl, Apache NiFi, Vector, and other open-source platforms, ensuring consistent ingestion strategies across a complex, multi-source telemetry environment.
โข Develop platform-agnostic ingestion frameworks and template-driven architectures to enable reusable ingestion patterns, supporting a variety of input types (e.g., syslog, Kafka, HTTP, Event Hubs, Blob Storage) and output destinations (e.g., Snowflake, Splunk, ADX, Log Analytics, Anvilogic).
โข Spearhead the creation and adoption of a schema normalization strategy, leveraging the Open Cybersecurity Schema Framework (OCSF), including field mapping, transformation templates, and schema validation logicโdesigned to be portable across ingestion platforms.
โข Design and implement custom data transformations and enrichments using scripting languages such as Groovy, Python, or JavaScript, while enforcing robust governance and security controls (SSL/TLS, client authentication, input validation, logging).
โข Ensure full end-to-end traceability and lineage of data across the ingestion, transformation, and storage lifecycle, including metadata tagging, correlation IDs, and change tracking for forensic and audit readiness.
โข Collaborate with observability and platform teams to integrate pipeline-level health monitoring, transformation failure logging, and anomaly detection mechanisms.
โข Oversee and validate data integration efforts, ensuring high-fidelity delivery into downstream analytics platforms and data stores, with minimal data loss, duplication, or transformation drift.
โข Lead technical working sessions to evaluate and recommend best-fit technologies, tools, and practices for managing structured and unstructured security telemetry data at scale.
โข Implement data transformation logic including filtering, enrichment, dynamic routing, and format conversions (e.g., JSON โ CSV, XML, Logfmt) to prepare data for downstream analytics platforms. (100 plus sources of data)
โข Contribute to and maintain a centralized documentation repository, including ingestion patterns, transformation libraries, naming standards, schema definitions, data governance procedures, and platform-specific integration details.
โข Coordinate with security, analytics, and platform teams to understand use cases and ensure pipeline logic supports threat detection, compliance, and data analytics requirements.
Skills: data,engineer,security,analytics,platforms,cyber