

Sr. Data Engineer_ Onsite(Overland Park, Kansas)
β - Featured Role | Apply direct with Data Freelance Hub
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
September 9, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Unknown
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Overland Park, KS
-
π§ - Skills detailed
#Data Ingestion #Kafka (Apache Kafka) #Scala #"ETL (Extract #Transform #Load)" #Storage #Apache NiFi #Snowflake #Security #Data Engineering #Splunk #NiFi (Apache NiFi)
Role description
Sr. Data Engineer
Duration: 12 Months
Location: Bellevue HQ or Overland Park β onsite 4 days a week
Project/Initiative: SIEM Modernization
We are seeking eight Senior Data Engineers to lead efforts in orchestrating and transforming complex security telemetry data flows. These individuals will be responsible for high-level architecture, governance, and ensuring secure and reliable movement of data between systems, particularly for legacy and non-standard log sources. There are 100+ data sources including existing and new that are specific to Cyber Security workloads that are in-scope. These tasks will be performed on one or more data ingestion pipelines (Cribl, Vector, NiFi)
Work Required
β’ Lead the architecture, design, and implementation of scalable, modular, and reusable data flow pipelines using Cribl, Apache NiFi, Vector, and other open-source platforms, ensuring consistent ingestion strategies across a complex, multi-source telemetry environment.
β’ Develop platform-agnostic ingestion frameworks and template-driven architectures to enable reusable ingestion patterns, supporting a variety of input types (e.g., syslog, Kafka, HTTP, Event Hubs, Blob Storage) and output destinations (e.g., Snowflake, Splunk, ADX, Log Analytics, Anvilogic).
Sr. Data Engineer
Duration: 12 Months
Location: Bellevue HQ or Overland Park β onsite 4 days a week
Project/Initiative: SIEM Modernization
We are seeking eight Senior Data Engineers to lead efforts in orchestrating and transforming complex security telemetry data flows. These individuals will be responsible for high-level architecture, governance, and ensuring secure and reliable movement of data between systems, particularly for legacy and non-standard log sources. There are 100+ data sources including existing and new that are specific to Cyber Security workloads that are in-scope. These tasks will be performed on one or more data ingestion pipelines (Cribl, Vector, NiFi)
Work Required
β’ Lead the architecture, design, and implementation of scalable, modular, and reusable data flow pipelines using Cribl, Apache NiFi, Vector, and other open-source platforms, ensuring consistent ingestion strategies across a complex, multi-source telemetry environment.
β’ Develop platform-agnostic ingestion frameworks and template-driven architectures to enable reusable ingestion patterns, supporting a variety of input types (e.g., syslog, Kafka, HTTP, Event Hubs, Blob Storage) and output destinations (e.g., Snowflake, Splunk, ADX, Log Analytics, Anvilogic).