Sr. Data Engineer - (Cyber Team)

โญ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Sr. Data Engineer on the Cyber Team, lasting 12 months with a pay rate of $55 ON C2C. Candidates need 14+ years of experience, cybersecurity expertise, and familiarity with mobility clients. Onsite work is required in Bellevue or Overland Park.
๐ŸŒŽ - Country
United States
๐Ÿ’ฑ - Currency
$ USD
-
๐Ÿ’ฐ - Day rate
-
๐Ÿ—“๏ธ - Date discovered
August 29, 2025
๐Ÿ•’ - Project duration
More than 6 months
-
๐Ÿ๏ธ - Location type
On-site
-
๐Ÿ“„ - Contract type
Unknown
-
๐Ÿ”’ - Security clearance
Unknown
-
๐Ÿ“ - Location detailed
Overland Park, KS
-
๐Ÿง  - Skills detailed
#"ETL (Extract #Transform #Load)" #Data Governance #XML (eXtensible Markup Language) #Snowflake #Storage #JavaScript #NiFi (Apache NiFi) #Groovy #Documentation #Libraries #Logging #Cybersecurity #Security #Observability #Anomaly Detection #Monitoring #Data Integration #Kafka (Apache Kafka) #Compliance #Apache NiFi #Data Transformations #Python #Metadata #Normalization #JSON (JavaScript Object Notation) #Scripting #Strategy #Scala #Splunk #Data Security #Data Engineering #Data Ingestion
Role description
I have 8 open roles for Data Engineer These roles are 4 days onsite, so please send me local candidates. Nonโ€“local candidates are also acceptable, but the candidate should be ready to relocate to one of the locations below โ€“ no last-minute surprises. This is a cybersecurity role, so data security is more important. Look for candidates coming from Mobility clients, such as Tโ€“Mobile, Verizon, AT&T, etc. Job Description Sr. Data Engineer - (Cyber Team) Duration: 12 Months Rate 55 ON C2C Need 14+ Years of exp. Location: Bellevue HQ or Overland Park โ€“ onsite 4 days a week โ€ข Location 1 (Bellevue) โ€ข Location 2 (Overland Park, Kansas) Overview We are seeking eight Senior Data Engineers to lead efforts in orchestrating and transforming complex security telemetry data flows. These individuals will be responsible for high-level architecture, governance, and ensuring secure and reliable movement of data between systems, particularly for legacy and non-standard log sources. There are 100+ data sources including existing and new that are specific to Cyber Security workloads that are in-scope. These tasks will be performed on one or more data ingestion pipelines (Cribl, Vector, NiFi) Project/Initiative: SIEM Modernization Work Required โ€ข Lead the architecture, design, and implementation of scalable, modular, and reusable data flow pipelines using Cribl, Apache NiFi, Vector, and other open-source platforms, ensuring consistent ingestion strategies across a complex, multi-source telemetry environment. โ€ข Develop platform-agnostic ingestion frameworks and template-driven architectures to enable reusable ingestion patterns, supporting a variety of input types (e.g., syslog, Kafka, HTTP, Event Hubs, Blob Storage) and output destinations (e.g., Snowflake, Splunk, ADX, Log Analytics, Anvilogic). โ€ข Spearhead the creation and adoption of a schema normalization strategy, leveraging the Open Cybersecurity Schema Framework (OCSF), including field mapping, transformation templates, and schema validation logicโ€”designed to be portable across ingestion platforms. โ€ข Design and implement custom data transformations and enrichments using scripting languages such as Groovy, Python, or JavaScript, while enforcing robust governance and security controls (SSL/TLS, client authentication, input validation, logging). โ€ข Ensure full end-to-end traceability and lineage of data across the ingestion, transformation, and storage lifecycle, including metadata tagging, correlation IDs, and change tracking for forensic and audit readiness. โ€ข Collaborate with observability and platform teams to integrate pipeline-level health monitoring, transformation failure logging, and anomaly detection mechanisms. โ€ข Oversee and validate data integration efforts, ensuring high-fidelity delivery into downstream analytics platforms and data stores, with minimal data loss, duplication, or transformation drift. โ€ข Lead technical working sessions to evaluate and recommend best-fit technologies, tools, and practices for managing structured and unstructured security telemetry data at scale. โ€ข Implement data transformation logic including filtering, enrichment, dynamic routing, and format conversions (e.g., JSON โ†” CSV, XML, Logfmt) to prepare data for downstream analytics platforms. (100 plus sources of data) โ€ข Contribute to and maintain a centralized documentation repository, including ingestion patterns, transformation libraries, naming standards, schema definitions, data governance procedures, and platform-specific integration details. โ€ข Coordinate with security, analytics, and platform teams to understand use cases and ensure pipeline logic supports threat detection, compliance, and data analytics requirements. Skills: data,engineer,security,analytics,platforms,cyber