Jobs via Dice

Data Modeler with Data Security

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Data Modeler with Data Security, offering a contract in Chicago, IL. Requires expertise in data architecture, security datasets, and governance. Preferred certifications include CISSP or PMP. Minimum seven years of relevant experience is needed.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
October 2, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Chicago, IL
-
🧠 - Skills detailed
#Vulnerability Management #Data Analysis #Monitoring #Data Security #Consulting #Data Strategy #Normalization #Data Architecture #Tableau #Data Lake #MDM (Master Data Management) #Leadership #Cloud #Security #Computer Science #Data Quality #Migration #Strategy #Datasets
Role description
Dice is the leading career destination for tech experts at every stage of their careers. Our client, iFusion Inc., is seeking the following. Apply via Dice today! Location : Chicago IL Hybrid Role Duties: Contract role accountable for architecting and operationalizing the design phase of client s Unified Vulnerability Management and Security Data Lake initiative. This role will define the canonical data model, quality and normalization patterns (e.g., Bronze/Silver/Gold), and integration approach across scanners, asset sources, and remediation workflows to deliver a single, normalized view of the vulnerability landscape. The contractor partners with Security Architecture, Vulnerability Management, AppSec, and Enterprise Data teams to evaluate platform options (e.g., Axonius vs. broader data platform), establish vendoragnostic data portability, and ready the program for 2026 delivery stages.Individual contributor role providing guidance on directing, evaluating, developing, implementing, communicating, operating, monitoring and maintaining department-wide Information Security data strategy • Lead the Decision/Design Phase of project: Produce target-state architecture, integration patterns, and a delivery roadmap that align to the phases outlined in the UVM + Security Data Lake business case. • Canonical Data Model & Normalization: Define the vulnerability/asset/exposure canonical model and the Bronze Silver Gold (medallion) approach for ingestion, conformance, and consumption. • Data Quality & Governance: Establish data quality rules, metrics, and SLAs across sources; define controls for lineage, cataloging, and business definitions. • Tooling Strategy (Vendoragnostic posture): Evaluate the role of Axonius vs. alternative/adjacent data services; document integration points and trade • InterTool Mapping & Ingestion: Design and prototype ingestion/mapping for key systems and define the normalization schema and harmonized identifiers. • Prioritization Methodology: Specify the scoring framework that fuses external severity with internal business risk factors. • Workflow & Ticketing Integration: Define integration patterns to ServiceNow for ticket creation, assignment, and status telemetry. • Standards Alignment: Ensure architecture and data flows comply with the CNA Vulnerability Management Standard and Security Architecture guidance. • Reporting & Consumption: Specify flexible reporting for technical and leadership stakeholders; define certified semantic layers and downstream access. • Vendor & Platform Due Diligence: Contribute to market scans and structured evaluations with explicit success criteria and migration considerations. • Knowledge Transfer: Create a runbook and handoff plan to operations and engineering teams for 2026 delivery stages. Skills: • Data Architecture & Modeling: Expert in canonical modeling, medallion/Delta patterns, data contracts, and MDM/ER techniques. • Security Data Domain: Working fluency with vulnerability/asset/security datasets and how they drive CTEM outcomes. • Integration & Pipelines: Handson with ingestion frameworks and schema evolution for scanners and asset sources. • Governance & Quality: Proven ability to define DQ rules, metrics, lineage, cataloging, and common definitions. • Workflow/ITSM: Familiarity with ServiceNow data models and ticket orchestration. • Communication & Influence: Ability to align diverse stakeholders around a single architecture and phased plan. Education: • Bachelor's degree in Computer Science, or related discipline, or equivalent work experience. • Typically a minimum of seven years of technical experience in the data analytics, security vulnerability analysis, remediation management, data architecture, or security data strategy. Preferred: • Experience in consulting or technical account management • CISSP, CCSP, PMP, Network+ and/or Security+ • Experience with Axonius, ArmorCode, TenableOne, Brinqa, Kenna Security or similar asset aggregation platforms • Familiarity with Vulnerability and Remediation Management data analysis • Cloud Data Platform experience (e.g. Big Query, PowerBI, Tableau)