

Data Platforms Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Platforms Engineer with a contract length of "unknown," offering a pay rate of $100-112/hr. Required skills include data engineering, ETL, and cloud data lakes. Experience with physical security data and data visualization tools is highly desirable.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
896
-
ποΈ - Date discovered
July 2, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Unknown
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
United States
-
π§ - Skills detailed
#Azure #Spark (Apache Spark) #AWS (Amazon Web Services) #Data Pipeline #Data Privacy #BI (Business Intelligence) #Computer Science #Data Engineering #Airflow #Documentation #GDPR (General Data Protection Regulation) #Python #"ETL (Extract #Transform #Load)" #Data Governance #Data Integration #Metadata #Data Extraction #Compliance #Looker #SQL (Structured Query Language) #Visualization #Data Lake #Tableau #Cloud #Microsoft Power BI #GCP (Google Cloud Platform) #SaaS (Software as a Service) #Security #Data Enrichment
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Enterprise Security Technology Data Platforms Engineer
Key Responsibilities
Data Source Identification & Assessment:
β’ Identify, document, and assess data sources from various physical security systems (e.g., access control, video management, visitor management, incident management). Evaluate data structures, volume, quality, and integration options (APIs, direct DB access, file exports).
Data Extraction & Ingestion:
β’ Design, develop, and implement robust data pipelines for extracting and ingesting data from diverse security systems into the data lake, ensuring secure and efficient data transfer.
Data Transformation & Harmonization:
β’ Clean, normalize, and standardize disparate data formats. Harmonize data models to enable unified querying and analysis across multiple systems.
Data Enrichment:
β’ Integrate and enrich security data with contextual information from internal and external sources (e.g., HR data, organizational hierarchies, location metadata).
Data Lake Integration:
β’ Ensure seamless integration of processed and enriched data into our existing data lake architecture. Maintain documentation on data schemas, lineage, and quality metrics.
β’ Documentation & Knowledge Transfer:
β’ Produce comprehensive documentation for all developed pipelines, transformation logic, and data models. Deliver knowledge transfer sessions to internal teams for ongoing maintenance and enhancement.
Data Visualization & Reporting:
β’ Collaborate with security program stakeholders to design and develop dashboards and visualizations, identifying key metrics and creating intuitive, actionable reports.
Data Privacy & Security:
β’ Adhere to data privacy regulations (e.g., GDPR, CCPA) and our internal data governance policies. Implement data minimization, anonymization, and robust access controls.
Qualifications
β’ Bachelorβs or Masterβs degree in Computer Science, Data Engineering, Information Systems, or related field (or equivalent experience).
β’ 5+ years of experience in data engineering, with a focus on ETL, data integration, or analytics platforms.
β’ Experience working with physical security or operational technology data sources is highly desirable.
β’ Experience pulling data from SaaS systems via APIs.
β’ Familiarity with the Genetec Security Center platform is desirable.
β’ Strong proficiency in building data pipelines and ETL workflows (e.g., using Python, SQL, Spark, Airflow, etc.).
β’ Hands-on experience with cloud data lake environments (e.g., AWS, Azure, GCP).
β’ Proven ability to design and implement data models for complex, heterogeneous data sources.
β’ Strong understanding of data privacy, security, and compliance best practices.
β’ Experience with data visualization tools (e.g., Tableau, Power BI, Looker) is a plus.
β’ Excellent documentation, communication, and collaboration skills.
Pay Rate Range:
β’ $100-112/hr.