Lead Data Engineer with Data Security

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Lead Data Engineer with Data Security, lasting 18 months, based in Glendale, CA (Hybrid). Pay is $70-$80/hour C2C. Requires expertise in Snowflake, Tableau, Fivetran, and compliance with GDPR, CCPA, HIPAA regulations.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
640
🗓️ - Date discovered
April 22, 2025
🕒 - Project duration
More than 6 months
🏝️ - Location type
Hybrid
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
Glendale, CA
🧠 - Skills detailed
#Apache Spark #Compliance #Airflow #Security #Computer Science #Datasets #Storage #Databricks #Cloud #Delta Lake #"ACID (Atomicity #Consistency #Isolation #Durability)" #Data Integration #Data Processing #Fivetran #Spark (Apache Spark) #Data Orchestration #Visualization #Batch #"ETL (Extract #Transform #Load)" #Data Encryption #Kafka (Apache Kafka) #Data Privacy #Data Security #Data Pipeline #Scala #GDPR (General Data Protection Regulation) #Snowflake #Tableau #Data Engineering
Role description

Job Title: Sr. Data Security Engineer

Location: Glendale, CA (Hybrid – 2 days onsite/week, must live within 30 miles; up to 4 days onsite if required)

Duration: 18 Months

Interview Mode: 2 Rounds via Video

Pay Rate: $70-$80/hour C2C (All-Inclusive)

Job Description:

The client is seeking a highly skilled Sr. Data Security Engineer to join their Data Engineering team. This role will be instrumental in securing data infrastructure and pipelines across a modern data ecosystem. The ideal candidate will have hands-on experience with key data tools and a strong background in end-to-end data security and governance.

Key Responsibilities:

   • Implement and maintain data security practices across data platforms including Snowflake, Tableau, Fivetran, and Immuta.

   • Manage the complete data security lifecycle: data creation, ingestion, transformation, storage, and access control.

   • Ensure compliance with internal security frameworks and external regulatory requirements (GDPR, CCPA, HIPAA).

   • Collaborate with data engineering and platform teams to secure ETL/ELT processes.

   • Design and enforce user access controls, data masking, and encryption mechanisms across tools and platforms.

   • Utilize Airflow for orchestration of secure, scheduled data workflows.

   • Implement real-time data streaming and processing solutions using Kafka and Apache Spark.

   • Build and scale secure, performant data pipelines and analytics platforms using Databricks and Delta Lake.

   • Conduct regular security audits and vulnerability assessments of data systems.

Required Technical Skills:

Data Security Tools Experience:

   • Snowflake: Strong experience with cloud data warehousing, data encryption, access controls, and role-based permissions.

   • Tableau: Familiarity with secure data visualization, implementing row-level security, and controlling access to sensitive dashboards/data sources.

   • Fivetran: Knowledge of secure data integration pipelines, encryption-in-transit, and secure connectors.

   • Immuta: Hands-on experience with data policy enforcement, dynamic data masking, and automated access controls.

End-to-End Data Security Ownership:

   • Proven track record of owning data security across the full lifecycle — ingestion to consumption.

   • Strong understanding of compliance, governance, and audit controls in enterprise data platforms.

Tech Stack:

   • Airflow: Experience building and scheduling DAGs, with a focus on data orchestration security.

   • Spark: Experience with batch and streaming data processing using Apache Spark in secure environments.

   • Databricks: Hands-on experience building scalable, secure pipelines using Databricks on top of Spark.

   • Delta Lake: Familiarity with ACID-compliant storage and managing secure, large-scale datasets.

   • Kafka: Real-time data streaming experience with Kafka, including secure stream processing and high-throughput data flow.

Preferred Qualifications:

   • Experience in large-scale data environments in the entertainment or media industry.

   • Familiarity with data privacy regulations (GDPR, HIPAA, CCPA).

   • Excellent collaboration and communication skills.

   • Bachelor’s or Master’s degree in Computer Science, Information Security, or related field.