Tech Observer

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer in Seattle, WA, requiring 4-5 days in-office. The contract involves designing data architecture with Databricks and Unity Catalog, managing ETL pipelines, and implementing data governance. Key skills include data security, compliance, and machine learning.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
520
-
πŸ—“οΈ - Date
March 14, 2026
πŸ•’ - Duration
Unknown
-
🏝️ - Location
On-site
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Seattle, WA
-
🧠 - Skills detailed
#Metadata #Data Governance #Strategy #Data Pipeline #Databricks #Scala #"ETL (Extract #Transform #Load)" #Data Security #Compliance #Data Architecture #Batch #Data Lineage #Business Analysis #Collibra #Data Management #Data Science #Security #Data Engineering #ML (Machine Learning) #Data Integrity
Role description
Data Platform Engineer with Databricks and Unity Catalog Location - Seattle, WA (Only Local Candidates) 4 -5days office is must β€’ Design and implement scalable data architecture using technologies such as Databricks, Unity Catalog, Privacera, and Collibra. β€’ Develop, optimize, and manage ETL data pipelines within Databricks, ensuring high levels of data integrity, reliability, and performance. β€’ Design and maintain robust data models and schemas while integrating governance frameworks through Unity Catalog and Collibra. β€’ Implement data security, access control, and regulatory compliance using capabilities provided by Databricks and Privacera. β€’ Establish and enforce a comprehensive data governance strategy, including metadata management, data lineage, quality standards, and governance policies. β€’ Operationalize machine learning models in both batch and real-time data pipelines while ensuring alignment with governance and compliance frameworks. β€’ Collaborate with cross-functional teamsβ€”including data scientists, engineers, and business analystsβ€”to translate complex business requirements into scalable and efficient data solutions.