

VeeAR Projects Inc.
Data Engineering Leader
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineering Leader with a contract length of "unknown," offering a pay rate of "unknown." Key skills include experience with Snowflake, ETL processes, and data warehousing. Strong communication with engineering teams is essential.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
April 4, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Oakland, CA
-
🧠 - Skills detailed
#Microsoft Power BI #Databricks #Scrum #BigQuery #Databases #Data Engineering #Snowflake #Redshift #Informatica #Big Data #Data Warehouse #BI (Business Intelligence) #Visualization #"ETL (Extract #Transform #Load)"
Role description
Core technical expectations:
• Strong data ecosystem background: experience leading data products or data warehousing initiatives.
• Solid understanding of schemas, databases, and ETL (Extract, Transform, Load) processes; no hands-on coding expected but must be able to engage deeply with engineers and "know what they are talking about.”
• Primary data warehouse platform: Snowflake.
• Broader big data background acceptable if not purely Snowflake (e.g., Databricks, BigQuery, Redshift).
• Current ETL tool: Informatica; hands-on Informatica expertise is not required.
• Analytics/visualization not in scope; Power BI knowledge is a nice-to-have, not mandatory.
Role and scope:
• Titles vary across industry: product owner (PO), project manager, scrum master, TPM; at PG&E, similar roles may be labeled "product managers.”
• Not seeking a pure scrum master or a typical external-facing PO who only writes requirements.
• Role must interface with both business and engineers; expected to work directly with engineering teams.
• Focus is on the enterprise data platform and data engineering; not an analytics/visualization role.
Core technical expectations
• Strong data ecosystem background: experience leading data products or data warehousing initiatives.
• Solid understanding of schemas, databases, and ETL (Extract, Transform, Load) processes; no hands-on coding expected but must be able to engage deeply with engineers and "know what they are talking about.”
• Primary data warehouse platform: Snowflake.
• Broader big data background acceptable if not purely Snowflake (e.g., Databricks, BigQuery, Redshift).
• Current ETL tool: Informatica; hands-on Informatica expertise is not required.
• Analytics/visualization not in scope; Power BI knowledge is a nice-to-have, not mandatory.
Core technical expectations:
• Strong data ecosystem background: experience leading data products or data warehousing initiatives.
• Solid understanding of schemas, databases, and ETL (Extract, Transform, Load) processes; no hands-on coding expected but must be able to engage deeply with engineers and "know what they are talking about.”
• Primary data warehouse platform: Snowflake.
• Broader big data background acceptable if not purely Snowflake (e.g., Databricks, BigQuery, Redshift).
• Current ETL tool: Informatica; hands-on Informatica expertise is not required.
• Analytics/visualization not in scope; Power BI knowledge is a nice-to-have, not mandatory.
Role and scope:
• Titles vary across industry: product owner (PO), project manager, scrum master, TPM; at PG&E, similar roles may be labeled "product managers.”
• Not seeking a pure scrum master or a typical external-facing PO who only writes requirements.
• Role must interface with both business and engineers; expected to work directly with engineering teams.
• Focus is on the enterprise data platform and data engineering; not an analytics/visualization role.
Core technical expectations
• Strong data ecosystem background: experience leading data products or data warehousing initiatives.
• Solid understanding of schemas, databases, and ETL (Extract, Transform, Load) processes; no hands-on coding expected but must be able to engage deeply with engineers and "know what they are talking about.”
• Primary data warehouse platform: Snowflake.
• Broader big data background acceptable if not purely Snowflake (e.g., Databricks, BigQuery, Redshift).
• Current ETL tool: Informatica; hands-on Informatica expertise is not required.
• Analytics/visualization not in scope; Power BI knowledge is a nice-to-have, not mandatory.






