

Data Engineer or Data Product Owner
β - Featured Role | Apply direct with Data Freelance Hub
This role is a contract-to-hire Data Engineer or Data Product Owner in Foster City, CA (Hybrid - 3 days onsite). Requires 3+ years of ETL development, SQL, Python, and BI tools like Tableau. Familiarity with SAP financial data is preferred.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
July 31, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Hybrid
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Foster City, CA
-
π§ - Skills detailed
#Data Warehouse #Databricks #"ETL (Extract #Transform #Load)" #SQL (Structured Query Language) #Scripting #Data Engineering #Tableau #Data Integrity #SAP #Visualization #BI (Business Intelligence) #Python #Requirements Gathering #Data Pipeline
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Dice is the leading career destination for tech experts at every stage of their careers. Our client, Ab Ovo Inc, is seeking the following. Apply via Dice today!
Our direct Pharmaceutical client is looking for a Data Engineer or Data Product Owner at Foster City, CA (Hybrid - 3 days onsite).
This is contract-to-hire opportunity.
Top 3 Required Skill Sets:
β’ Experience with ETL development and data pipeline orchestration (e.g., Databricks Workflows).
β’ Experience with data querying languages (e.g., SQL) and scripting languages (e.g., Python).
β’ Experience with BI visualization tools (e.g., Tableau).
Job Description:
β’ You will play an active role in maturing RGA s data and analytics capabilities by bridging technical excellence with an understanding of business processes and needs.
β’ You will own responsibilities end to end, including requirements gathering, architecture, development, and change management.
β’ To ensure data integrity, you will prioritize architectural efficiency and adopt development standards to ensure a consistent experience for end users.
β’ As you reduce the operational overhead of data pipelines, you will use additional bandwidth to experiment with new data and analytical solutions, prioritizing value add to the team.
Required Years of Experience:
β’ 3+ years of work experience where the primary responsibility involves working with data pipelines.
β’ Experience with ETL development and data pipeline orchestration (e.g., Databricks Workflows).
β’ Experience with data querying languages (e.g., SQL) and scripting languages (e.g., Python).
β’ Experience with BI visualization tools (e.g., Tableau).
β’ Familiarity with financial data on SAP.
β’ Nice to have: Experience with Database or Data Warehouse internals.