

RCI-GILD-16412 Data Product Owner/ Data Engineer (Financial Data/)- Foster City, CA (HYBRID)
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Product Owner/Data Engineer in Foster City, CA (Hybrid). Contract length is unspecified, with a pay rate of "unknown." Requires 4+ years in data pipelines, expertise in ETL, SQL, Python, and BI tools like Tableau.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
480
-
ποΈ - Date discovered
July 29, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Hybrid
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
San Mateo, CA
-
π§ - Skills detailed
#Data Science #SAP #Requirements Gathering #BI (Business Intelligence) #Databricks #Statistics #Data Pipeline #Scripting #"ETL (Extract #Transform #Load)" #Computer Science #Data Warehouse #Python #Visualization #Data Engineering #Data Integrity #Mathematics #Tableau #SQL (Structured Query Language)
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Job Title: Data Product Owner or Data Engineer
Location: Foster City 3 days a week onsite and 2 days remote
Unique Selling Point of this role: Join an emerging and dynamic team with broad purview into the company's financial data. Potential opportunity for FTE conversion.
Job Description:
β’ As part of the Risk Analytics group within Risk Governance and Audit (RGA), this role will own the delivery of financial data products, crucial to internal audit processes.
β’ You will play an active role in maturing RGAβs data and analytics capabilities by bridging technical excellence with an understanding of business processes and needs.
β’ You will own responsibilities end to end, including requirements gathering, architecture, development, and change management
β’ To ensure data integrity, you will prioritize architectural efficiency and adopt development standards to ensure a consistent experience for end users.
β’ As you reduce the operational overhead of data pipelines, you will use additional bandwidth to experiment with new data and analytical solutions, prioritizing value add to the team.
Required Years of Experience: 4+ years of work experience where the primary responsibility involves working with data pipelines (or 3+ years of experience with a Masterβs).
Top 3 Required Skill Sets:
Experience with ETL development and data pipeline orchestration (e.g., Databricks Workflows).
Expertise with data querying languages (e.g., SQL) and scripting languages (e.g., Python).
End-to-end expertise with BI visualization tools (e.g., Tableau).
Top 3 Nice to Have Skill Sets:
Familiarity with financial data on SAP.
2+ years of work experience with internal financial data.
Experience with Database or Data Warehouse internals.
Required Degree or Certification: Bachelor's degree in Computer Science, Computer Engineering, Data Science, Statistics, Mathematics, or a related field
Any Disqualifiers? Must have some proficiency in Python and SQL