Data Analyst

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Ops Lead Contractor based in the SF Bay Area, with a contract length of unspecified duration and a pay rate of "unknown." Requires 5-7 years of experience in data operations, proficiency in AWS Iceberg, Snowflake, and ETL/ELT pipeline development.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
-
🗓️ - Date discovered
August 29, 2025
🕒 - Project duration
Unknown
-
🏝️ - Location type
On-site
-
📄 - Contract type
Unknown
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
San Francisco Bay Area
-
🧠 - Skills detailed
#Scala #"ETL (Extract #Transform #Load)" #Databases #Airflow #Data Governance #Microsoft Power BI #AWS (Amazon Web Services) #Data Pipeline #Snowflake #Data Quality #Data Engineering #Python #BI (Business Intelligence) #Data Analysis #Documentation #Lean
Role description
Day 1 onsite role Ideally based in SF Bay Area. Experience Level: 5-7 years Reviews, evaluates, designs company database[s]. Identifies data sources, constructs data decomposition diagrams, provides data flow diagrams and documents the process. Writes codes for database quality including stored procedures. May require a bachelor's degree in a related area and 0-2 years of experience in the field or in a related area. Has knowledge of commonly-used concepts, practices, and procedures within a particular field. Reports to a project leader or manager. With strategic move to AWS Iceberg and Snowflake, Data Ops engineers capture, transform, and curate clickstream data for both early ad hoc analysis and robust production reporting. Collaboration is core to our culture: we work closely with UX Research and Service Design teams to blend quantitative and qualitative insights, enhancing digital user experiences. Position Overview We are seeking a Data Ops Lead Contractor to optimize and mature our data operations for CX Analytics digital data assets. This role balances hands-on technical delivery with strong communication and collaboration. Your primary focus is ensuring the proper documentation, definition, and ongoing refinement of the data models that underpin our key business measures and KPIs. You’ll partner with CX Analytics team members, UX Research, Service Design, engineering, and analytics teams to translate business needs and research findings into effective, trusted data assets—while also overseeing the development, quality, and flow of web analytics data pipelines from AWS Iceberg through to Snowflake and BI tools. Key Responsibilities • Lead the documentation, definition, and governance of data models that support product- and business-critical KPIs • Work with CX Analytics team members to gather and translate analytics requirements into robust, actionable data products • Collaborate extensively with UX Research and Service Design teams to align data models and insights, integrating both qualitative and quantitative journey data into analytics assets • Guide and support the development of scalable ETL/ELT pipelines (in Airflow/Python/Snowflake procedures) that move and transform web clickstream data • Promote discovery and ad hoc analysis within Iceberg environments and ensure seamless reporting enablement from Snowflake and other BI tools • Champion clear communication, knowledge sharing, and documentation practices that expand access to meaningful analytics for all team members • Foster a “data as a product” culture that emphasizes structure, rigor, innovation, transparency, and the highest standards of data governance and quality Experience / Typical Qualifications • 5–7 years of experience in data operations, data analytics, data engineering, or related roles within large-scale digital or customer analytics environments • Demonstrated ability to create structure and rigor in documenting data models, managing analytic KPIs, and governing data assets • Strong communication and collaboration skills, with a proven track record of effective partnership across business, product, technical, and UX Research teams • Experience designing and governing enterprise-scale data products and analytic assets (with experience in AWS Iceberg, Snowflake, and Power BI a plus) • Proficiency in ETL/ELT pipeline development, with the ability to guide, standardize, and improve technical workflows and data quality best practices • Proven ability to enable ad hoc and self-service analysis by documenting clear, reusable data assets • Exceptional problem-solving and requirements-gathering skills, translating ambiguous business needs into structured data models • Able to work effectively as a collaborative, high-impact resource within a lean, cross-functional team