LanceSoft, Inc.

Senior Data Analyst

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Analyst with a 10-month contract, paying $60.00/hr to $80.00/hr. Located in Burbank, CA (hybrid), it requires 7-10 years of experience in data modeling, SQL, and analytics platforms like Snowflake and Databricks.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
640
-
🗓️ - Date
February 13, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Burbank, CA
-
🧠 - Skills detailed
#Data Processing #Predictive Modeling #Stories #AWS Glue #Data Lineage #Datasets #Agile #Semantic Models #Documentation #Automation #Data Profiling #Data Vault #Data Modeling #Snowflake #Databricks #SQL (Structured Query Language) #Informatica #dbt (data build tool) #AI (Artificial Intelligence) #Forecasting #Cloud #AWS (Amazon Web Services) #Data Quality #Vault #"ETL (Extract #Transform #Load)" #Data Analysis #Data Governance #Data Architecture #ML (Machine Learning) #Metadata
Role description
Pay Rate: $60.00/hr to $80.00/hr on W2 Duration: 10 Months Location: Burbank, CA - Hybrid • We are looking for Data Analysts who are proactive, agile thinkers that thrive on tackling complex, user-centric challenges. • Highly supportive team with diverse skill sets and backgrounds. Job Responsibilities / Typical Day in the Role Lead Data Analysis for Application & Platform Alignment • Translate product features and user stories into well-defined data model requirements that support application workflows and downstream analytics. • Partner with Application Designers and Engineers to profile, assess, and validate source data, ensuring it meets both functional and non-functional requirements. • Collaborate with the Senior Data Architect to align application data models with canonical and semantic models across domains. Ensure Long-Term Analytical & AI Enablement • Design data structures and pipelines that serve both operational application needs and future analytical/AI use cases. • Anticipate and define data capture, transformation, and enrichment requirements to support predictive modeling, forecasting, and advanced analytics. • Recommend optimizations that improve data quality, timeliness, and completeness for decision-making. Governance, Quality, and Documentation • Partner with enterprise data governance teams to apply metadata, lineage, and access control standards. • Define and execute data validation, profiling, and reconciliation processes to ensure trusted results. • Maintain documentation of data definitions, mapping specifications, and lineage diagrams for both applications and analytical datasets. Cross-Pod Collaboration and Mentorship • Lead cross-pod workshops to resolve semantic conflicts, promote reusable data assets, and ensure consistent application of standards. • Mentor junior analysts and support teams in data discovery, mapping, and quality assessment best practices. • Represent the Platform Pod’s data perspective in architecture boards, product councils, and design reviews. Must Have Skills / Requirements Proven expertise in data modeling techniques (relational, dimensional, wide-table for ML, data vault) and mapping business processes to data structures. • 7-10+ years of experience Strong proficiency in SQL, data profiling, and transformation tools (e.g., dbt, Informatica, AWS Glue). • 7-10+ years of experience Familiarity with distributed data processing and analytics platforms (e.g., Snowflake, Databricks, AWS-native analytics stack). • 7-10+ years of experience Experience Translating product features and user stories into well-defined data model requirements • 7-10+ years of experience Technology Requirements: 1. Proven expertise in data modeling techniques (relational, dimensional, wide-table for ML, data vault) and mapping business processes to data structures. 1. Strong proficiency in SQL, data profiling, and transformation tools (e.g., dbt, Informatica, AWS Glue). 1. Experience in data quality frameworks, validation automation, and reconciliation methods. 1. Familiarity with distributed data processing and analytics platforms (e.g., Snowflake, Databricks, AWS-native analytics stack). 1. Demonstrated ability to align cross-team requirements into unified, reusable, and governed data solutions. 1. Experience Translating product features and user stories into well-defined data model requirements Hybrid schedule – 3 days on-site. Years experience: • 7+ years of experience as a Data Analyst, Data Modeler, or similar role in enterprise-scale, cloud-based environments.