

Committed Coaches
Data Analyst
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Analyst on a contract basis, paying $25.00 - $35.00 per hour, with expected hours of 20 – 40 per week. Key skills include SQL, Looker Studio, and BI tools; Python experience is a plus.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
280
-
🗓️ - Date
November 21, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Remote
-
🧠 - Skills detailed
#Data Engineering #Data Analysis #Python #Looker #Complex Queries #"ETL (Extract #Transform #Load)" #Redshift #Airflow #AWS (Amazon Web Services) #Data Warehouse #Leadership #Data Quality #Documentation #Data Exploration #Cloud #S3 (Amazon Simple Storage Service) #Data Cleaning #Strategy #ML (Machine Learning) #SQL (Structured Query Language) #Automation #Data Manipulation #BI (Business Intelligence) #dbt (data build tool) #Datasets
Role description
About the Role
We’re looking for a Data Analyst who loves turning messy, scattered data into clear business insights. You’ll build and maintain dashboards, analyze trends, support our product and operations teams, and help drive data-backed decision-making across the company.
This role is ideal for someone who is strong in SQL, comfortable in modern BI tools (Looker Studio, Superset), and excited about owning the full life cycle of data exploration—from understanding requirements to delivering polished insights. Bonus points if you have light Python or pipeline experience and want to grow those skills.
What You’ll Do
Analytics & Insights
Translate business questions into structured analysis and clear recommendations.
Extract insights from large datasets to support product, operations, marketing, and leadership.
Build ad-hoc analyses and recurring reports that drive decisions and highlight trends.
Dashboards & Reporting
Own the creation and maintenance of dashboards in Looker Studio, Superset, and similar BI tools.
Improve existing dashboards (performance, usability, data quality).
Track KPIs and ensure stakeholders have the data they need, in the format they need.
Data Ownership & Collaboration
Take full ownership of analysis tasks: gather context, reach out to stakeholders, and drive clarity.
Stay productive and unblock yourself by proactively asking questions and seeking information.
Participate in sprint planning and weekly syncs to stay aligned with the engineering team.
Work primarily within the collaboration window of 8:30 AM – 5:00 PM PST for cross-functional alignment.
Data Infrastructure (Bonus: Not Required)
Use Python to support data cleaning, automation, and small transformation workflows.
Work with engineering to maintain or explore ETL pipelines (Airflow is a plus).
Partner with data engineering on dataset structure, warehouse improvements, and Redshift optimization.
What We’re Looking For
Required Skills
Fluent in SQL: able to write complex queries, debug joins, optimize performance.
Strong experience with BI tools such as Looker Studio and Superset.
Ability to turn ambiguous requirements into concrete analyses.
Strong analytical thinking and an instinct for measuring business impact.
Excellent English communication skills—you can explain data clearly and concisely.
Comfortable working fast, iterating, and delivering value in short cycles.
Nice-to-Have Skills (Not Required)
Python (for data manipulation, automation, or light ETL).
Experience with Airflow or similar workflow orchestration tools.
Experience with AWS Redshift or other cloud data warehouses.
Familiarity with dbt, S3, or modern data stacks.
How We Work (Team Philosophy)
Ownership: You drive your work forward. You gather context, find missing information, and solve problems without waiting for instructions.
Speed With Care: Fast iteration compounds value, but we prioritize avoiding high-impact production issues.
Stay Unblocked: If you’re stuck, you proactively ask questions, seek help, or move to the next task.
Collaboration: We help each other succeed and stay aligned through weekly sprints, async updates, and clear communication.
Balanced Autonomy: We value responsiveness during working hours, but trust you to manage your schedule and deliver great results.
Typical Responsibilities in This Role
Analyze product usage patterns and customer data to uncover trends.
Build dashboards for revenue, funnel metrics, operations, retention, or performance tracking.
Partner with product managers, engineers, customer teams, and executives to guide decisions.
Improve data definitions, metric consistency, and documentation.
Validate data quality and report anomalies quickly.
Package insights into clear narratives that lead to action.
Why Join Us
High-impact role: your analyses directly shape product and business strategy.
Collaborative engineering culture with strong ownership and autonomy.
Opportunity to expand into data engineering, analytics engineering, or machine learning if interested.
Work closely with senior leadership and shape how we use data company-wide.
Job Type: Contract
Pay: $25.00 - $35.00 per hour
Expected hours: 20 – 40 per week
Work Location: Remote
About the Role
We’re looking for a Data Analyst who loves turning messy, scattered data into clear business insights. You’ll build and maintain dashboards, analyze trends, support our product and operations teams, and help drive data-backed decision-making across the company.
This role is ideal for someone who is strong in SQL, comfortable in modern BI tools (Looker Studio, Superset), and excited about owning the full life cycle of data exploration—from understanding requirements to delivering polished insights. Bonus points if you have light Python or pipeline experience and want to grow those skills.
What You’ll Do
Analytics & Insights
Translate business questions into structured analysis and clear recommendations.
Extract insights from large datasets to support product, operations, marketing, and leadership.
Build ad-hoc analyses and recurring reports that drive decisions and highlight trends.
Dashboards & Reporting
Own the creation and maintenance of dashboards in Looker Studio, Superset, and similar BI tools.
Improve existing dashboards (performance, usability, data quality).
Track KPIs and ensure stakeholders have the data they need, in the format they need.
Data Ownership & Collaboration
Take full ownership of analysis tasks: gather context, reach out to stakeholders, and drive clarity.
Stay productive and unblock yourself by proactively asking questions and seeking information.
Participate in sprint planning and weekly syncs to stay aligned with the engineering team.
Work primarily within the collaboration window of 8:30 AM – 5:00 PM PST for cross-functional alignment.
Data Infrastructure (Bonus: Not Required)
Use Python to support data cleaning, automation, and small transformation workflows.
Work with engineering to maintain or explore ETL pipelines (Airflow is a plus).
Partner with data engineering on dataset structure, warehouse improvements, and Redshift optimization.
What We’re Looking For
Required Skills
Fluent in SQL: able to write complex queries, debug joins, optimize performance.
Strong experience with BI tools such as Looker Studio and Superset.
Ability to turn ambiguous requirements into concrete analyses.
Strong analytical thinking and an instinct for measuring business impact.
Excellent English communication skills—you can explain data clearly and concisely.
Comfortable working fast, iterating, and delivering value in short cycles.
Nice-to-Have Skills (Not Required)
Python (for data manipulation, automation, or light ETL).
Experience with Airflow or similar workflow orchestration tools.
Experience with AWS Redshift or other cloud data warehouses.
Familiarity with dbt, S3, or modern data stacks.
How We Work (Team Philosophy)
Ownership: You drive your work forward. You gather context, find missing information, and solve problems without waiting for instructions.
Speed With Care: Fast iteration compounds value, but we prioritize avoiding high-impact production issues.
Stay Unblocked: If you’re stuck, you proactively ask questions, seek help, or move to the next task.
Collaboration: We help each other succeed and stay aligned through weekly sprints, async updates, and clear communication.
Balanced Autonomy: We value responsiveness during working hours, but trust you to manage your schedule and deliver great results.
Typical Responsibilities in This Role
Analyze product usage patterns and customer data to uncover trends.
Build dashboards for revenue, funnel metrics, operations, retention, or performance tracking.
Partner with product managers, engineers, customer teams, and executives to guide decisions.
Improve data definitions, metric consistency, and documentation.
Validate data quality and report anomalies quickly.
Package insights into clear narratives that lead to action.
Why Join Us
High-impact role: your analyses directly shape product and business strategy.
Collaborative engineering culture with strong ownership and autonomy.
Opportunity to expand into data engineering, analytics engineering, or machine learning if interested.
Work closely with senior leadership and shape how we use data company-wide.
Job Type: Contract
Pay: $25.00 - $35.00 per hour
Expected hours: 20 – 40 per week
Work Location: Remote






