

Harvey Nash
Data Analyst (Engineering Data Analyst)
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Analyst (Engineering Data Analyst) in Mountain View, CA, for 6 months at a pay rate of "X". Requires 3-5 years of experience, strong SQL and Python skills, and familiarity with JIRA and Git.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
464
-
ποΈ - Date
May 2, 2026
π - Duration
More than 6 months
-
ποΈ - Location
On-site
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
Mountain View, CA
-
π§ - Skills detailed
#Deployment #Looker #Databases #Project Management #R #Computer Science #Data Integration #Workday #dbt (data build tool) #Python #Data Pipeline #Jira #Microsoft Power BI #Documentation #Tableau #Leadership #GIT #Visualization #Automation #SQL (Structured Query Language) #Database Management #BI (Business Intelligence) #SharePoint #Data Accuracy #Datasets #Data Analysis #Airflow #"ETL (Extract #Transform #Load)" #Data Manipulation
Role description
Data Analyst 3(Engineering Data Analyst)
Mountain View, CA (Fully onsite)
6 Months
KEY RESPONSIBILITES/REQUIREMENTS:
β’ We are seeking a Data Analyst contractor to support the Core Engineering organization. In this embedded role, you will partner directly with engineering leadership to build and maintain a comprehensive engineering
β’ intelligence platform spanning delivery metrics, quality indicators, and team health analytics across our global development centers in the US, Bangalore, and Warsaw.
β’ The ideal candidate combines strong technical skills (SQL, Python, Looker) with analytical rigor and clear communication.
β’ You will work across multiple data sourcesβincluding JIRA, HR systems, Git, and CI/CD pipelinesβto surface actionable insights that drive operational decisions and team effectiveness.
Key Responsibilities
Engineering Metrics & Dashboards
β’ Design, build, and maintain dashboards for sprint velocity, cycle time, release frequency, and deployment success
β’ Create automated reporting pipelines using Python to reduce manual data gathering
β’ Establish standardized metrics definitions across US, Bangalore, and Warsaw teams
Quality Analytics
β’ Track and visualize bug rates, test coverage, incident response times, and technical debt trends
β’ Build early warning systems to identify quality issues before they impact delivery
β’ Partner with engineering leads to define quality benchmarks and improvement targets
Team Health & Capacity Planning
β’ Develop capacity planning models and utilization dashboards
β’ Analyze hiring pipeline data to support workforce planning decisions
β’ Monitor attrition patterns and provide insights to support retention efforts
Data Integration & Automation
β’ Connect and normalize data from JIRA, HR systems (Workday), Git repositories, and CI/CD tools
β’ Build reliable ETL processes to ensure data freshness and accuracy
β’ Document data sources, transformations, and metric calculations
Stakeholder Communication
β’ Deliver weekly/monthly reports to engineering leadership
β’ Translate complex data findings into clear, actionable recommendations
β’ Support quarterly business reviews with relevant engineering metrics.
Education and years of Experience:
β’ Bachelors Degree or higher in an applicable field
β’ 3-5 years of experience in data analytics, business intelligence, or a related field
Qualifications (Required)
β’ 3-5 years of experience in data analytics, business intelligence, or a related field
β’ Strong SQL skills with experience querying complex, multi-source datasets
β’ Proficiency in Python or R for data manipulation, analysis, and automation
β’ Hands-on experience with BI/visualization tools (Tableau, Power BI, or similar)
β’ Familiarity with engineering workflows and tools (JIRA, Git, CI/CD concepts)
β’ Ability to work independently and manage multiple priorities in a fast-paced environment
β’ Excellent communication skillsβcan translate data into clear insights for technical and non-technical audiences
(Preferred)
β’ Experience with Looker (LookML knowledge a plus)
β’ Experience with engineering metrics (velocity, cycle time, DORA metrics)
β’ Exposure to HR/people analytics (capacity planning, attrition analysis)
β’ Familiarity with data pipeline tools (dbt, Airflow, or similar)
β’ Experience working with distributed/global teams across multiple timezones
β’ Background in ad tech, media, or high-growth technology companies
Location & Availability
β’ US-based with ability to work Pacific timezone hours
β’ Available for occasional overlap calls with Bangalore (morning) and Warsaw (afternoon) teams
β’ Full-time availability (40 hours/week) for 6+ month engagement
Culture Fit
β’ Operational Excellence β Systematic approach to problem-solving; attention to detail and data accuracy
β’ Self-Direction β Proactively identifies gaps and opportunities without waiting to be asked
β’ Global Mindset β Comfortable collaborating asynchronously with distributed teams across timezones
β’ Clear Communication β Explains complex analysis simply; writes documentation others can follow
β’ Continuous Improvement β Iterates on dashboards and processes based on user feedback
Top Skills:
β’ Strong SQL skills with experience querying complex, multi-source datasets
β’ Proficiency in Python or R for data manipulation, analysis, and automation
β’ Hands-on experience with BI/visualization tools (Tableau, Power BI, or similar)
β’ Familiarity with engineering workflows and tools (JIRA, Git, CI/CD concepts).
Beeline Summary:
β’ The main function of the Data Analyst is to provide business intelligence support and supporting areas by means of both repeatable and ad hoc reporting delivery reports (charts, graphs, tables, etc) that enable informed business decisions.
Job Responsibilities:
β’ Analyzes performance of process activities, identifies problem areas, and presents findings in clear, concise charts, graphs, tables, or summaries.
β’ Establishes standardized methods of recording, tracking and reporting on activity.
β’ Design, implement, automate and maintain large scale enterprise data ETL processes.
β’ Modify existing databases and database management systems and/or direct programmers and analysts to make changes.
Skills:
β’ Ability to work as part of a team, as well as work independently or with minimal direction.
β’ Excellent written, presentation, and verbal communication skills.
β’ Demonstrated knowledge of one or more key information service standards such as SDLC, ITIL, QA/testing, Project Management, Six Sigma, etc.
β’ Strong PC skills including knowledge of Microsoft SharePoint.
Education/Experience:
β’ Bachelor's degree in a technical field such as computer science, computer engineering or related field required.
β’ 5-7 years of experience required.
β’ Process certification, such as, Six Sigma, CBPP, BPM, ISO 20000, ITIL, CMMI.
Data Analyst 3(Engineering Data Analyst)
Mountain View, CA (Fully onsite)
6 Months
KEY RESPONSIBILITES/REQUIREMENTS:
β’ We are seeking a Data Analyst contractor to support the Core Engineering organization. In this embedded role, you will partner directly with engineering leadership to build and maintain a comprehensive engineering
β’ intelligence platform spanning delivery metrics, quality indicators, and team health analytics across our global development centers in the US, Bangalore, and Warsaw.
β’ The ideal candidate combines strong technical skills (SQL, Python, Looker) with analytical rigor and clear communication.
β’ You will work across multiple data sourcesβincluding JIRA, HR systems, Git, and CI/CD pipelinesβto surface actionable insights that drive operational decisions and team effectiveness.
Key Responsibilities
Engineering Metrics & Dashboards
β’ Design, build, and maintain dashboards for sprint velocity, cycle time, release frequency, and deployment success
β’ Create automated reporting pipelines using Python to reduce manual data gathering
β’ Establish standardized metrics definitions across US, Bangalore, and Warsaw teams
Quality Analytics
β’ Track and visualize bug rates, test coverage, incident response times, and technical debt trends
β’ Build early warning systems to identify quality issues before they impact delivery
β’ Partner with engineering leads to define quality benchmarks and improvement targets
Team Health & Capacity Planning
β’ Develop capacity planning models and utilization dashboards
β’ Analyze hiring pipeline data to support workforce planning decisions
β’ Monitor attrition patterns and provide insights to support retention efforts
Data Integration & Automation
β’ Connect and normalize data from JIRA, HR systems (Workday), Git repositories, and CI/CD tools
β’ Build reliable ETL processes to ensure data freshness and accuracy
β’ Document data sources, transformations, and metric calculations
Stakeholder Communication
β’ Deliver weekly/monthly reports to engineering leadership
β’ Translate complex data findings into clear, actionable recommendations
β’ Support quarterly business reviews with relevant engineering metrics.
Education and years of Experience:
β’ Bachelors Degree or higher in an applicable field
β’ 3-5 years of experience in data analytics, business intelligence, or a related field
Qualifications (Required)
β’ 3-5 years of experience in data analytics, business intelligence, or a related field
β’ Strong SQL skills with experience querying complex, multi-source datasets
β’ Proficiency in Python or R for data manipulation, analysis, and automation
β’ Hands-on experience with BI/visualization tools (Tableau, Power BI, or similar)
β’ Familiarity with engineering workflows and tools (JIRA, Git, CI/CD concepts)
β’ Ability to work independently and manage multiple priorities in a fast-paced environment
β’ Excellent communication skillsβcan translate data into clear insights for technical and non-technical audiences
(Preferred)
β’ Experience with Looker (LookML knowledge a plus)
β’ Experience with engineering metrics (velocity, cycle time, DORA metrics)
β’ Exposure to HR/people analytics (capacity planning, attrition analysis)
β’ Familiarity with data pipeline tools (dbt, Airflow, or similar)
β’ Experience working with distributed/global teams across multiple timezones
β’ Background in ad tech, media, or high-growth technology companies
Location & Availability
β’ US-based with ability to work Pacific timezone hours
β’ Available for occasional overlap calls with Bangalore (morning) and Warsaw (afternoon) teams
β’ Full-time availability (40 hours/week) for 6+ month engagement
Culture Fit
β’ Operational Excellence β Systematic approach to problem-solving; attention to detail and data accuracy
β’ Self-Direction β Proactively identifies gaps and opportunities without waiting to be asked
β’ Global Mindset β Comfortable collaborating asynchronously with distributed teams across timezones
β’ Clear Communication β Explains complex analysis simply; writes documentation others can follow
β’ Continuous Improvement β Iterates on dashboards and processes based on user feedback
Top Skills:
β’ Strong SQL skills with experience querying complex, multi-source datasets
β’ Proficiency in Python or R for data manipulation, analysis, and automation
β’ Hands-on experience with BI/visualization tools (Tableau, Power BI, or similar)
β’ Familiarity with engineering workflows and tools (JIRA, Git, CI/CD concepts).
Beeline Summary:
β’ The main function of the Data Analyst is to provide business intelligence support and supporting areas by means of both repeatable and ad hoc reporting delivery reports (charts, graphs, tables, etc) that enable informed business decisions.
Job Responsibilities:
β’ Analyzes performance of process activities, identifies problem areas, and presents findings in clear, concise charts, graphs, tables, or summaries.
β’ Establishes standardized methods of recording, tracking and reporting on activity.
β’ Design, implement, automate and maintain large scale enterprise data ETL processes.
β’ Modify existing databases and database management systems and/or direct programmers and analysts to make changes.
Skills:
β’ Ability to work as part of a team, as well as work independently or with minimal direction.
β’ Excellent written, presentation, and verbal communication skills.
β’ Demonstrated knowledge of one or more key information service standards such as SDLC, ITIL, QA/testing, Project Management, Six Sigma, etc.
β’ Strong PC skills including knowledge of Microsoft SharePoint.
Education/Experience:
β’ Bachelor's degree in a technical field such as computer science, computer engineering or related field required.
β’ 5-7 years of experience required.
β’ Process certification, such as, Six Sigma, CBPP, BPM, ISO 20000, ITIL, CMMI.






