Novia Infotech

Engineering Data Analyst

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an "Engineering Data Analyst" with a 6-month contract, based onsite in Mountain View, CA. Requires a Bachelor’s degree, 3-5 years in data analytics, strong SQL, Python/R proficiency, and experience with BI tools.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
May 5, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
On-site
-
📄 - Contract
1099 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Mountain View, CA
-
🧠 - Skills detailed
#"ETL (Extract #Transform #Load)" #Deployment #Airflow #Looker #Workday #Data Pipeline #Data Analysis #Data Manipulation #Documentation #Data Integration #Python #Automation #R #GIT #dbt (data build tool) #Datasets #Data Accuracy #Jira #Tableau #Leadership #SQL (Structured Query Language) #Visualization #Microsoft Power BI #BI (Business Intelligence)
Role description
Engineering Data Analyst Work Location: 645 Clyde Avenue, Mountain View, CA, USA Work Schedule: Fully onsite Length of Assignment: 6 months Education and years of Experience: 1. Bachelors Degree or higher in an applicable field 1. 3-5 years of experience in data analytics, business intelligence, or a related field Top Skills: • Strong SQL skills with experience querying complex, multi-source datasets • Proficiency in Python or R for data manipulation, analysis, and automation • Hands-on experience with BI/visualization tools (Tableau, Power BI, or similar) • Familiarity with engineering workflows and tools (JIRA, Git, CI/CD concepts) KEY RESPONSIBILITES/REQUIREMENTS: • We are seeking a Data Analyst contractor to support the Core Engineering organization. In this embedded role, you will partner directly with engineering leadership to build and maintain a comprehensive engineering • intelligence platform spanning delivery metrics, quality indicators, and team health analytics across our global development centers in the US, Bangalore, and Warsaw. • The ideal candidate combines strong technical skills (SQL, Python, Looker) with analytical rigor and clear communication. You will work across multiple data sources—including JIRA, HR systems, Git, and CI/CD pipelines—to surface actionable insights that drive operational decisions and team effectiveness. Key Responsibilities 1. Engineering Metrics & Dashboards • Design, build, and maintain dashboards for sprint velocity, cycle time, release frequency, and deployment success • Create automated reporting pipelines using Python to reduce manual data gathering • Establish standardized metrics definitions across US, Bangalore, and Warsaw teams 1. Quality Analytics • Track and visualize bug rates, test coverage, incident response times, and technical debt trends • Build early warning systems to identify quality issues before they impact delivery • Partner with engineering leads to define quality benchmarks and improvement targets 1. Team Health & Capacity Planning • Develop capacity planning models and utilization dashboards • Analyze hiring pipeline data to support workforce planning decisions • Monitor attrition patterns and provide insights to support retention efforts 1. Data Integration & Automation • Connect and normalize data from JIRA, HR systems (Workday), Git repositories, and CI/CD tools • Build reliable ETL processes to ensure data freshness and accuracy • Document data sources, transformations, and metric calculations 1. Stakeholder Communication • Deliver weekly/monthly reports to engineering leadership • Translate complex data findings into clear, actionable recommendations • Support quarterly business reviews with relevant engineering metrics Qualifications (Required) • 3-5 years of experience in data analytics, business intelligence, or a related field • Strong SQL skills with experience querying complex, multi-source datasets • Proficiency in Python or R for data manipulation, analysis, and automation • Hands-on experience with BI/visualization tools (Tableau, Power BI, or similar) • Familiarity with engineering workflows and tools (JIRA, Git, CI/CD concepts) • Ability to work independently and manage multiple priorities in a fast-paced environment • Excellent communication skills—can translate data into clear insights for technical and non-technical audiences (Preferred) • Experience with Looker (LookML knowledge a plus) • Experience with engineering metrics (velocity, cycle time, DORA metrics) • Exposure to HR/people analytics (capacity planning, attrition analysis) • Familiarity with data pipeline tools (dbt, Airflow, or similar) • Experience working with distributed/global teams across multiple timezones • Background in ad tech, media, or high-growth technology companies Location & Availability • US-based with ability to work Pacific timezone hours • Available for occasional overlap calls with Bangalore (morning) and Warsaw (afternoon) teams • Full-time availability (40 hours/week) for 6+ month engagement • Culture Fit • Operational Excellence – Systematic approach to problem-solving; attention to detail and data accuracy • Self-Direction – Proactively identifies gaps and opportunities without waiting to be asked • Global Mindset – Comfortable collaborating asynchronously with distributed teams across timezones • Clear Communication – Explains complex analysis simply; writes documentation others can follow • Continuous Improvement – Iterates on dashboards and processes based on user feedback