The Brixton Group

Senior Quantitative Analyst

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Quantitative Analyst with a contract length of 6+ months, 100% remote. Key skills include Python, Jupyter, SQL, Monte Carlo simulations, and cloud cost modeling (AWS, Azure, GCP). Experience in FP&A and data validation is required.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
April 7, 2026
πŸ•’ - Duration
More than 6 months
-
🏝️ - Location
Remote
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
San Jose, CA
-
🧠 - Skills detailed
#Plotly #Data Engineering #Forecasting #Matplotlib #Visualization #Pandas #Datasets #Storage #Azure #Version Control #dbt (data build tool) #Data Pipeline #NumPy #Airflow #"ETL (Extract #Transform #Load)" #AWS (Amazon Web Services) #Data Extraction #Cloud #Jupyter #Python #GCP (Google Cloud Platform) #SQL (Structured Query Language)
Role description
Duration: 6+ months Location: 100% REMOTE Requirements: β€’ Strong Python + Jupyter Notebook experience (heavy Pandas usage) β€’ Experience converting complex Excel models into Python (formula tracing, validation) β€’ Hands-on Monte Carlo simulation (P10/P50/P90, distributions, scenario modeling) β€’ Experience with cloud cost modeling (AWS, Azure, GCP - compute, storage, networking) β€’ Strong SQL for data extraction and analysis β€’ Experience building lightweight data pipelines (APIs, files, DB queries) β€’ FP&A-style forecasting, variance analysis, and driver-based modeling β€’ Experience with data validation, auditability, and versioning of model runs β€’ Ability to explain outputs and variance drivers to non-technical stakeholders Key Responsibilities: β€’ Rebuild Excel-based cloud cost model into Python (Jupyter notebooks) β€’ Create automated data pipelines and clean Pandas datasets for modeling β€’ Build parameterized forecasting engine across cloud cost drivers β€’ Implement Monte Carlo simulations for probabilistic forecasting β€’ Develop variance analysis (actual vs forecast, forecast vs forecast) β€’ Deliver sensitivity analysis, scenario modeling, and driver ranking β€’ Build notebook-based visualizations (waterfalls, fan charts, etc.) β€’ Ensure full auditability and version control of model inputs/outputs β€’ Partner with FinOps, FP&A, Data Engineering, and Infrastructure teams Nice to Have: β€’ Experience in FinOps, cloud economics, or cost modeling β€’ Familiarity with Airflow, Prefect, dbt, or scheduling tools β€’ Experience with Plotly, Matplotlib, or Bokeh β€’ Exposure to PyMC, NumPyro, or probabilistic modeling tools 26-00383