Great Value Hiring

Data Scientist (Kaggle-Grandmaster)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Scientist (Kaggle Grandmaster) with a contract length of over 6 months, offering $56/hr. Key skills include Python, machine learning, and statistical methods. Remote work is available, requiring 3–5+ years of data science experience.
🌎 - Country
United Kingdom
💱 - Currency
$ USD
-
💰 - Day rate
448
-
🗓️ - Date
December 4, 2025
🕒 - Duration
More than 6 months
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United Kingdom
-
🧠 - Skills detailed
#Python #ML (Machine Learning) #Spark (Apache Spark) #Data Science #SQL (Structured Query Language) #BigQuery #Scala #Programming #Forecasting #Pandas #Leadership #Deployment #"ETL (Extract #Transform #Load)" #Snowflake #Documentation #Datasets #Data Analysis #Big Data #AI (Artificial Intelligence) #NLP (Natural Language Processing) #NumPy
Role description
Data Scientist (Kaggle-Grandmaster) [$56/hr] As an independent member of the referral program of a leading organization (Mercor), we are posting to seek a highly skilled Data Scientist with a Kaggle Grandmaster profile. In this role, you will transform complex datasets into actionable insights, high-performing models, and scalable analytical workflows. You will work closely with researchers and engineers to design rigorous experiments, build advanced statistical and ML models, and develop data-driven frameworks to support product and research decisions. What You’ll Do • Analyze large, complex datasets to uncover patterns, develop insights, and inform modeling direction • Build predictive models, statistical analyses, and machine learning pipelines across tabular, time-series, NLP, or multimodal data • Design and implement robust validation strategies, experiment frameworks, and analytical methodologies • Develop automated data workflows, feature pipelines, and reproducible research environments • Conduct exploratory data analysis (EDA), hypothesis testing, and model-driven investigations to support research and product teams • Translate modeling outcomes into clear recommendations for engineering, product, and leadership teams • Collaborate with ML engineers to productionize models and ensure data workflows operate reliably at scale • Present findings through well-structured dashboards, reports, and documentation Qualifications • Kaggle Competitions Grandmaster or comparable achievement: top-tier rankings, multiple medals, or exceptional competition performance • 3–5+ years of experience in data science or applied analytics • Strong proficiency in Python and data tools (Pandas, NumPy, Polars, scikit-learn, etc.) • Experience building ML models end-to-end: feature engineering, training, evaluation, and deployment • Solid understanding of statistical methods, experiment design, and causal or quasi-experimental analysis • Familiarity with modern data stacks: SQL, distributed datasets, dashboards, and experiment tracking tools • Excellent communication skills with the ability to clearly present analytical insights Nice to Have • Strong contributions across multiple Kaggle tracks (Notebooks, Datasets, Discussions, Code) • Experience in an AI lab, fintech, product analytics, or ML-focused organization • Knowledge of LLMs, embeddings, and modern ML techniques for text, images, and multimodal data • Experience working with big data ecosystems (Spark, Ray, Snowflake, BigQuery, etc.) • Familiarity with statistical modeling frameworks such as Bayesian methods or probabilistic programming Why Join • Gain exposure to cutting-edge AI research workflows, collaborating closely with data scientists, ML engineers, and research leaders shaping next-generation analytical systems • Work on high-impact data science challenges while experimenting with advanced modeling strategies, new analytical methods, and competition-grade validation techniques • Collaborate with world-class AI labs and technical teams operating at the frontier of forecasting, experimentation, tabular ML, and multimodal analytics • Flexible engagement options (30-40 hrs/week or full-time) — ideal for data scientists eager to apply Kaggle-level problem-solving to real-world, production analytics • Fully remote and globally flexible work structure — optimized for deep analytical work, async collaboration, and high-output research