Integrated Resources, Inc ( IRI )

Data Analytics

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Scientist 5 - Data Analytics, 100% remote in Beaverton, OR, for 8+ months at a competitive pay rate. Requires expert SQL, Python, Apache Spark, AWS, and DataBricks skills, with 10+ years of relevant experience.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
600
-
πŸ—“οΈ - Date
October 14, 2025
πŸ•’ - Duration
More than 6 months
-
🏝️ - Location
Remote
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Beaverton, OR
-
🧠 - Skills detailed
#Data Analysis #Libraries #SQL (Structured Query Language) #Data Architecture #Data Ingestion #Data Governance #Scala #Data Science #Data Quality #Documentation #SQL Queries #Integration Testing #Big Data #Apache Spark #Data Integrity #AWS (Amazon Web Services) #Datasets #Python #Data Mining #Automation #Strategy #Unit Testing #Data Processing #Spark (Apache Spark) #Metadata #Databricks #Visualization #"ETL (Extract #Transform #Load)"
Role description
Job Title: Data Scientist 5 - Data Analytics Location: Beaverton, OR (100% Remote) Duration: 8+ Months Project Description: β€’ Client’s Marketplace Coverage Correction Factors (MCCF) product is a data science solution designed to estimate total marketplace sales at a detailed product level, particularly in areas where Client β€’ does not have direct access to retailer point-of-sale (POS) data. β€’ The MCCF product leverages advanced modeling to β€œgross up” known sales data from mapped accounts and predict sales for unmapped accounts, helping Client gain a comprehensive view of marketplace performance. β€’ This project is essential for supporting business decision-making and optimizing Client’s marketplace strategy. Job Description: β€’ Designs, develops and programs methods, processes, and systems to consolidate and analyze structured/unstructured, diverse β€œbig data” sources to generate actionable insights and solutions for client services and product enhancement. β€’ Builds "products" for Analysis. β€’ Interacts with product and service teams to identify questions and issues for data analysis and experiments. β€’ Develops and codes software programs, algorithms and automated processes to cleanse, integrate and evaluate large datasets from multiple disparate sources. β€’ Identifies meaningful insights from large data and metadata sources; interprets and communicates insights and findings from analysis and experiments to product, service, and business managers. β€’ Lead to the accomplishment of key goals across consumer and commercial analytics functions. β€’ Work with key stakeholders to understand requirements, develop sustainable data solutions, and provide insights and recommendations. β€’ Document and communicate systems and analytics changes to the business, translating complex functionality into business relevant language. β€’ Validate key performance indicators and build queries to quantitatively measure business performance. β€’ Communicate with cross-functional teams to understand the business cause of data anomalies and outliers. β€’ Develop data governance standards from data ingestion to product dictionaries and documentation. β€’ Develop SQL queries and data visualizations to fulfill ad-hoc analysis requests and ongoing reporting needs leveraging standard query syntax. β€’ Organize and transform information into comprehensible structures. β€’ Use data to predict trends and perform statistical analysis. β€’ Use data mining to extract information from data sets and identify correlations and patterns. β€’ Monitor data quality and remove corrupt data. β€’ Evaluate and utilize new technologies, tools, and frameworks centered around high-volume data processing. β€’ Improve existing processes through automation and efficient workflows. Build and deliver scalable data and analytics solutions. β€’ Work independently and take initiative to identify, explore and solve problems. Design and build innovative data and analytics solutions to support key decisions. β€’ Support standard methodologies in reporting and analysis, such as, data integrity, unit testing, data quality control, system integration testing, modeling, validation, and documentation. β€’ Independently support end-to-end analysis to advise product strategy, data architecture and reporting decisions. Requirements: β€’ Must have expert-level SQL skills β€’ Python (standard libraries) β€’ Apache Spark β€’ AWS β€’ DataBricks Qualifications: Typically requires a Bachelors Degree and minimum of 10 years directly relevant experience; experience should include comprehensive experience as a business/process leader or industry expert. Note: One of the following alternatives may be accepted: - PhD or Law + 8 yrs; Masters + 9 yrs; Associates degree + 11 yrs; High School + 12 yrs.