eStaffing Inc.

Applied Data Scientist

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an Applied Data Scientist on a 7-month contract, remote in Indiana, paying $41.68/hr. Requires a bachelor's degree or equivalent experience, proficiency in R/Python/SQL, and 2+ years in data manipulation, machine learning, and building analytic applications.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
328
-
πŸ—“οΈ - Date
April 10, 2026
πŸ•’ - Duration
More than 6 months
-
🏝️ - Location
Remote
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Indiana, United States
-
🧠 - Skills detailed
#Neural Networks #Docker #Kubernetes #Mathematics #Data Pipeline #"ETL (Extract #Transform #Load)" #Data Engineering #Neo4J #Flask #Version Control #R #Automation #Regression #Spatial Data #Data Manipulation #GitHub #Decision Tree Learning #Data Mining #Computer Science #Statistics #BI (Business Intelligence) #Data Aggregation #ML (Machine Learning) #Databases #Visualization #Data Science #Streamlit #Clustering #SQL (Structured Query Language) #Python
Role description
Job Title: Data Scientist (798995) Job Type: Contract-Remote Location: Indiana Job Duration: 7 months + 1 year Extension Shift: 8am to 5pm Pay rate: $41.68/hr β€’ The position is remote but only candidates located in Indiana will be considered for this role β€’ Job Description: The Data Scientist plays a key role by creating in-depth analyses by leveraging data science techniques, methods, and interpretations to convey accurate, meaningful insights that empower IDOH and other partners to make informed decisions in support of the health, safety, and well-being of the citizens of Indiana. Essential Duties/Responsibilities: The essential functions of this role are as follows: β€’ Provides mentoring and guidance to other, more junior Data Scientists and staff β€’ Support the development of internal web applications or interactive tools that help operationalize and deliver data science products across the organization. β€’ Acts as mentor and DS SME for other more junior DS users across the state and key external stakeholders β€’ Engages with key business stakeholders on large projects and initiatives to understand their analytical and operational challenges and translate these needs into data solutions β€’ Assesses the structure, content, and quality of the data through examination of source systems and data samples β€’ Collaborates with other DS professionals, data engineers, and BI professionals around data/table structures to optimize architecture, ETL procedures, dashboards, and other self-service needs β€’ Prioritizes requirements and create rapid prototypes and minimally viable products for end users β€’ Looks for opportunities to improve current processes or find efficiencies by applying industry best practices as a DS professional β€’ Mines and analyzes data from state databases to drive insights into problems and efficiency in processes while maintaining the standards of organizational excellence β€’ Interprets data and from multiple sources using a variety of analytical techniques, ranging from simple data aggregation to data mining, to more complex statistical methodologies β€’ Uses and monitors the input for code repositories like GitHub for code version control β€’ Provides end user education for interpretation of business data β€’ Tests and evaluates data solutions as it relates to upgrades to existing software β€’ Provides maintenance and support for existing data solutions for the agency β€’ Documents and communicates technical specifications to ensure that proper techniques and standards are incorporated into deliverables and understood by the end users The job profile is not designed to cover or contain a comprehensive listing of activities, duties or responsibilities that are required of the employee. Other duties, responsibilities and activities may change or be assigned at any time with or without notice. Job Requirements: The ideal candidate in this role should minimally have either: β€’ A bachelor’s degree with course work in analytics, statistics, computer science, informatics, and/or mathematics and 2+ years of experience and passion for leveraging data to drive significant organizational impact, or a master’s degree with course work in analytics, statistics, computer science, informatics, and/or mathematics, or 4+ years of experience and passion for leveraging data to drive significant organizational impact. β€’ Exp w/Shiny, Dash, Flask or Streamlit to build user-facing interfaces, connect to backend data pipelines, and deploy lightweight analytic applications- 2-year experience required β€’ Experience connecting to backend data pipelines, and deploy lightweight analytic applications- 2-year experience required β€’ Experience using (R, Python, SQL, etc.) to manipulate and draw insights from large data sets as well develop software for automation- 2-year experience required β€’ Advanced statistical techniques and concepts (regression, properties of distributions, statistical tests and proper usage, etc.)- 2-year experience required β€’ Experience with data manipulation to include cleansing, standardizing, and transforming- 2-year experience required β€’ Broad knowledge of a variety of machine learning techniques (clustering, decision tree learning, artificial neural networks, etc.)-2-year experience required β€’ Strong understanding of relational and dimensional databases, theories, principles, and practices-2-year experience required β€’ Experience in leading workshops or training sessions with a user community a plus β€’ Exceptional analytical, conceptual, and problem-solving abilities β€’ Experience generating and distributing visualizations to a broad range of audiences β€’ Must inhabit strategic thinking β€’ Strong written/oral communication and presentation skills β€’ Resourceful self-starter and highly motivated team player β€’ Able to perform well in a fast-paced environment β€’ Effective communicator and someone who enjoys getting to understand nuances of a problem β€’ Experience with the following concepts or tools (geocoding and geospatial data, shiny, network diagraming, neo4j, Docker, Kubernetes)