Jobs via Dice

Data Scientist

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a remote Data Scientist contract position requiring proficiency in Python, Linux, and cloud environments (AWS, Azure, GCP). Key skills include AI/ML expertise, data integration, and experience with Kubernetes and CI/CD. A BS/MS in Computer Science or Data Science with relevant experience is required.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
March 6, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Remote
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Databases #TensorFlow #Docker #AI (Artificial Intelligence) #Programming #Libraries #Data Pipeline #Data Science #PyTorch #NLP (Natural Language Processing) #Kubernetes #Cloud #Azure #DevOps #SpaCy #AWS (Amazon Web Services) #Data Integration #API (Application Programming Interface) #NLTK (Natural Language Toolkit) #Computer Science #Data Analysis #Linux #Keras #GCP (Google Cloud Platform) #Redshift #Automation #ML (Machine Learning) #Python #Visualization
Role description
Dice is the leading career destination for tech experts at every stage of their careers. Our client, Hirekeyz, is seeking the following. Apply via Dice today! Position: Data Scientist Location : Remote Employment Type: Contract Job Type: W-2 Job Description: Key Technical Skills: • Must have - Proficiency Python Programming, Strong knowledge of Linux operating systems • Extensive knowledge of AI supervised, unsupervised, GenAI LLM and Python libraries to perform data analysis, building advanced machine learning models and data integrations with source systems • Design APIs for performance, real-time applications, scale, ease of use and governance automation. • Training ML models to effectively communicate complex data insights through clear and informative visualizations • Knowledge and experience in Cloud Environments (AWS, Azure, Google Cloud Platform) • Experience in building and deploying end to end data pipelines using cloud and docker services, experience with Kubernetes • Strong knowledge of CI/CD, test automation and DevOps • Ability to integrate data, sourcing data from several Sources including databases, files, API and Server logs. • Drive requirements for AI related projects, work with internal customers and with ServiceNow and Salesforce Product Managers for features planning, prioritization and implementation • Strong quantitative and statistical skills • Apply GenAI, ML and DL concepts, NLP in real world applications to solve business problems • Develop application-specific interfaces that leverage GenAI capabilities and LLMs • Familiarity with common frameworks such as PyTorch, TensorFlow, Keras, NLTK, or spaCy • Familiarity and experience with Databases like Postgres, Redshift, MSSQL Soft Skills: • Proven ability to deliver timely results in professional work environments. • Strong communication skills. • Facility in building relationships with key work partners. • Self-motivation and accountability for delivering on work commitments. • Ability to create informative collaboration materials for teammates, cross-functional partners, and users. Formal Education: • BS in Computer Science or Data Science, with 5 years experience • MS in Computer Science or Data Science, with 2-3 years experience