Smart IT Frame LLC

Data Scientist

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an AI Data Scientist in Charlotte, NC (Hybrid – 2 days), with a 6-month contract-to-hire. Key skills include Python, machine learning, cloud platforms (AWS, Azure, GCP), and MLOps tools. Experience with LLMs and data engineering is required.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
December 16, 2025
🕒 - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Charlotte, NC
-
🧠 - Skills detailed
#Data Quality #Data Engineering #Microsoft Power BI #Tableau #Programming #MLflow #ML (Machine Learning) #Cloud #Compliance #Data Pipeline #Distributed Computing #Plotly #Scala #Data Science #Deployment #Matplotlib #SageMaker #NoSQL #AWS (Amazon Web Services) #Databases #Langchain #SQL (Structured Query Language) #Data Privacy #NLP (Natural Language Processing) #Python #Pandas #Kubernetes #Hugging Face #Azure #NumPy #TensorFlow #AI (Artificial Intelligence) #AWS SageMaker #Deep Learning #Airflow #GCP (Google Cloud Platform) #Visualization #Docker #BI (Business Intelligence) #PyTorch
Role description
Dear Candidates, We have a contract to hire role with one of our clients. Kindly find the below details and let me know if you are interested. Job role: AI Data Scientist Job Location: Charlotte, NC (Hybrid – 2 days) Duration: 6 Months CTH Job Description: We are looking for an experienced Data Scientist who can design develop and deploy advanced AIML models and data driven solutions The ideal candidate will have strong expertise in machine learning deep learning LLMs and cloud based data platforms along with hands on experience in data engineering vector databases and end to end deployment Key Responsibilities Model Development Optimization o Build and finetune MLDL models including LLMs for NLP tasks o Implement RAG Retrieval Augmented Generation and Agentic AI workflows for enterprise use cases o Optimize models for performance scalability and cost efficiency Data Engineering Management o Design and maintain data pipelines for structured and unstructured data o Work with vector databases eg Pinecone Milvus Weaviate for semantic search and embeddings o Ensure data quality governance and compliance Deployment MLOps o Deploy models using Docker Kubernetes and cloudnative services AWS Azure GCP o Implement CICD pipelines for ML workflows and automated retraining o Monitor model performance and drift using MLOps tools Collaboration Communication o Work closely with architects engineers and business stakeholders to translate requirements into solutions o Present insights and recommendations using data visualization tools Required Technical Skills • Programming Python Pandas NumPy Scikitlearn PyTorch TensorFlow SQL • AIML Frameworks LangChain Hugging Face LlamaIndex • Cloud Platforms AWS SageMaker Azure ML GCP Vertex AI • Databases SQLNoSQL Vector DBs Pinecone Milvus Weaviate • Deployment Docker Kubernetes Helm • MLOps Tools MLflow Kubeflow Airflow • Visualization Power BI Tableau Matplotlib Plotly Preferred Qualifications • Experience with LLM finetuning and prompt engineering • Knowledge of distributed computing and GPU acceleration • Familiarity with data privacy regulations and responsible AI principles