

Data Engineer with Machine Learning and Python / Contract W2
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with Machine Learning and Python, offering a W2 contract for 3 days onsite in Atlanta, GA. Key skills include data visualization, predictive analysis, statistical modeling, and proficiency in Python packages like Pandas and TensorFlow.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
May 29, 2025
π - Project duration
Unknown
-
ποΈ - Location type
On-site
-
π - Contract type
W2 Contractor
-
π - Security clearance
Unknown
-
π - Location detailed
Atlanta, GA
-
π§ - Skills detailed
#TensorFlow #Distributed Computing #NumPy #Pandas #SciPy #Matplotlib #Forecasting #Deep Learning #Visualization #SpaCy #Mathematics #PySpark #Data Manipulation #Spark (Apache Spark) #Classification #Python #Neural Networks #NLP (Natural Language Processing) #Apache Spark #Big Data #ML (Machine Learning) #Clustering #"ETL (Extract #Transform #Load)" #Data Engineering #Time Series
Role description
Dice is the leading career destination for tech experts at every stage of their careers. Our client, iTvorks Inc, is seeking the following. Apply via Dice today!
Data Engineer with Machine learning and Python
3days onsite Atlanta, GA.
Data Visualization: Creating visual representations of data to extract insights and communicate findings effectively.
Predictive Analysis: Analyzing historical data to make predictions about future outcomes.
Statistical Modeling: Building mathematical models to analyze relationships within data.
Data Preprocessing: Cleaning, transforming, and preparing data for analysis.
Clustering and Classification: Grouping data points into clusters or assigning them to predefined classes.
Time Series Analysis and Forecasting: Analyzing time-dependent data and making predictions about future values.
Machine Learning: Designing and implementing algorithms that enable computers to learn from data.
Deep Learning Algorithms: Utilizing neural networks with multiple layers to solve complex problems.
Python Packages:
Pandas: For data manipulation and analysis.
Numpy: For numerical computing with arrays and matrices.
Matplotlib: For creating static, interactive, and animated visualizations in Python.
Scikit-Learn: For machine learning algorithms and tools.
Scipy: For scientific computing and advanced mathematics.
Spacy: For natural language processing (NLP) tasks.
TensorFlow: For building and training deep learning models.
PySpark: For working with big data and distributed computing using Apache Spark with Python.
Dice is the leading career destination for tech experts at every stage of their careers. Our client, iTvorks Inc, is seeking the following. Apply via Dice today!
Data Engineer with Machine learning and Python
3days onsite Atlanta, GA.
Data Visualization: Creating visual representations of data to extract insights and communicate findings effectively.
Predictive Analysis: Analyzing historical data to make predictions about future outcomes.
Statistical Modeling: Building mathematical models to analyze relationships within data.
Data Preprocessing: Cleaning, transforming, and preparing data for analysis.
Clustering and Classification: Grouping data points into clusters or assigning them to predefined classes.
Time Series Analysis and Forecasting: Analyzing time-dependent data and making predictions about future values.
Machine Learning: Designing and implementing algorithms that enable computers to learn from data.
Deep Learning Algorithms: Utilizing neural networks with multiple layers to solve complex problems.
Python Packages:
Pandas: For data manipulation and analysis.
Numpy: For numerical computing with arrays and matrices.
Matplotlib: For creating static, interactive, and animated visualizations in Python.
Scikit-Learn: For machine learning algorithms and tools.
Scipy: For scientific computing and advanced mathematics.
Spacy: For natural language processing (NLP) tasks.
TensorFlow: For building and training deep learning models.
PySpark: For working with big data and distributed computing using Apache Spark with Python.