

AWS Data Scientist Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for an AWS Data Scientist Engineer in Boston, MA, hybrid (4 days onsite), with a contract length of "unknown" and a pay rate of "unknown." Requires 7+ years of AWS data engineering experience, expert Python skills, and familiarity with MLOps and containerization.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
August 13, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Hybrid
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Boston, MA
-
π§ - Skills detailed
#Monitoring #Storage #Cloud #Statistics #Data Processing #GitHub #ML (Machine Learning) #Distributed Computing #Apache Spark #Snowflake #Docker #NoSQL #Security #Data Science #Model Deployment #SageMaker #AWS SageMaker #Python #Automation #Data Engineering #AWS (Amazon Web Services) #Computer Science #Visualization #Datasets #AWS Glue #Spark (Apache Spark) #Scala #SQL (Structured Query Language) #Data Architecture #Data Lifecycle #Deployment #S3 (Amazon Simple Storage Service) #Version Control
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Role: AWS Data Science Engineer
Location: Boston, MA β Hybrid, 4 days onsite
About the Opportunity:
We are seeking an experienced engineer with a passion for solving complex data problems and operationalizing machine learning at scale. In this position, youβll design and maintain cloud-based data ecosystems, craft reliable pipelines, and develop models that deliver measurable business impact. This role blends software engineering, data architecture, and applied machine learning in a fast-paced, collaborative setting.
What Youβll Do:
β’ Create automated data workflows to collect, process, and integrate information from multiple sources using AWS Glue and related services.
β’ Use Python and distributed processing tools like Apache Spark to prepare datasets, engineer features, and construct predictive algorithms.
β’ Architect and optimize storage and retrieval strategies leveraging Snowflake and Amazon S3 for both structured and semi/unstructured data.
β’ Apply AWS SageMaker to train, test, deploy, and track machine learning models in production.
β’ Collaborate with analysts, product teams, and business leaders to translate requirements into scalable data science solutions.
β’ Conduct exploratory analysis, generate visualizations, and communicate technical findings to non-technical audiences.
β’ Implement best practices for security, governance, and quality control across the data lifecycle.
β’ Automate MLOps workflows with GitHub Actions to enable streamlined CI/CD for model deployment.
β’ Build and maintain Docker images to ensure consistent, portable deployments in various environments.
Your Background:
β’ Degree in Computer Science, Data Science, Statistics, or a related technical field.
β’ 7+ yearsβ experience with AWS-based data engineering and analytics, including Glue, S3, SageMaker, and Snowflake.
β’ Expert-level skills in Python for data processing, automation, and machine learning.
β’ Proficiency with distributed computing platforms such as Spark for large-scale workloads.
β’ Knowledge of supervised and unsupervised ML methods, performance evaluation, and tuning strategies.
β’ Familiarity with SQL, NoSQL, and data warehousing best practices.
β’ Experience delivering ML models into live environments and monitoring their accuracy and stability over time.
β’ Understanding of MLOps concepts, version control for models, and deployment automation.
β’ Hands-on experience creating CI/CD workflows using GitHub Actions and working with containerized environments via Docker.
β’ Ability to translate technical concepts into actionable recommendations for stakeholders.
β’ Certifications in AWS, Spark, or other relevant technologies are a plus.