Sage Maker Analyst

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Sage Maker Analyst, a 12+ month remote contract requiring 12 years of experience. Key skills include Amazon SageMaker, Redshift datasets, and Python. Familiarity with Agile/Scrum methodologies and large-scale datasets in a cloud environment is essential.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
August 5, 2025
πŸ•’ - Project duration
More than 6 months
-
🏝️ - Location type
Remote
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
United States
-
🧠 - Skills detailed
#"ETL (Extract #Transform #Load)" #AWS SageMaker #SageMaker #Scripting #Data Quality #Amazon Redshift #Scrum #AWS (Amazon Web Services) #Cloud #Data Manipulation #Datasets #Redshift #Python #Automation #Data Pipeline #Agile #SQL (Structured Query Language)
Role description
Dice is the leading career destination for tech experts at every stage of their careers. Our client, Digitive LLC, is seeking the following. Apply via Dice today! Sage Maker Analyst Remote Role 12+ Months contract Exp Level- 12 Years Req Skills- Amazon SageMaker, Redshift datasets, AWS SageMaker, SageMaker Studio/Data Wrangler/Pipelines Job Description: β€’ Design and implement data transformation workflows and recipes in Amazon SageMaker using Redshift datasets. β€’ Collaborate with actuaries to understand data requirements for actuarial models and reporting. β€’ Develop reusable templates and pipelines in SageMaker Studio or Data Wrangler for common actuarial data preparation tasks. β€’ Mentor and train business users on using SageMaker tools for self-service data preparation and exploration. β€’ Ensure data quality, consistency, and lineage across transformation workflows. β€’ Optimize performance of data pipelines for large-scale actuarial datasets. β€’ Document workflows, best practices, and user guides for internal stakeholders. β€’ Participate in sprint planning, demos, and retrospectives as part of the scrum team. β€’ experience working with AWS SageMaker, including SageMaker Studio, Data Wrangler, or Pipelines. β€’ Strong experience with Amazon Redshift and SQL-based data transformation. β€’ Proficiency in Python for data manipulation, automation, and scripting. β€’ Experience working with large-scale datasets in a cloud environment. β€’ Ability to explain technical concepts to non-technical users and support user enablement. β€’ Familiarity with Agile/Scrum methodologies.