DSM-H Consulting

Analyst/Developer (Data Operations, Data Management)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an Analyst/Developer (Data Operations, Data Management) with a contract length of "unknown," offering a pay rate of "unknown." Located in Chicago or Peoria, it requires 5+ years of experience, proficiency in Python, SQL, and AWS services, plus data management expertise.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
January 28, 2026
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Hybrid
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Chicago, IL
-
🧠 - Skills detailed
#SageMaker #SQS (Simple Queue Service) #Python #SNS (Simple Notification Service) #RDS (Amazon Relational Database Service) #IAM (Identity and Access Management) #Lambda (AWS Lambda) #Cloud #Automation #Microservices #Monitoring #Data Engineering #DynamoDB #Data Triage #Debugging #AWS (Amazon Web Services) #Jenkins #Aurora #Data Pipeline #S3 (Amazon Simple Storage Service) #Data Management #SQL (Structured Query Language)
Role description
Typical task breakdown: - Identify, investigate, and obtain resolution commitments for platform and data issues to maintain and improve the quality and performance of assigned digital product data. - Issue Identification: Reports in all forms from customers, dealers, industry representatives, and subsidiaries. - Issue Investigation: Statistical analysis, data triage, and infrastructure problem-solving. - Issue Resolution: Identify root causes, create SageMaker scripts to fix data, and perform break/fix tasks on data pipeline code. - Develop scripts and automation tools to better detect and correct data issues. - Develop monitoring and alerting capabilities to proactively detect data issues. - Work directly on complex application and technical problem identification and resolution, including responding to off-shift and weekend support calls. - Communicate with end users and internal customers to help direct the development, debugging, and testing of application software for accuracy, integrity, interoperability, and completeness. - Employee is also responsible for performing other job duties as assigned by CLIENT management from time to time. Interaction with team: - Liaise with designers, engineers, and support teams to improve data pipeline performance & reliability. Work environment: Chicago or Peoria office ( hybrid schedule 2x days - could go 5 days in office in the future) Education & Experience Required: - Degree required with 5+ years’ experience in this capacity. - Masters degree with 4+ years’ experience in this capacity. - No degree but technical certifications with 8+ years exp in this capacity is welcomed as well. Required Technical Skills (Required) - 2-4 years of python and SQL experience Experience with development and delivery of microservices using serverless - AWS services (S3, Cloudwatch, RDS, Aurora, DynamoDB, Lambda, SNS, SQS, Kinesis, IAM) - Background in data management, data engineering, or data operations - Familiarity with ADO pipeline framework, or CICD experience (Jenkins)