Signify Technology

ML Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a senior Python ML Engineer on a 3-month contract, remote within U.S. time zones. Requires 7+ years in ML application development, SQL migration to Redshift, and expertise in testing and containerization tools.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
November 21, 2025
🕒 - Duration
3 to 6 months
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Chicago, IL
-
🧠 - Skills detailed
#Python #Deployment #"ETL (Extract #Transform #Load)" #Snowflake #Pytest #Redshift #Airflow #Data Warehouse #Docker #Data Quality #Documentation #Data Access #Cloud #Migration #Model Validation #Amazon Redshift #BigQuery #ML (Machine Learning) #Data Ingestion #SQL (Structured Query Language) #Automation #Databases #Kubernetes #Logging #Automated Testing #dbt (data build tool) #Data Pipeline
Role description
URGENT ML contractor Term: 3 months Openings: 1 Location: Remote (U.S. time zones; must overlap ?4 hours with U.S. Central Time) Start: ASAP Overview We are seeking a senior Python ML engineer to lead the migration of multiple analytics and machine learning applications from a legacy SQL environment to Amazon Redshift. In addition, the codebases need to be standardized on a modern Python architecture that supports best practices for deployment, testing, and maintainability. This role combines hands-on work with mentoring, ensuring sustainable practices across the team. Key Responsibilities • Review existing Python applications to map dependencies, data access patterns, configuration, and deployment processes. • Transition data pipelines to pull from Redshift while eliminating legacy SQL dependencies. • Standardize code organization, packaging, configuration, logging, and containerization according to a modern reference framework. • Develop unit and integration tests for data ingestion, transformations, and model outputs, integrating them into CI/CD pipelines. • Document code, add clear type hints, improve readability, and produce operational runbooks for all applications. • Update deployment pipelines using containerization and orchestration tools to ensure repeatable, automated releases. • Provide guidance and training to engineers on modern development standards, testing practices, and Redshift integration. Expected Deliverables • Week 1: Conduct application inventory, define architecture targets, and begin updating the first application (data layer, tests, documentation). • Week 2: Complete first app migration, validate in a staging environment, and begin work on a second application. • Week 3+: Continue migrating ~2 applications per week, including code standardization, testing, documentation, and deployment automation, until all applications are fully transitioned. Required Skills And Experience • 7+ years of professional experience developing production ML or analytics applications in Python. • Strong knowledge of Python project structures, dependency management, and packaging tools (pip, poetry, conda). • Experience migrating applications from legacy SQL databases to cloud data warehouses (Redshift, Snowflake, BigQuery), ensuring data consistency. • Proficiency in SQL and experience optimizing queries for cloud warehouses. • Demonstrated ability to write robust tests (pytest/unittest) and integrate them with CI/CD pipelines. • Familiarity with containerization, orchestration, and workflow tools such as Docker, Kubernetes, Airflow, or Step Functions. • Strong documentation skills and ability to coach other engineers on sustainable development practices. Preferred Skills • Experience with dbt-modeled data warehouses and collaboration with analytics engineers. • Knowledge of MLOps tools, model validation frameworks, and feature stores. • Ability to implement automated testing frameworks and data quality checks for ML pipelines. Success Metrics • All Python ML and analytics applications migrated to Redshift with verified parity. • Applications updated to a modern architecture, complete with testing, documentation, and deployment automation. • Team empowered with guidance, processes, and runbooks to maintain the applications independently after the engagement.