Northern Trust

ETL Developer (contract)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an ETL Developer (contract) with a duration of "unknown" and a pay rate of $67-$74 USD hourly. Key skills include Snowflake, Python, SQL, and Airflow. Requires 10+ years of experience and expertise in data analysis and automation.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
October 10, 2025
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Unknown
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Chicago, IL
-
🧠 - Skills detailed
#Azure #XML (eXtensible Markup Language) #Scala #Python #Scripting #Automation #Compliance #Snowflake #Migration #Unix #Azure DevOps #"ETL (Extract #Transform #Load)" #JSON (JavaScript Object Notation) #DevOps #DataStage #Deployment #Data Quality #Data Analysis #Airflow #Shell Scripting #SQL Server #Oracle #SQL (Structured Query Language) #Databases
Role description
Project Overview: This is to support the new work forthcoming in Risk: > Migration of legacy data workloads to Snowflake. > Development to support source system replacements. > Murex Risk docket. Looking for an experienced Snowflake developer responsible for enhancing and supporting data ingest\egress integrations with Risk & Compliance Applications. Contractor’s Role: The role will revolve around the building and supporting of the Snowflake data workloads as well as working with current application development team to drive superior business value, enhanced customer experience and compliance with NT standards. Experience Level: Senior resource: 10+ years Skills/Qualifications (must haves): β€’ Expertise in Design/Development in Snowflake, Python. β€’ Expertise in Data Analysis/Analytics skills. β€’ Expertise in SQL. β€’ Experience with Airflow. β€’ Strong in PL/SQL and UNIX shell scripting. β€’ Experience working with XML transformation and consumption of messages from queues will be a plus. β€’ Hands-on improve operational stability via automation, such as auto healing where possible, and raise design changes to Data analyst, SMEs and/or architects, as well as opportunities to application /service managers. β€’ Work some off hours as demanded by projects, operations, and stakeholders, such as release, testing and/or critical bug resolutions. Nice to have: β€’ Experience with ETL tools such as DataStage. β€’ Experience with control-M Tasks and Responsibilities: β€’ Migrate staging and output layers from legacy platforms (e.g., Oracle, SQL Server) to Snowflake-based schemas. β€’ Design and implement scalable data models in Snowflake, ensuring alignment with business logic and reporting needs. β€’ Build and maintain Airflow DAGs to automate data movement between Snowflake schemas β€’ Use orchestration tools such as Airflow, Azure DevOps, and Control-M to manage deployments and scheduling β€’ Ingest data from different sources such as flat files (CSV, TXT, XML, JSON) and relational databases (Oracle, SQL Server) β€’ Transform raw data into structured formats based on business requirements, ensuring consistency and accuracy across layers. β€’ Clean and restructure unformatted or semi-structured files to make them compatible with Snowflake ingestion pipelines. β€’ Implement error detection and handling logic, including creation of error tables to capture rejected records (e.g., duplicates, schema mismatches) β€’ Monitor pipeline health and troubleshoot data quality issues proactively. Pay Rate Range Min Pay Rate Max Pay Rate Currency Unit 67 74 USD hourly Additional Notes The above listed pay range is a good faith estimate of what the employer reasonably expects to pay for this position. Benefits Information Optional benefits offering includes medical, dental, vision and retirement benefits via Hiregenics.