

Northern Trust
Sr. Data Engineer (contract)
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Sr. Data Engineer (contract) focused on financial risk, lasting 6 months, with a pay rate of $67-$74/hour. Key skills include Snowflake, SQL, Python, and data orchestration tools. A Bachelor's degree in a related field is required.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
March 4, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Chicago, IL
-
🧠 - Skills detailed
#Data Orchestration #Data Warehouse #Apache Airflow #DataStage #"ETL (Extract #Transform #Load)" #Compliance #Cloud #Data Manipulation #Airflow #Deployment #Computer Science #Strategy #SQL (Structured Query Language) #API (Application Programming Interface) #Python #Automation #Data Architecture #Data Modeling #Scala #Snowflake #Data Engineering #BI (Business Intelligence) #Data Lineage
Role description
Senior Data Engineer, Data Platforms (Financial Risk Focus) Project Overview: Lead the design, and implementation of the enterprise data ecosystem, driving the modernization and consolidation of data platforms to support business intelligence, advanced analytics, and regulatory reporting. This includes a critical focusing on the design, development, and maintenance of data warehousing solutions that support credit risk modeling. Contractor's Role:
• Provide strategic vision for data architecture, platforms, and governance, translating business objectives into effective, scalable, and secure technical solutions.
• Lead the end-to-end design and implementation of a scalable enterprise data ecosystem, migrating legacy systems into a modern, consolidated cloud environment.
• Architect and maintain robust data warehouse structures in Snowflake to support complex credit risk models, ensuring data lineage, accuracy, and auditability for regulatory compliance.
• Build and optimize high-performance ETL/ELT pipelines using Python, ensuring seamless data flow from disparate sources into the central warehouse.
• Design and manage complex workflows using data orchestration tools (e.g., Airflow) to ensure high availability and reliability of data feeds.
• Drive the transition from fragmented data silos to a unified platform, implementing best practices in data modeling. Experience Level - 3 - Senior
• 8+ years in Data Engineering, with a proven track record of leading large-scale data modernization or consolidation projects.
• Deep experience in Snowflake architecture, including performance tuning, data sharing, and cost management.
• Expert-level knowledge of SQL and dimensional data modeling techniques.
• Mastery of Python for data manipulation, automation, and API integrations.
• Extensive experience with enterprise-grade orchestration tools (e.g., Apache Airflow) Qualifications
• Bachelor's degree in Computer Science, Information Technology, or a related field, or equivalent professional experience.
• In-depth knowledge of data architecture principles, data modeling techniques (including dimensional modeling for regulatory reporting).
• Strong communication and presentation skills, with the ability to convey complex technical concepts and strategy, including regulatory requirements related to data, to both technical and executive-level stakeholders.
• Expertise in data warehousing and ETL/ELT patterns, particularly within a financial risk context. Nice to Have
• Datastage ETL experience.
• CI/CD pipeline deployment processes. Daily Tasks and Responsibilities
• Lead technical workshops to capture business and technical requirements, particularly those related to data warehousing solutions that support credit risk modeling.
• This includes coding, testing, and deploying processes to extract, transform, and load data from various sources.
• validating data to ensure accuracy and consistency, which is crucial for reliable analysis.
• Perform technical evaluations of new data technologies and features, building proof-of-concepts to validate architectural approaches.
• Serve as the key technical liaison between the engineering teams.
Pay Rate Range
67 - 74 USD hourly
Additional Notes
The above listed pay range is a good faith estimate of what the employer reasonably expects to pay for this position.
Benefits Information
Optional benefits offering includes medical, dental, vision and retirement benefits via Hiregenics.
Senior Data Engineer, Data Platforms (Financial Risk Focus) Project Overview: Lead the design, and implementation of the enterprise data ecosystem, driving the modernization and consolidation of data platforms to support business intelligence, advanced analytics, and regulatory reporting. This includes a critical focusing on the design, development, and maintenance of data warehousing solutions that support credit risk modeling. Contractor's Role:
• Provide strategic vision for data architecture, platforms, and governance, translating business objectives into effective, scalable, and secure technical solutions.
• Lead the end-to-end design and implementation of a scalable enterprise data ecosystem, migrating legacy systems into a modern, consolidated cloud environment.
• Architect and maintain robust data warehouse structures in Snowflake to support complex credit risk models, ensuring data lineage, accuracy, and auditability for regulatory compliance.
• Build and optimize high-performance ETL/ELT pipelines using Python, ensuring seamless data flow from disparate sources into the central warehouse.
• Design and manage complex workflows using data orchestration tools (e.g., Airflow) to ensure high availability and reliability of data feeds.
• Drive the transition from fragmented data silos to a unified platform, implementing best practices in data modeling. Experience Level - 3 - Senior
• 8+ years in Data Engineering, with a proven track record of leading large-scale data modernization or consolidation projects.
• Deep experience in Snowflake architecture, including performance tuning, data sharing, and cost management.
• Expert-level knowledge of SQL and dimensional data modeling techniques.
• Mastery of Python for data manipulation, automation, and API integrations.
• Extensive experience with enterprise-grade orchestration tools (e.g., Apache Airflow) Qualifications
• Bachelor's degree in Computer Science, Information Technology, or a related field, or equivalent professional experience.
• In-depth knowledge of data architecture principles, data modeling techniques (including dimensional modeling for regulatory reporting).
• Strong communication and presentation skills, with the ability to convey complex technical concepts and strategy, including regulatory requirements related to data, to both technical and executive-level stakeholders.
• Expertise in data warehousing and ETL/ELT patterns, particularly within a financial risk context. Nice to Have
• Datastage ETL experience.
• CI/CD pipeline deployment processes. Daily Tasks and Responsibilities
• Lead technical workshops to capture business and technical requirements, particularly those related to data warehousing solutions that support credit risk modeling.
• This includes coding, testing, and deploying processes to extract, transform, and load data from various sources.
• validating data to ensure accuracy and consistency, which is crucial for reliable analysis.
• Perform technical evaluations of new data technologies and features, building proof-of-concepts to validate architectural approaches.
• Serve as the key technical liaison between the engineering teams.
Pay Rate Range
67 - 74 USD hourly
Additional Notes
The above listed pay range is a good faith estimate of what the employer reasonably expects to pay for this position.
Benefits Information
Optional benefits offering includes medical, dental, vision and retirement benefits via Hiregenics.






