

Prairie Consulting Services
Senior ETL Developer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior ETL Developer (Contract, Hybrid, Chicago, IL) with 10+ years in data engineering, strong Snowflake expertise, and skills in SQL, Python, and Airflow. Expected duration is over 6 months, focusing on data migration and integration for financial risk applications.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
576
-
ποΈ - Date
October 16, 2025
π - Duration
More than 6 months
-
ποΈ - Location
Hybrid
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
Chicago, IL
-
π§ - Skills detailed
#DataStage #Migration #SQL Server #Compliance #Scala #Data Integration #Oracle #Scripting #Snowflake #Data Quality #Databases #Python #JSON (JavaScript Object Notation) #Unix #DevOps #"ETL (Extract #Transform #Load)" #XML (eXtensible Markup Language) #Azure #Airflow #SQL (Structured Query Language) #Azure DevOps #Data Analysis #Shell Scripting #Automation
Role description
Senior Snowflake Developer β Contract (Hybrid, Chicago, IL)
Location: Chicago, IL (Hybrid β 3 days onsite per week)
Weβre seeking a Senior Snowflake Developer to support new initiatives at a leading financial institution. This role focuses on the migration of legacy data workloads to Snowflake, as well as building and supporting modern data integrations for key Risk applications β including the Murex Risk docket and other strategic modernization efforts.
Required Skills & Experience:
β’ 10+ years of experience in data development and engineering roles.
β’ Strong Snowflake design and development expertise.
β’ Advanced SQL, Python, and PL/SQL skills.
β’ Experience with Airflow for orchestration and scheduling.
β’ Solid understanding of UNIX shell scripting.
β’ Strong data analysis and analytics background.
β’ Exposure to XML transformation and message queue consumption (plus).
Nice to Have:
β’ Experience with ETL tools such as DataStage.
β’ Familiarity with Control-M and Azure DevOps.
Key Responsibilities
β’ Migrate existing data workloads from legacy databases (Oracle, SQL Server) to Snowflake.
β’ Design and implement scalable Snowflake data models that align with business and reporting needs.
β’ Build and maintain Airflow DAGs to automate data movement between Snowflake schemas.
β’ Develop and optimize SQL, PL/SQL, and Python scripts for data transformation and analytics.
β’ Integrate diverse data sources (flat files, JSON, XML, relational databases) into Snowflake pipelines.
β’ Monitor and enhance pipeline performance, error handling, and data quality.
β’ Support automation initiatives to improve operational stability (auto-healing, process optimization).
β’ Collaborate with cross-functional teams to ensure solutions meet governance and compliance standards.
β’ Participate in releases, testing, and critical support activities as needed.
This is a high-impact, long-term contract role where youβll contribute to critical risk modernization efforts within a global financial institution. Youβll work in a collaborative, hybrid environment using modern data technologies to build resilient, automated, and scalable Snowflake solutions.
Apply now or reach out directly to learn more about this opportunity!
Senior Snowflake Developer β Contract (Hybrid, Chicago, IL)
Location: Chicago, IL (Hybrid β 3 days onsite per week)
Weβre seeking a Senior Snowflake Developer to support new initiatives at a leading financial institution. This role focuses on the migration of legacy data workloads to Snowflake, as well as building and supporting modern data integrations for key Risk applications β including the Murex Risk docket and other strategic modernization efforts.
Required Skills & Experience:
β’ 10+ years of experience in data development and engineering roles.
β’ Strong Snowflake design and development expertise.
β’ Advanced SQL, Python, and PL/SQL skills.
β’ Experience with Airflow for orchestration and scheduling.
β’ Solid understanding of UNIX shell scripting.
β’ Strong data analysis and analytics background.
β’ Exposure to XML transformation and message queue consumption (plus).
Nice to Have:
β’ Experience with ETL tools such as DataStage.
β’ Familiarity with Control-M and Azure DevOps.
Key Responsibilities
β’ Migrate existing data workloads from legacy databases (Oracle, SQL Server) to Snowflake.
β’ Design and implement scalable Snowflake data models that align with business and reporting needs.
β’ Build and maintain Airflow DAGs to automate data movement between Snowflake schemas.
β’ Develop and optimize SQL, PL/SQL, and Python scripts for data transformation and analytics.
β’ Integrate diverse data sources (flat files, JSON, XML, relational databases) into Snowflake pipelines.
β’ Monitor and enhance pipeline performance, error handling, and data quality.
β’ Support automation initiatives to improve operational stability (auto-healing, process optimization).
β’ Collaborate with cross-functional teams to ensure solutions meet governance and compliance standards.
β’ Participate in releases, testing, and critical support activities as needed.
This is a high-impact, long-term contract role where youβll contribute to critical risk modernization efforts within a global financial institution. Youβll work in a collaborative, hybrid environment using modern data technologies to build resilient, automated, and scalable Snowflake solutions.
Apply now or reach out directly to learn more about this opportunity!