NAM Info Inc

Sr. Data Engineer - W2 or Fulltime

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Sr. Data Engineer in Chicago, IL (Hybrid) with a contract duration of more than 6 months, offering a competitive pay rate. Key skills include Python, Snowflake, and experience with BCBS 239 compliance in financial services.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
March 4, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Chicago, IL
-
🧠 - Skills detailed
#Data Access #Programming #PySpark #Alation #"ETL (Extract #Transform #Load)" #SnowPipe #Pandas #Compliance #Cloud #Documentation #Metadata #GCP (Google Cloud Platform) #Airflow #DevOps #Data Management #Security #Data Accuracy #Spark (Apache Spark) #SQL (Structured Query Language) #Data Mart #Python #Automation #Data Pipeline #Docker #Kubernetes #AWS (Amazon Web Services) #Leadership #Data Catalog #Data Governance #Collibra #Data Modeling #Scala #Data Aggregation #Monitoring #Snowflake #Data Quality #Clustering #Data Engineering #Code Reviews #Azure #Data Lineage
Role description
Job Title: Senior Data Engineer Location: Chicago, IL (Hybrid) Department: Data & Analytics Reports To: Head of Data Engineering / Data Platform Lead Role Overview We are seeking a highly skilled Senior Data Engineer with strong Python development expertise and deep experience in Snowflake to design, build, and optimize scalable enterprise data solutions. This role is based in Chicago, IL and will support regulatory and risk data initiatives in a highly governed environment. The ideal candidate has hands-on experience building modern cloud data platforms and is familiar with risk management frameworks, BCBS 239 principles, and Governance, Risk & Compliance (GRC) requirements within financial services. Key Responsibilities Data Engineering & Architecture Design, develop, and maintain scalable data pipelines using Python. Build and optimize data models, transformations, and data marts within Snowflake. Develop robust ELT/ETL frameworks for structured and semi-structured data. Optimize Snowflake performance, cost efficiency, clustering, and workload management. Implement automation, monitoring, and CI/CD for data pipelines. Risk & Regulatory Data Management Support regulatory reporting aligned with BCBS 239 (risk data aggregation and reporting). Ensure data traceability, lineage, reconciliation, and auditability. Implement controls aligned with Governance, Risk & Compliance (GRC) frameworks. Partner with Risk, Finance, Compliance, and Audit teams to deliver accurate and governed data assets. Data Governance & Quality Develop and enforce data quality validation frameworks. Maintain metadata, lineage documentation, and data catalog integration. Implement data access controls and security best practices. Technical Leadership Provide mentorship and code reviews for data engineering team members. Promote engineering best practices and documentation standards. Collaborate cross-functionally with architects, analysts, and business stakeholders. Required Qualifications 7+ years of experience in Data Engineering or Data Platform development. Strong Python programming expertise (Pandas, PySpark, Airflow, etc.). Hands-on experience with Snowflake (data modeling, Snowpipe, Streams & Tasks, performance tuning). Advanced SQL skills and deep understanding of data warehousing concepts. Experience supporting BCBS 239 compliance or similar regulatory reporting frameworks. Experience working within Governance, Risk & Compliance (GRC) structures. Experience in cloud environments (AWS, Azure, or GCP). Strong understanding of data lineage, controls, reconciliation, and audit requirements. Preferred Qualifications Experience in banking, capital markets, or financial services. Knowledge of credit risk, market risk, liquidity risk, or regulatory reporting domains. Experience with data governance tools (Collibra, Alation, etc.). Familiarity with DevOps practices, Docker, Kubernetes. Experience building enterprise data platforms in highly regulated environments. Key Competencies Strong problem-solving and analytical thinking. Ability to operate in a regulated, audit-driven environment. Excellent communication and stakeholder management skills. Detail-oriented with a focus on data accuracy and integrity. Leadership mindset with hands-on technical capability.