

CloudHive
Principal Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Principal Data Engineer in Chicago (Hybrid) for an 8-month contract, requiring expertise in Medallion lakehouse architecture, Apache Iceberg, Aurora MySQL, DynamoDB, Snowflake, Kafka, and AWS, with FinTech compliance experience essential.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
March 19, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Chicago, IL
-
🧠 - Skills detailed
#Snowflake #Compliance #Migration #Data Quality #S3 (Amazon Simple Storage Service) #PCI (Payment Card Industry) #Lambda (AWS Lambda) #AWS S3 (Amazon Simple Storage Service) #Security #Replication #Cloud #Airflow #Kafka (Apache Kafka) #"ETL (Extract #Transform #Load)" #IAM (Identity and Access Management) #DynamoDB #AWS (Amazon Web Services) #dbt (data build tool) #Aurora #Apache Iceberg #Data Engineering #MySQL
Role description
Job Title: Principal Data Engineer
Location: Chicago (Hybrid)
Contract Length: 8-months
One of CloudHive's largest customers operating within the FinTech space requires a Principal Data Engineer with a strong track record of delivering production systems at scale in high-throughput transactional environments.
Requirements:
• Medallion lakehouse architecture (Bronze/Silver/Gold): end-to-end standup and migration ownership in production
• Apache Iceberg: partition evolution, schema evolution, compaction, time travel, and catalog integration
• Aurora MySQL: schema management, CDC via Debezium, binlog replication, and OLTP performance tuning
• DynamoDB: table design, GSIs, TTL lifecycle, and multi-region operational constraints
• Snowflake: anti-pattern identification, cost driver analysis, compute right-sizing, and workload migration to lakehouse
• Kafka / Confluent: topic design, consumer group management, Debezium CDC connectors
• AWS: S3, LakeFormation, Glue, Lambda, IAM; security controls in PCI/PII environments
• dbt: Silver layer transformation orchestration, incremental models, and data quality tests
• Airflow: major version upgrades or migration to AWS MWAA
• PCI-DSS: payments or FinTech compliance experience in a regulated data environment
Job Title: Principal Data Engineer
Location: Chicago (Hybrid)
Contract Length: 8-months
One of CloudHive's largest customers operating within the FinTech space requires a Principal Data Engineer with a strong track record of delivering production systems at scale in high-throughput transactional environments.
Requirements:
• Medallion lakehouse architecture (Bronze/Silver/Gold): end-to-end standup and migration ownership in production
• Apache Iceberg: partition evolution, schema evolution, compaction, time travel, and catalog integration
• Aurora MySQL: schema management, CDC via Debezium, binlog replication, and OLTP performance tuning
• DynamoDB: table design, GSIs, TTL lifecycle, and multi-region operational constraints
• Snowflake: anti-pattern identification, cost driver analysis, compute right-sizing, and workload migration to lakehouse
• Kafka / Confluent: topic design, consumer group management, Debezium CDC connectors
• AWS: S3, LakeFormation, Glue, Lambda, IAM; security controls in PCI/PII environments
• dbt: Silver layer transformation orchestration, incremental models, and data quality tests
• Airflow: major version upgrades or migration to AWS MWAA
• PCI-DSS: payments or FinTech compliance experience in a regulated data environment






