

Optomi
Senior Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer with 5+ years of experience, focusing on financial data ingestion and modeling using Snowflake. Contract length and pay rate are unspecified. Key skills include advanced SQL, Python, and experience with financial datasets.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
720
-
🗓️ - Date
May 12, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Burbank, CA
-
🧠 - Skills detailed
#Snowflake #Data Pipeline #dbt (data build tool) #SAP #Data Mart #Scala #Airflow #AI (Artificial Intelligence) #"ETL (Extract #Transform #Load)" #Datasets #Data Ingestion #ML (Machine Learning) #SQL (Structured Query Language) #Python #Automation #Data Engineering
Role description
Optomi, in partnership with a leading Fortune 500 media and entertainment company, is seeking a Senior Data Engineer to join their team. The successful candidate will play a critical role in building a new Snowflake data platform, focusing on financial data ingestion and modeling. This role requires a deep understanding of Snowflake, advanced SQL skills, and experience with financial datasets. The candidate will collaborate with various teams to deliver high-quality solutions on schedule.
Job Requirements:
• 5+ years of experience as a Data Engineer
• The role is intended for a senior-level data engineer, not a junior or lightly experienced Snowflake resource.
• Hands-on experience delivering robust, maintainable data pipelines for complex enterprise environments.
• Expertise in Snowflake: Building data pipelines, Designing and structuring data object, Creating data marts.
• Expertise in advanced SQL for data transformation & modeling (e.g., window functions, CTEs).
• Python for ETL/ELT, automation, or analytics
• Experience with financial datasets AND industry
• Prior experience developing financial cost models (Financial planning, financial spend, cost allocation & recovery, etc.)
• Knowledge of financial, accounting, or technology operations systems and data such as SAP, Clarity, ServiceNow, Cognos
• Familiarity with workflow orchestration tools (DBT, Airflow, etc.).
• Strong communication skills
Job Responsibilities:
• Develop, automate, and maintain scalable data pipelines
• Partner with architects to develop robust data models
• Write performant SQL and Python code for cost allocation
• Collaborate with product managers, finance teams, and AI/ML engineers
• Implement reconciliation processes and IT controls
Optomi, in partnership with a leading Fortune 500 media and entertainment company, is seeking a Senior Data Engineer to join their team. The successful candidate will play a critical role in building a new Snowflake data platform, focusing on financial data ingestion and modeling. This role requires a deep understanding of Snowflake, advanced SQL skills, and experience with financial datasets. The candidate will collaborate with various teams to deliver high-quality solutions on schedule.
Job Requirements:
• 5+ years of experience as a Data Engineer
• The role is intended for a senior-level data engineer, not a junior or lightly experienced Snowflake resource.
• Hands-on experience delivering robust, maintainable data pipelines for complex enterprise environments.
• Expertise in Snowflake: Building data pipelines, Designing and structuring data object, Creating data marts.
• Expertise in advanced SQL for data transformation & modeling (e.g., window functions, CTEs).
• Python for ETL/ELT, automation, or analytics
• Experience with financial datasets AND industry
• Prior experience developing financial cost models (Financial planning, financial spend, cost allocation & recovery, etc.)
• Knowledge of financial, accounting, or technology operations systems and data such as SAP, Clarity, ServiceNow, Cognos
• Familiarity with workflow orchestration tools (DBT, Airflow, etc.).
• Strong communication skills
Job Responsibilities:
• Develop, automate, and maintain scalable data pipelines
• Partner with architects to develop robust data models
• Write performant SQL and Python code for cost allocation
• Collaborate with product managers, finance teams, and AI/ML engineers
• Implement reconciliation processes and IT controls






