Addison Group

Senior Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer, with a contract length of "unknown," offering a pay rate of "unknown," and located in "unknown." Key skills include Snowflake expertise, advanced SQL, and strong Python. Experience with ETRM systems is preferred.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
680
-
🗓️ - Date
May 13, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Houston, TX
-
🧠 - Skills detailed
#PCI (Payment Card Industry) #Forecasting #SQL (Structured Query Language) #Python #Scala #Automation #Data Pipeline #Snowflake #Data Engineering
Role description
Position Overview We are seeking a highly independent Senior Data Engineer to own, manage, and evolve our enterprise data platform, with Snowflake serving as the central data repository. This role focuses on building, maintaining, and optimizing data pipelines that support commercial, trading, pricing, and operational functions across the business. Core Responsibilities • Own and manage the Snowflake data platform as the enterprise data repository • Maintain, enhance, and support 8–10 core integration pipelines • Build new data pipelines as new vendors, pricing sources, and commercial systems come online • Manage ingestion and integration of data from: • ETRM systems (PCI strongly preferred) • Pricing and market data vendors • Commercial and operational systems • Internal systems that pull external data feeds • Ensure high reliability, performance, and quality for business‑critical data flows • Provide day‑to‑day operational support, including backfills and issue resolution • Partner with commercial, trading, and operations teams to understand data needs and deliver scalable solutions • Support analytics, reporting, forecasting, and downstream data consumers Must‑Have Technical Requirements • Senior‑level Data Engineering experience • Strong, hands‑on Snowflake expertise (platform ownership experience required) • Advanced SQL • Strong Python for pipelines, automation, and integrations • Proven experience building and supporting production‑grade data pipelines • Ability to work independently with minimal direction Data Environment • Snowflake as the central enterprise data repository • 8–10 existing integration pipelines • Data sources include: • ETRM systems • Pricing vendors • Commercial and operational systems • External data feeds pulled by internal systems