Addison Group

Senior Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer with a contract length of "Unknown," offering a pay rate of "Unknown." Key skills include Snowflake, AWS/Azure, advanced SQL, and Python. Requires 7+ years in data engineering, with upstream oil & gas experience preferred.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
760
-
πŸ—“οΈ - Date
October 31, 2025
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Unknown
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Houston, TX
-
🧠 - Skills detailed
#Computer Science #AWS (Amazon Web Services) #Data Modeling #Airflow #Programming #Python #GIT #Data Engineering #Lambda (AWS Lambda) #Tableau #Data Security #Snowflake #Data Science #Compliance #Databricks #ML (Machine Learning) #Scala #SQL (Structured Query Language) #Data Integration #Data Pipeline #AI (Artificial Intelligence) #BI (Business Intelligence) #Cloud #Data Processing #Security #Automated Testing #Leadership #Automation #Azure #S3 (Amazon Simple Storage Service) #Data Architecture #Data Warehouse #Data Quality #ADF (Azure Data Factory) #Spotfire #"ETL (Extract #Transform #Load)" #Azure Data Factory #Microsoft Power BI #Data Governance
Role description
About the Role The Senior Data Engineer will play a critical role in building and scaling an enterprise data platform to enable analytics, reporting, and operational insights across the organization. This position requires deep expertise in Snowflake and cloud technologies (AWS or Azure), along with strong upstream oil & gas domain experience. The engineer will design and optimize data pipelines, enforce data governance and quality standards, and collaborate with cross-functional teams to deliver reliable, scalable data solutions. Key Responsibilities Data Architecture & Engineering β€’ Design, develop, and maintain scalable data pipelines using Snowflake, AWS/Azure, and modern data engineering tools. β€’ Implement ETL/ELT processes integrating data from upstream systems (SCADA, production accounting, drilling, completions, etc.). β€’ Architect data models supporting both operational reporting and advanced analytics. β€’ Establish and maintain frameworks for data quality, validation, and lineage to ensure enterprise data trust. Platform Development & Optimization β€’ Lead the build and optimization of Snowflake-based data warehouses for performance and cost efficiency. β€’ Design cloud-native data solutions leveraging AWS/Azure services (S3, Lambda, Azure Data Factory, Databricks). β€’ Manage large-scale time-series and operational data processing workflows. β€’ Implement strong security, access control, and governance practices. Technical Leadership & Innovation β€’ Mentor junior data engineers and provide technical leadership across the data platform team. β€’ Research and introduce new technologies to enhance platform scalability and automation. β€’ Build reusable frameworks, components, and utilities to streamline delivery. β€’ Support AI/ML initiatives by delivering production-ready, high-quality data pipelines. Business Partnership β€’ Collaborate with stakeholders across business units to translate requirements into technical solutions. β€’ Work with analysts and data scientists to enable self-service analytics and reporting. β€’ Ensure data integration supports regulatory and compliance reporting. β€’ Act as a bridge between business and technical teams to ensure alignment and impact. Qualifications & Experience Education β€’ Bachelor’s degree in Computer Science, Engineering, Information Systems, or a related field. β€’ Advanced degree or relevant certifications (SnowPro, AWS/Azure Data Engineer, Databricks) preferred. Experience β€’ 7+ years in data engineering roles, with at least 3 years on cloud data platforms. β€’ Proven expertise in Snowflake and at least one major cloud platform (AWS or Azure). β€’ Hands-on experience with upstream oil & gas data (wells, completions, SCADA, production, reserves, etc.). β€’ Demonstrated success delivering operational and analytical data pipelines. Technical Skills β€’ Advanced SQL and Python programming skills. β€’ Strong background in data modeling, ETL/ELT, cataloging, lineage, and data security. β€’ Familiarity with Airflow, Azure Data Factory, or similar orchestration tools. β€’ Experience with CI/CD, Git, and automated testing. β€’ Knowledge of BI tools such as Power BI, Spotfire, or Tableau. β€’ Understanding of AI/ML data preparation and integration.