

Delphi-US, LLC - Peacemakers in the Talent War
Senior Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer (Contract, Remote) requiring 5+ years of Snowflake experience, 3+ years with DBT, and strong skills in data operations, CICD, and SQL. Retail domain experience is essential. Contract length and pay rate are unspecified.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
October 22, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
New York, NY
-
🧠 - Skills detailed
#Data Strategy #Cloud #dbt (data build tool) #Data Modeling #Automation #Data Management #Fivetran #Data Quality #Airflow #Continuous Deployment #Big Data #Deployment #GitHub #SQL (Structured Query Language) #Data Manipulation #Data Catalog #Snowflake #BI (Business Intelligence) #Data Engineering #Strategy #Version Control #SAS #"ETL (Extract #Transform #Load)"
Role description
Job Title: Senior Data Engineer (Data Operations) - (Contract) - Job#5782
Location: Remote, Eastern Time Zone Hours
Job Description
Our client has an immediate requirement for a Senior Data Engineer to join their team. The selected candidate will play a key role in defining data strategy and operations in a CICD (Continuous Integration/Continuous Deployment) environment.
Responsibilities
• Establish and maintain a robust CICD pipeline for ELT (Extract, Load, Transform) processes.
• Consolidate data from various source systems into Snowflake.
• Centralize data from SAS applications and Cloud SAS applications.
• Develop and implement a data quality program to support automation tools and processes.
• Create and manage BI (Business Intelligence) processes using GitHub.
• Utilize DBT (Data Build Tool) to transform and document the data modeling.
• Monitor data quality and implement necessary improvements.
• Contribute to the enhancement of the Customer Data Platform (CDP).
• Drive data implementation using Snowflake.
• Contribute to a complex data strategy involving multiple sources and systems.
• Implement and test data solutions using Snowflake.
• Catalog and audit current data to contribute to the overall data strategy.
• Set up data platforms for efficient data management.
Required Skills
• 3+ years of hands-on experience with Data Modeling Tools such as DBT.
• 5+ years of experience working with a cloud db like Snowflake.
• 2+ years of experience working with orchestration tools like Airflow, Astro.
• Strong understanding of data operations and best practices.
• Ability to design, configure, and manage CICD pipelines for ELT processes.
• Strong understanding of Customer Data Platform (CDP) and experience in retail domain.
• Proven experience in data platform management.
• Ability to centralize and effectively manage enterprise data.
• Proficiency in version control using GitHub and experience managing code in GitHub repositories.
• Strong SQL skills for data manipulation and analysis.
• Familiarity with No Code ETL Solutions tools such as FiveTran.
• Experience working with BW for SAS applications.
• Demonstrated expertise in data cataloguing and auditing.
• Proven experience working with transactional data. ??
• Experience handling complex data from diverse sources and systems.
• Familiarity with Big Data technologies.
Job Title: Senior Data Engineer (Data Operations) - (Contract) - Job#5782
Location: Remote, Eastern Time Zone Hours
Job Description
Our client has an immediate requirement for a Senior Data Engineer to join their team. The selected candidate will play a key role in defining data strategy and operations in a CICD (Continuous Integration/Continuous Deployment) environment.
Responsibilities
• Establish and maintain a robust CICD pipeline for ELT (Extract, Load, Transform) processes.
• Consolidate data from various source systems into Snowflake.
• Centralize data from SAS applications and Cloud SAS applications.
• Develop and implement a data quality program to support automation tools and processes.
• Create and manage BI (Business Intelligence) processes using GitHub.
• Utilize DBT (Data Build Tool) to transform and document the data modeling.
• Monitor data quality and implement necessary improvements.
• Contribute to the enhancement of the Customer Data Platform (CDP).
• Drive data implementation using Snowflake.
• Contribute to a complex data strategy involving multiple sources and systems.
• Implement and test data solutions using Snowflake.
• Catalog and audit current data to contribute to the overall data strategy.
• Set up data platforms for efficient data management.
Required Skills
• 3+ years of hands-on experience with Data Modeling Tools such as DBT.
• 5+ years of experience working with a cloud db like Snowflake.
• 2+ years of experience working with orchestration tools like Airflow, Astro.
• Strong understanding of data operations and best practices.
• Ability to design, configure, and manage CICD pipelines for ELT processes.
• Strong understanding of Customer Data Platform (CDP) and experience in retail domain.
• Proven experience in data platform management.
• Ability to centralize and effectively manage enterprise data.
• Proficiency in version control using GitHub and experience managing code in GitHub repositories.
• Strong SQL skills for data manipulation and analysis.
• Familiarity with No Code ETL Solutions tools such as FiveTran.
• Experience working with BW for SAS applications.
• Demonstrated expertise in data cataloguing and auditing.
• Proven experience working with transactional data. ??
• Experience handling complex data from diverse sources and systems.
• Familiarity with Big Data technologies.