

Senior Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer in White Plains, NY, with a contract length of unspecified duration and a pay rate of $60 - $70/hour. Key skills include 7+ years in Data Engineering, ETL, SQL, and experience with SAP BTP and cloud platforms.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
560
-
ποΈ - Date discovered
May 21, 2025
π - Project duration
Unknown
-
ποΈ - Location type
On-site
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
White Plains, NY
-
π§ - Skills detailed
#Azure DevOps #Python #Data Analysis #SQL Queries #Automation #PySpark #AWS (Amazon Web Services) #API (Application Programming Interface) #Security #Storage #Data Architecture #Collibra #Azure #ADF (Azure Data Factory) #Cloud #Business Analysis #Data Engineering #Data Integration #GCP (Google Cloud Platform) #Lambda (AWS Lambda) #"ETL (Extract #Transform #Load)" #Data Profiling #GitHub #Data Management #Metadata #Data Security #Data Lineage #Spark (Apache Spark) #Data Quality #SQL (Structured Query Language) #SAP #DevOps #Data Extraction #Data Processing #Compliance #Scala #Data Governance #Deployment #Version Control #Data Accuracy #Data Pipeline #Documentation
Role description
Job Number: 25-04373
Want to be part of the Energy Industry? ECLARO is looking for a Senior Data Engineer for our client in White Plains, NY.
ECLARO's Client is America's largest state power organization and is a national leader in energy efficiency and clean energy technology. If youβre up to the challenge, then take a chance at this rewarding opportunity!
Position Overview:
β’ Client's current on-premises Enterprise Resource Planning (ERP) system, SAP ECC 6.0, has been in use for over a decade and is nearing technological obsolescence resulting in Client's requirement to select and implement a new ERP system.
β’ The objective for deploying a new ERP system, is to successfully implement a system that integrates all business functions, including finance, operations, and human resources into a cohesive platform.
β’ This implementation aims to enhance organizational efficiency, improve data accuracy, and provide real-time reporting capabilities.
β’ The goal is to streamline processes, reduce operational costs, and support informed decision-making across all departments.
Responsibilities:
β’ Cloud Data Engineering & Integration:
β’ Design and implement data pipelines across AWS, Azure, and Google Cloud.
β’ Develop SAP BTP integrations with cloud and on-premise systems.
β’ Ensure seamless data movement and storage between cloud platforms.
β’ ETL & Data Pipeline Development:
β’ Develop and optimize ETL workflows using Pentaho and Microsoft ADF / or equivalent ETL tools.
β’ Design scalable and efficient data transformation, movement, and ingestion processes.
β’ Monitor and troubleshoot ETL jobs to ensure high availability and performance.
β’ API Development & Data Integration:
β’ Develop and integrate RESTful APIs to support data exchange between SAP and other platforms.
β’ Work with API gateways and authentication methods like OAuth, JWT, API keys.
β’ Implement API-based data extractions and real-time event-driven architectures.
β’ Data Analysis & SQL Development:
β’ Write and optimize SQL queries, stored procedures, and scripts for data analysis, reporting, and integration.
β’ Perform data profiling, validation, and reconciliation to ensure data accuracy and consistency.
β’ Support data transformation logic and business rules for ERP reporting needs.
β’ Data Governance & Quality (Ataccama, Collibra):
β’ Work with Ataccama and Collibra to define and enforce data quality and governance policies.
β’ Implement data lineage, metadata management, and compliance tracking across systems.
β’ Ensure compliance with enterprise data security and governance standards.
β’ Cloud & DevOps (AWS, Azure, GCP):
β’ Utilize Azure DevOps and GitHub for version control, CI / CD, and deployment automation.
β’ Deploy and manage data pipelines on AWS, Azure, and Google Cloud.
β’ Work with serverless computing (Lambda, Azure Functions, Google Cloud Functions) to automate data workflows.
β’ Collaboration & Documentation:
β’ Collaborate with SAP functional teams, business analysts, and data architects to understand integration requirements.
β’ Document ETL workflows, API specifications, data models, and governance policies.
β’ Provide technical support and troubleshooting for data pipelines and integrations.
Required Skills:
β’ 7+ years of experience in Data Engineering, ETL, and SQL development.
β’ Hands-on experience with SAP BTP Integration Suite for SAP and non-SAP integrations.
β’ Strong expertise in Pentaho (PDI), Microsoft ADF, and API development.
β’ Proficiency in SQL (stored procedures, query optimization, performance tuning).
β’ Experience working with Azure DevOps, GitHub, and CI / CD for data pipelines.
β’ Good understanding of data governance tools (Ataccama, Collibra) and data quality management.
β’ Experience working with AWS, Azure, and Google Cloud (GCP) for data integration and cloud-based workflows.
β’ Strong problem-solving skills and ability to work independently in a fast-paced environment.
Preferred Experience:
β’ Experience working on SAP S / 4HANA and cloud-based ERP implementations.
β’ Familiarity with Python, PySpark for data processing and automation.
β’ Experience working on Pentaho, Microsoft ADF / or equivalent ETL tools.
β’ Knowledge of event-driven architectures.
β’ Familiarity with CI / CD for data pipelines (Azure DevOps, GitHub Actions, etc.).
Pay Rate: $60 - $70 / Hour
If hired, you will enjoy the following ECLARO Benefits:
β’ 401k Retirement Savings Plan administered by Merrill Lynch
β’ Commuter Check Pretax Commuter Benefits
β’ Eligibility to purchase Medical, Dental & Vision Insurance through ECLARO
If interested, you may contact:
Cedric Ceballo
cedric.ceballo@eclaro.com
646-357-1237
Cedric Ceballo | LinkedIn
Equal Opportunity Employer: ECLARO values diversity and does not discriminate based on Race, Color, Religion, Sex, Sexual Orientation, National Origin, Age, Genetic Information, Disability, Protected Veteran Status, or any other legally protected group status, in compliance with all applicable laws.
Job Number: 25-04373
Want to be part of the Energy Industry? ECLARO is looking for a Senior Data Engineer for our client in White Plains, NY.
ECLARO's Client is America's largest state power organization and is a national leader in energy efficiency and clean energy technology. If youβre up to the challenge, then take a chance at this rewarding opportunity!
Position Overview:
β’ Client's current on-premises Enterprise Resource Planning (ERP) system, SAP ECC 6.0, has been in use for over a decade and is nearing technological obsolescence resulting in Client's requirement to select and implement a new ERP system.
β’ The objective for deploying a new ERP system, is to successfully implement a system that integrates all business functions, including finance, operations, and human resources into a cohesive platform.
β’ This implementation aims to enhance organizational efficiency, improve data accuracy, and provide real-time reporting capabilities.
β’ The goal is to streamline processes, reduce operational costs, and support informed decision-making across all departments.
Responsibilities:
β’ Cloud Data Engineering & Integration:
β’ Design and implement data pipelines across AWS, Azure, and Google Cloud.
β’ Develop SAP BTP integrations with cloud and on-premise systems.
β’ Ensure seamless data movement and storage between cloud platforms.
β’ ETL & Data Pipeline Development:
β’ Develop and optimize ETL workflows using Pentaho and Microsoft ADF / or equivalent ETL tools.
β’ Design scalable and efficient data transformation, movement, and ingestion processes.
β’ Monitor and troubleshoot ETL jobs to ensure high availability and performance.
β’ API Development & Data Integration:
β’ Develop and integrate RESTful APIs to support data exchange between SAP and other platforms.
β’ Work with API gateways and authentication methods like OAuth, JWT, API keys.
β’ Implement API-based data extractions and real-time event-driven architectures.
β’ Data Analysis & SQL Development:
β’ Write and optimize SQL queries, stored procedures, and scripts for data analysis, reporting, and integration.
β’ Perform data profiling, validation, and reconciliation to ensure data accuracy and consistency.
β’ Support data transformation logic and business rules for ERP reporting needs.
β’ Data Governance & Quality (Ataccama, Collibra):
β’ Work with Ataccama and Collibra to define and enforce data quality and governance policies.
β’ Implement data lineage, metadata management, and compliance tracking across systems.
β’ Ensure compliance with enterprise data security and governance standards.
β’ Cloud & DevOps (AWS, Azure, GCP):
β’ Utilize Azure DevOps and GitHub for version control, CI / CD, and deployment automation.
β’ Deploy and manage data pipelines on AWS, Azure, and Google Cloud.
β’ Work with serverless computing (Lambda, Azure Functions, Google Cloud Functions) to automate data workflows.
β’ Collaboration & Documentation:
β’ Collaborate with SAP functional teams, business analysts, and data architects to understand integration requirements.
β’ Document ETL workflows, API specifications, data models, and governance policies.
β’ Provide technical support and troubleshooting for data pipelines and integrations.
Required Skills:
β’ 7+ years of experience in Data Engineering, ETL, and SQL development.
β’ Hands-on experience with SAP BTP Integration Suite for SAP and non-SAP integrations.
β’ Strong expertise in Pentaho (PDI), Microsoft ADF, and API development.
β’ Proficiency in SQL (stored procedures, query optimization, performance tuning).
β’ Experience working with Azure DevOps, GitHub, and CI / CD for data pipelines.
β’ Good understanding of data governance tools (Ataccama, Collibra) and data quality management.
β’ Experience working with AWS, Azure, and Google Cloud (GCP) for data integration and cloud-based workflows.
β’ Strong problem-solving skills and ability to work independently in a fast-paced environment.
Preferred Experience:
β’ Experience working on SAP S / 4HANA and cloud-based ERP implementations.
β’ Familiarity with Python, PySpark for data processing and automation.
β’ Experience working on Pentaho, Microsoft ADF / or equivalent ETL tools.
β’ Knowledge of event-driven architectures.
β’ Familiarity with CI / CD for data pipelines (Azure DevOps, GitHub Actions, etc.).
Pay Rate: $60 - $70 / Hour
If hired, you will enjoy the following ECLARO Benefits:
β’ 401k Retirement Savings Plan administered by Merrill Lynch
β’ Commuter Check Pretax Commuter Benefits
β’ Eligibility to purchase Medical, Dental & Vision Insurance through ECLARO
If interested, you may contact:
Cedric Ceballo
cedric.ceballo@eclaro.com
646-357-1237
Cedric Ceballo | LinkedIn
Equal Opportunity Employer: ECLARO values diversity and does not discriminate based on Race, Color, Religion, Sex, Sexual Orientation, National Origin, Age, Genetic Information, Disability, Protected Veteran Status, or any other legally protected group status, in compliance with all applicable laws.