Senior Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer in White Plains, NY, with a contract length of unspecified duration and a pay rate of $60.00 - $70.00/hour. Key skills required include 7+ years in ETL development, Azure Databricks, ADF, and API integration.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
560
-
πŸ—“οΈ - Date discovered
May 21, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
On-site
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
White Plains, NY
-
🧠 - Skills detailed
#Scripting #Azure DevOps #Python #Azure Data Factory #Data Analysis #Automation #PySpark #AWS (Amazon Web Services) #API (Application Programming Interface) #Security #Complex Queries #Azure #ADF (Azure Data Factory) #Cloud #Business Analysis #Data Engineering #Data Integration #"ETL (Extract #Transform #Load)" #GitHub #REST (Representational State Transfer) #Monitoring #Databricks #Spark (Apache Spark) #Data Quality #Big Data #SQL (Structured Query Language) #DevOps #Data Extraction #Data Mapping #Data Modeling #Data Processing #Azure Databricks #Compliance #Scala #Data Governance #Data Reconciliation #Data Pipeline #Documentation
Role description
Job Number: 25-04370 Want to be part of the Energy Industry? ECLARO is looking for a Senior Data Engineer for our client in White Plains, NY. ECLARO's Client is America's largest state power organization and is a national leader in energy efficiency and clean energy technology. If you’re up to the challenge, then take a chance at this rewarding opportunity! Project Overview: β€’ Responsible for managing critical Business-As-Usual (BAU) services that support enterprise-wide data operations. β€’ These services include the development, maintenance, and monitoring of data pipelines, integrations, and reporting infrastructure that are essential for ongoing business functions. Pay Rate: $60.00 - $70.00/Hour Responsibilities: β€’ Maintaining and troubleshooting ETL workflows (Pentaho, Databricks, ADF). β€’ Supporting daily data loads and ensuring data availability for business reporting. β€’ Responding to ad-hoc requests from business users. β€’ Coordinating with DBAs and application teams for incident resolution. β€’ Performing enhancements to support evolving business data needs These BAU services are essential for keeping business operations running smoothly and delivering timely insights across multiple departments. β€’ ETL & Data Integration: β€’ Design, develop, and optimize ETL pipelines using Azure Databricks, ADF, and Pentaho to support enterprise data workflows. β€’ Implement and maintain data movement, transformation, and integration across multiple systems. β€’ Ensure seamless data exchange between cloud, on-prem, and hybrid environments. β€’ Work with Globalscape FTP for secure file transfers and automation. β€’ API Development and Integration: β€’ Develop, consume, and integrate RESTful and SOAP APIs to facilitate data. β€’ Work with API gateways and authentication methods such Oauth, JWT, certificate, and API keys. β€’ Implement and optimize API-based data extractions and real-time data integrations. β€’ Data Quality & Governance: β€’ Implement data validation, cleansing, and enrichment techniques. β€’ Develop and execute data reconciliation processes to ensure accuracy and completeness. β€’ Adhere to data governance policies and security compliance standards. β€’ BAU Support & Performance Optimization: β€’ Troubleshoot and resolve ETL failures, data load issues, and performance bottlenecks. β€’ Optimize SQL stored procedures and complex queries for better performance. β€’ Support ongoing enhancements and provide operational support for existing data pipelines. β€’ Collaboration & Documentation: β€’ Work closely with Data Analysts, Business Analysts, and stakeholders to understand data needs. β€’ Document ETL processes, data mappings, and workflows for maintainability and knowledge sharing. β€’ Provide guidance and best practices to ensure scalability and efficiency of data solutions. Required Qualifications: β€’ 7+ years of experience in ETL development, data integration, and SQL scripting. β€’ Strong expertise in Azure Databricks, ADF (Azure Data Factory), and Pentaho. β€’ Experience handling secure file transfers using Globalscape FTP. β€’ Hands-on experience in developing and consuming APIs (REST/SOAP). β€’ Experience working with API security protocols (OAuth, JWT, API Keys, etc.,). β€’ Proficiency in SQL, stored procedures, performance tuning, and query optimization. β€’ Understanding of data modeling, data warehousing, and data governance best practices. β€’ Hands-on experience with cloud-based data platforms (Azure/AWS) is a plus. β€’ Strong problem-solving skills, troubleshooting abilities, and ability to work independently. β€’ Excellent communication skills and ability to work in a fast-paced environment. Preferred Qualifications: β€’ Experience working in large-scale enterprise data integration projects. β€’ Knowledge of Python, PySpark for big data processing. β€’ Familiarity with CI/CD for data pipelines (Azure DevOps, GitHub Actions, etc.). If hired, you will enjoy the following ECLARO Benefits: β€’ 401k Retirement Savings Plan administered by Merrill Lynch β€’ Commuter Check Pretax Commuter Benefits β€’ Eligibility to purchase Medical, Dental & Vision Insurance through ECLARO If interested, you may contact: Eileen Sares esares@eclaro.com 646-755-9301 Eileen Sares | LinkedIn Equal Opportunity Employer: ECLARO values diversity and does not discriminate based on Race, Color, Religion, Sex, Sexual Orientation, National Origin, Age, Genetic Information, Disability, Protected Veteran Status, or any other legally protected group status, in compliance with all applicable laws.