

Senior Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer in White Plains, NY, for 12 months at a competitive pay rate. Requires 7+ years in ETL development, expertise in Azure Databricks and ADF, and relevant certifications. Hybrid work allowed.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
May 23, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Hybrid
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
White Plains, NY
-
π§ - Skills detailed
#"ETL (Extract #Transform #Load)" #Python #Cloud #SQL (Structured Query Language) #Data Governance #Scripting #DevOps #GitHub #Security #Data Processing #Azure Databricks #PySpark #Azure #AWS (Amazon Web Services) #Databricks #ADF (Azure Data Factory) #Spark (Apache Spark) #REST (Representational State Transfer) #API (Application Programming Interface) #Computer Science #Data Bricks #Data Engineering #Data Pipeline #Big Data #Data Modeling #Azure DevOps #Data Integration #Monitoring #Azure Data Factory
Role description
Job Description (Hybrid):
Position: Senior Data Engineer
Location: White Plains, NY
Duration: 12 Months
Note: Hybrid schedules are permissible with a minimum of 3 days on-site, depending on assignment, and can be fully on-site, depending on business needs.
Project Overview:
Responsible for managing critical Business-As-Usual (BAU) services that support enterprise-wide data operations.
These services include the development, maintenance, and monitoring of data pipelines, integrations, and reporting infrastructure that are essential for ongoing business functions.
Key responsibilities include:
β’ Maintaining and troubleshooting ETL workflows (Pentaho, Databricks, ADF)
β’ Supporting daily data loads and ensuring data availability for business reporting
β’ Responding to ad-hoc requests from business users
β’ Coordinating with DBAs and application teams for incident resolution
β’ Performing enhancements to support evolving business data needs. These BAU services are essential for keeping business operations running smoothly and delivering timely insights across multiple departments
Required Skills & Experience:
β’ 7+ years of experience in ETL development, data integration, and SQL scripting.
β’ Strong expertise in Azure Databricks, ADF (Azure Data Factory), and Pentaho.
β’ Experience handling secure file transfers using Globalscape FTP.
β’ Hands-on experience in developing and consuming APIs (REST/SOAP).
β’ Experience working with API security protocols (Oauth, JWT, API Keys, etc.).
β’ Proficiency in SQL, stored procedures, performance tuning, and query optimization.
β’ Understanding of data modeling, data warehousing, and data governance best practices.
β’ Hands-on experience with cloud-based data platforms (Azure/AWS) is a plus.
β’ Strong problem-solving skills, troubleshooting abilities, and the ability to work independently.
β’ Excellent communication skills and ability to work in a fast-paced environment.
Preferred Qualifications:
β’ Experience working in large-scale enterprise data integration projects.
β’ Knowledge of Python, PySpark for big data processing.
β’ Familiarity with CI/CD for data pipelines (Azure DevOps, GitHub Actions, etc.)
Education & Certifications:
β’ Bachelor's or Master's degree in a relevant field like Computer Science, Data Engineering, or a related technical field.
Certifications:
β’ Data bricks certified Data Engineer
β’ Azure Data Engineer associate
Job Description (Hybrid):
Position: Senior Data Engineer
Location: White Plains, NY
Duration: 12 Months
Note: Hybrid schedules are permissible with a minimum of 3 days on-site, depending on assignment, and can be fully on-site, depending on business needs.
Project Overview:
Responsible for managing critical Business-As-Usual (BAU) services that support enterprise-wide data operations.
These services include the development, maintenance, and monitoring of data pipelines, integrations, and reporting infrastructure that are essential for ongoing business functions.
Key responsibilities include:
β’ Maintaining and troubleshooting ETL workflows (Pentaho, Databricks, ADF)
β’ Supporting daily data loads and ensuring data availability for business reporting
β’ Responding to ad-hoc requests from business users
β’ Coordinating with DBAs and application teams for incident resolution
β’ Performing enhancements to support evolving business data needs. These BAU services are essential for keeping business operations running smoothly and delivering timely insights across multiple departments
Required Skills & Experience:
β’ 7+ years of experience in ETL development, data integration, and SQL scripting.
β’ Strong expertise in Azure Databricks, ADF (Azure Data Factory), and Pentaho.
β’ Experience handling secure file transfers using Globalscape FTP.
β’ Hands-on experience in developing and consuming APIs (REST/SOAP).
β’ Experience working with API security protocols (Oauth, JWT, API Keys, etc.).
β’ Proficiency in SQL, stored procedures, performance tuning, and query optimization.
β’ Understanding of data modeling, data warehousing, and data governance best practices.
β’ Hands-on experience with cloud-based data platforms (Azure/AWS) is a plus.
β’ Strong problem-solving skills, troubleshooting abilities, and the ability to work independently.
β’ Excellent communication skills and ability to work in a fast-paced environment.
Preferred Qualifications:
β’ Experience working in large-scale enterprise data integration projects.
β’ Knowledge of Python, PySpark for big data processing.
β’ Familiarity with CI/CD for data pipelines (Azure DevOps, GitHub Actions, etc.)
Education & Certifications:
β’ Bachelor's or Master's degree in a relevant field like Computer Science, Data Engineering, or a related technical field.
Certifications:
β’ Data bricks certified Data Engineer
β’ Azure Data Engineer associate