Senior Data Engineer - Local to NY, NJ, CT - W2 Contract

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer in White Plains, NY, with a 12-month W2 contract at $X/hour. Requires 7+ years in ETL development, strong skills in Azure Databricks, ADF, and API integration. Bachelor's degree preferred.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
August 15, 2025
πŸ•’ - Project duration
More than 6 months
-
🏝️ - Location type
Hybrid
-
πŸ“„ - Contract type
W2 Contractor
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
White Plains, NY
-
🧠 - Skills detailed
#DevOps #Automation #Azure DevOps #Data Mapping #Data Reconciliation #Azure Data Factory #Big Data #AI (Artificial Intelligence) #Documentation #GitHub #Data Modeling #API (Application Programming Interface) #Monitoring #Data Pipeline #Spark (Apache Spark) #AWS (Amazon Web Services) #Computer Science #Complex Queries #Cybersecurity #Cloud #Data Integration #Databricks #PySpark #Python #Compliance #Data Governance #Consulting #Data Analysis #Business Analysis #Scripting #Scala #Security #Data Processing #Data Engineering #Azure Databricks #Data Quality #Data Extraction #REST (Representational State Transfer) #SQL (Structured Query Language) #"ETL (Extract #Transform #Load)" #ADF (Azure Data Factory) #Azure
Role description
Dice is the leading career destination for tech experts at every stage of their careers. Our client, Cogent Infotech Corp, is seeking the following. Apply via Dice today! Job Title Senior Data Engineer Location White Plains, NY Duration 12 Months Hours Weekly: 37.5hrs/week Hybrid onsite role 3 days Only w2 contract Project Overview Responsible for managing critical Business-As-Usual (BAU) services that support enterprise-wide data operations. These services include the development, maintenance, and monitoring of data pipelines, integrations, and reporting infrastructure that are essential for ongoing business functions. Key responsibilities include: 1) Maintaining and troubleshooting ETL workflows (Pentaho, Databricks, ADF) 2) Supporting daily data loads and ensuring data availability for business reporting 3) Responding to ad-hoc requests from business users 4) Coordinating with DBAs and application teams for incident resolution 5) Performing enhancements to support evolving business data needs These BAU services are essential for keeping business operations running smoothly and delivering timely insights across multiple departments Job Functions And Responsibilities β€’ ETL & Data Integration: - Design, develop, and optimize ETL pipelines using Azure Databricks, ADF, and Pentaho to support enterprise data workflows. β€’ Implement and maintain data movement, transformation, and integration across multiple systems. β€’ Ensure seamless data exchange between cloud, on-prem, and hybrid environments. β€’ Work with Globalscape FTP for secure file transfers and automation. API Development and Integration: β€’ Develop, consume, and integrate RESTful and SOAP APIs to facilitate data . β€’ Work with API gateways and authentication methods such Oauth, JWT, certificate, and API keys. β€’ Implement and optimize API-based data extractions and real-time data integrations Data Quality & Governance: β€’ Implement data validation, cleansing, and enrichment techniques. β€’ Develop and execute data reconciliation processes to ensure accuracy and completeness. β€’ Adhere to data governance policies and security compliance standards. BAU Support & Performance Optimization: β€’ Troubleshoot and resolve ETL failures, data load issues, and performance bottlenecks β€’ β€’ . β€’ Optimize SQL stored procedures and complex queries for better performance. β€’ Support ongoing enhancements and provide operational support for existing data pipelines. Collaboration & Documentation: β€’ Work closely with Data Analysts, Business Analysts, and stakeholders to understand data needs. β€’ Document ETL processes, data mappings, and workflows for maintainability and knowledge sharing. β€’ Provide guidance and best practices to ensure scalability and efficiency of data solutions. Skills Required Skills & Experience: β€’ 7+ years of experience in ETL development, data integration, and SQL scripting. β€’ Strong expertise in Azure Databricks, ADF (Azure Data Factory), and Pentaho. β€’ Experience handling secure file transfers using Globalscape FTP. β€’ Hands-on experience in developing and consuming APIs (REST/SOAP). β€’ Experience working with API security protocols (Oauth, JWT, API Keys, etc.,). β€’ Proficiency in SQL, stored procedures, performance tuning, and query optimization. β€’ Understanding of data modeling, data warehousing, and data governance best practices. β€’ Hands-on experience with cloud-based data platforms (Azure/AWS) is a plus. β€’ Strong problem-solving skills, troubleshooting abilities, and ability to work independently. β€’ Excellent communication skills and ability to work in a fast-paced environment. Preferred Qualifications: β€’ Experience working in large-scale enterprise data integration projects. β€’ Knowledge of Python, PySpark for big data processing. β€’ Familiarity with CI/CD for data pipelines (Azure DevOps, GitHub Actions, etc.). Education and Certifications β€’ Bachelor's or Master's degree in a relevant field like Computer science , Data Engineering or related technical field Nice to have below certifications: a) Databricks certified Data Engineer b) Azure Data Engineer associate Why Choose Cogent? Cogent Infotech stands at the forefront of technology consulting and is recognized globally for its award-winning services. With our headquarters in Pittsburgh, PA, USA, we specialize in guiding enterprises through digital transformation, leveraging the power of emerging technologies such as Cloud Computing, Cybersecurity, Data Analytics, and AI. Our mission is to provide innovative workforce solutions that address the complex challenges faced by today s businesses. As an ISO-certified firm and appraised at CMMI level 3, our reputation for excellence is well-established. We are proud to collaborate with over 70 Fortune 500 companies and more than 150 Federal C State agencies, delivering cutting-edge technology solutions that drive success. Cogent is an equal opportunity employer. Cogent will not discriminate against applicants or employees based on race, color, religion, national origin, age, sex, pregnancy (including childbirth or related medical condition), genetic information, sexual orientation, gender identity, military status, citizenship, or any other class protected by applicable law