Senior Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
September 9, 2025
πŸ•’ - Project duration
More than 6 months
-
🏝️ - Location type
Unknown
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
United States
-
🧠 - Skills detailed
#SQL (Structured Query Language) #Synapse #Base #ADaM (Analysis Data Model) #Azure #Data Lake #Documentation #Automation #Data Management #Data Warehouse #Scala #Azure Data Factory #Replication #Data Modeling #SaaS (Software as a Service) #Python #Datasets #Azure cloud #Cloud #Database Performance #BI (Business Intelligence) #Data Pipeline #Data Quality #Programming #Microsoft Azure #API (Application Programming Interface) #Data Backup #"ETL (Extract #Transform #Load)" #Data Integration #Schema Design #Data Analysis #Data Access #Computer Science #Talend #ADF (Azure Data Factory) #Monitoring #Data Mining #Data Engineering #Security #Microservices #Informatica
Role description
Description: NOTE: Must be a US Citizen or Green Card Holder. No sponsorship or C2C at this time. You must also love within 4 hours of Philadelphia and be willing to travel quarterly to their offices in Pennsylvania for important meetings. No exceptions. Our client, a stable company with offices in Pennsylvania, is hiring a Senior Data Engineer to join their team. This is a full-time salaried, remote position (must be in the EST time zone) with a competitive base salary and a comprehensive benefits package. As a Senior Data Engineer at our client, you will responsible for integrating and analyzing raw data, developing and maintaining data pipelines and datasets and improving data quality and efficiency. This Data Engineer needs to have strong analytical skills and will involve various methods to integrate different data sources, transform data into useful business drivers. You should be intimate with several data stores, integration tools/technologies, data transformation methods, data modelling and analysis. Candidate should be able to work across projects involving multiple stakeholders and have strong technical and programming skills while working with business, architects, partners to resolve complex system issues. Responsibilities: β€’ Develops and maintains scalable data pipelines, integration tools and builds out new API integrations to support continuing increases in data usage, volume, and complexity. β€’ Building analytical tools to utilize the data pipeline, providing actionable insight into key business performance metrics including operational efficiency and customer acquisition. β€’ Designs data integrations, data quality framework and integrate it with monitoring services. Strong hands-on capabilities across integration toolsets including but not limited to ADF, Replication, CDC etc. β€’ Performs data analysis required to troubleshoot data related issues and assist in the resolution of data issues. β€’ Identifying, designing, and implementing internal process improvements including re-designing infrastructure for greater scalability, DB, SQL, and data pipelines performance tuning, optimizing data delivery, and automating manual processes β€’ Build solution prototypes as needed and write algorithms against data. β€’ Implements processes and systems to monitor data quality, ensuring production data is always accurate and available for key stakeholders and business processes that depend on it. β€’ Strong Programming, SQL skills across structured and unstructured data. Intimate with most Azure services including ADF, Azure Functions, ASB, API, Synapse, SQL β€’ Collaborates with analytics and business teams to improve data models that feed business intelligence tools, increasing data accessibility, and fostering data-driven decision making across the organization. β€’ Works closely with a team of frontend and backend engineers, product managers, and analysts β€’ Data Management, Modelling, Architecture, and Integration β€’ ETL, ELT with Data Lake and Data Warehouse data platforms β€’ Should be self-starter with strong execution and delivery mindset. β€’ Monitor and manage database performance β€’ Excellent knowledge of data backup, recovery, security, integrity and SQL β€’ Hands-on experience with database standards and end user applications β€’ Develop and maintain database documentation, including data standards, procedures and definitions for data elements and tables in the company. Experience and Competencies: β€’ At least 7 years of experience in technology, specifically in the data field β€’ Hands-on 7+ years' experience with SQL, Data integration, Data analysis, Data Modelling, and design β€’ 4+ years of Python or Nodejs β€’ 4+ years of experience with schema design and dimensional data modeling β€’ Experience in technology platforms across Microsoft Azure, Cloud Computing, Software as a Services (SaaS), Integration Platform as a Service (IPaaS), Infrastructure as a Service (IaaS) β€’ Previous experience as a data engineer or in a similar role β€’ Technical expertise with data models, data mining, and segmentation techniques β€’ Experience working and implementing solutions with Microservices Architecture β€’ Experience working with Data warehousing and Data Lake solutions. β€’ Demonstrated ability to instrument Data Quality & Management standards and processes. β€’ 4+ years working with data integration and ETL tools, such as Azure Data Factory, CDC, Replication, Talend, Informatica etc. β€’ Strong problem-solving and analytical skills β€’ Excellent communication and collaboration abilities β€’ Adaptability and a willingness to learn new technologies and techniques. β€’ Ability to work with CI/CD pipelines and Automation. β€’ Ability to apply multiple technical solutions to enable future-state business capabilities that, in turn, drive targeted business outcomes. β€’ Ability to quickly comprehend the functions and capabilities of existing, new, and emerging technologies that enable and drive new business designs and models. β€’ Bachelor’s degree in computer science or related field is a plus.