e&e IT Consulting Services, Inc.

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Data Engineer contract in Philadelphia, PA, offering competitive pay. Key skills required include ETL/ELT development, data modeling, cloud platforms, and compliance with data governance. Experience with data pipelines, SQL, and Agile methodologies is essential.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
November 14, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Philadelphia, PA
-
🧠 - Skills detailed
#Data Warehouse #Security #Data Quality #GDPR (General Data Protection Regulation) #ML (Machine Learning) #JSON (JavaScript Object Notation) #Matillion #EDW (Enterprise Data Warehouse) #Leadership #Cloud #Data Processing #Data Governance #Scrum #Spark (Apache Spark) #Data Lake #SQL (Structured Query Language) #Data Analysis #Data Mart #Kafka (Apache Kafka) #Informatica #Agile #Data Privacy #Spark SQL #Compliance #Data Engineering #Data Pipeline #dbt (data build tool) #Fivetran #Data Modeling #XML (eXtensible Markup Language) #Automation #"ETL (Extract #Transform #Load)" #AI (Artificial Intelligence) #Python #Documentation #Talend #Complex Queries #Scala
Role description
e&e is seeking a Data Engineer for an onsite contract opportunity in Philadelphia, PA! The Data Engineer is responsible for designing, developing, and optimizing scalable data pipelines and modern data infrastructure to support operational, analytical, and AI/ML initiatives. This role partners closely with data leadership, analytics engineering, business users, product teams, and data modelers to gather requirements, improve data quality, and ensure data is delivered in a reliable, secure, and timely manner. The ideal candidate brings strong experience across ETL/ELT development, data modeling, cloud-based data platforms, and modern data engineering tools, while maintaining compliance with data governance and security standards. Responsibilities: • Design and develop scalable, modern data infrastructure to support enterprise-wide data and analytics initiatives. • Build and maintain high-performance ETL/ELT pipelines and big-data processing solutions to reliably deliver curated data assets. • Optimize data pipeline performance and automate processes to enhance speed, reliability, and cost efficiency. • Collaborate with analytics engineering teams to capture and standardize key business metrics and KPIs. • Develop and maintain documentation including process flows, source-to-target mappings, error handling, and pipeline recovery procedures. • Ensure compliance with data governance policies, security standards, and industry regulatory requirements. • Drive continuous improvements to enhance automation, scalability, and overall data engineering best practices. • Partner with Operations teams to implement control objectives, manage data movement, and support incident management processes. • Perform additional data engineering duties as assigned. Requirements • Deep understanding of relational, dimensional, and non-relational data modeling techniques. • Extensive experience acquiring data through Change Data Capture (CDC), pub/sub patterns, real-time streaming (Kafka, Kinesis), APIs, and similar methods. • Hands-on experience building data pipelines using ETL/ELT tools (Talend, Informatica, Fivetran, Matillion, dbt), Python, Spark, SQL, etc., and delivering curated data in data lakes, enterprise data warehouses, and data marts. • Ability to analyze and interpret various data formats including XML, JSON, AVRO, and CSV. • Experience developing Data-as-a-Service (DaaS) APIs to deliver data products to internal or external consumers. • Strong SQL proficiency, including writing complex queries for data analysis, profiling, validation, and reconciliation. • Familiarity with data privacy and compliance regulations such as GDPR, CCPA, and HIPAA. • Experience working within Agile/Scrum teams. • Excellent analytical and problem-solving skills with a proactive approach to identifying and resolving issues. • Commitment to staying current with emerging data engineering technologies, cloud platforms, and best practices.