

PeopleCaddie
Data Engineer - Manager
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer - Manager on a 6-12 month contract, fully remote, with a pay rate of $80–$85/hour. Requires 8+ years of data engineering experience, strong skills in Python, SQL, and modern data platforms, especially Microsoft Azure.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
680
-
🗓️ - Date
December 25, 2025
🕒 - Duration
More than 6 months
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Chicago, IL
-
🧠 - Skills detailed
#Java #Data Integration #AI (Artificial Intelligence) #Data Modeling #Computer Science #SQL Server #Databricks #Data Warehouse #Informatica Cloud #Automation #SQL (Structured Query Language) #Strategy #Data Engineering #C# #Datasets #Data Strategy #Indexing #ML (Machine Learning) #MDM (Master Data Management) #Informatica #Data Cleansing #Programming #Microsoft Power BI #Cloud #Snowflake #Data Science #Synapse #Data Quality #Data Pipeline #"ETL (Extract #Transform #Load)" #ADF (Azure Data Factory) #Scala #Azure Data Factory #Microsoft Azure #BI (Business Intelligence) #Data Management #Leadership #Azure #Python #Data Accuracy #Databases
Role description
Job Description
Title: Senior Data Engineer – Contract
Client: Large Public Accounting Firm
Engagement: 6-12 Months+ Contract
Work Model: Fully Remote
Rate: $80–$85/hour (C2C)
Role Overview
The Senior Data Engineer plays a critical role in advancing a modern, enterprise-wide data strategy focused on enabling data-driven decision-making, advanced analytics, and AI-powered insights. This role supports a centralized Data & Analytics function aligned to core technology pillars including Application Modernization, AI, and Data.
You will be responsible for designing, building, optimizing, and maintaining scalable data platforms and pipelines that support analytics, reporting, AI/ML, and operational intelligence across the organization. This is a senior, hands-on engineering role requiring deep technical expertise, strong collaboration skills, and the ability to translate complex data challenges into reliable, high-value solutions.
Key Responsibilities
• Design, develop, and maintain scalable and resilient data pipelines for ingesting, transforming, and delivering data from diverse internal and external sources.
• Integrate data across databases, data warehouses, APIs, and third-party platforms while ensuring data accuracy, consistency, and integrity.
• Apply data cleansing, validation, aggregation, enrichment, and transformation techniques to prepare analytics-ready datasets.
• Optimize data pipelines and processing workflows for performance, scalability, reliability, and cost efficiency.
• Monitor and tune data systems; identify performance bottlenecks and implement indexing, caching, and optimization strategies.
• Embed data quality checks, validation rules, and governance controls directly within data pipelines.
• Collaborate with architects, data scientists, AI engineers, and analysts to support advanced analytics, business intelligence, and AI/ML use cases.
• Take ownership and accountability for maximizing the value of enterprise data assets used for insights, automation, and decision support.
• Clearly communicate complex technical concepts to both technical and non-technical stakeholders, including senior leadership.
Required Experience & Qualifications
Bachelor’s degree in Computer Science, Data Science, Software Engineering, Information Systems, or a related quantitative field.
8+ years of experience in data engineering, including:
• Data modeling and architecture
• ETL / ELT and data integration
• Data warehousing and analytics platforms
• Data quality, master data management, and governance
• Business intelligence and advanced analytics (predictive and prescriptive)
Strong programming experience with Python, SQL, Java, and/or C#.
Hands-on Experience With Modern Data Platforms And Tools, Including
• Microsoft Azure technologies (SQL Server IaaS/PaaS, Synapse, Cosmos DB, Azure Data Factory, Databricks, HDInsight, Fabric, Power BI)
• Informatica Cloud (CIH, DIH, CDGC, Master Data Management, Data Quality)
• Snowflake and other leading cloud data technologies
#PCIT
Job Description
Title: Senior Data Engineer – Contract
Client: Large Public Accounting Firm
Engagement: 6-12 Months+ Contract
Work Model: Fully Remote
Rate: $80–$85/hour (C2C)
Role Overview
The Senior Data Engineer plays a critical role in advancing a modern, enterprise-wide data strategy focused on enabling data-driven decision-making, advanced analytics, and AI-powered insights. This role supports a centralized Data & Analytics function aligned to core technology pillars including Application Modernization, AI, and Data.
You will be responsible for designing, building, optimizing, and maintaining scalable data platforms and pipelines that support analytics, reporting, AI/ML, and operational intelligence across the organization. This is a senior, hands-on engineering role requiring deep technical expertise, strong collaboration skills, and the ability to translate complex data challenges into reliable, high-value solutions.
Key Responsibilities
• Design, develop, and maintain scalable and resilient data pipelines for ingesting, transforming, and delivering data from diverse internal and external sources.
• Integrate data across databases, data warehouses, APIs, and third-party platforms while ensuring data accuracy, consistency, and integrity.
• Apply data cleansing, validation, aggregation, enrichment, and transformation techniques to prepare analytics-ready datasets.
• Optimize data pipelines and processing workflows for performance, scalability, reliability, and cost efficiency.
• Monitor and tune data systems; identify performance bottlenecks and implement indexing, caching, and optimization strategies.
• Embed data quality checks, validation rules, and governance controls directly within data pipelines.
• Collaborate with architects, data scientists, AI engineers, and analysts to support advanced analytics, business intelligence, and AI/ML use cases.
• Take ownership and accountability for maximizing the value of enterprise data assets used for insights, automation, and decision support.
• Clearly communicate complex technical concepts to both technical and non-technical stakeholders, including senior leadership.
Required Experience & Qualifications
Bachelor’s degree in Computer Science, Data Science, Software Engineering, Information Systems, or a related quantitative field.
8+ years of experience in data engineering, including:
• Data modeling and architecture
• ETL / ELT and data integration
• Data warehousing and analytics platforms
• Data quality, master data management, and governance
• Business intelligence and advanced analytics (predictive and prescriptive)
Strong programming experience with Python, SQL, Java, and/or C#.
Hands-on Experience With Modern Data Platforms And Tools, Including
• Microsoft Azure technologies (SQL Server IaaS/PaaS, Synapse, Cosmos DB, Azure Data Factory, Databricks, HDInsight, Fabric, Power BI)
• Informatica Cloud (CIH, DIH, CDGC, Master Data Management, Data Quality)
• Snowflake and other leading cloud data technologies
#PCIT






