

PeopleCaddie
Data Engineer - Manager
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer – Contract, lasting 6-12 months, with a pay rate of $80–$85/hour. Key skills include 8+ years in data engineering, proficiency in Python, SQL, and Azure technologies, and experience with data modeling and ETL processes.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
680
-
🗓️ - Date
May 5, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Chicago, IL
-
🧠 - Skills detailed
#Microsoft Azure #AI (Artificial Intelligence) #"ETL (Extract #Transform #Load)" #Cloud #Azure #Data Science #ADF (Azure Data Factory) #Computer Science #Data Engineering #Databricks #Data Pipeline #Databases #Indexing #C# #Data Management #Data Warehouse #Strategy #Data Integration #Informatica Cloud #Data Quality #Azure Data Factory #Python #Automation #Scala #Programming #Data Strategy #MDM (Master Data Management) #Informatica #Synapse #Data Cleansing #ML (Machine Learning) #Datasets #Data Accuracy #SQL Server #Snowflake #Java #Data Modeling #Leadership #SQL (Structured Query Language) #Microsoft Power BI #BI (Business Intelligence)
Role description
Job Description
Title: Senior Data Engineer – Contract
Client: Large Public Accounting Firm
Engagement: 6-12 Months+ Contract
Work Model: Fully Remote
Rate: $80–$85/hour (C2C)
Role Overview
The Senior Data Engineer plays a critical role in advancing a modern, enterprise-wide data strategy focused on enabling data-driven decision-making, advanced analytics, and AI-powered insights. This role supports a centralized Data & Analytics function aligned to core technology pillars including Application Modernization, AI, and Data.
You will be responsible for designing, building, optimizing, and maintaining scalable data platforms and pipelines that support analytics, reporting, AI/ML, and operational intelligence across the organization. This is a senior, hands-on engineering role requiring deep technical expertise, strong collaboration skills, and the ability to translate complex data challenges into reliable, high-value solutions.
Key Responsibilities
• Design, develop, and maintain scalable and resilient data pipelines for ingesting, transforming, and delivering data from diverse internal and external sources.
• Integrate data across databases, data warehouses, APIs, and third-party platforms while ensuring data accuracy, consistency, and integrity.
• Apply data cleansing, validation, aggregation, enrichment, and transformation techniques to prepare analytics-ready datasets.
• Optimize data pipelines and processing workflows for performance, scalability, reliability, and cost efficiency.
• Monitor and tune data systems; identify performance bottlenecks and implement indexing, caching, and optimization strategies.
• Embed data quality checks, validation rules, and governance controls directly within data pipelines.
• Collaborate with architects, data scientists, AI engineers, and analysts to support advanced analytics, business intelligence, and AI/ML use cases.
• Take ownership and accountability for maximizing the value of enterprise data assets used for insights, automation, and decision support.
• Clearly communicate complex technical concepts to both technical and non-technical stakeholders, including senior leadership.
Required Experience & Qualifications
Bachelor’s degree in Computer Science, Data Science, Software Engineering, Information Systems, or a related quantitative field.
8+ years of experience in data engineering, including:
• Data modeling and architecture
• ETL / ELT and data integration
• Data warehousing and analytics platforms
• Data quality, master data management, and governance
• Business intelligence and advanced analytics (predictive and prescriptive)
Strong programming experience with Python, SQL, Java, and/or C#.
Hands-on Experience With Modern Data Platforms And Tools, Including
• Microsoft Azure technologies (SQL Server IaaS/PaaS, Synapse, Cosmos DB, Azure Data Factory, Databricks, HDInsight, Fabric, Power BI)
• Informatica Cloud (CIH, DIH, CDGC, Master Data Management, Data Quality)
• Snowflake and other leading cloud data technologies
#PCIT
Job Description
Title: Senior Data Engineer – Contract
Client: Large Public Accounting Firm
Engagement: 6-12 Months+ Contract
Work Model: Fully Remote
Rate: $80–$85/hour (C2C)
Role Overview
The Senior Data Engineer plays a critical role in advancing a modern, enterprise-wide data strategy focused on enabling data-driven decision-making, advanced analytics, and AI-powered insights. This role supports a centralized Data & Analytics function aligned to core technology pillars including Application Modernization, AI, and Data.
You will be responsible for designing, building, optimizing, and maintaining scalable data platforms and pipelines that support analytics, reporting, AI/ML, and operational intelligence across the organization. This is a senior, hands-on engineering role requiring deep technical expertise, strong collaboration skills, and the ability to translate complex data challenges into reliable, high-value solutions.
Key Responsibilities
• Design, develop, and maintain scalable and resilient data pipelines for ingesting, transforming, and delivering data from diverse internal and external sources.
• Integrate data across databases, data warehouses, APIs, and third-party platforms while ensuring data accuracy, consistency, and integrity.
• Apply data cleansing, validation, aggregation, enrichment, and transformation techniques to prepare analytics-ready datasets.
• Optimize data pipelines and processing workflows for performance, scalability, reliability, and cost efficiency.
• Monitor and tune data systems; identify performance bottlenecks and implement indexing, caching, and optimization strategies.
• Embed data quality checks, validation rules, and governance controls directly within data pipelines.
• Collaborate with architects, data scientists, AI engineers, and analysts to support advanced analytics, business intelligence, and AI/ML use cases.
• Take ownership and accountability for maximizing the value of enterprise data assets used for insights, automation, and decision support.
• Clearly communicate complex technical concepts to both technical and non-technical stakeholders, including senior leadership.
Required Experience & Qualifications
Bachelor’s degree in Computer Science, Data Science, Software Engineering, Information Systems, or a related quantitative field.
8+ years of experience in data engineering, including:
• Data modeling and architecture
• ETL / ELT and data integration
• Data warehousing and analytics platforms
• Data quality, master data management, and governance
• Business intelligence and advanced analytics (predictive and prescriptive)
Strong programming experience with Python, SQL, Java, and/or C#.
Hands-on Experience With Modern Data Platforms And Tools, Including
• Microsoft Azure technologies (SQL Server IaaS/PaaS, Synapse, Cosmos DB, Azure Data Factory, Databricks, HDInsight, Fabric, Power BI)
• Informatica Cloud (CIH, DIH, CDGC, Master Data Management, Data Quality)
• Snowflake and other leading cloud data technologies
#PCIT






