Senior Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer with a 6-month contract-to-hire in Boston, MA. Key skills include SQL, Python, Azure Data Factory, and data modeling. Candidates must have 5+ years of experience and expertise in cloud technologies.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
September 26, 2025
πŸ•’ - Project duration
More than 6 months
-
🏝️ - Location type
Hybrid
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
Boston, MA
-
🧠 - Skills detailed
#Security #Visualization #Azure Data Factory #Microsoft Azure #Microsoft SQL Server #ADF (Azure Data Factory) #Data Quality #Spark (Apache Spark) #Data Lakehouse #Synapse #Data Processing #Data Management #"ETL (Extract #Transform #Load)" #Scripting #Cloud #Data Lake #SQL Server #Automation #Azure #SQL (Structured Query Language) #Python #Data Warehouse #Microsoft SQL #Leadership #MS SQL (Microsoft SQL Server) #Scala #Azure cloud #Data Architecture #Data Engineering #Data Modeling #Shell Scripting
Role description
Senior Data Engineer (contract to hire) This role is with a DeWinter Legal Partner Boston, MA - Hybrid Role - We are targeting local candidates that can be in the Boston office 3 days per week. 6 Month contract to hire About the Opportunity Join a prominent leader in the legal industry for a high-impact 6-month contract-to-perm engagement. This is a crucial role focused on transforming the organization's core data capabilities and is intended for a Senior Data Engineer who is ready to drive significant, measurable value from day one. Our client is addressing a clear need for modernization and efficiency, specifically targeting a skill gap in advanced data visualization and rapid project delivery. We are looking for a highly autonomous and dedicated professional capable of hitting the ground running to tackle these challenges head-on. The successful candidate will be the primary driver behind enhancing data scalability, security, and accessibility, ultimately accelerating turnaround times for insights across the entire organization. Given the client's preference for assessing a candidate's fit, this role is explicitly structured as a contract-to-perm (T2P) opportunity. Key Responsibilities & Deliverables Your focus will be on the successful execution of specific, high-value deliverables that directly impact business operations: β€’ Data Platform Modernization & Management: You will be responsible for the management, evaluation, and enhancement of our hybrid data platforms, spanning both Microsoft SQL Server on-premises and Microsoft Azure Cloud. A major deliverable will be the identification and implementation of emerging cloud-based solutions (with Azure/Fabric preferred) to drastically improve overall scalability and security. β€’ Data Lakehouse Design: Design and implement a robust, cloud-based Data Lake/Lakehouse architecture. You will monitor and optimize data engineering processes to boost data processing efficiency, data quality, and overall system performance. β€’ Process Optimization and Governance: Establish, document, and optimize standardized procedures for data management, integration, quality management, and lifecycle management. This work ensures data consistency, accuracy, and timeliness, applying advanced Powershell Scripting for crucial automation tasks. β€’ Stakeholder Collaboration & Leadership: Act as a key technical contact for the Data Services Team, providing expert guidance to IT teams, business units, and senior management. You must translate complicated technical solutions into clear and concise business communication, driving successful engagement and fostering a data-driven culture across the firm. Required Skills & Experience The ideal candidate will have a proven track record of successful engagements and possess the following deep technical expertise: β€’ Experience & Technology: 5 or more years of experience as a Database Architect or in a highly relevant senior data role, with 2–3 years of hands-on experience in data cloud technologies. β€’ Data Expertise: Expertise in Data Modeling from a warehouse perspective is mandatory (in lieu of pure Data Architect experience). Deep technical proficiency is required in SQL, Python, and the Microsoft Azure ecosystem. β€’ Cloud Proficiency: Advanced, hands-on experience with Microsoft Azure Data Factory and the Microsoft Azure Spark platform is necessary. Expertise with the Microsoft Azure Data Warehouse platform is also required. While Fabric is the preferred future direction, candidates with deep, production-level expertise in Azure Synapse will be strongly considered.