DTI (Diversified Technology Inc.)

Data Pipeline Engineer - Remote - W2

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Pipeline Engineer on a 7+ month contract, remote, with a pay rate of $65.00/hr. Key skills include advanced Databricks, Azure SQL Server, and healthcare data integration experience preferred.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
520
-
🗓️ - Date
December 17, 2025
🕒 - Duration
More than 6 months
-
🏝️ - Location
Remote
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#SQL Server #Data Pipeline #Databricks #Data Modeling #Data Processing #Azure #SQL (Structured Query Language) #Data Governance #Data Security #Storage #Data Architecture #Data Engineering #Docker #Big Data #Security #Normalization #Python #Automation #PySpark #"ETL (Extract #Transform #Load)" #Consulting #Data Integration #Data Storage #CRM (Customer Relationship Management) #Spark (Apache Spark) #Data Quality #Kubernetes #Azure SQL #Scala
Role description
WHO WE ARE: Founded in 2007, DTI (Diversified Technology, Inc.) is a successful African American owned IT Consulting/Staffing firm based in Chicago's Loop. WHAT WE DO: We focus on providing delivery, staffing, and supported services such as enterprise integration/implementations including, but not limited to, CRM, EAM, ERP, PMO, and QA. We service clients in SLED (state/local gov't & education), financial services, fortune, public utility, as well as regularly partnering with Big 4 SI partners. Are you a Data Engineer looking for your next contract? If so, we want to speak to you! DTI has an immediate need for a Data Engineer for a 7+ Months contract. Must work on our W2 Job Title: Data Pipeline Engineer - 3 Roles Location: Remote Duration: 7+ Months contract (01/01/2026 to 07/31/2026) 3 roles for Data Pipeline Engineer Description: The services You will provide the client project team: As a Data Engineer Contractor, you will support the project team by designing and implementing large-scale data solutions to meet business needs. Design and develop scalable big data architectures to handle large volumes of data. Collaborate with stakeholders to understand data requirements and translate them into technical solutions. Implement data integration, data processing, and data storage solutions using big data technologies. Ensure data security, data quality, and data governance standards are met. Optimize data architectures for performance, scalability, and cost-efficiency. Primary Skill Required for the Role: Databricks Level Required for Primary Skill: Advanced (6-9 years experience) Skills: • Databricks (Python and SQL/PySpark) • Azure SQL Server (Managed Instance, Azure SQL DB, SQL Server VMs) • Data modeling and ETL/ELT design patterns • Docker and Azure Kubernetes Service (AKS) for select automation • Performance optimization and scalability Experience: • 7+ years enterprise data engineering • Healthcare data integration experience strongly preferred • Experience with large-scale data harmonization and normalization Pay rate: $65.00/hr depending on experience Please click on the link below for our company benefits https://docs.google.com/document/d/1q2mHf0U1akaC1ZKC65-VyyG3FrKBGNCahx3WpS2mj7M/edit?usp=sharing DTI is an Equal Opportunity Employer. We do not discriminate based on race, color, religion, sex, gender identity, sexual orientation, national origin, ancestry, age, disability, marital status, veteran status, or any other protected characteristic under Illinois state or federal law. All qualified applicants are encouraged to apply, and employment decisions are based solely on merit, qualifications, and business needs.