

Databricks Data Engineer (Only W2)
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Databricks Data Engineer (Only W2) contract position, remote, offering a pay rate of "$XX per hour." Requires 7–12 years of data engineering experience, 3+ years with Databricks/Spark, strong SQL, and Private Equity/Private Credit industry knowledge.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
-
🗓️ - Date discovered
September 5, 2025
🕒 - Project duration
Unknown
-
🏝️ - Location type
Remote
-
📄 - Contract type
W2 Contractor
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Dimensional Data Models #AWS (Amazon Web Services) #Data Engineering #Data Integration #SQL (Structured Query Language) #Data Profiling #Data Architecture #Azure #Data Modeling #Cloud #Databricks #"ETL (Extract #Transform #Load)" #Scala #Spark (Apache Spark)
Role description
Job Title: Databricks Data Engineer (Only W2)
Location: Remote
Position Type: Contract
Job Description:
Requirements and Qualifications:
• Lead the design and development of scalable data solutions using Databricks.
• Apply Private Equity/Private Credit industry knowledge to ensure data models align with business requirements.
• Develop and optimize relational and dimensional data models to support reporting and analytics.
• Perform data profiling, source-to-target mapping, and ELT transformation design to enable high-quality, accurate data flows.
• Collaborate with cross-functional teams including business stakeholders, data architects, and engineers to ensure consistency and scalability.
• 7–12 years of experience in data engineering or related roles, with at least 3+ years working with Databricks/Spark in production environments.
• Strong SQL expertise and hands-on experience with cloud platforms (AWS or Azure preferred).
• Proven background in data modeling, data integration, and ETL/ELT pipeline design.
• Excellent problem-solving and communication skills, with ability to work in fast-paced, client-facing environments.
If you believe you are qualified for this position and are currently in the job market or interested in making a change, please email me the resume along with contact details at roshni@nytpcorp.com
Job Title: Databricks Data Engineer (Only W2)
Location: Remote
Position Type: Contract
Job Description:
Requirements and Qualifications:
• Lead the design and development of scalable data solutions using Databricks.
• Apply Private Equity/Private Credit industry knowledge to ensure data models align with business requirements.
• Develop and optimize relational and dimensional data models to support reporting and analytics.
• Perform data profiling, source-to-target mapping, and ELT transformation design to enable high-quality, accurate data flows.
• Collaborate with cross-functional teams including business stakeholders, data architects, and engineers to ensure consistency and scalability.
• 7–12 years of experience in data engineering or related roles, with at least 3+ years working with Databricks/Spark in production environments.
• Strong SQL expertise and hands-on experience with cloud platforms (AWS or Azure preferred).
• Proven background in data modeling, data integration, and ETL/ELT pipeline design.
• Excellent problem-solving and communication skills, with ability to work in fast-paced, client-facing environments.
If you believe you are qualified for this position and are currently in the job market or interested in making a change, please email me the resume along with contact details at roshni@nytpcorp.com