Jobs via Dice

Databricks Engineer - Remote - USA - W2 -Contract

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Databricks Engineer on a W2 contract, remote in the USA, focusing on developing ETL/ELT pipelines with Databricks, Apache Spark, and Pyspark. Key skills include SQL, Python, and data modeling; experience with Azure data platforms is required.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
November 12, 2025
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Remote
-
πŸ“„ - Contract
W2 Contractor
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
United States
-
🧠 - Skills detailed
#"ETL (Extract #Transform #Load)" #SQL (Structured Query Language) #PySpark #Apache Spark #Data Wrangling #Delta Lake #Data Science #Azure #Datasets #Data Quality #Data Pipeline #Scala #ML (Machine Learning) #Databricks #Azure Data Platforms #Python #Visualization #Spark (Apache Spark) #Data Analysis #Data Modeling
Role description
Dice is the leading career destination for tech experts at every stage of their careers. Our client, TechVirtue LLC, is seeking the following. Apply via Dice today! Databricks Engineer Location: Remote - USA Databricks Engineer with analytical expertise to design, develop, and optimize scalable data pipelines and analytics solutions using the Databricks platform. Closely with data scientists, analysts, and business stakeholders to transform raw data into actionable insights that drive strategic decision-making. Key Responsibilities: Develop and maintain robust ETL/ELT pipelines using Apache Spark, Pyspark on Databricks. Collaborate with cross-functional teams to understand data requirements and deliver analytical solutions. Implement data quality checks, performance tuning, and scalable data models. Leverage SQL, Python, and Spark to perform advanced data analysis and reporting. Monitor and troubleshoot data workflows to ensure reliability and accuracy. Required Skills: Proficiency in Databricks, Apache Spark, Pyspark and Azure data platforms. Strong analytical and problem-solving skills with experience in data wrangling and visualization. Expertise in SQL, Python, and data modeling techniques. Familiarity with Delta Lake, ML flow, and notebook-based development. Ability to interpret complex datasets and communicate insights effectively.