Youth Power Technosoft LLC

Databricks

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a W2, onsite Databricks Data Engineer position for 5+ years experienced professionals. Pay rate is competitive. Key skills include PySpark, SQL, and cloud platform integration (AWS, Azure, GCP). Databricks certification preferred.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
October 3, 2025
πŸ•’ - Duration
Unknown
-
🏝️ - Location
On-site
-
πŸ“„ - Contract
W2 Contractor
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Plano, TX
-
🧠 - Skills detailed
#Security #Compliance #dbt (data build tool) #Code Reviews #"ETL (Extract #Transform #Load)" #Azure #Data Governance #Delta Lake #DevOps #Data Engineering #S3 (Amazon Simple Storage Service) #ADLS (Azure Data Lake Storage) #Spark SQL #Airflow #Data Quality #Data Lake #MLflow #Scala #BigQuery #GIT #GCP (Google Cloud Platform) #SQL (Structured Query Language) #Computer Science #Distributed Computing #Databricks #Cloud #AWS (Amazon Web Services) #Data Lakehouse #PySpark #Spark (Apache Spark) #Data Science
Role description
W2 role and Onsite ο»ΏKey Responsibilities: β€’ Build and maintain robust ETL/ELT pipelines using PySpark and Databricks notebooks β€’ Design and implement scalable data solutions leveraging Delta Lake, Unity Catalog, and Lakehouse architecture β€’ Optimize Spark jobs for performance and cost efficiency β€’ Collaborate with data scientists, analysts, and business stakeholders to deliver high-quality data products β€’ Integrate Databricks with cloud platforms (AWS, Azure, or GCP) and external data sources β€’ Ensure data quality, governance, and security compliance across pipelines β€’ Participate in code reviews, architecture discussions, and platform enhancements Required Skills: β€’ 5+ years of experience in data engineering, with 2+ years on Databricks β€’ Proficiency in PySpark, SQL, and Databricks notebooks β€’ Strong understanding of Delta Lake, data lakehouse principles, and distributed computing β€’ Experience with cloud platforms (AWS, Azure, or GCP) and associated services (e.g., S3, ADLS, BigQuery) β€’ Familiarity with CI/CD pipelines, Git, and DevOps practices β€’ Knowledge of data governance tools (Unity Catalog, Purview, etc.) β€’ Excellent problem-solving and communication skills Preferred Qualifications: β€’ Databricks Certified Data Engineer Associate or Professional β€’ Experience with MLflow, Auto Loader, and structured streaming β€’ Exposure to Airflow, dbt, or other orchestration tools Education: β€’ Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related field Let me know if you'd like this adapted for a specific cloud (AWS, Azure, GCP), formatted for outreach, or paired with hashtags.