

Jobs via Dice
Contract W2 Databricks Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Contract W2 Databricks Engineer in Adelphi, MD, lasting 4 months with possible extensions. Key skills include Databricks, ETL/ELT, Python/Spark, and data modeling. A Bachelor's degree and Databricks/Azure certifications are preferred. Hybrid work is required.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
February 13, 2026
🕒 - Duration
3 to 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Adelphi, MD
-
🧠 - Skills detailed
#Data Processing #Scala #Python #Data Engineering #Computer Science #Agile #Data Accuracy #Documentation #Spark (Apache Spark) #Monitoring #Security #Data Science #Data Modeling #Databricks #SQL (Structured Query Language) #Data Pipeline #AI (Artificial Intelligence) #NLP (Natural Language Processing) #Data Ingestion #Dimensional Data Models #Data Quality #Compliance #Data Lake #"ETL (Extract #Transform #Load)" #Data Cleansing #Data Lakehouse #Base #Data Governance #SQL Queries #Azure
Role description
Dice is the leading career destination for tech experts at every stage of their careers. Our client, Logicplanet, Inc., is seeking the following. Apply via Dice today!
Databricks Engineer
Location: Adelphi, MD
Duration: 4 months base; possible extensions
Interview: 2 rounds
NOTES:
Prefer someone able to work hybrid 1-2 days a week in their Adelphi, MD office.
Contract valid until June 30, 2026 (possible extensions thereafter)
JOB DESCRIPTION:
The Person
You are a detail-oriented and analytical problem solver who enjoys tackling complex data challenges. You communicate clearly and collaborate effectively with cross-functional teams to deliver meaningful, data-driven solutions. You are adaptable, service oriented, curious, and passionate about using modern data technologies to unlock insights. Highly organized and proactive, you can manage multiple priorities while maintaining a strong focus on quality, scalability, and innovation.
Key Responsibilities
• Implement and optimize data models and structures within Databricks to support efficient querying, analytics, and reporting.
• Design, develop, and maintain scalable data pipelines and ETL/ELT workflows, with a strong emphasis on dimensional data modeling and data quality.
• Partner with engineering teams and business stakeholders to gather requirements and deliver reliable analytics solutions.
• Develop, optimize, and maintain SQL queries, notebooks, and scripts for data ingestion, transformation, and processing.
• Ensure data accuracy, consistency, and integrity through validation, monitoring, and data cleansing processes.
• Create and maintain comprehensive documentation for data pipelines, models, and analytics solutions.
• Monitor, troubleshoot, and optimize data pipelines and analytics workloads to ensure performance and reliability.
• Stay current with industry trends, including AI-driven analytics tools, semantic modeling, and emerging data engineering best practices.
Preferred Experience
• Hands-on experience implementing and operating solutions in Databricks.
• Strong understanding of ETL/ELT architectures and data ingestion patterns.
• Proficiency in Python and/or Spark for large-scale data processing.
• Experience designing and implementing dimensional data models in data lakehouse environments.
• Familiarity with AI-driven analytics platforms, semantic modeling concepts, and NLP techniques.
• Experience working in SAFe Agile or other scaled Agile environments.
• Solid understanding of data governance, security, and compliance best practices in global environments with numerous data providers and consumers.
Academic Credentials
• Bachelor s degree in Computer Science, Data Science, Engineering, or a related field preferred.
• Master s degree is a plus.
• Databricks and Azure certifications strongly preferred.
Dice is the leading career destination for tech experts at every stage of their careers. Our client, Logicplanet, Inc., is seeking the following. Apply via Dice today!
Databricks Engineer
Location: Adelphi, MD
Duration: 4 months base; possible extensions
Interview: 2 rounds
NOTES:
Prefer someone able to work hybrid 1-2 days a week in their Adelphi, MD office.
Contract valid until June 30, 2026 (possible extensions thereafter)
JOB DESCRIPTION:
The Person
You are a detail-oriented and analytical problem solver who enjoys tackling complex data challenges. You communicate clearly and collaborate effectively with cross-functional teams to deliver meaningful, data-driven solutions. You are adaptable, service oriented, curious, and passionate about using modern data technologies to unlock insights. Highly organized and proactive, you can manage multiple priorities while maintaining a strong focus on quality, scalability, and innovation.
Key Responsibilities
• Implement and optimize data models and structures within Databricks to support efficient querying, analytics, and reporting.
• Design, develop, and maintain scalable data pipelines and ETL/ELT workflows, with a strong emphasis on dimensional data modeling and data quality.
• Partner with engineering teams and business stakeholders to gather requirements and deliver reliable analytics solutions.
• Develop, optimize, and maintain SQL queries, notebooks, and scripts for data ingestion, transformation, and processing.
• Ensure data accuracy, consistency, and integrity through validation, monitoring, and data cleansing processes.
• Create and maintain comprehensive documentation for data pipelines, models, and analytics solutions.
• Monitor, troubleshoot, and optimize data pipelines and analytics workloads to ensure performance and reliability.
• Stay current with industry trends, including AI-driven analytics tools, semantic modeling, and emerging data engineering best practices.
Preferred Experience
• Hands-on experience implementing and operating solutions in Databricks.
• Strong understanding of ETL/ELT architectures and data ingestion patterns.
• Proficiency in Python and/or Spark for large-scale data processing.
• Experience designing and implementing dimensional data models in data lakehouse environments.
• Familiarity with AI-driven analytics platforms, semantic modeling concepts, and NLP techniques.
• Experience working in SAFe Agile or other scaled Agile environments.
• Solid understanding of data governance, security, and compliance best practices in global environments with numerous data providers and consumers.
Academic Credentials
• Bachelor s degree in Computer Science, Data Science, Engineering, or a related field preferred.
• Master s degree is a plus.
• Databricks and Azure certifications strongly preferred.





