Test Teechnogen, Inc.

Azure Developer-Databricks

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer specializing in Databricks and Python, with a focus on medallion architecture. Contract length is unspecified, pay rate is "unknown," and remote work is allowed. Requires 5+ years of experience, SQL proficiency, and a degree in Computer Science or related field.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
December 9, 2025
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Unknown
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
United States
-
🧠 - Skills detailed
#Data Pipeline #Scala #Python #Data Lake #Data Modeling #Cybersecurity #Compliance #Data Lakehouse #SQL (Structured Query Language) #Databricks #Data Processing #Cloud #"ETL (Extract #Transform #Load)" #Azure #Data Quality #Data Engineering #Computer Science #Security #Data Science #Delta Lake
Role description
About the Role: We are seeking a highly skilled Senior Data Engineer with deep expertise in Databricks and Python, and substantial experience implementing medallion architecture. The ideal candidate will play a key role in designing and maintaining scalable, secure, and reliable data platforms that support advanced analytics and business insights. Experience in cybersecurity and/or healthcare data environments is highly preferred. Key Responsibilities: β€’ Design, implement, and optimize robust data pipelines and ETL processes using Databricks and Python. β€’ Architect and manage data solutions leveraging the medallion architecture (Bronze, Silver, Gold layers) for scalable and efficient data processing. β€’ Ensure data quality, integrity, and compliance with security and privacy regulations, particularly in healthcare or cybersecurity contexts. β€’ Collaborate with data scientists, analysts, and cross-functional teams to deliver high-quality, secure data solutions. β€’ Develop and maintain data lakehouse architectures, utilizing Delta Lake and Unity Catalog. β€’ Monitor, troubleshoot, and optimize data workflows in production environments. β€’ Contribute to the development and enforcement of data engineering standards, especially with respect to secure data handling. Required Qualifications: β€’ Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field. β€’ 5+ years of experience in data engineering roles, with a strong emphasis on Databricks and Python. β€’ Proven track record implementing medallion architecture in cloud-based environments. β€’ Advanced proficiency in SQL, data modeling, and ETL frameworks. β€’ Excellent problem-solving, communication, and collaboration skills.