Celebal Technologies

Solutions Architect

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Solutions Architect in Denver, Colorado, on a contract for 1 to 3 months, offering a competitive pay rate. Key skills include Databricks, Delta Live Tables, Unity Catalog, and Microsoft Fabric, with 10+ years of relevant experience required.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
December 3, 2025
πŸ•’ - Duration
1 to 3 months
-
🏝️ - Location
Hybrid
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Denver, CO
-
🧠 - Skills detailed
#Data Engineering #Big Data #Delta Lake #Azure #GCP (Google Cloud Platform) #Kafka (Apache Kafka) #Databricks #Automation #Data Quality #Data Pipeline #Data Architecture #Data Science #Spark (Apache Spark) #Scala #AWS (Amazon Web Services) #Security #Terraform #PySpark #Microsoft Azure #DevOps #Cloud
Role description
Celebal is looking for Solution Architect. Company Profile - Celebal Technologies a Premier Microsoft Azure partner and have been recently funded by Norwest Venture Capital. Celebal Technologies is a premier software services company in the field of Data Science, Big Data, Enterprise Cloud & Automation. Established in 2016, in this short span of time we grew to headcount of 2200+ We help in achieving a competitive advantage with intelligent data solutions, built using cutting-edge technology. Job Location: Denver, Colorado Job Type: Contract Work Mode: Initial 2 months Onsite, followed by Hybrid (Note: Fully remote is not an option) Job Description: Celebal Technologies is seeking a highly skilled Resident Solutions Architect (RSA) with strong expertise in Databricks, Delta Live Tables, Unity Catalog, and Microsoft Fabric. The ideal candidate will work closely with our customers to architect, implement, and optimize scalable data solutions, ensuring high performance, security, and reliability across cloud environments. Required Skills & Experience: Design, architect, and implement end-to-end data solutions on Databricks, including Delta Live Tables and Unity Catalog. Build scalable and reliable data pipelines, ensuring data quality, lineage, and governance. Work directly with customer teams to understand business requirements and translate them into technical solutions. β€’ 10+ years of experience in data engineering, data architecture, or cloud solution design. β€’ Strong hands-on experience with Databricks (PySpark, Delta Lake, Delta Live Tables). β€’ Deep understanding of Unity Catalog, RBAC, governance, and workspace security. β€’ Experience implementing data solutions on Microsoft Fabric (Lakehouse, Warehouse, Data Pipelines, etc.). β€’ Proficiency in cloud platforms (Azure preferred; AWS/GCP is a plus). β€’ Strong knowledge of modern data architectures – Lakehouse, Medallion Architecture, ELT frameworks. β€’ Experience developing and optimizing large-scale data pipelines. β€’ Strong problem-solving, communication, and stakeholder management skills. Good to Have: β€’ Databricks certifications (Data Engineer Professional, Solution Architect). β€’ Experience with CI/CD, Terraform, or DevOps practices. β€’ Exposure to streaming technologies (Kafka, Spark Structured Streaming).