

Generis Tek Inc
Data Bricks Solution Architect-Hybrid
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Bricks Solution Architect-Hybrid in Dallas, TX, with a contract duration of 5+ months at $70/hr. Requires expertise in Databricks, cloud platforms (AWS/Azure/GCP), data engineering, and strong communication skills. Databricks certification preferred.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
560
-
ποΈ - Date
February 27, 2026
π - Duration
More than 6 months
-
ποΈ - Location
Hybrid
-
π - Contract
W2 Contractor
-
π - Security
Unknown
-
π - Location detailed
Dallas, TX
-
π§ - Skills detailed
#DevOps #Migration #GitHub #Python #Code Reviews #AWS (Amazon Web Services) #"ETL (Extract #Transform #Load)" #Security #Data Architecture #Monitoring #Compliance #Databricks #Data Quality #MLflow #Tableau #Azure DevOps #Azure #Storage #Scala #Leadership #Data Bricks #ML (Machine Learning) #Agile #Data Governance #Data Engineering #Data Processing #SQL (Structured Query Language) #BI (Business Intelligence) #Delta Lake #Microsoft Power BI #GCP (Google Cloud Platform) #Spark (Apache Spark) #Cloud #Kafka (Apache Kafka) #Jenkins #Looker #Data Lineage #Data Pipeline
Role description
Please Contact: To discuss this amazing opportunity, reach out to our Talent Acquisition Specialist Abhinav Chakraborty at email address Abhinav.Chakraborty@generistek.com can be reached on # 630-576-1925.
We have Contract role Data Bricks Solution Architect-Hybrid for our client at Dallas TX. Please let me know if you or any of your friends would be interested in this position.
Position Details:
Data Bricks Solution Architect-Hybrid Dallas TX
Location : Dallas, TX - 75201 β Hybrid
Project Duration : 5+ Months (Contract to Hire)
Pay rate : $70/hr. on w2
Role Summary :
β’ We are looking for a highly skilled Databricks Solution Architect to lead the design and implementation of scalable, enterprise-grade data platforms using Databricks. The ideal candidate will combine strong technical expertise in data engineering and cloud platforms (AWS/Azure/GCP) with architectural leadership, solution design capability, and strong stakeholder engagement skills.
Key Responsibilities:
1. Solution Architecture & Design
β’ Design end-to-end data architectures using Databricks Lake house Platform.
β’ Architect scalable ETL/ELT pipelines, real-time streaming solutions, and advanced analytics platforms.
β’ Define data models, storage strategies, and integration patterns aligned with business and enterprise architecture standards.
β’ Provide guidance on cluster configuration, performance optimization, cost management, and workspace governance.
1. Technical Leadership
β’ Lead technical discussions and design workshops with engineering teams and business stakeholders.
β’ Provide best practices, frameworks, and reusable component designs for consistent delivery.
β’ Perform code reviews and provide technical mentoring to data engineers and developers.
1. Stakeholder & Project Engagement
β’ Collaborate with product owners, business leaders, and analytics teams to translate business requirements into scalable technical solutions.
β’ Create and present solution proposals, architectural diagrams, and implementation strategies.
β’ Support pre-sales or discovery phases with technical input when needed.
1. Data Governance, Security & Compliance
β’ Define and implement governance standards across Databricks workspaces (data lineage, cataloging, access control, etc.).
β’ Ensure compliance with regulatory and organizational security frameworks.
β’ Implement best practices for monitoring, auditing, and data quality management.
1. Continuous Improvement & Innovation
β’ Stay updated on Databricks features, roadmap, and industry trends.
β’ Recommend improvements, optimizations, and modernization opportunities across the data ecosystem.
β’ Evaluate integration of complementary technologies (Delta Live Tables, MLflow, Unity Catalog, streaming frameworks, etc.).
Required Skills & Experience
Technical Skills
β’ Databricks Expertise: Strong hands-on experience with Databricks (clusters, notebooks, Delta Lake, MLflow, Unity Catalog).
β’ Cloud Platforms: Experience with at least one cloud provider (AWS, Azure, GCP).
β’ Data Engineering: Strong proficiency in Spark, Python, SQL, and distributed data processing.
β’ Architecture: Experience designing large-scale data solutions including ingestion, transformation, storage, and analytics.
β’ Streaming: Experience with streaming technologies (Structured Streaming, Kafka, Kinesis, Event Hub).
β’ DevOps: CI/CD practices for data pipelines (Azure DevOps, GitHub Actions, Jenkins, etc.).
Soft Skills
β’ Strong communication skills with the ability to engage both technical and business teams.
β’ Experience working in Agile environments.
β’ Ability to simplify complex technical concepts for non-technical audiences.
β’ Strong analytical, problem-solving, and decision-making abilities.
Preferred Qualifications:
β’ Databricks Certified Data Engineer Professional / Architect certification.
β’ AWS/Azure/GCP cloud architect certifications.
β’ Experience with BI tools (Tableau, Power BI, and Looker).
β’ Experience in machine learning workflows and ML operations.
β’ Background in large-scale data modernization or cloud migration projects.
To discuss this amazing opportunity, reach out to our Talent Acquisition Specialist Abhinav Chakraborty at email address Abhinav.Chakraborty@generistek.com can be reached on # 630-576-1925.
Please Contact: To discuss this amazing opportunity, reach out to our Talent Acquisition Specialist Abhinav Chakraborty at email address Abhinav.Chakraborty@generistek.com can be reached on # 630-576-1925.
We have Contract role Data Bricks Solution Architect-Hybrid for our client at Dallas TX. Please let me know if you or any of your friends would be interested in this position.
Position Details:
Data Bricks Solution Architect-Hybrid Dallas TX
Location : Dallas, TX - 75201 β Hybrid
Project Duration : 5+ Months (Contract to Hire)
Pay rate : $70/hr. on w2
Role Summary :
β’ We are looking for a highly skilled Databricks Solution Architect to lead the design and implementation of scalable, enterprise-grade data platforms using Databricks. The ideal candidate will combine strong technical expertise in data engineering and cloud platforms (AWS/Azure/GCP) with architectural leadership, solution design capability, and strong stakeholder engagement skills.
Key Responsibilities:
1. Solution Architecture & Design
β’ Design end-to-end data architectures using Databricks Lake house Platform.
β’ Architect scalable ETL/ELT pipelines, real-time streaming solutions, and advanced analytics platforms.
β’ Define data models, storage strategies, and integration patterns aligned with business and enterprise architecture standards.
β’ Provide guidance on cluster configuration, performance optimization, cost management, and workspace governance.
1. Technical Leadership
β’ Lead technical discussions and design workshops with engineering teams and business stakeholders.
β’ Provide best practices, frameworks, and reusable component designs for consistent delivery.
β’ Perform code reviews and provide technical mentoring to data engineers and developers.
1. Stakeholder & Project Engagement
β’ Collaborate with product owners, business leaders, and analytics teams to translate business requirements into scalable technical solutions.
β’ Create and present solution proposals, architectural diagrams, and implementation strategies.
β’ Support pre-sales or discovery phases with technical input when needed.
1. Data Governance, Security & Compliance
β’ Define and implement governance standards across Databricks workspaces (data lineage, cataloging, access control, etc.).
β’ Ensure compliance with regulatory and organizational security frameworks.
β’ Implement best practices for monitoring, auditing, and data quality management.
1. Continuous Improvement & Innovation
β’ Stay updated on Databricks features, roadmap, and industry trends.
β’ Recommend improvements, optimizations, and modernization opportunities across the data ecosystem.
β’ Evaluate integration of complementary technologies (Delta Live Tables, MLflow, Unity Catalog, streaming frameworks, etc.).
Required Skills & Experience
Technical Skills
β’ Databricks Expertise: Strong hands-on experience with Databricks (clusters, notebooks, Delta Lake, MLflow, Unity Catalog).
β’ Cloud Platforms: Experience with at least one cloud provider (AWS, Azure, GCP).
β’ Data Engineering: Strong proficiency in Spark, Python, SQL, and distributed data processing.
β’ Architecture: Experience designing large-scale data solutions including ingestion, transformation, storage, and analytics.
β’ Streaming: Experience with streaming technologies (Structured Streaming, Kafka, Kinesis, Event Hub).
β’ DevOps: CI/CD practices for data pipelines (Azure DevOps, GitHub Actions, Jenkins, etc.).
Soft Skills
β’ Strong communication skills with the ability to engage both technical and business teams.
β’ Experience working in Agile environments.
β’ Ability to simplify complex technical concepts for non-technical audiences.
β’ Strong analytical, problem-solving, and decision-making abilities.
Preferred Qualifications:
β’ Databricks Certified Data Engineer Professional / Architect certification.
β’ AWS/Azure/GCP cloud architect certifications.
β’ Experience with BI tools (Tableau, Power BI, and Looker).
β’ Experience in machine learning workflows and ML operations.
β’ Background in large-scale data modernization or cloud migration projects.
To discuss this amazing opportunity, reach out to our Talent Acquisition Specialist Abhinav Chakraborty at email address Abhinav.Chakraborty@generistek.com can be reached on # 630-576-1925.






