

Method360, Inc.
SAP Databricks Solution Architect
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an SAP Databricks Solution Architect on a contract-to-hire basis, starting on 11/3/2025, with a hybrid work location in Plano, TX. Requires 12–15+ years in data architecture, SAP BDC experience, and relevant certifications. Pay rate is unspecified.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
October 22, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Plano, TX
-
🧠 - Skills detailed
#Data Lake #ML (Machine Learning) #GDPR (General Data Protection Regulation) #Leadership #Cloud #Apache Spark #Data Modeling #Automation #Agile #Data Migration #AI (Artificial Intelligence) #Data Management #Security #Data Processing #AWS S3 (Amazon Simple Storage Service) #Azure Data Factory #Data Integrity #Data Quality #Data Pipeline #Azure #Databricks #Data Lifecycle #Data Integration #SQL (Structured Query Language) #Scala #Spark (Apache Spark) #Python #Data Science #GCP (Google Cloud Platform) #Data Governance #Libraries #Metadata #DevOps #Snowflake #Migration #ADF (Azure Data Factory) #Compliance #SAP #Data Engineering #AWS (Amazon Web Services) #Data Privacy #Strategy #Data Architecture #S3 (Amazon Simple Storage Service) #Batch #BigQuery #Snowpark #"ETL (Extract #Transform #Load)" #Microsoft Azure
Role description
Job Title: SAP BDC / Databricks Solution Architect
Employment Type: Contract to Hire
Start: 11/3/2025
Workplace Type: Hybrid
Location: Plano, TX
Travel: <25%
• NO C2C
Description: As an SAP Enterprise Databricks Architect you will lead the definition, design, and execution of scalable data architecture strategies to support enterprise-wide data migration and transformation programs. This role ensures data integrity, consistency, scalability, and alignment with the organization’s enterprise data standards, while enabling long-term data governance. As part of this role, you will collaborate with technical, functional, and business teams to design optimal data models, flows, and frameworks that support SAP BDC processes and accelerate digital transformation.
As a Databricks Subject Matter Expert (SME), you will be the technical authority and hands-on expert driving the design, development, and optimization of the data platform on the Databricks Unified Analytics platform. You will work closely with enterprise architects, data engineers, and SAP BDC teams to leverage Databricks’ capabilities for scalable ETL, advanced analytics, and AI/ML workloads. Your role is critical in enabling seamless data integration, processing, and transformation between SAP systems and cloud-based data lakes and warehouses.
Key Responsibilities:
• Lead the end-to-end data architecture strategy across SAP BDC initiatives, including data modeling, transformation, and governance.
• Define target and transition architectures for both SAP and non-SAP data domains, integrating with modern platforms such as Azure, AWS, Snowflake, and Databricks.
• Ensure data platform alignment with architectural principles across hybrid cloud environments, leveraging third-party tools from Microsoft, Nvidia, Google, and others.
• Design and validate data flows, lineage, and integration points between legacy systems, SAP S/4HANA, and cloud-based platforms.
• Collaborate with migration teams to drive effective ETL/ELT strategy and data quality frameworks, ensuring consistency across enterprise systems.
• Identify and mitigate data risks, including data duplication, data loss, latency, or security compliance gaps.
• Guide tool selection, integration architecture, and usage of cloud-native services (e.g., Azure Data Factory, BigQuery, Snowpark, Databricks notebooks) to support data lifecycle processes.
• Lead the design and implementation of data pipelines, notebooks, and workflows on the Databricks platform supporting SAP data migration and analytics use cases.
• Develop, optimize, and tune Spark jobs for large-scale, distributed data processing.
• Collaborate with data architects to align Databricks solutions with enterprise data governance and architecture principles.
• Enable data engineers and scientists by building reusable libraries, APIs, and data transformation frameworks using Databricks and related technologies.
• Integrate Databricks with SAP data sources and external platforms such as Snowflake, Azure Data Lake, and AWS S3.
• Establish and promote best practices around security, cost optimization, and performance on Databricks.
• Participate in tooling and automation efforts such as CI/CD pipelines for Databricks assets.
• Provide training, mentorship, and knowledge sharing to junior team members and other stakeholders.
• Stay current with Databricks and Apache Spark ecosystem updates and innovations.
• Collaborate with cross-functional teams leveraging NVIDIA GPU acceleration or AI/ML frameworks integrated with Databricks.
Skills and Attributes for Success:
• 12–15+ years of experience in enterprise data architecture, with at least 5 years in SAP BDC or related data transformation environments, with 3+ years specifically on Databricks and Spark-based data platforms.
• Strong background in enterprise data architecture and SAP data management (SAP ECC, S/4HANA, MDG, BW/4HANA).
• Experience working with cloud providers and platforms: Microsoft Azure, AWS, Google Cloud, Snowflake, Nvidia RAPIDS, Databricks.
• Proficiency in data modeling (conceptual, logical, physical), data governance, and metadata management.
• Experience designing and governing large-scale data migration and transformation initiatives.
• Agile mindset and experience working in DevOps/CI-CD environments.
• Knowledge of data migration tools such as SAP BDC, LSMW, IDocs, or CPI.
• Solid understanding of data privacy and compliance standards (GDPR, HIPAA, etc.).
• Strong leadership, stakeholder management, and cross-functional collaboration skills.
• Familiarity with architecture frameworks such as TOGAF or DAMA-DMBOK.
• Strong expertise in Databricks platform and Apache Spark (Scala, Python, or SQL).
• Experience designing and developing large-scale ETL pipelines and batch/streaming data workflows.
• Familiarity with cloud ecosystems, particularly Azure, AWS, or GCP, and integrating Databricks within these environments.
• Knowledge of data architecture, governance, and security practices on cloud data platforms.
• Ability to troubleshoot and optimize Spark job performance and resource utilization.
• Understanding of data science and machine learning workflows on Databricks is a plus.
• Strong communication skills to collaborate with SAP, data architecture, and delivery teams.
Must-Have Requirements:
• Relevant certifications such as Databricks Certified Data Engineer and ML Architect
• Tangible experience working on SAP BDC projects
Please apply through our on-line portal with your resume and contact information. Applications will be reviewed and assessed against position requirements. Qualified candidates will be contacted by the lead recruiter within 48 hours of submittal. No phone calls please.Method360 is proud to be an Equal Opportunity Employer
Job Title: SAP BDC / Databricks Solution Architect
Employment Type: Contract to Hire
Start: 11/3/2025
Workplace Type: Hybrid
Location: Plano, TX
Travel: <25%
• NO C2C
Description: As an SAP Enterprise Databricks Architect you will lead the definition, design, and execution of scalable data architecture strategies to support enterprise-wide data migration and transformation programs. This role ensures data integrity, consistency, scalability, and alignment with the organization’s enterprise data standards, while enabling long-term data governance. As part of this role, you will collaborate with technical, functional, and business teams to design optimal data models, flows, and frameworks that support SAP BDC processes and accelerate digital transformation.
As a Databricks Subject Matter Expert (SME), you will be the technical authority and hands-on expert driving the design, development, and optimization of the data platform on the Databricks Unified Analytics platform. You will work closely with enterprise architects, data engineers, and SAP BDC teams to leverage Databricks’ capabilities for scalable ETL, advanced analytics, and AI/ML workloads. Your role is critical in enabling seamless data integration, processing, and transformation between SAP systems and cloud-based data lakes and warehouses.
Key Responsibilities:
• Lead the end-to-end data architecture strategy across SAP BDC initiatives, including data modeling, transformation, and governance.
• Define target and transition architectures for both SAP and non-SAP data domains, integrating with modern platforms such as Azure, AWS, Snowflake, and Databricks.
• Ensure data platform alignment with architectural principles across hybrid cloud environments, leveraging third-party tools from Microsoft, Nvidia, Google, and others.
• Design and validate data flows, lineage, and integration points between legacy systems, SAP S/4HANA, and cloud-based platforms.
• Collaborate with migration teams to drive effective ETL/ELT strategy and data quality frameworks, ensuring consistency across enterprise systems.
• Identify and mitigate data risks, including data duplication, data loss, latency, or security compliance gaps.
• Guide tool selection, integration architecture, and usage of cloud-native services (e.g., Azure Data Factory, BigQuery, Snowpark, Databricks notebooks) to support data lifecycle processes.
• Lead the design and implementation of data pipelines, notebooks, and workflows on the Databricks platform supporting SAP data migration and analytics use cases.
• Develop, optimize, and tune Spark jobs for large-scale, distributed data processing.
• Collaborate with data architects to align Databricks solutions with enterprise data governance and architecture principles.
• Enable data engineers and scientists by building reusable libraries, APIs, and data transformation frameworks using Databricks and related technologies.
• Integrate Databricks with SAP data sources and external platforms such as Snowflake, Azure Data Lake, and AWS S3.
• Establish and promote best practices around security, cost optimization, and performance on Databricks.
• Participate in tooling and automation efforts such as CI/CD pipelines for Databricks assets.
• Provide training, mentorship, and knowledge sharing to junior team members and other stakeholders.
• Stay current with Databricks and Apache Spark ecosystem updates and innovations.
• Collaborate with cross-functional teams leveraging NVIDIA GPU acceleration or AI/ML frameworks integrated with Databricks.
Skills and Attributes for Success:
• 12–15+ years of experience in enterprise data architecture, with at least 5 years in SAP BDC or related data transformation environments, with 3+ years specifically on Databricks and Spark-based data platforms.
• Strong background in enterprise data architecture and SAP data management (SAP ECC, S/4HANA, MDG, BW/4HANA).
• Experience working with cloud providers and platforms: Microsoft Azure, AWS, Google Cloud, Snowflake, Nvidia RAPIDS, Databricks.
• Proficiency in data modeling (conceptual, logical, physical), data governance, and metadata management.
• Experience designing and governing large-scale data migration and transformation initiatives.
• Agile mindset and experience working in DevOps/CI-CD environments.
• Knowledge of data migration tools such as SAP BDC, LSMW, IDocs, or CPI.
• Solid understanding of data privacy and compliance standards (GDPR, HIPAA, etc.).
• Strong leadership, stakeholder management, and cross-functional collaboration skills.
• Familiarity with architecture frameworks such as TOGAF or DAMA-DMBOK.
• Strong expertise in Databricks platform and Apache Spark (Scala, Python, or SQL).
• Experience designing and developing large-scale ETL pipelines and batch/streaming data workflows.
• Familiarity with cloud ecosystems, particularly Azure, AWS, or GCP, and integrating Databricks within these environments.
• Knowledge of data architecture, governance, and security practices on cloud data platforms.
• Ability to troubleshoot and optimize Spark job performance and resource utilization.
• Understanding of data science and machine learning workflows on Databricks is a plus.
• Strong communication skills to collaborate with SAP, data architecture, and delivery teams.
Must-Have Requirements:
• Relevant certifications such as Databricks Certified Data Engineer and ML Architect
• Tangible experience working on SAP BDC projects
Please apply through our on-line portal with your resume and contact information. Applications will be reviewed and assessed against position requirements. Qualified candidates will be contacted by the lead recruiter within 48 hours of submittal. No phone calls please.Method360 is proud to be an Equal Opportunity Employer