AMISEQ

Databricks Databricks Solutions Architect

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Databricks Solutions Architect with 8+ years in Data Engineering, 4+ years in Databricks, and strong Informatica expertise. Contract length is unspecified, and pay rate is also unspecified. Remote work is available.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
October 11, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#GCP (Google Cloud Platform) #Data Modeling #Informatica #PySpark #Snowflake #Terraform #Informatica PowerCenter #AWS (Amazon Web Services) #Cloud #Metadata #S3 (Amazon Simple Storage Service) #MLflow #Data Engineering #Azure #Alation #Data Framework #Data Architecture #Data Integration #Python #Automation #Databricks #Delta Lake #Collibra #Scala #ADLS (Azure Data Lake Storage) #Spark (Apache Spark) #IICS (Informatica Intelligent Cloud Services) #Data Pipeline #BigQuery #SQL (Structured Query Language) #"ETL (Extract #Transform #Load)" #GIT #Data Governance
Role description
Position Overview: We are seeking an experienced Databricks RSA (Reference Solution Architect) with strong Informatica expertise to design, implement, and optimize modern data solutions on the Databricks Lakehouse Platform. The ideal candidate will have a solid background in data engineering, ETL design, performance tuning, and data integration, with hands-on experience in Informatica PowerCenter / IICS and Databricks. This role involves working closely with data architects, business stakeholders, and data engineering teams to design scalable, efficient, and secure data pipelines and analytics frameworks in a cloud environment (Azure/AWS/GCP). Required Skills and Qualifications: • 8+ years of experience in Data Engineering / Data Architecture roles. • 4+ years of hands-on experience with Databricks (PySpark, Delta Lake, SQL, MLflow). • Strong expertise in Informatica PowerCenter or Informatica Intelligent Cloud Services (IICS). • Proven experience building and optimizing ETL/ELT pipelines in large-scale data environments. • Solid understanding of data modeling, data warehousing, and lakehouse design patterns. • Experience with cloud platforms (Azure, AWS, or GCP) and corresponding data services (e.g., ADLS, S3, BigQuery, Snowflake). • Proficiency in Python, SQL, and Spark. • Familiarity with CI/CD for Databricks, Git integration, and automation using Terraform or Databricks Asset Bundles. • Excellent communication, analytical, and problem-solving skills. Preferred Qualifications: • Databricks Certified Data Engineer Associate/Professional. • Informatica Certified Professional (PowerCenter / IICS). • Experience with Unity Catalog, Delta Live Tables, or Databricks SQL Warehouses. • Exposure to Data Governance tools (Collibra, Alation) and metadata frameworks.