

AMISEQ
Databricks Architect - REMOTE
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Databricks Architect (Remote) with a contract length of "unknown" and a pay rate of "unknown." Requires 8+ years in Data Engineering, 4+ years with Databricks, strong Informatica skills, and cloud platform experience (Azure/AWS/GCP). Certifications preferred.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
October 29, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Data Pipeline #SQL (Structured Query Language) #PySpark #Data Governance #Data Modeling #Data Engineering #Terraform #Collibra #Informatica #Spark (Apache Spark) #Cloud #Automation #Informatica PowerCenter #Alation #Snowflake #Data Integration #GCP (Google Cloud Platform) #Azure #BigQuery #Delta Lake #Python #IICS (Informatica Intelligent Cloud Services) #Scala #GIT #ADLS (Azure Data Lake Storage) #Data Architecture #MLflow #"ETL (Extract #Transform #Load)" #Data Framework #Databricks #S3 (Amazon Simple Storage Service) #AWS (Amazon Web Services) #Metadata
Role description
Position Overview:
We are seeking an experienced Databricks RSA (Reference Solution Architect) with strong Informatica expertise to design, implement, and optimize modern data solutions on the Databricks Lakehouse Platform. The ideal candidate will have a solid background in data engineering, ETL design, performance tuning, and data integration, with hands-on experience in Informatica PowerCenter / IICS and Databricks.
This role involves working closely with data architects, business stakeholders, and data engineering teams to design scalable, efficient, and secure data pipelines and analytics frameworks in a cloud environment (Azure/AWS/GCP).
Required Skills and Qualifications:
• 8+ years of experience in Data Engineering / Data Architecture roles.
• 4+ years of hands-on experience with Databricks (PySpark, Delta Lake, SQL, MLflow).
• Strong expertise in Informatica PowerCenter or Informatica Intelligent Cloud Services (IICS).
• Proven experience building and optimizing ETL/ELT pipelines in large-scale data environments.
• Solid understanding of data modeling, data warehousing, and lakehouse design patterns.
• Experience with cloud platforms (Azure, AWS, or GCP) and corresponding data services (e.g., ADLS, S3, BigQuery, Snowflake).
• Proficiency in Python, SQL, and Spark.
• Familiarity with CI/CD for Databricks, Git integration, and automation using Terraform or Databricks Asset Bundles.
• Excellent communication, analytical, and problem-solving skills.
Preferred Qualifications:
• Databricks Certified Data Engineer Associate/Professional.
• Informatica Certified Professional (PowerCenter / IICS).
• Experience with Unity Catalog, Delta Live Tables, or Databricks SQL Warehouses.
• Exposure to Data Governance tools (Collibra, Alation) and metadata frameworks.
Position Overview:
We are seeking an experienced Databricks RSA (Reference Solution Architect) with strong Informatica expertise to design, implement, and optimize modern data solutions on the Databricks Lakehouse Platform. The ideal candidate will have a solid background in data engineering, ETL design, performance tuning, and data integration, with hands-on experience in Informatica PowerCenter / IICS and Databricks.
This role involves working closely with data architects, business stakeholders, and data engineering teams to design scalable, efficient, and secure data pipelines and analytics frameworks in a cloud environment (Azure/AWS/GCP).
Required Skills and Qualifications:
• 8+ years of experience in Data Engineering / Data Architecture roles.
• 4+ years of hands-on experience with Databricks (PySpark, Delta Lake, SQL, MLflow).
• Strong expertise in Informatica PowerCenter or Informatica Intelligent Cloud Services (IICS).
• Proven experience building and optimizing ETL/ELT pipelines in large-scale data environments.
• Solid understanding of data modeling, data warehousing, and lakehouse design patterns.
• Experience with cloud platforms (Azure, AWS, or GCP) and corresponding data services (e.g., ADLS, S3, BigQuery, Snowflake).
• Proficiency in Python, SQL, and Spark.
• Familiarity with CI/CD for Databricks, Git integration, and automation using Terraform or Databricks Asset Bundles.
• Excellent communication, analytical, and problem-solving skills.
Preferred Qualifications:
• Databricks Certified Data Engineer Associate/Professional.
• Informatica Certified Professional (PowerCenter / IICS).
• Experience with Unity Catalog, Delta Live Tables, or Databricks SQL Warehouses.
• Exposure to Data Governance tools (Collibra, Alation) and metadata frameworks.






