

Senior Data Architect
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Architect with a contract length of "unknown" and a pay rate of "unknown." Key skills include 15+ years in data architecture, proficiency in Spark, Kafka, and cloud platforms (AWS, Azure, GCP), and CDMP certification.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
September 23, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Unknown
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
New York, NY
-
π§ - Skills detailed
#Spark (Apache Spark) #Data Governance #Snowflake #Cloud #Computer Science #Security #Azure #Kafka (Apache Kafka) #Data Architecture #API (Application Programming Interface) #Leadership #BigQuery #Metadata #GCP (Google Cloud Platform) #AWS (Amazon Web Services) #Data Management #Data Engineering #Databricks
Role description
Job Description:
β’ Design and implement data mesh frameworks across multiple domains.
β’ Collaborate with engineering, data, and business teams to enable self-serve analytics.
β’ Define and enforce data governance, quality, and security policies.
β’ Implement modern data platforms using cloud-native technologies (AWS, Azure, GCP).
β’ Mentor teams on domain-oriented data ownership and best practices.
Required Skills & Qualifications:
β’ Bachelorβs or Masterβs in Computer Science, Data Engineering, or related field.
β’ 15+ years in data architecture, cloud data engineering, or enterprise data platforms.
β’ Strong proficiency in Spark, Kafka, Snowflake, Databricks, or BigQuery.
β’ Experience with data governance frameworks, metadata management, and API-driven data services.
β’ Excellent problem-solving, collaboration, and leadership skills.
Certifications:
β’ Certified Data Management Professional (CDMP).
β’ Cloud certifications (AWS Solutions Architect, GCP Professional Data Engineer).
Job Description:
β’ Design and implement data mesh frameworks across multiple domains.
β’ Collaborate with engineering, data, and business teams to enable self-serve analytics.
β’ Define and enforce data governance, quality, and security policies.
β’ Implement modern data platforms using cloud-native technologies (AWS, Azure, GCP).
β’ Mentor teams on domain-oriented data ownership and best practices.
Required Skills & Qualifications:
β’ Bachelorβs or Masterβs in Computer Science, Data Engineering, or related field.
β’ 15+ years in data architecture, cloud data engineering, or enterprise data platforms.
β’ Strong proficiency in Spark, Kafka, Snowflake, Databricks, or BigQuery.
β’ Experience with data governance frameworks, metadata management, and API-driven data services.
β’ Excellent problem-solving, collaboration, and leadership skills.
Certifications:
β’ Certified Data Management Professional (CDMP).
β’ Cloud certifications (AWS Solutions Architect, GCP Professional Data Engineer).