Xyant Services

Cloud Data Architect

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Cloud Data Architect on a long-term contract in Lehi, UT or San Jose, CA (Hybrid), offering competitive pay. Requires a Bachelor's degree, 5+ years in data engineering, and expertise in Azure, Spark, and data modeling.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
October 31, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
San Jose, CA
-
🧠 - Skills detailed
#Computer Science #Distributed Computing #Scripting #Data Modeling #Big Data #Python #Data Engineering #Data Security #Data Science #Databricks #Visualization #SQL (Structured Query Language) #Spark (Apache Spark) #Cloud #Data Lake #Security #Hadoop #Leadership #Azure #Documentation #Data Architecture #Database Systems
Role description
Cloud Data Architect Long Term Contract Lehi, UT or San Jose, CA (Hybrid) Job Description: The Cloud Architect will be a key contributor to designing, evolving, and optimizing our company's cloud-based data architecture. This role requires a strong background in data engineering, hands-on experience building cloud data solutions, and a talent for communicating complex designs through clear diagrams and documentation. Required Qualifications • Bachelor's degree in Computer Science, Data Science, Information Systems, or a related field. • Minimum of 5 years of hands-on data engineering experience using distributed computing approaches (Spark, Map Reduce, DataBricks) • Proven track record of successfully designing and implementing cloud-based data solutions in Azure • Deep understanding of data modeling concepts and techniques. • Strong proficiency with database systems (relational and non-relational). • Exceptional diagramming skills with tools like Visio, Lucidchart, or other data visualization software. Preferred Qualifications • Advanced knowledge of cloud-specific data services (e.g., DataBricks, Azure Data Lake). • Expertise in big data technologies (e.g., Hadoop, Spark). • Strong understanding of data security and governance principles. • Experience in scripting languages (Python, SQL). Additional Skills • Communication: Exemplary written and verbal communication skills to collaborate effectively with all teams and stakeholders. • Problem-solving: Outstanding analytical and problem-solving skills for complex data challenges. • Teamwork & Leadership: Ability to work effectively in cross-functional teams and demonstrate potential for technical leadership.