Clevanoo LLC

Cloud Data Architect

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Cloud Data Architect in San Jose, CA, with a long-term contract. It requires a Bachelor's degree, 5+ years in data engineering, expertise in Azure and big data technologies, and strong diagramming skills. Must work EST hours.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
December 16, 2025
πŸ•’ - Duration
Unknown
-
🏝️ - Location
On-site
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
San Jose, CA
-
🧠 - Skills detailed
#Data Lake #Data Engineering #Hadoop #Monitoring #Data Bricks #ML (Machine Learning) #Cloud #GDPR (General Data Protection Regulation) #Computer Science #Compliance #Security #Data Security #Data Access #Distributed Computing #Scala #Data Science #Deployment #Databricks #Data Architecture #Scripting #SQL (Structured Query Language) #"ETL (Extract #Transform #Load)" #Strategy #Agile #Documentation #Python #Spark (Apache Spark) #Kubernetes #Big Data #Azure #Automation #Storage #Batch #Database Systems #AI (Artificial Intelligence) #Visualization #Data Ingestion
Role description
Job Title: Cloud Data Architect Location: San Jose, CA (Day 1 On-Site) Duration: Long Term Required Qualifications Β· Bachelor's degree in computer science, Data Science, Information Systems, or a related field. Β· Minimum of 5 years of hands-on data engineering experience using distributed computing approaches (Spark, Map Reduce, DataBricks) Β· Proven track record of successfully designing and implementing cloud-based data solutions in Azure Β· Deep understanding of data modelling concepts and techniques. Β· Strong proficiency with database systems (relational and non-relational). Β· Exceptional diagramming skills with tools like Visio, Lucid chart, or other data visualization software. Preferred Qualifications Β· Advanced knowledge of cloud-specific data services (e.g., Data Bricks, Azure Data Lake). Β· Expertise in big data technologies (e.g., Hadoop, Spark). Β· Strong understanding of data security and governance principles. Β· Experience in scripting languages (Python, SQL). What you do – The Cloud Architect will be a key contributor to designing, evolving, and optimizing our company's cloud-based data architecture. This role requires a strong background in data engineering, hands-on experience building cloud data solutions, and a talent for communicating complex designs through clear diagrams and documentation. Must work EST hours. Strategy, Planning, and Roadmap Development: Align AI and ML system design with broader business objectives, shaping technology roadmaps and architectural standards for end-to-end cloud-driven analytics and AI adoption. Designing End-to-End AI/ML Workflows: Architect and oversee all stages of AI/ML pipeline developmentβ€”data ingestion, preprocessing, model training, validation, deployment, monitoring, and lifecycle management within cloud environments. Selecting Technologies and Services: Evaluate and choose optimal cloud services, AI/ML platforms, infrastructure components (compute, storage, orchestration), frameworks, and tools that fit operational, financial, and security requirements. Infrastructure Scalability and Optimization: Design and scale distributed cloud solutions capable of supporting real-time and batch processing workloads for AI/ML, leveraging technologies like Kubernetes, managed ML platforms, and hybrid/multi-cloud strategies for optimal performance. MLOps, Automation, and CI/CD Integration: Implement automated build, test, and deployment pipelines for machine learning models, facilitating continuous delivery, rapid prototyping, and agile transformation for data and AI-driven products. Security, Compliance, and Governance: Establish robust protocols for data access, privacy, encryption, and regulatory compliance (e.g., GDPR, ethical AI), coordinating with security experts to continuously assess risks and enforce governance. Business and Technical Collaboration: Serve as the liaison between business stakeholders, development teams, and data scientists, translating company needs into technical solutions, and driving alignment and innovation across departments.