Newlineinfo Corp - IT Services and IT Consulting

Cloud Architect (Only Local Visa Independent Candidates)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Cloud Architect contract position in San Jose, CA / Lehi, UT, requiring local visa-independent candidates. Key skills include data engineering, Azure solutions, and AI/ML workflows. A Bachelor's degree and 5 years of experience are mandatory.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
October 31, 2025
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Hybrid
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
San Jose, CA
-
🧠 - Skills detailed
#Deployment #Data Ingestion #Computer Science #Monitoring #Distributed Computing #Data Access #Scripting #Data Modeling #Agile #Big Data #Python #Data Engineering #Batch #Data Security #Data Science #Compliance #GDPR (General Data Protection Regulation) #Databricks #ML (Machine Learning) #Visualization #Scala #SQL (Structured Query Language) #Spark (Apache Spark) #AI (Artificial Intelligence) #Cloud #Data Lake #Security #Strategy #Hadoop #Leadership #Storage #Automation #Azure #Documentation #Data Architecture #Kubernetes #Database Systems #"ETL (Extract #Transform #Load)"
Role description
Job Title: Cloud Architect Location: San Jose, CA / Lehi, UT (hybrid) Type: Contract Note: Please emphasize on Solution & Data Architecture and the need to work closely with the engineering teams to design and help implement solutions to adhere to the design Job Description: The Cloud Architect will be a key contributor to designing, evolving, and optimizing our company's cloud-based data architecture. This role requires a strong background in data engineering, hands-on experience building cloud data solutions, and a talent for communicating complex designs through clear diagrams and documentation. Must work EST hours. Strategy, Planning, and Roadmap Development: Align AI and ML system design with broader business objectives, shaping technology roadmaps and architectural standards for end-to-end cloud-driven analytics and AI adoption. Designing End-to-End AI/ML Workflows: Architect and oversee all stages of AI/ML pipeline developmentβ€”data ingestion, preprocessing, model training, validation, deployment, monitoring, and lifecycle management within cloud environments. Selecting Technologies and Services: Evaluate and choose optimal cloud services, AI/ML platforms, infrastructure components (compute, storage, orchestration), frameworks, and tools that fit operational, financial, and security requirements. Infrastructure Scalability and Optimization: Design and scale distributed cloud solutions capable of supporting real-time and batch processing workloads for AI/ML, leveraging technologies like Kubernetes, managed ML platforms, and hybrid/multi-cloud strategies for optimal performance. MLOps, Automation, and CI/CD Integration: Implement automated build, test, and deployment pipelines for machine learning models, facilitating continuous delivery, rapid prototyping, and agile transformation for data and AI-driven products. Security, Compliance, and Governance: Establish robust protocols for data access, privacy, encryption, and regulatory compliance (e.g., GDPR, ethical AI), coordinating with security experts to continuously assess risks and enforce governance. Business and Technical Collaboration: Serve as the liaison between business stakeholders, development teams, and data scientists, translating company needs into technical solutions, and driving alignment and innovation across departments. Performance Evaluation & System Monitoring: Monitor infrastructure and AI workloads, optimize resource allocation, troubleshoot bottlenecks, and fine-tune models and platforms for reliability and cost-efficiency at scale. Documentation and Best Practices: Create and maintain architectural diagrams, policy documentation, and knowledge bases for AI/ML and cloud infrastructure, fostering a culture of transparency, learning, and continuous improvement. Continuous Innovation: Stay abreast of new technologies, frameworks, trends in AI, ML, and cloud computing, evaluate emerging approaches, and lead strategic pilots or proofs-of-concept for next-generation solutions. This role blends leadership in technology and systems architecture with hands-on expertise in cloud infrastructure, artificial intelligence, and machine learning, pivotal for driving innovation, scalability, and resilience in a modern enterprise. Required Qualifications β€’ Bachelor's degree in Computer Science, Data Science, Information Systems, or a related field. β€’ Minimum of 5 years of hands-on data engineering experience using distributed computing approaches (Spark, Map Reduce, DataBricks) β€’ Proven track record of successfully designing and implementing cloud-based data solutions in Azure β€’ Deep understanding of data modeling concepts and techniques. β€’ Strong proficiency with database systems (relational and non-relational). β€’ Exceptional diagramming skills with tools like Visio, Lucidchart, or other data visualization software. Preferred Qualifications β€’ Advanced knowledge of cloud-specific data services (e.g., DataBricks, Azure Data Lake). β€’ Expertise in big data technologies (e.g., Hadoop, Spark). β€’ Strong understanding of data security and governance principles. β€’ Experience in scripting languages (Python, SQL). Additional Skills β€’ Communication: Exemplary written and verbal communication skills to collaborate effectively with all teams and stakeholders. β€’ Problem-solving: Outstanding analytical and problem-solving skills for complex data challenges. β€’ Teamwork & Leadership: Ability to work effectively in cross-functional teams and demonstrate potential for technical leadership.