Neurealm

Artificial Intelligence Consultant

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an "Artificial Intelligence Consultant" with a contract length of "unknown" and a pay rate of "unknown." Key skills include expertise in Data Architecture, GCP, AI/ML platforms, and Generative AI solutions. Preferred qualifications include a degree in a related field and GCP Professional certifications.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
February 14, 2026
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Unknown
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Santa Ana, CA
-
🧠 - Skills detailed
#Data Lake #Kafka (Apache Kafka) #BigQuery #Computer Science #Dataflow #Batch #"ETL (Extract #Transform #Load)" #Leadership #AI (Artificial Intelligence) #Consulting #Scala #GCP (Google Cloud Platform) #Tableau #Data Governance #ML (Machine Learning) #SQL (Structured Query Language) #BI (Business Intelligence) #Data Management #Metadata #Observability #PyTorch #Documentation #Data Science #Data Lakehouse #Microsoft Power BI #Data Architecture #Spark (Apache Spark) #Looker #TensorFlow #Data Engineering #IAM (Identity and Access Management) #Security #Storage #Python #Cloud #Monitoring #Deployment #Data Quality
Role description
AI Solution Architect with deep expertise in Data Architecture, AI/ML platforms, and Generative AI solutions, to design and deliver scalable, secure, and enterprise-grade data and AI solutions on Google Cloud Platform (GCP). The ideal candidate will have strong hands-on experience across data lakehouse architectures, modern BI platforms, ML/MLOps, Conversational Analytics, Generative AI, and Agentic AI frameworks, and will work closely with business, data engineering, and AI teams to drive end-to-end AI-led transformation. Key ResponsibilitiesData & Platform Architecture β€’ Design and own end-to-end data architectures including ingestion, processing, storage, governance, and consumption layers β€’ Architect modern data lakehouse platforms using GCP services (e.g., BigQuery, Dataproc, Cloud Storage) β€’ Define scalable data platforms supporting batch, streaming, and real-time analytics β€’ Establish data governance, metadata management, data quality, lineage, and security frameworks AI, ML & MLOps Architecture β€’ Design ML/AI architectures supporting model training, deployment, monitoring, and lifecycle management β€’ Define and implement MLOps frameworks (CI/CD for ML, feature stores, model registries, observability) β€’ Collaborate with data scientists to productionize ML models at scale β€’ Evaluate and recommend ML frameworks, tools, and best practices Generative AI & Agentic AI β€’ Architect and implement Generative AI solutions using LLMs (e.g., text, code, embeddings, multimodal use cases) β€’ Design Conversational Analytics and AI-powered BI solutions β€’ Build and evaluate Agentic AI platforms, including autonomous agents, orchestration frameworks, and tool integrations β€’ Lead solution evaluations, PoCs, and vendor/tool assessments for GenAI and Agent-based systems Business Intelligence & Analytics β€’ Design modern BI and analytics platforms enabling self-service analytics and AI-driven insights β€’ Integrate BI tools with data lakehouse and AI layers β€’ Enable semantic layers, metrics definitions, and governed analytics Cloud & GCP Leadership β€’ Lead architecture and solution design on Google Cloud Platform (GCP) β€’ Utilize GCP services such as BigQuery, Vertex AI, Cloud Storage, Dataflow, Dataproc, Pub/Sub, Looker, and IAM β€’ Ensure architectures follow best practices for security, scalability, performance, and cost optimization Stakeholder & Technical Leadership β€’ Partner with business leaders to translate business requirements into AI-driven solutions β€’ Lead technical design reviews and architecture governance β€’ Mentor engineers, architects, and data scientists β€’ Create architecture blueprints, reference architectures, and technical documentation Required Skills & QualificationsCore Technical Skills β€’ Strong experience in Data Architecture & Data Platforms β€’ Hands-on expertise in Data Lakehouse architectures β€’ Deep understanding of end-to-end data management β€’ Experience with modern BI platforms and analytics ecosystems β€’ Strong background in AI/ML architecture and MLOps β€’ Proven experience in Conversational Analytics and Generative AI β€’ Hands-on exposure to Agentic AI platforms, frameworks, and evaluations β€’ Strong expertise in Google Cloud Platform (GCP) Tools & Technologies (preferred) β€’ GCP: BigQuery, Vertex AI, Cloud Storage, Dataflow, Dataproc, Pub/Sub, Looker β€’ AI/ML: TensorFlow, PyTorch, scikit-learn, LLM frameworks β€’ MLOps: CI/CD, feature stores, model registries, monitoring tools β€’ Data: SQL, Python, Spark, Kafka β€’ BI: Looker, Tableau, Power BI (or equivalent) Preferred Qualifications β€’ Bachelor’s or Master’s degree in Computer Science, Data Science, Engineering, or related field β€’ GCP Professional certifications (e.g., Professional Data Engineer, Professional ML Engineer, Cloud Architect) β€’ Experience working in large-scale enterprise or consulting environments β€’ Strong communication and stakeholder management skills