Sovereign AI on Hybrid Manager v1.3
Hub quick links: Sovereign AI (hub) — AI Factory Hub
What Sovereign AI means in Hybrid Manager
With AI Factory on Hybrid Manager, AI workloads are co-located with your data:
- Your environment. Assistants, model serving, and vector search run directly inside your Hybrid Manager project and Kubernetes cluster.
- Your data. Source documents, embeddings, and conversation history never leave your environment unless you explicitly configure it.
- Your models. Model images are curated in the Model Library and served from endpoints inside your cluster.
- Your visibility. Conversation threads and model logs are observable through the same platform you use for Postgres and analytics.
Connectivity options
- Standard (connected). Most environments run with internet connectivity to pull container images, updates, or libraries. Even when connected, data stays inside your project — unlike SaaS AI, it is not shared with the provider.
- Air-gapped. For high-security deployments, Hybrid Manager can be operated without internet access. You preload model images and registries, and all workloads stay entirely within your controlled cluster.
Contrast with external AI services
- External LLM services (e.g., OpenAI, Anthropic, Cohere): your prompts, documents, and embeddings leave your network and are processed in a third-party environment.
- Hybrid Manager: all retrieval, generation, and storage happen inside your project and infrastructure. Models are served next to your data, ensuring that sensitive content never leaves your control. Air-gapped deployments further guarantee that no external connectivity is required.
Learn more
- Sovereign AI (hub) — principles and patterns.
- AI Factory Hub — deeper guides on assistants, knowledge bases, and model serving.