AI Factory vs. “Generic AI” v1.3

AI Factory focuses on Gen AI and data infrastructure that run with your Postgres estate and, primarily, within EDB Hybrid Manager. “Generic AI” refers to general-purpose AI/ML stacks and services you might already use — for example, external LLM APIs, ML platforms, and data science tooling.

This page explains who each approach is for, what each does best, where they run, when to choose one or the other, and how they can work together. It also answers common questions we hear from teams adopting EDB Postgres® AI.


What we mean by each

  • AI Factory: A modular set of capabilities for Gen AI and intelligent data apps on your Postgres data. It includes Gen AI, Pipelines, Vector Engine, and Model Serving. It is designed for Hybrid Manager first, where you gain a unified control plane for governance and observability via AI Factory in Hybrid Manager.

  • Generic AI: General-purpose AI and ML tooling outside of AI Factory — such as cloud LLM APIs or standalone ML platforms used for experimentation and model development. You can combine these with AI Factory to build end-to-end solutions.

For an overview of concepts and architecture, see AI Factory Concepts and AI Factory Explained.


Who it is for

  • AI Factory: Application developers, data engineers, and platform teams who want to build Gen AI assistants, document intelligence, and AI-powered features on trusted Postgres data with governance. See Learning Paths to get hands-on.

  • Generic AI: Data science and research teams doing model exploration, fine-tuning, or domain R&D on specialized ML tooling. These efforts can feed models and components into AI Factory for operationalization.


Where it runs

  • Hybrid Manager (hub): Most AI Factory capabilities are delivered and managed within Hybrid Manager’s AI Factory. This gives you lifecycle control, governance, and observability across Gen AI, pipelines, vector search, and model serving.

  • Outside Hybrid Manager (selected components): You can run Postgres-focused capabilities such as Vector Engine, Pipelines, or AIDB features where supported. When you do so, you own cluster lifecycle, security, and observability. For details, see Where AI Factory runs.


When to choose each

Choose AI Factory when you:

  • Build assistants, search, or workflow automation that rely on enterprise data in Postgres.
  • Need governed Retrieval-Augmented Generation (RAG) with Pipelines and Vector Engine.
  • Want unified observability and lifecycle management (Hybrid Manager).
  • Plan to serve models behind stable endpoints via Model Serving.

Choose Generic AI when you:

  • Experiment with model architectures, training data, or evaluation workflows in a research setting.
  • Use external APIs for prototyping or transient workloads that do not require integration with Postgres governance.
  • Run batch model training that is not part of your operational Postgres or Hybrid Manager environment.

Many organizations do both: Generic AI for exploration, AI Factory for productionizing data-centric AI on Postgres.


How they work together

  • Bring your own models: Package a vetted model and deploy it as an inference service with Model Serving. Use the Model Library patterns to organize model assets.

  • Use external providers: Build assistants and tools in Gen AI that can call external services through drivers. See Drivers overview.

  • Operationalize data flows: Use Pipelines to prepare, transform, and keep content fresh for vector search and RAG.

  • Query with vectors in Postgres: Store and search embeddings with Vector Engine alongside your relational data for simpler operations and security.


Use cases

  • Enterprise knowledge assistants: Build domain assistants that retrieve trusted answers from Knowledge Bases and Postgres. Start with Gen AI and Vector Engine.

  • Document intelligence: Classify, extract, and summarize documents using Pipelines integrated with Gen AI assistants.

  • AI-powered APIs for apps: Serve embeddings, recommendations, or language tasks behind Model Serving.

  • Model exploration and evaluation: Use your preferred “Generic AI” tooling to evaluate candidate models before promoting them into AI Factory for serving.


Prerequisites

  • Using Hybrid Manager: Access to a Hybrid Manager instance with AI Factory enabled and entitlements for Gen AI, pipelines, vector search, and model serving. See AI Factory in Hybrid Manager.

  • Running components outside Hybrid Manager: A supported Postgres environment for extensions and AIDB, and the infrastructure required for pipelines and serving where applicable. Validate platform compatibility, security, and licensing.

If you need precise support matrices for outside-HM usage, coordinate with your EDB representative.


FAQ

Q: Does AI Factory train models?

A: AI Factory focuses on Gen AI and data infrastructure — retrieving, augmenting, and serving models with your data. Use your Generic stack for model training and experimentation, then serve vetted models via Model Serving.

Q: Can I run AI Factory without Hybrid Manager?

A: Many capabilities are intended to run in Hybrid Manager. Selected components — such as Vector Engine, Pipelines, and AIDB features — can run outside Hybrid Manager when supported. See Where AI Factory runs.

Q: How do I bring my own LLM or embedding model?

A: Package and deploy it using Model Serving. Manage artifacts and metadata using the library patterns under Model Library.

Q: How do assistants access enterprise data safely?

A: Use Gen AI with governed Knowledge Bases, RAG via Pipelines, and Postgres-native Vector Engine. In Hybrid Manager, these workloads run under unified governance and observability.

Q: Where do I start?

A: Review AI Factory Concepts and pick a Learning Path. If you operate in Hybrid Manager, start from AI Factory in Hybrid Manager.