Enabling the Migration Portal AI Copilot v1.3
After deploying the Hybrid Manager (HM), the Migration Portal, a Launchpad tool for migrating external database schemas, works out of the box.
However, the AI Copilot, a chat-bot tool that helps you resolve schema migration compatibility issues within the Migration Portal, is not enabled by default, so you need additional configuration to enable it.
You can configure the AI Copilot using OpenAI, Azure OpenAI, or an alternative AI vendor. Alternatively, you can configure the AI Copilot to work with a self-hosted model you previously created within the HM with EDB AI Factory capabilities. This gives you the flexibility of using ready-to-use models from a third-party vendor or of having complete control over your data and environment.
Third-party model (cloud)
Workflow: While using the Migration Portal, you can submit a prompt to the Migration Portal AI Copilot to obtain information and help on schema conversion. The AI Copilot then communicates with an external, third-party cloud vendor of your choice (for example, OpenAI) over the Internet. The third-party vendor uses its own model to process the prompt and generate an answer.
Models: Third-party vendors provide the core AI capabilities, including a chat completion model and an embeddings model. (For Azure OpenAI, you'll still have to deploy an embedding model.)
Benefits: This approach offers access to highly powerful and continuously updated models without the need for you to manage any AI infrastructure.
Considerations: This option requires uninterrupted Internet connectivity, and depending on the license, data may be stored with the vendor. Additionally, consider costs caused by periodic health check requests that the chat completion and embeddings models need and per-query charges.
Self-hosted model (on-premises)
Workflow: While using the Migration Portal, you can submit a prompt to the Migration Portal AI Copilot to obtain information and help on schema conversion. The AI Copilot then communicates with models that are hosted locally within the Hybrid Manager environment itself. These models are previously created by your organization with EDB AI Factory capabilities. The self-hosted models process the prompt and generate an answer.
Models: You create and manage these models using the EDB AI Factory. They can be models like Llama 3, which is served using an NVIDIA NIM microservice.
Benefits: This approach is better suited for organizations that require a highly secure, air-gapped environment. It ensures that all data processing remains within your infrastructure, meeting strict compliance and privacy requirements.
Considerations: The AI infrastructure is managed by you, which means you'll need AI expertise in your organization. Additionally, self-hosted models lack the built-in content filtering and safeguards provided by third-party vendors. The responsibility and potential liability for any unsafe or harmful content generated by the model will fall on the user or organization. You also must consider the resource consumption of costly GPU nodes required to run AI workloads.
Third-party model (OpenAI or Azure OpenAI)
Learn how to enable a third-party model to power the Migration Portal AI Copilot.
Self-hosted model (AI Factory)
Learn how to enable a self-hosted AI Factory model to power the Migration Portal AI Copilot.