This document provides detailed deployment instructions for users who need to deploy FineBI 6.1 and the AI component via FineOps.
Note:
Read the entire document first to understand the overall steps before proceeding.
If the FineBI 6.1 project is not deployed via FineOps, see Manual FineChatBI Deployment (Not by FineOps).
Deploy FineBI via FineOps first, as FineBI and the AI component must be deployed in sequence.
For details, see FineOps Deployment.
FanRuan applications rely on FineOps for deployment.
You need to deploy FineOps first.
Ensure FineOps is of V2.20.0 or later versions.
Prepare the FineBI deployment environment.
Confirming Server Configuration of the FineBI Project
Confirming Server Network of the FineBI Project
Preparing the FineBI Mounting Directory
Confirm that the image repository supports the connection to the FanRuan cloud repository. For details, see Ensuring Access to FanRuan Cloud Repository.
Images are required to deploy components of a new project. Ensure that images already exist in the image repository, or you can pull images from the cloud.
New Project Deployment
Ensure FineBI is of V6.1.6 or later versions.
The AI components include:
FineAI (fine-ai): It is the algorithm toolkit serving as the API gateway for Large Language Model (LLM) integration.
FineChatBI Semantic Parser (fine-chat-bi-parser): It is the Q&A BI engine that converts natural language queries into executable SQL statements.
FineAI Redis (fine-ai-redis): It is deployed together with the Q&A BI engine.
Given the substantial resource requirements of AI models and potential future LLM expansions, deploying AI components on a dedicated server is recommended.
The requirements on the server are as follows:
Recommended configuration: 16-core CPU, 64 GB available memory, 100 GB available disk space, where this server is exclusively used by AI components
Minimum configuration: 8-core CPU, 16 GB available memory, 80 GB available disk space, where AI components share the server with the FineBI project (These are the resource requirements for deploying AI components only. You must verify server resource availability post-FineBI-deployment before deploying AI components.)
Basic server requirement
Time consistency
The AI component server must maintain time synchronization with other servers of the project, with a maximum allowable drift of five seconds.
Inconsistent server time can cause issues such as incorrect execution of scheduled tasks, disorganized log records, and data inconsistencies.
Time zone consistency
The AI component server must maintain identical time zone configuration with other servers of the project.
Inconsistent server time zones can cause issues such as incorrect execution of scheduled tasks, disorganized log records, and data inconsistencies.
Interconnection over the intranet
The AI component server must have intranet connectivity with other servers of the project or have open ports for cross-server communication.
For details about port requirements, see the following content.
Non-VM (Recommended)
Operating system
Operating system type
Recommended: Ubuntu 22
Supported:
Ubuntu 18.04.4 and later releases (Ubuntu 20.04 is not supported.)
CentOS 7.3 to 7.9
Red Hat 7.6 and later releases
Rocky Linux 8.8 to 9.4
You are advised to use Ubuntu since CentOS is discontinued.
When using Ubuntu, verify the user privilege (as the default root user is not a superuser) and confirm that the disk type is XFS. For details, see the notes in the following content.
CPU
Disk
The server shall have a partition with available space of more than 100 GB.
The server shall have a partition with available space of more than 80 GB.
Ensure that the file system of the mounting directory is configured to be automatically mounted during boot.
Otherwise, containers may fail to access these directories, leading to data loss or application startup issues.
The mounting path cannot be a shared-use path.
Sharing the file system may cause performance degradation, file permission issues, and data inconsistency, affecting the reliability and response speed of applications running in the container.
FineAI uses lightweight model computation. A GPU can significantly accelerate processing for an enhanced experience.
The system remains operational without a GPU, but responds slowly.
Using a GPU is recommended. 4080, 4090, and A100 are supported.
Ensure the server has the tar command installed.
The tar command is a commonly used tool for packaging and compressing files.
FineOps requires this command for file extraction.
Ensure the server has the sed command installed.
The sed command is used for text processing.
FineOps requires this command for text processing.
Ensure the user you use can connect to the FineOps server via the SSH protocol.
Ensure the password used for SSH connection contains no English single quotation marks, or the privilege will fail to be validated during deployment.
The server user responsible for deploying the project must have the necessary sudo privileges.
1. You are advised to use the root user account for project deployment and operation.
2. To use a non-root user for deployment and operation, see Linux User Privilege Explanation.
Ensure the port to be mapped automatically (default port) is not in use. If it is already in use, use a free port.
For instructions on port occupancy inspection and firewall configuration, see Port Occupancy Inspection and Firewall Configuration.
FineAI (fine-ai): 7666
FineChatBl Semantic Parser (fine-chat-bi-parser): 8666
FineAI Redis (fine-ai-redis): 6679
Certain ports on the server must be open to ensure proper functioning between components.
1. FineAI (fine-ai)
Ensure the FanRuan internal gateway of the FineBI projects can access FineAI.
Ensure each FineBI - Application Node component of the FineBI project can access FineAI.
2. FineChatBl Semantic Parser (fine-chat-bi-parser)
Ensure each FineBI - Application Node component of the FineBI project can access FineChatBl Semantic Parser.
3. FineAI Redis (fine-ai-redis)
Ensure FineChatBl Semantic Parser can access FineAI Redis.
Ensure FineAI can access FineAI Redis.
Images of the FineAI and FineChatBl Semantic Parser components cannot be obtained automatically from the cloud image repository or the full FineKey installation package.
You need to push the image packages of these two components by referring to this section.
1. Obtain image packages.
You can download the image of the FineAI component: FineAI Image
You can download the image of the FineChatBl Semantic Parser component: FineChatBl Semantic Parser Image
2. Upload image packages.
Log in to FineOps as the admin and choose Platform Management > O&M Component.
Click Export Deployment Information. After the successful export, the file path is displayed.
Navigate to the server hosting FineOps. In the same path as the logs folder (where the exported file resides), locate the resources folder, which is the designated location for uploading images.
For example, if the exported file is in /home/ops/fanruan_5d15bea4/ops/logs, the image upload path is /home/ops/fanruan_5d15bea4/ops/resources.
Upload the two downloaded .tar.gz image files to the resources folder.
3. Push images to the repository.
Log in to FineOps as the admin, choose Maintenance Center > Image Management, and click Load Image.
Select the two image files of AI components and load them from the resources folder. After loading, the image files in resources will be deleted.
4. Check and modify deployed version numbers.
Log in to FineOps as the admin after the successful push and choose Maintenance Center > Image Management to view the new images pushed into the repository.
Locate the recently pushed fine-ai and fine-chat-bi-parser images and note down their version numbers.
Log in to FineOps as the admin and choose Platform Management > Update & Upgrade > Deployed Version List.
Modify the image version numbers of the two AI components to exactly match those in Image Management. Only then can FineOps detect available image updates.
FineAI Redis refers to the redis component in the image repository.
Log in to FineOps as the admin and choose Maintenance Center > Image Management. Check if the redis image exists in the image repository.
Ensure availability of the redis image of v20.3.0-6.2.17 or later versions. If no redis image meets the version requirement, ensure that the image repository can connect to FanRuan's cloud repository. For details, see Ensuring Access to FanRuan Cloud Repository.
1. Enter the component adding page.
Log in to FineOps as the admin, select a project, and choose Maintenance > Component Management.
Click Add Component, set Component Type to Business Service, and select AI.
2. Add the node (optional).
If you have prepared a new server for AI components, include this server in the project nodes first.
Click the Add Node button, enter server information, click Add Node, and wait for completion.
The following table describes specific settings.
Select Component.
Enter the host IP address (intranet IP address) of the node.
Enter a server username with sudo permission.
The supported verification methods include Password and Public Key.
1. Passwords/Secret keys are only used during project deployment and become unnecessary afterward. Project-FineOps integration relies solely on platform configuration.
Therefore, subsequent changes to the server password will not affect project O&M and monitoring.
2. If you select Public Key for authentication, upload a private key file with the .key/.pem/.crt extension (for example, id_rsa.key). Do not upload files with other extensions or public key files (for example, id_rsa.pub).
Enter the node installation path on the server, which is the mounting path set in the "Preparing the AI Component Server" section.
The default path is ~/data, where ~ represents the home directory of the server user you use.
Note: You can use the user account to access the server in the terminal, and enter the echo $HOME command to view the path of the home directory.
Optional
If the server does not support intranet access unless the intranet IP address is mapped to an extranet IP address, you can fill in a connectable extranet IP address.
3. Select a node.
Select the project node for deploying AI components.
Minimum requirements: 12-core CPU, 24 GB available memory, and 50 GB available disk space
Nodes where resources cannot meet the requirements are grayed out and unselectable.
4. Check service configuration.
Adjust port settings for each component according to the available ports prepared in the "Preparing the AI Component Server" section.
You must modify the password of the FineAI Redis component. The default password is randomly generated and cannot be changed after successful deployment.
5. Start deployment.
Click the Start Deployment button. The AI components will be automatically deployed on the selected node.
If component images are missing in FineOps's image repository, FineOps will automatically pull them from the cloud before deployment.
Once images are prepared, components will be deployed sequentially. Failure reasons will be displayed if the deployment fails.
1. You can download the installation package of the FineChatBI plugin: FineChatBI.zip
2. Log in to FineBI as the admin and choose System Management > Plugin Management > App Store.
3. Click Install From Local, select the obtained installation package of the FineChatBI plugin, and complete the installation.
The appearance of the FineChatBI icon in the lower right corner of the decision-making platform indicates that the intelligent Q&A configuration is complete.
For AI-related licensing, contact FanRuan sales personnel.
For details about network-specific operations, see Public Cloud Authentication (extranet) and New Project Registration (intranet).
The procedure of upgrading AI components deployed via FineOps is identical to that of upgrading FineOps-deployed project components.
Extra attention is required when you push the latest image packages and modify the image version numbers. Follow the steps in the "Preparing Images of AI Components" section.
For details, see Extranet-Based O&M Project Upgrade and Intranet-Based O&M Project Upgrade.
Additionally, upgrade the plugin installed in the "Installing the Plugin in FineBI" section after the upgrade completion.
For details about the LLM connection, see Service Architecture Overview.
滑鼠選中內容,快速回饋問題
滑鼠選中存在疑惑的內容,即可快速回饋問題,我們將會跟進處理。
不再提示
10s後關閉
Submitted successfully
Network busy