As an auxiliary function of FineChatBI, the large language model (LLM) can help handle ambiguous semantics. FineChatBI, connected to the LLM, provides higher efficiency and stronger expressiveness during data-driven intelligent decision-making.
FineChatBI can be connected to the LLM to achieve the following functions:
· Intelligent rewriting: The LLM is used to standardize and rewrite asked questions.
· Analysis idea: Vague or unclear semantics can be understood and handled to help users clarify question intentions.
· Data interpretation: Intelligent data insight support is provided.
· Attribution analysis suggestion: Users can quickly locate key influencing factors of indicator changes.
· One-click configuration: Synonyms can be configured for field values through the LLM.
You can refer to FineChatBI Service Architecture Overview to deploy the LLM service.
After successfully configuring the LLM service, you can, as the admin, optionally enable the LLM functions under Intelligent Q&A Configuration > Other Configurations, as shown in the following figure.
The intelligent rewriting function uses the LLM to understand user intent and standardize the posed questions, improving matching accuracy between query phrasing and database fields. This makes the questions match the standard fields in the database more precisely, thereby improving query accuracy and making it easier to ask questions.
Example 1
· Question before rewriting: "Store that performs the best"
· Standardized question after rewriting: "Store name with the highest total sales"
Example 2
· Question before rewriting: "Help me compare the sales amounts between the last year and this year"
· Standardized question after rewriting: "Last year's sales amount vs this year’s sales amount"
1. Assume that you have set Default Q&A Mode to Intelligence and saved the configuration under Intelligent Q&A Configuration > Other Configurations, as shown in the following figure.
The buttons Extreme Speed and Intelligent will appear for Q&A mode switchover on the question page.
· Intelligent is selected by default. In this case, every question will be rewritten by the LLM to improve the response accuracy with, however, the lower system speed and performance.
· If you manually select Extreme Speed, no question will be rewritten.
2. Assume that you have set Default Q&A Mode to Extreme Speed and saved the configuration under Intelligent Q&A Configuration > Other Configurations. Only when you click Unsatisfied with the result? Regenerate will the question be intelligently rewritten, as shown in the following figure.
After the Analysis Idea function is enabled, the LLM can understand and process vague or unclear semantics. It can also provide inspiring ideas and suggestions to assist in deep thinking and decision-making.
1. Enable Analysis Idea under Intelligent Q&A Configuration > Other Configurations, as shown in the following figure.
2. On the Q&A page, click Ask Idea and enter ideas (for example, entering Help me see which products are worth promoting). Then the system will disassemble the ideas for you to directly view the post-disassembled ideas.
After the Data Interpretation function is enabled, FineChatBI will provide intelligent data insight support. After a chart is generated, the LLM can automatically generate an interpretation report to analyze data performance, associate external influencing factors, and recommend subsequent analysis directions, help fully understand the chart.
1. Enable Data Interpretation under Intelligent Q&A Configuration > Other Configurations. Unlike other functions, Data Interpretation requires separate LLM configuration. For example, since the DeepSeek model is more suitable for data interpretation, you can enter the LLM information of DeepSeek in the area below Data Interpretation, as shown in the following figure.
2. After a chart is generated on the Q&A page, click Data Interpretation below. Then the LLM will provide a data interpretation report, which will include:
· Data performance analysis: The system provides a detailed data performance interpretation in the chart based on the question.
· Association with external factors: The system automatically associates external information (if any) affecting the data to reasonably explains the chart's data performance and the deeper reasons behind the data.
· Recommended follow-up question: Based on the current analysis direction, the system intelligently recommends a series of questions for further inquiry, guiding you to dig deeper into the data and expand the analysis dimensions.
Analysis by multiple disassembled dimensions is one of the effective methods to identify the cause of a question. However, too many dimensions disassembled during analysis will defocus the analysis, neither identifying the cause nor suggesting potential issues across every dimension. The Attribution Analysis Suggestion function uses LLM capabilities to help quickly identify the key factors affecting indicator changes.
1. Enable Attribution Analysis Suggestion under Intelligent Q&A Configuration > Other Configurations, as shown in the following figure.
2. After raising a question, click the indicator value to find the important factors affecting the indicator value, break down the dimensions in a targeted way for analysis, and improve analysis efficiency.
· For the question involving time and a single indicator, abnormal points can be automatically detected. Example: February 2022 and June 2022.
· For the question involving a single group and a single indicator, you can click a point on the component for attribution analysis. Example: Sales in June 2022
· You can use Q&A for attribution analysis by time range or time point. Example: Why were sales in February 2022 abnormal?
Use the LLM to configure synonyms for field values. For details, see Synonym Configuration.
During preloading configuration, the subject introduction can be intelligently paraphrased using the LLM, as shown in the following figure.
滑鼠選中內容,快速回饋問題
滑鼠選中存在疑惑的內容,即可快速回饋問題,我們將會跟進處理。
不再提示
10s後關閉
Submitted successfully
Network busy