Data Connection FAQs

  • Last update: March 02, 2026
  • SQL Server

    Data Connection Failure 

    Error message: The server selected protocol version TLS10 is not accepted by client preferences [TLS12]

    Issue Description

    The driver is unable to establish a secure connection to the SQL Server database using Secure Sockets Layer (SSL) encryption.

    Solution

    • Check for driver version mismatch: Find the supported driver versions on the official database website based on your database version. If your driver version is supported, check for compatibility. Search online for how to check your driver version.

    • Check for the JDK version. Try changing the JDK version or replacing the server.

    • Verify that a Java environment exists, and modify the files in the jre directory.

    Preview Error When Field Names Contain Parentheses 

    Issue Description

    A preview error occurs when a field name contains parentheses. The error message reads "Data connection is abnormal - DataBase[sqlserver2016] get data failed! 'Field name' is not a recognized built-in function name."

    Solution

    Wrap the field name in square brackets ([]), for example, [Field name].

    Data Connection Configuration Failure

    Issue Description

    Configuring the data connection fails with the error message indicating that the driver is unable to establish a secure connection to the SQL Server database using Secure Sockets Layer (SSL) encryption.

    Cause Analysis

    This is caused by new cipher suites introduced in newer JDK versions.

    Solution

    Open the java.security file in FineDataLink installation directory\jre\lib\security\java.security and remove or comment out 3DES_EDE_CBC, TLS1, TLS1.1, and TLS1.2, as shown in the following figure. Save the file and restart FineDataLink. The connection can then be configured normally.

    MySQL

    RDS MySQL Database Connection Error

    Issue Description

    An error occurs when you configure a data connection to an RDS MySQL database.

    Error message: com.fr.third.alibaba.druid.pool.GetConnectionTimeoutException: wait millis 10003, active 0, maxActive 50, creating 1, createElapseMillis 20014 at com.fr.third.alibaba.druid.pool.

    Cause Analysis

    The IP address is not included in the allowlist of Alibaba Cloud.

    Solution

    Add the IP address of the FineDataLink server to the allowlist of Alibaba Cloud.

    MySQL Data Connections Established Yet Unusable

    Issue Description

    A data connection has actually been created with the driver org.gjt.mm.mysql.Driver.

    However, the corresponding MySQL data connection is not displayed in the option list.

    Cause Analysis

    This MySQL driver is not yet supported.

    Solution

    Currently, the supported MySQL drivers include com.mysql.jdbc.Driver and com.mysql.cj.jdbc.Driver.

    Data Cleanup Failure of FineDataLink

    Issue Description

    Data is output to a MySQL database with Write Method set to Write Data into Target Table After Emptying It. However, the target table is not cleared during each task execution.

    Cause Analysis

    The database user used to configure the MySQL data connection does not have the DROP privilege. Grant the user the DROP privilege.

    Data Connection Error: Handshake Failure

    Issue Description

    After the TLS protocol of the MySQL database is upgraded, a data connection error occurs, saying "Caused by: javax.net.ssl.SSLHandshakeException: Received fatal alert: handshake_failure." The SSL handshake fails during connection.

    Solution

    Append ?useSSL=false&allowPublicKeyRetrieval=true to the MySQL data connection URL.

    Data Connection Error: Protocol Version Issue

    Issue Description

    An error occurs during connection, saying "Caused by: javax.net.ssl.SSLException: Received fatal alert: protocol_version."

    Solution

    Append ?useSSL=false&allowPublicKeyRetrieval=true to the data connection URL.

    Oracle

    Data Connection Error: ORA-12505

    Issue Description

    An error occurs during connection, saying "ORA-12505: TNS:listener does not currently know of SID given in connect descriptor."

    Cause Analysis

    The provided SID is incorrect, most likely because the database name and SID are being confused.

    Solution 

    You can write the database URL in two ways: 

    • Use a colon to specify the SID service, meaning the SID is orcl

    Example: database.url=jdbc:oracle:thin:@171.xxx.96.xx:xxxx:orcl 

    • Use a slash to specify the service name, meaning the service name is orcl.

    Example: database.url=jdbc:oracle:thin:@171.xxx.96.xx:xxxx/orcl 

    The first format uses a colon to specify the SID service, while the second uses a slash to specify the service name. Therefore, the error occurs because the SID is written incorrectly, with the service name mistakenly used as the SID. You can replace the colon (:) before orcl with a slash (/) to resolve the issue.

    Data Preview Error: ORA-00942

    Issue Description

    No data is displayed in Data Preview, and the ORA-00942 error is recorded in the log, indicating that the table or view does not exist.

    Cause Analysis

    The SQL query does not account for case sensitivity in schema and table names, which are treated as case-sensitive in Oracle databases.

    Solution

    Modify the case of the schema and table names to match the actual names in the database.

    Greenplum

    Data Connection Error: gpfdist Environment Check Failure

    Issue Description

    Solution

    When uploading gpfdist files, first upload the compressed package to the Linux server and then decompress it. For details, see Greenplum Data Connection.

    Execution Failure of SQL Statements in FineDataLink Only

    Issue Description

    The SQL query succeeds in the Greenplum database but fails in FineDataLink.

    Solution

    Include the schema name in the SQL statement.

    Data Synchronization Hang

    Issue Description

    Both the source and target databases are Greenplum. Data synchronization hangs after 300 million records are extracted. Both read and write row counts drop to 0.

    Cause Analysis

    The read operation stalls due to network fluctuations.

    Solution

    Add the socketTimeout parameter to the data connection URL of the source end Greenplum database, for example, jdbc:postgresql://hostname:port/database?socketTimeout=1800.

    HTTP Response Code 404 from gpfdist

    Issue Description

    The target end is a Greenplum database. The task fails with the error "ERROR: http response code 404 from gpfdist (gpfdist://192.168.60.197:15500/fdda13be3f2f01e7_BLANK_CHANCHU.csv): HTTP/1.0 404 file not found (seg33 slice1 192.168.60.177:40001 pid=27049)."

    Cause Analysis

    The old gpfdist process is not terminated during the FineDataLink restart, leaving it occupying the port and preventing new tasks from being executed.

    Solution

    Run kill -9 to kill the old gpfdist process and rerun the scheduled task to generate a new gpfdist service process.

    Error When Connecting to gpfdist

    Issue Description

    The target end is a Greenplum database. The task fails with the error "error when connecting to gpidist htp:/10.145 1.74:15500/491b2d5143c97c36 ods oa worklow curentoperator.csy, quit after 11 tries (seg10 slice1 10.145.1.79:40002 pid=46296)."

    Cause Analysis

    A Greenplum node is abnormal and cannot access port 15500 on the FineDataLink server.

    Solution

    Check the Greenplum nodes and verify whether the database server can establish a telnet connection to the FineDataLink IP address and port.

    Presto

    Data Connection Error: SQL Execution Failure

    Issue Description

    Error message: java.sql.SQLException: Connection is in auto-commit mode - java.sql.SQLException: Connection is in auto-commit mode

    Cause Analysis

    This is caused by driver version issues.

    Solution

    For Presto database 0.273.2, use a driver of 0.169 or later versions.

    Data Connection Failure During Data Synchronization

    Issue Description

    The Presto data connection is successful under System Management > Data Connection > Data Connection Management, but fails in the Data Synchronization node.

    Cause Analysis

    This is caused by a product logic issue.

    Under System Management > Data Connection > Data Connection Management, if the system cannot obtain the database schema, it will leave the schema field empty and consider the connection successful. However, in the Data Synchronization node, failure to obtain the schema results in an error.

    PostgreSQL

    Failure to Delete PostgreSQL Database Tables Using DELETE/DROP/TRUNCATE in Navicat

    Issue Description

    Cause Analysis

    The table is locked by another transaction, preventing deletion operations.

    Solution

    Run select pid, query from pg_stat_activity to check all current processes. The result shows that the second PID is associated with the table that cannot be deleted.

    Run the following command to terminate the corresponding PID: 

    select pg_terminate_backend(pid), query from pg_stat_activity where query ~* 'order_table' and pid <> pg_backend_pid()

    The table can then be deleted successfully.

    SAP

    Connection Error: Handler Dispatch Failure

    Issue Description

    Error message: Handler dispatch failed: nested exception is java.lang.NoClassDefFoundError: com/sap/conn/ico/CoException

    Solution

    Restart the FineDataLink server after placing the driver.

    Doris

    Data Transfer Error: Connection Reset

    Issue Description

    In the Data Synchronization node, an error occurs during data output, saying "Exception when job run com.fr.dp.exception.FineDPException: An error was encountered during data transfer - Connection reset - Connection reset."

    Cause Analysis

    The FE node address in the Doris data connection configuration is entered incorrectly. The correct input should be the IP address and the HTTP port of the FE node.

    Data Write Error

    Issue Description

    A scheduled task with Doris as the target end fails with the error "Within the Doris transaction, only INSERT INTO SELECT, UPDATE, and DELETE statements are supported. -errCode = 2, detailMessage = This is in a transaction, only insert, update, delete, commit, rollback is acceptable."

    Solution

    Before enabling Transaction Control, add useLocalSessionState=true to the Doris data connection URL.

    For example, jdbc:mysql://192.168.5.199:9099/mysql?useLocalSessionState=true.

    SSH

    Connection Timeout

    Issue Description

    Tesing the SSH data connection fails with the error "SSH connection failed - timeout."

    Cause Analysis

    The FineDataLink server cannot establish a telnet connection to port 22 of the target server. Check the firewall rules, IP address blocklists/allowlists, and other network restrictions.

    Connection Failure: Authentication Failure

    Issue Description

    The command ssh root@IP address succeeds when you configure the SSH data connection to the local host. However, an authentication failure occurs when you use localhost in the connection URL in FineDataLink, and a timeout occurs when you use the IP address.

    Solution

    iconNote:
    First, verify connectivity by running ssh root@IP address from the FineDataLink server. If that fails, check the password. If the password is correct, proceed with the following steps.

    This issue may be caused by the following reasons.

    1. The user account (for example, root on CentOS 6) is not allowed to log in remotely.

    2. The GSSAPIAuthentication value in the server configuration file (/etc/ssh/sshd_config) is set to yes.

    Troubleshooting steps:

    1. Open the sshd_config file (/etc/ssh/sshd_config), uncomment the line PermitRootLogin yes to allow remote logins for root.

    2. Set the GSSAPIAuthentication value in the sshd_config file to no, or add session.setConfig("userauth.gssapi-with-mic", "no") and session.setConfig("StrictHostKeyChecking", "no") in the code.

    3. To speed up SSH logins, change the value of UseDNS from yes to no in the sshd_config file.

    4. Restart the SSH service. The connection speed will be significantly improved.

    SSH Connection Verification Failure

    Issue Description

    Configuring an SSH data connection fails with the error "SSH Connection Failed - verify: false."

    Solution

    The built-in JSCH version in the platform fully supports only OpenSSH 6.x. For OpenSSH of later versions, additional encryption methods need to be configured.

    Apache Impala

    Impala Connection Failure

    Issue Description

    A project that is deployed in a containerized manner encounters an Impala connection failure with the error "Unable to obtain Principal Name for authentication."

    Solution

    Open the krb5.conf configuration file, delete the renew_lifetime parameter, and then reconnect FineDataLink to the database.

    Hadoop Hive

    Data Connection Error: Permission Denied

    Issue Description

    Error message: org.apache.hadoop.security.AccessControlException: Permission denied: user=root, access=WRITE, inode="/":hdfs:supergroup:drwxr-xr-x

    Solution One

    Open the catalina.sh file on the FineDataLink server and add the variable declaration export HADOOP_USER_NAME=Username. This specifies that the user connecting to HDFS is Username.

    Solution Two (Not Recommended)

    If no user is specified, the root account is used for connection to HDFS by default. This solution requires disabling HDFS user authentication, which may introduce security risks, as it allows all users to access HDFS and execute commands.

    iconNote:
    This configuration is not present in CDH by default and must be added manually.

    The procedures are as follows:

    1. Locate the advanced configuration snippet (safety valve) of the HDFS service in hdfs-site.xml.

    2. Set the value of dfs.permissions.enabled to false, save the change, and restart HDFS.

    HDFS Write File Error

    Issue Description

    Hive (HDFS) data connection fails with the error "hdfs write file error.Caused by: java.lang.UnsupportedOperationException" in the log.

    Cause Analysis

    The value format of HDFS Setting in the data connection configuration should be hdfs://IP address:Port number, but you may have incorrectly entered http://IP address:Port number.

    More Rows in the Hive Database Than in the Source Database After Data Synchronization

    Issue Description

    After the execution of a scheduled task that synchronizes data from a Kingbase database V8R6 to a Hadoop Hive database 2.1, the row count in the Hadoop Hive database (90496) is more than that in the source database (87719). Additionally, the Kingbase table has a non-null constraint on the id column, but null values appear in the corresponding column in the Hadoop Hive table.

    Cause Analysis

    The target table in the Hadoop Hive database is stored in the TEXT format. The data from the Kingbase database may contain delimiter characters, causing it to be split into new rows when written to HDFS.

    Solution

    Manually create a table using the following SQL statement that specifies the storage format as ORC:

    CREATE TABLE your_table (
      id INT,
      name STRING
    )
    row format serde 'org.apache.hadoop.hive.ql.io.orc.OrcSerde'
    STORED AS INPUTFORMAT
      'org.apache.hadoop.hive.ql.io.orc.OrcInputFormat'
    OUTPUTFORMAT
      'org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat';

    Hadoop Hive as Target: Error in Data Writing After Table Cleanup

    Issue Description

    The Hadoop Hive is used as the target end with Write Method set to Write Data into Target Table After Emptying It. An error occurs, saying "FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask.Exception while processing."

    Cause Analysis

    The configured HDFS user does not have sufficient privileges.

    Solution

    Use an HDFS user with the appropriate privilege.

    Data Connection Error: No Common Protection Layer Between the Client and the Server

    Issue Description

    The Kerberos authentication is enabled for a Hadoop Hive data connection. The test connection fails with the error "No common protection layer between client and server."

    Cause Analysis

    The SASL security layer is incorrectly configured or mismatched. Both the client and the server must support the same SASL mechanism, and the sasl.qop parameter in the configuration file must be set correctly.

    Solution

    Add sasl.qop=auth to the data connection URL.

    Example URL: 

    jdbc:hive2://192.168.6.140:19023/default;principal=hive/primary.cxfund.com@CGJH.COM;auth=KERBEROS;sasl.qop=auth

    Hadoop Hive as Target: Error in Retrieving Existing Tables

    Issue Description

    The Hadoop Hive is used as the target end. An error occurs in retrieving existing tables.

    Solution

    Check the Hive configuration hive.support.quoted.identifiers. If it is set to none, backticks are not supported, causing the error. Modify this parameter value accordingly.

    Local Server Directory

    Data Connection Creation Failure

    Issue Description

    You do not find the Server Local Directory option when creating a data connection.

    Solution

    The data connection of the Local Server Directory type can only be created by the super administrator. For details, see Data Connection to a Local Server Directory.

    ClickHouse

    Response Without Column Names

    Error message: ClickHouse response without column names occurs 

    Append urlcompress=false to the connection URL.

    Primary Key Retrieval Failure in Data Synchronization

    Issue Description

    A ClickHouse table has a primary key, but it cannot be retrieved in the Data Synchronization node.

    Cause Analysis

    Primary keys in ClickHouse databases differ from conventional primary keys.

    In ClickHouse databases, a primary key functions more like a sorting key, providing a primary index but not enforcing uniqueness.

    In FineDataLink, the primary key is used to synchronize UPDATE/DELETE operations. Since ClickHouse primary keys are not unique constraints, selecting a primary key may be ineffective.

    Automatic Table Creation Failure

    Issue Description

    The automatic creation of target tables in ClickHouse databases fails.

    Cause Analysis

    The ClickHouse database engine is the MySQL engine, which does not support table creation.

    Solution

    Switch it to the Atomic engine, which supports automatic table creation during write operations.

    Slow ClickHouse Primary Key Updates

    Issue Description

    You use the Data Synchronization node to write data to a ClickHouse database with Strategy for Primary Key Conflict set to Update Data. Only tens of rows are written per second.

    Cause Analysis

    ClickHouse is an MPP database, which is fast in inserts but slow in updates.

    Solution

    Use a Data Comparison node in FineDataLink to check the amount of data to be inserted before executing synchronization.

    Field Names Prefixed with Table Names After a Join

    Issue Description

    You use an SQL statement to join tables for data synchronization. Field names in Data Preview appear prefixed with the table name (in the Table name.Field name format).

    Cause Analysis

    Unlike MySQL databases, if you perform joins on ClickHouse table fields and the second table has a field present in the first table, the field name in the result set will be prefixed with the table name, even if the SELECT clause refers to it unambiguously. For example, SELECT t2.bit1 FROM "fdl"."all_type_03" t1 JOIN "fdl"."all_type_03" t2 ON t1.bit1 = t2.bit1 results in a column named t2.bit1.

    Solution

    Use AS to rename the field and remove the table name prefix.

    TDengine

    Connection Error: No taos in java.library.path

    Issue Description

    During the connection to TDengine, the error "no taos in java.library.path" occurs.

    Solution

    The native library taos is not found. Add the corresponding dynamic link library to the system where FineDataLink is deployed. 

    Fine+ Project

    Failure to Get a Token from FineBI

    Issue Description

    During the connection to FineBI’s Public Data, the error "Failed to get token from FineBI" occurs.

    Cause Analysis

    Single sign-on (SSO) causes the login API requests to be redirected with the username parameter removed.

    Solution

    Enter the following URL in a browser: http://IP address:Port number/webroot/decision/login/cross/domain?fine_username=Username&fine_password=Password&validity=-1&callback=. If FineBI can be accessed successfully, the relevant token value will be returned.

    If the access fails, disable the single sign-on plugin in the FineBI project and then perform reconnection.

    StarRocks

    Write Error: Too Many Filtered Rows

    Issue Description

    A scheduled task writing to a StarRocks database fails with the error "too many filtered rows."

    Cause Analysis

    The source field corresponding to a non-null primary key field in the target table is not a primary key and allows null values.

    Solution

    Delete the target table in the database and set the target table to Auto Created Table in Data Destination and Mapping.

    SelectDB

    Data Write Error

    Issue Description

    A scheduled task with SelectDB as the target end fails with the error "Within the SelectDB transaction, only INSERT INTO SELECT, UPDATE, and DELETE statements are supported.-errCode = 2, detailMessage = This is in a transaction, only insert, update, delete, commit, rollback is acceptable."

    Solution

    Before enabling Transaction Control, add useLocalSessionState=true to the SelectDB data connection URL.

    For example, jdbc:mysql://192.168.5.199:9099/mysql?useLocalSessionState=true.

    Transwarp ArgoDB

    Data Connection Success with HDFS Authentication Failure

    Issue Description

    The Transwarp ArgoDB data connection is successful, but HDFS authentication fails.

    Cause Analysis

    The time difference between the KDC server and the FineDataLink server exceeds five minutes.

    Application Data Source

    Issue Description

    During project startup, if the request to the application data source server takes more than five seconds, FineDataLink will actively interrupt the connection and fail to retrieve the application data source. An INFO-level log message will be printed:

    INFO [standard] Fetched FineApp data sources from cloud cost 5019 ms.

    Solution

    Manually refresh the connection through /webroot/decision/v10/config/connection/fineapp/refresh.

    Others

    Abnormal Display on the Data Connection Page

    Issue Description

    Content on the data connection page is displayed abnormally, with issues such as misaligned text.

    Solution

    The browser being used is of a low version. Upgrade it to the latest version. You are advised to use Google Chrome or Microsoft Edge browsers of recent versions.

    Test Connection Error: SSH Connection Timeout

    Issue Description

    The data connection test fails with the error "SSH data connection error-timeout."

    Solution

    This error occurs due to network connectivity issues. The FineDataLink server cannot establish a telnet connection to the port on the database server. Check the firewall rules, IP address blocklists/allowlists, and other network restrictions.

    Data Connection Creation Failure During Scheduled Task Execution

    Issue Description

    The data connection test succeeds, but a data connection creation failure occurs during scheduled task execution.

    Cause Analysis

    The data connection times out.

    Solution

    Increase the maximum wait time in the data connection configuration. For details, see Connection Pool Setting.

    Data Connection Error: Table Not Exist

    Issue Description

    Cause Analysis

    The database user used to configure the data connection may lack table creation privileges.

    Solution

    Check whether the user is assigned the necessary privileges on the relevant tables.

    Data Preview Error: Data Fetching Failure

    Issue Description

    Error message: -XXXX get data failed

    Cause Analysis

    No database is input in the SQL statement.

    Solution

    Select the appropriate database before executing the query.

    No Tables Displayed After Data Connection Selection

    Issue Description

    On the Data Destination and Mapping tab page in a Data Synchronization node, after you select a data connection and set the target table to Existing Table, no existing tables are displayed.

    Cause Analysis

    The data connection name contains brackets ([]). Remove them.

    Communication Link Failure

    Issue Description

    A TiDB data connection error occurs, saying "Communications link failure.

    The last packet successfully received from the serve was 1,165,470 milliseconds ago.The last packet sent successfully to the server was 1,165,470 milliseconds ago."

    Cause Analysis

    Maximum Wait Time in the data connection configuration is set to 0 milliseconds, under the mistaken assumption that 0 means never timeout.

    Solution

    Set Maximum Wait Time to an appropriate positive value.

    JDBC Column Type Mismatch: No Reader Function Mapping for java.sql.Types Code: 0

    Issue Description

    The connection test fails with the error "JDBC column type does not match - No reader function mapping for java.sql.Types code: 0."

    Solution

    The JDBC driver does not match the database version. Download the correct driver from the help document or the official database website and upload it to FineDataLink.

    附件列表


    主题: Data Source Configuration
    • Helpful
    • Not helpful
    • Only read

    滑鼠選中內容,快速回饋問題

    滑鼠選中存在疑惑的內容,即可快速回饋問題,我們將會跟進處理。

    不再提示

    10s後關閉

    Get
    Help
    Online Support
    Professional technical support is provided to quickly help you solve problems.
    Online support is available from 9:00-12:00 and 13:30-17:30 on weekdays.
    Page Feedback
    You can provide suggestions and feedback for the current web page.
    Pre-Sales Consultation
    Business Consultation
    Business: international@fanruan.com
    Support: support@fanruan.com
    Page Feedback
    *Problem Type
    Cannot be empty
    Problem Description
    0/1000
    Cannot be empty

    Submitted successfully

    Network busy