Overview
Version
Version | Functional Change |
---|---|
4.0.20.1 | Supported Kafka of releases later than 0.10.2. |
4.1.11.4 | Real-Time Task supported the Kafka data source. |
4.1.13.2 | Supported Kerberos authentication. |
Application Scenario
FineDataLink supports connection to Kafka to fetch real-time data from Kafka by configuring the Kafka consumer.

Preparation
Prerequisite
For details, see Overview of Data Connection.
Connection Information Collection
Collect the following information before connection.
1. IP address and port number of the Kafka service
2. Username and password (if username and password authentication is used); To use Kerberos authentication in FineDataLink 4.1.13.2 and later versions, you need to download configuration files krb5.conf, Keytab file name.keytab, and principal from the environment where the Kafka service is running.
Keytab file name.keytab is the key table file. You need to find the file location on the application server that provides the Kerberos service.
Configuring the hosts File

Configure the hosts file on the local server, for example, the hosts file in the C:\Windows\System32\drivers\etc path. Add the remote mapping relationship in the format of IP address Machine name, as shown in the following figure.

Procedure
1. Log in to FineDataLink as the admin, choose System Management > Data Connection > Data Connection Management, select a folder, and create a data connection, as shown in the following figure.
2. Set the data connection name. You can also modify the directory of the data connection.
3. Find the data source by searching the data source name or filtering the data source by Data Source Type, Supported Form, and Compatible Module, as shown in the following figure.
4. Fill in the information of the data source, as shown in the following figure.
Setting items are described in the following table.
Setting Item | Description |
---|---|
Kafka Service Address | Configure the Kafka address. Example format: demo.fanruan.com:9093 Fill in the IP address or hostname and the port number, with multiple addresses separated by comma (,). |
Authentication Method | Supported authentication methods include No Authentication, Username Password, and Kerberos. Username Password: Enter the username and password of the Kafka service. Kerberos: 1. Kerberos Service Name
![]() 2. Keytab File
3. Client Principal The principal is the name of the client that has registered with KDC, and it is automatically filled in after the keytab file is parsed. The format of the principal is usually Username/Department name@Company name. To check whether the principal is correct, you can execute the klist or kinit -k -t Keytab file path Principal name command in the shell of the database server. You can also connect to the authenticated service by using tools such as Beeline and Impala Shell to view the principal information. For example, the principal name corresponding to the Hive service is hive/bigdata@Company name.COM, and the one corresponding to the Impala service is impala/bigdata@Company name.COM. 4. krb5.conf File Only CONF files can be uploaded. Upload the krb5.conf file. |
Encoding | It refers to the encoding parameter, which determines the character encoding used for reading data when the type of the KEY-type key or the MESSAGE-type value is set to STRING. ![]() |
Minimum Number of Bytes in One Fetch (Byte) | Manual input is supported.
|
Single Fetch Duration (Millisecond) | Manual input is supported.
|
Parameter Expansion | You can add and delete parameters and parameter values.
|
4. Click Test Connection. If the connection is successful, click Save, as shown in the following figure.
Data Source Usage
You can use the Kafka data source in Data Pipeline and Real-Time Task. For details, see Pipeline Task - Kafka.