FineDataLink supports connection to Kafka to fetch real-time data from Kafka by configuring the Kafka consumer.
For details, see Overview of Data Connection.
Collect the following information before connection.
1. IP address and port number of the Kafka service
2. Username and password (if username and password authentication is used); To use Kerberos authentication in FineDataLink 4.1.13.2 and later versions, you need to download configuration files krb5.conf, Keytab file name.keytab, and principal from the environment where the Kafka service is running.
Keytab file name.keytab is the key table file. You need to find the file location on the application server that provides the Kerberos service.
Configure the hosts file on the local server, for example, the hosts file in the C:\Windows\System32\drivers\etc path. Add the remote mapping relationship in the format of IP address Machine name, as shown in the following figure.
1. Log in to FineDataLink as the admin, choose System Management > Data Connection > Data Connection Management, select a folder, and create a data connection, as shown in the following figure.
2. Set the data connection name. You can also modify the directory of the data connection.
3. Find the data source by searching the data source name or filtering the data source by Data Source Type, Supported Form, and Compatible Module, as shown in the following figure.
4. Fill in the information of the data source, as shown in the following figure.
Setting items are described in the following table.
Configure the Kafka address.
Example format: demo.fanruan.com:9093
Fill in the IP address or hostname and the port number, with multiple addresses separated by comma (,).
Supported authentication methods include No Authentication, Username Password, and Kerberos.
Username Password: Enter the username and password of the Kafka service.
Kerberos:
1. Kerberos Service Name
It refers to the service name used by the client to access the service. Kerberos authentication uses the service name to match the service credential, ensuring that the client connects to the correct service.
The default value is kafka.
2. Keytab File
Only KEYTAB files can be uploaded.
3. Client Principal
The principal is the name of the client that has registered with KDC, and it is automatically filled in after the keytab file is parsed.
The format of the principal is usually Username/Department name@Company name. To check whether the principal is correct, you can execute the klist or kinit -k -t Keytab file path Principal name command in the shell of the database server.
You can also connect to the authenticated service by using tools such as Beeline and Impala Shell to view the principal information.
For example, the principal name corresponding to the Hive service is hive/bigdata@Company name.COM, and the one corresponding to the Impala service is impala/bigdata@Company name.COM.
4. krb5.conf File
Only CONF files can be uploaded. Upload the krb5.conf file.
It refers to the encoding parameter, which determines the character encoding used for reading data when the type of the KEY-type key or the MESSAGE-type value is set to STRING.
Manual input is supported.
Only positive integers are supported.
Spaces are not allowed.
You can add and delete parameters and parameter values.
The length of the parameter name and the parameter value cannot exceed 50 characters.
The name of the new parameter can contain English letters and periods (.) only. The parameter value can contain English letters and numbers only.
It cannot be empty.
4. Click Test Connection. If the connection is successful, click Save, as shown in the following figure.
You can use the Kafka data source in Data Pipeline and Real-Time Task. For details, see Pipeline Task - Kafka.
滑鼠選中內容,快速回饋問題
滑鼠選中存在疑惑的內容,即可快速回饋問題,我們將會跟進處理。
不再提示
10s後關閉
Submitted successfully
Network busy