Create a Smart Connect task to dump Kafka instance data to OBS for message data backup.
Data in the source Kafka instance is synchronized to the dumping file in real time.
Select the region where your Kafka instance is located.
Parameter |
Description |
---|---|
Regular expression |
A regular expression is used to subscribe to topics whose messages you want to dump. |
Enter/Select |
Enter or select the names of the topics to be dumped. Separate them with commas (,). A maximum of 20 topics can be entered or selected. |
Parameter |
Description |
---|---|
Offset |
Options:
|
Dumping Period (s) |
Interval for periodically dumping data. The time unit is second and the default interval is 300 seconds. No package files will be generated if there is no data within an interval. |
AK |
Access key ID. For details about how to obtain the AK, see Access Keys. |
SK |
Secret access key used together with the access key ID. For details about how to obtain the SK, see Access Keys. |
Dumping Address |
The OBS bucket used to store the topic data.
|
Dumping Directory |
Directory for storing topic files dumped to OBS. Use slashes (/) to separate directory levels. |
Time Directory Format |
Data is saved to a hierarchical time directory in the dumping directory. For example, if the time directory is accurate to day, the directory will be in the format of bucket name/file directory/year/month/day. |
Record Separator |
Select a separator to separate OBS dumping records. |
Use Storage Key |
Specifies whether to dump keys. |
Do not use the key of a message as the dumping file name.