forked from docs/doc-exports
Reviewed-by: Pruthi, Vineet <vineet.pruthi@t-systems.com> Co-authored-by: Su, Xiaomeng <suxiaomeng1@huawei.com> Co-committed-by: Su, Xiaomeng <suxiaomeng1@huawei.com>
119 lines
13 KiB
HTML
119 lines
13 KiB
HTML
<a name="dli_08_0410"></a><a name="dli_08_0410"></a>
|
|
|
|
<h1 class="topictitle1">Confluent Avro</h1>
|
|
<div id="body8662426"><div class="section" id="dli_08_0410__en-us_topic_0000001310095793_section152911439362"><h4 class="sectiontitle">Function</h4><p id="dli_08_0410__en-us_topic_0000001310095793_p7340555103613">The Avro Schema Registry (<strong id="dli_08_0410__en-us_topic_0000001310095793_b1946116350417">avro-confluent</strong>) format allows you to read records that were serialized by the <strong id="dli_08_0410__en-us_topic_0000001310095793_b270013432415">io.confluent.kafka.serializers.KafkaAvroSerializer</strong> and to write records that can in turn be read by the <strong id="dli_08_0410__en-us_topic_0000001310095793_b2091552945">io.confluent.kafka.serializers.KafkaAvroDeserializer</strong>.</p>
|
|
<p id="dli_08_0410__en-us_topic_0000001310095793_p834019551364">When reading (deserializing) a record with this format the Avro writer schema is fetched from the configured Confluent Schema Registry based on the schema version ID encoded in the record while the reader schema is inferred from table schema.</p>
|
|
<p id="dli_08_0410__en-us_topic_0000001310095793_p2034085543615">When writing (serializing) a record with this format the Avro schema is inferred from the table schema and used to retrieve a schema ID to be encoded with the data The lookup is performed with in the configured Confluent Schema Registry under the <a href="https://docs.confluent.io/current/schema-registry/index.html#schemas-subjects-and-topics" target="_blank" rel="noopener noreferrer">subject</a>. The subject is specified by <strong id="dli_08_0410__en-us_topic_0000001310095793_b1374112234717">avro-confluent.schema-registry.subject</strong>.</p>
|
|
</div>
|
|
<div class="section" id="dli_08_0410__en-us_topic_0000001310095793_section382515542379"><h4 class="sectiontitle">Supported Connectors</h4><ul id="dli_08_0410__en-us_topic_0000001310095793_ul123811916163812"><li id="dli_08_0410__en-us_topic_0000001310095793_li203823165389">kafka</li><li id="dli_08_0410__en-us_topic_0000001310095793_li115222184514">upsert kafka</li></ul>
|
|
</div>
|
|
<div class="section" id="dli_08_0410__en-us_topic_0000001310095793_section17656257385"><h4 class="sectiontitle">Parameters</h4>
|
|
<div class="tablenoborder"><table cellpadding="4" cellspacing="0" summary="" id="dli_08_0410__en-us_topic_0000001310095793_table151408218395" frame="border" border="1" rules="all"><caption><b>Table 1 </b>Parameter description</caption><thead align="left"><tr id="dli_08_0410__en-us_topic_0000001310095793_row91401425390"><th align="left" class="cellrowborder" valign="top" width="27.98%" id="mcps1.3.3.2.2.6.1.1"><p id="dli_08_0410__en-us_topic_0000001310095793_p12140520394">Parameter</p>
|
|
</th>
|
|
<th align="left" class="cellrowborder" valign="top" width="8.25%" id="mcps1.3.3.2.2.6.1.2"><p id="dli_08_0410__en-us_topic_0000001310095793_p1414019215399">Mandatory</p>
|
|
</th>
|
|
<th align="left" class="cellrowborder" valign="top" width="8.15%" id="mcps1.3.3.2.2.6.1.3"><p id="dli_08_0410__en-us_topic_0000001310095793_p41401421390">Default Value</p>
|
|
</th>
|
|
<th align="left" class="cellrowborder" valign="top" width="12.32%" id="mcps1.3.3.2.2.6.1.4"><p id="dli_08_0410__en-us_topic_0000001310095793_p11402212398">Type</p>
|
|
</th>
|
|
<th align="left" class="cellrowborder" valign="top" width="43.3%" id="mcps1.3.3.2.2.6.1.5"><p id="dli_08_0410__en-us_topic_0000001310095793_p1414019217396">Description</p>
|
|
</th>
|
|
</tr>
|
|
</thead>
|
|
<tbody><tr id="dli_08_0410__en-us_topic_0000001310095793_row91405223918"><td class="cellrowborder" valign="top" width="27.98%" headers="mcps1.3.3.2.2.6.1.1 "><p id="dli_08_0410__en-us_topic_0000001310095793_p1267125415390">format</p>
|
|
</td>
|
|
<td class="cellrowborder" valign="top" width="8.25%" headers="mcps1.3.3.2.2.6.1.2 "><p id="dli_08_0410__en-us_topic_0000001310095793_p3141623396">Yes</p>
|
|
</td>
|
|
<td class="cellrowborder" valign="top" width="8.15%" headers="mcps1.3.3.2.2.6.1.3 "><p id="dli_08_0410__en-us_topic_0000001310095793_p61411426391">None</p>
|
|
</td>
|
|
<td class="cellrowborder" valign="top" width="12.32%" headers="mcps1.3.3.2.2.6.1.4 "><p id="dli_08_0410__en-us_topic_0000001310095793_p14141124399">String</p>
|
|
</td>
|
|
<td class="cellrowborder" valign="top" width="43.3%" headers="mcps1.3.3.2.2.6.1.5 "><p id="dli_08_0410__en-us_topic_0000001310095793_p214116214399">Format to be used. Set this parameter to <strong id="dli_08_0410__en-us_topic_0000001310095793_b1575774813812">avro-confluent</strong>.</p>
|
|
</td>
|
|
</tr>
|
|
<tr id="dli_08_0410__en-us_topic_0000001310095793_row20141142183917"><td class="cellrowborder" valign="top" width="27.98%" headers="mcps1.3.3.2.2.6.1.1 "><p id="dli_08_0410__en-us_topic_0000001310095793_p111418293914">avro-confluent.schema-registry.subject</p>
|
|
</td>
|
|
<td class="cellrowborder" valign="top" width="8.25%" headers="mcps1.3.3.2.2.6.1.2 "><p id="dli_08_0410__en-us_topic_0000001310095793_p16141192153913">No</p>
|
|
</td>
|
|
<td class="cellrowborder" valign="top" width="8.15%" headers="mcps1.3.3.2.2.6.1.3 "><p id="dli_08_0410__en-us_topic_0000001310095793_p81416243919">None</p>
|
|
</td>
|
|
<td class="cellrowborder" valign="top" width="12.32%" headers="mcps1.3.3.2.2.6.1.4 "><p id="dli_08_0410__en-us_topic_0000001310095793_p20141922393">String</p>
|
|
</td>
|
|
<td class="cellrowborder" valign="top" width="43.3%" headers="mcps1.3.3.2.2.6.1.5 "><p id="dli_08_0410__en-us_topic_0000001310095793_p1014117210397">The Confluent Schema Registry subject under which to register the schema used by this format during serialization.</p>
|
|
<p id="dli_08_0410__en-us_topic_0000001310095793_p75221341104217">By default, <strong id="dli_08_0410__en-us_topic_0000001310095793_b75412438105">kafka</strong> and <strong id="dli_08_0410__en-us_topic_0000001310095793_b583418485104">upsert-kafka</strong> connectors use <strong id="dli_08_0410__en-us_topic_0000001310095793_b147701256151011"><topic_name>-value</strong> or <strong id="dli_08_0410__en-us_topic_0000001310095793_b118681516116"><topic_name>-key</strong> as the default subject name if this format is used as the value or key format.</p>
|
|
</td>
|
|
</tr>
|
|
<tr id="dli_08_0410__en-us_topic_0000001310095793_row8603191010393"><td class="cellrowborder" valign="top" width="27.98%" headers="mcps1.3.3.2.2.6.1.1 "><p id="dli_08_0410__en-us_topic_0000001310095793_p3604181012390">avro-confluent.schema-registry.url</p>
|
|
</td>
|
|
<td class="cellrowborder" valign="top" width="8.25%" headers="mcps1.3.3.2.2.6.1.2 "><p id="dli_08_0410__en-us_topic_0000001310095793_p19604110203919">Yes</p>
|
|
</td>
|
|
<td class="cellrowborder" valign="top" width="8.15%" headers="mcps1.3.3.2.2.6.1.3 "><p id="dli_08_0410__en-us_topic_0000001310095793_p136048109394">None</p>
|
|
</td>
|
|
<td class="cellrowborder" valign="top" width="12.32%" headers="mcps1.3.3.2.2.6.1.4 "><p id="dli_08_0410__en-us_topic_0000001310095793_p1460461020392">String</p>
|
|
</td>
|
|
<td class="cellrowborder" valign="top" width="43.3%" headers="mcps1.3.3.2.2.6.1.5 "><p id="dli_08_0410__en-us_topic_0000001310095793_p76041610183916">URL of the Confluent Schema Registry to fetch/register schemas.</p>
|
|
</td>
|
|
</tr>
|
|
</tbody>
|
|
</table>
|
|
</div>
|
|
</div>
|
|
<div class="section" id="dli_08_0410__en-us_topic_0000001310095793_section8159127174419"><h4 class="sectiontitle">Example</h4><p id="dli_08_0410__en-us_topic_0000001310095793_p5192123014443">1. Read JSON data from the source topic in Kafka and write the data in Confluent Avro format to the sink topic.</p>
|
|
<ol id="dli_08_0410__en-us_topic_0000001310095793_ol146916334611"><li id="dli_08_0410__en-us_topic_0000001310095793_li12693334616"><span>Create a datasource connection for the communication with the VPC and subnet where Kafka and ECS locate and bind the connection to the queue. Set a security group and inbound rule to allow access of the queue and test the connectivity of the queue using the Kafka and ECS IP addresses. For example, locate a general-purpose queue where the job runs and choose <strong id="dli_08_0410__en-us_topic_0000001310095793_b1958016478128">More</strong> > <strong id="dli_08_0410__en-us_topic_0000001310095793_b95811547101219">Test Address Connectivity</strong> in the <strong id="dli_08_0410__en-us_topic_0000001310095793_b1958174761214">Operation</strong> column. If the connection is successful, the datasource is bound to the queue. Otherwise, the binding fails.</span></li><li id="dli_08_0410__en-us_topic_0000001310095793_li48520717465"><span>Purchase an ECS cluster, download Confluent 5.5.2 (<a href="https://packages.confluent.io/archive/5.5/" target="_blank" rel="noopener noreferrer">https://packages.confluent.io/archive/5.5/</a>) and jdk1.8.0_232, and upload them to the ECS cluster. Run the following command to decompress the packages (assume that the decompression directories are<strong id="dli_08_0410__en-us_topic_0000001310095793_b872314619146"> confluent-5.5.2</strong> and <strong id="dli_08_0410__en-us_topic_0000001310095793_b44181549101410">jdk1.8.0_232</strong>):</span><p><pre class="screen" id="dli_08_0410__en-us_topic_0000001310095793_screen1034417402518">tar zxvf confluent-5.5.2-2.11.tar.gz
|
|
tar zxvf jdk1.8.0_232.tar.gz</pre>
|
|
</p></li><li id="dli_08_0410__en-us_topic_0000001310095793_li13463132814505"><span>Run the following commands to install jdk1.8.0_232 in the current ECS cluster. You can run the <strong id="dli_08_0410__en-us_topic_0000001310095793_b1642231017153">pwd</strong> command in the <strong id="dli_08_0410__en-us_topic_0000001310095793_b1823091618158">jdk1.8.0_232 folder</strong> to view the value of <strong id="dli_08_0410__en-us_topic_0000001310095793_b12973326121516">yourJdkPath</strong>.</span><p><pre class="screen" id="dli_08_0410__en-us_topic_0000001310095793_screen1675211085518">export JAVA_HOME=<yourJdkPath>
|
|
export PATH=$JAVA_HOME/bin:$PATH
|
|
export CLASSPATH=.:$JAVA_HOME/lib:$JAVA_HOME/jre/lib</pre>
|
|
</p></li><li id="dli_08_0410__en-us_topic_0000001310095793_li18770519556"><span>Go to the <strong id="dli_08_0410__en-us_topic_0000001310095793_b958744421518">confluent-5.5.2/etc/schema-registry/</strong> directory and modify the following configuration items in the <strong id="dli_08_0410__en-us_topic_0000001310095793_b1734995316156">schema-registry.properties</strong> file:</span><p><pre class="screen" id="dli_08_0410__en-us_topic_0000001310095793_screen1970435215579">listeners=http://<yourEcsIp>:8081
|
|
kafkastore.bootstrap.servers=<yourKafkaAddress1>:<yourKafkaPort>,<yourKafkaAddress2>:<yourKafkaPort></pre>
|
|
</p></li><li id="dli_08_0410__en-us_topic_0000001310095793_li4179117231"><span>Switch to the <strong id="dli_08_0410__en-us_topic_0000001310095793_b125381117111614">confluent-5.5.2</strong> directory and run the following command to start Confluent:</span><p><pre class="screen" id="dli_08_0410__en-us_topic_0000001310095793_screen7999174822310">bin/schema-registry-start etc/schema-registry/schema-registry.properties</pre>
|
|
</p></li><li id="dli_08_0410__en-us_topic_0000001310095793_li760285010574"><span>Create a Flink opensource SQL job, select the Flink 1.12 version, and allow DLI to save job logs in OBS. Add the following statement to the job and submit it:</span><p><pre class="screen" id="dli_08_0410__en-us_topic_0000001310095793_screen3886121512115">CREATE TABLE kafkaSource (
|
|
order_id string,
|
|
order_channel string,
|
|
order_time string,
|
|
pay_amount double,
|
|
real_pay double,
|
|
pay_time string,
|
|
user_id string,
|
|
user_name string,
|
|
area_id string
|
|
) WITH (
|
|
'connector' = 'kafka',
|
|
'properties.bootstrap.servers' = '<yourKafkaAddress1>:<yourKafkaPort>,<yourKafkaAddress2>:<yourKafkaPort>',
|
|
'topic' = '<yourSourceTopic>',
|
|
'properties.group.id' = '<yourGroupId>',
|
|
'scan.startup.mode' = 'latest-offset',
|
|
'format' = 'json'
|
|
);
|
|
CREATE TABLE kafkaSink (
|
|
order_id string,
|
|
order_channel string,
|
|
order_time string,
|
|
pay_amount double,
|
|
real_pay double,
|
|
pay_time string,
|
|
user_id string,
|
|
user_name string,
|
|
area_id string
|
|
) WITH (
|
|
'connector' = 'kafka',
|
|
'properties.bootstrap.servers' = '<yourKafkaAddress1>:<yourKafkaPort>,<yourKafkaAddress2>:<yourKafkaPort>',
|
|
'topic' = '<yourSinkTopic>',
|
|
'format' = 'avro-confluent',
|
|
'avro-confluent.schema-registry.url' = 'http://<yourEcsIp>:8081',
|
|
'avro-confluent.schema-registry.subject' = '<yourSubject>'
|
|
);
|
|
insert into kafkaSink select * from kafkaSource;</pre>
|
|
</p></li><li id="dli_08_0410__en-us_topic_0000001310095793_li14392312811"><span>Insert the following data into Kafka:</span><p><pre class="screen" id="dli_08_0410__en-us_topic_0000001310095793_screen1416212441032">{"order_id":"202103241000000001", "order_channel":"webShop", "order_time":"2021-03-24 10:00:00", "pay_amount":"100.00", "real_pay":"100.00", "pay_time":"2021-03-24 10:02:03", "user_id":"0001", "user_name":"Alice", "area_id":"330106"}
|
|
|
|
{"order_id":"202103241606060001", "order_channel":"appShop", "order_time":"2021-03-24 16:06:06", "pay_amount":"200.00", "real_pay":"180.00", "pay_time":"2021-03-24 16:10:06", "user_id":"0001", "user_name":"Alice", "area_id":"330106"}</pre>
|
|
</p></li><li id="dli_08_0410__en-us_topic_0000001310095793_li11163393320"><span>Read the data of the sink Kafka topic. You will find that the data has been written and the schema has been saved to the <strong id="dli_08_0410__en-us_topic_0000001310095793_b214761531912">_schema</strong> topic of Kafka.</span></li></ol>
|
|
</div>
|
|
</div>
|
|
<div>
|
|
<div class="familylinks">
|
|
<div class="parentlink"><strong>Parent topic:</strong> <a href="dli_08_0407.html">Format</a></div>
|
|
</div>
|
|
</div>
|
|
|