mirror of https://github.com/dapr/docs.git
Merge branch 'v1.13' into update-sdk-feature-table
This commit is contained in:
commit
c3c58bfdc1
|
@ -462,7 +462,8 @@ You can configure pub/sub to publish or consume data encoded using [Avro binary
|
|||
### Configuration
|
||||
|
||||
{{% alert title="Important" color="warning" %}}
|
||||
Currently, only message value serialization/deserialization is supported. Since cloud events are not supported, the `rawPayload=true` metadata must be passed.
|
||||
Currently, only message value serialization/deserialization is supported. Since cloud events are not supported, the `rawPayload=true` metadata must be passed when publishing Avro messages.
|
||||
Please note that `rawPayload=true` should NOT be set for consumers, as the message value will be wrapped into a CloudEvent and base64-encoded. Leaving `rawPayload` as default (i.e. `false`) will send the Avro-decoded message to the application as a JSON payload.
|
||||
{{% /alert %}}
|
||||
|
||||
When configuring the Kafka pub/sub component metadata, you must define:
|
||||
|
@ -533,7 +534,6 @@ def subscribe():
|
|||
'topic': 'my-topic',
|
||||
'route': 'my_topic_subscriber',
|
||||
'metadata': {
|
||||
'rawPayload': 'true',
|
||||
'valueSchemaType': 'Avro',
|
||||
} }]
|
||||
return subscriptions
|
||||
|
|
Loading…
Reference in New Issue