Compare commits

..

54 Commits

Author SHA1 Message Date
Doug Davis 4ad3223f05
Merge pull request #1356 from duglin/releases
Update release process
2025-08-15 07:06:24 -07:00
Doug Davis 40cda2e99d more review edits
Signed-off-by: Doug Davis <duglin@gmail.com>
2025-08-15 14:04:35 +00:00
Doug Davis 6749e1fda2 more edits
Signed-off-by: Doug Davis <duglin@gmail.com>
2025-07-31 11:56:59 +00:00
Doug Davis 177d4fe98f Update release process
- create a new branch for each release instead of a tag/github-release

Signed-off-by: Doug Davis <duglin@gmail.com>
2025-07-30 18:02:57 +00:00
Doug Davis e2d119e26d
Merge pull request #1353 from yordis/yordis/causationid-extension
feat(extensions): add event tracing extension with correlationid and causationid
2025-07-24 16:03:37 -04:00
Doug Davis e007e0aecb
Merge pull request #1355 from duglin/v2items
Add a "v2.md" file so we can close v2 issues/PRs
2025-07-24 15:57:55 -04:00
Doug Davis 11da3778f3 Add a "v2.md" file so we can close v2 issues/PRs
Signed-off-by: Doug Davis <duglin@gmail.com>
2025-07-24 19:55:22 +00:00
Remi Cattiau c5eb14ae13
feat: update github adapter spec (#1306)
* fix: pages is an array so none can be considered the main one

Signed-off-by: Remi Cattiau <remi@cattiau.com>

* feat: add PackageEvent

Signed-off-by: Remi Cattiau <remi@cattiau.com>

* feat: add SponsorshipEvent

Signed-off-by: Remi Cattiau <remi@cattiau.com>

* feat: add BranchProtectionRuleEvent

Signed-off-by: Remi Cattiau <remi@cattiau.com>

* feat: add BranchProtectionConfigurationEvent and CustomPropertyEvent

Signed-off-by: Remi Cattiau <remi@cattiau.com>

* feat: add more missing events

Signed-off-by: Remi Cattiau <remi@cattiau.com>

* feat: add missing CodeScanningAlertEvent

Signed-off-by: Remi Cattiau <remi@cattiau.com>

* wip

Signed-off-by: Remi Cattiau <remi@cattiau.com>

* fix: dug last remarks

Signed-off-by: Remi Cattiau <remi@cattiau.com>

* fix: should case

Signed-off-by: Remi Cattiau <remi@cattiau.com>

---------

Signed-off-by: Remi Cattiau <remi@cattiau.com>
2025-07-17 12:54:02 -04:00
Yordis Prieto a43f83658e
feat(extensions): add event tracing extension with correlationid and causationid
closes #25

Signed-off-by: Yordis Prieto <yordis.prieto@gmail.com>
2025-07-14 15:02:06 -04:00
Jon Skeet 94c622d44a
Create a new issue template (#1350)
* Create a new issue template

Fixes #1348

Signed-off-by: Jon Skeet <jonskeet@google.com>

* Wrap text

Signed-off-by: Jon Skeet <skeet@pobox.com>

---------

Signed-off-by: Jon Skeet <jonskeet@google.com>
Signed-off-by: Jon Skeet <skeet@pobox.com>
2025-06-19 12:08:17 -04:00
Fabrizio Lazzaretti 3fd0012454
AsyncAPI with CloudEvents (#1349)
* AsyncAPI with CloudEvents

Adding a first draft according to https://github.com/cloudevents/spec/issues/1276

Signed-off-by: Lazzaretti <fabrizio@lazzaretti.me>

* add changes from code review

- remove intorudcion
- wrap the lines at 80 chars
- s/should/will/
- create translation files he/CN

Signed-off-by: Lazzaretti <fabrizio@lazzaretti.me>

* fix: title naming

Signed-off-by: Lazzaretti <fabrizio@lazzaretti.me>

* Add short samples on how to add CloudEvents support to AsyncAPI

Signed-off-by: Lazzaretti <fabrizio@lazzaretti.me>

* Change links for asyncapi-traits to location after merging

Signed-off-by: Lazzaretti <fabrizio@lazzaretti.me>

---------

Signed-off-by: Lazzaretti <fabrizio@lazzaretti.me>
2025-06-15 20:21:29 -04:00
Doug Davis c932302d7a
Merge pull request #1351 from duglin/fixtool
ignore .github files
2025-06-15 11:03:28 -04:00
Doug Davis 5f90878705 ignote .github files
Signed-off-by: Doug Davis <duglin@gmail.com>
2025-06-15 15:01:42 +00:00
Doug Davis 67163e50ef
Merge pull request #1341 from thompson-tomo/task/DocTweaks
Make adapters.md the readme for the addapters folder
2025-05-15 13:10:24 -04:00
James Thompson 24ad80a58e Make adapters.md the readme for the addapters folder
Signed-off-by: James Thompson <thompson.tomo@outlook.com>
2025-05-10 15:38:58 +10:00
Emmanuel Ferdman 43f1c5359c
Resolve deprecation warnings (#1336)
* Resolve bs4 deprecation warnings

Signed-off-by: Emmanuel Ferdman <emmanuelferdman@gmail.com>

* Resolve pytest deprecation warnings

Signed-off-by: Emmanuel Ferdman <emmanuelferdman@gmail.com>

---------

Signed-off-by: Emmanuel Ferdman <emmanuelferdman@gmail.com>
2025-04-17 16:57:12 -04:00
Doug Davis 8f8770737c
Merge pull request #1334 from duglin/issue1251
wording tweak for issue 1251
2025-04-17 13:41:17 -04:00
Doug Davis 8386fa6743 wording tweak for issue 1251
Signed-off-by: Doug Davis <duglin@gmail.com>
2025-04-17 17:40:24 +00:00
Doug Davis 37545bfcc8
Merge pull request #1333 from duglin/fixci
fix ci issue
2025-03-27 14:07:21 -04:00
Doug Davis 2e9e677f1d fix ci issue
Signed-off-by: Doug Davis <duglin@gmail.com>
2025-03-27 18:03:47 +00:00
Doug Davis a31214efab
Merge pull request #1332 from duglin/removebot
Remove the "stale" action/bot per 3/20 call
2025-03-27 13:01:31 -04:00
Doug Davis 721144c759 Remove the "stale" action/bot per 3/20 call
Signed-off-by: Doug Davis <duglin@gmail.com>
2025-03-20 16:43:09 +00:00
Doug Davis f65302fb7c
Merge pull request #1329 from xibz/main
Adds CDEvents to docs/open-source.md
2025-02-20 13:10:57 -05:00
benjamin-j-powell b5dc5ffd3e
update(docs): Add CDEvents to open-source.md
This commit adds the CDEvents project to teh open source list.

Signed-off-by: benjamin-j-powell <bjp@apple.com>
2025-02-13 15:12:50 -06:00
Doug Davis a85940fc1e
Merge pull request #1328 from duglin/xReg7
lowercase all attributes in subscription spec
2025-02-13 15:24:17 -05:00
Doug Davis 8c54946e50
Merge pull request #1327 from neuroglia-io/main
Add `Cloud Shapes` as a product that use the CloudEvents spec
2025-02-13 15:23:57 -05:00
Doug Davis fdab0b5c29 lowercase all attributes
Signed-off-by: Doug Davis <duglin@gmail.com>
2025-02-12 17:56:05 +00:00
Charles d'Avernas 0ce5ea4163
Add `Cloud Shapes` as products that use the CloudEvents spec
Signed-off-by: Charles d'Avernas <charles.davernas@neuroglia.io>
2025-02-12 12:27:29 +01:00
Charles d'Avernas fbb00366d6
Add `Cloud Streams` and `Synapse` as products that use the `CloudEvents` spec (#1326)
* Add CloudStreams and Synapse as products that use the CloudEvents spec

Signed-off-by: Charles d'Avernas <charles.davernas@neuroglia.io>

* Renamed cloud events into CloudEvents

Signed-off-by: Charles d'Avernas <charles.davernas@neuroglia.io>

---------

Signed-off-by: Charles d'Avernas <charles.davernas@neuroglia.io>
2025-02-06 13:27:10 -05:00
Doug Davis d130533898
Merge pull request #1322 from duglin/1317continue
Add appendix to data classification extension
2025-01-16 12:50:49 -05:00
Doug Davis fed55696ea Add appendix to data classification extension
See #1317

Signed-off-by: Doug Davis <duglin@gmail.com>
2025-01-14 13:23:45 +00:00
Rob b1643cf6f9
Add data-classification.md extension (#1317)
* Add data-classification.md extension

Signed-off-by: Rob Sessink <rob.sessink@gmail.com>

* FIX based upon PR comments: correct spelling, add link in extensions/README.md and usage of MUST keyword in example use case
-

Signed-off-by: Rob Sessink <rob.sessink@gmail.com>

* FIX based upon PR comments: improve spelling

Signed-off-by: Rob Sessink <rob.sessink@gmail.com>

* FIX based upon PR comments: improve description around recommended labels, remove 'applicability constraints', extend usage section.
-

Signed-off-by: Rob Sessink <rob.sessink@gmail.com>

* FIX based upon PR comments: improve wording and usage of notational conventions
-

Signed-off-by: Rob Sessink <rob.sessink@gmail.com>

* FIX: add missing 'of'

Signed-off-by: Rob Sessink <rob.sessink@gmail.com>

* FIX based upon PR comments: extend usage section to state expectations when intermediaries/consumers encounter unknown attribute values.
-

Signed-off-by: Rob Sessink <rob.sessink@gmail.com>

* FIX: must -> MUST

Signed-off-by: Rob Sessink <rob.sessink@gmail.com>

* FIX based upon PR comments: in Usage section change 'ignore event' into 'report error'.

Signed-off-by: Rob Sessink <rob.sessink@gmail.com>

---------

Signed-off-by: Rob Sessink <rob.sessink@gmail.com>
Co-authored-by: Rob Sessink <rob.sessink@gmail.com>
2024-12-12 14:09:56 -05:00
Doug Davis 68902074ee
Merge pull request #1320 from duglin/urlCheck
add another trusted host
2024-11-20 09:27:19 -05:00
Doug Davis 0c377f7cae add another trusted host
show the error when we fail

Signed-off-by: Doug Davis <duglin@gmail.com>
2024-11-20 14:26:15 +00:00
Doug Davis b131f9725c
Merge pull request #1319 from Bert-R/patch-1
docs: Textual correction
2024-11-14 12:21:58 -05:00
Bert Roos 0dad249627
docs: Textual correction
which --> that

Signed-off-by: Bert Roos <Bert-R@users.noreply.github.com>
2024-11-13 15:40:35 +01:00
Doug Davis 8463221d42
Merge pull request #1315 from vandewillysilva/add-contributor
docs: add vandewilly to contributors list
2024-10-17 14:00:05 -04:00
Vandewilly Oliveira da Silva 7f06c5c5c9
docs: add vandewilly to contributors list
Adding myself to the contributors list. It's a small thing, but
since I contributed to the deprecation extension, it's important
for me to be included.

Signed-off-by: Vandewilly Oliveira da Silva <vandewilly.oliveira-da.silva@hpe.com>
2024-10-16 13:13:04 -04:00
Doug Davis dc521c077d
Merge pull request #1314 from rob-sessink/feature/bam-extension-typos
Fix typos in bam.md
2024-10-03 13:22:05 -04:00
Rob Sessink 8c0111f18e Fix typos in bam.md
Signed-off-by: Rob Sessink <rob.sessink@gmail.com>
2024-09-30 21:44:47 +02:00
Doug Davis 45026679ee
Merge pull request #1311 from jskeet/rabbit-mq
Add RabbitMQ to proprietary specs and SDK support matrix
2024-09-26 13:44:39 -04:00
Jon Skeet beb886aa65 Add RabbitMQ to proprietary specs and SDK support matrix
Signed-off-by: Jon Skeet <jonskeet@google.com>
2024-09-26 11:50:09 +01:00
Doug Davis 6e33dd6b62
Merge pull request #1310 from henriqueblang/fix-typo-webhooks
chore: remove extra 'is'
2024-09-17 12:54:53 -04:00
Henrique Barcia Lang 9642361f6e chore: remove extra 'is'
Signed-off-by: Henrique Barcia Lang <henriquebarcia@hotmail.com>
2024-09-17 13:21:35 -03:00
Vandewilly Silva b86ed0a917
Add CloudEvents extension for deprecated events (#1307)
* Add CloudEvents extension for deprecated events

Introduced the `deprecation` attribute to indicate when an event type is deprecated,
and added the `sunset` attribute to specify the date and time when the event
will become unsupported. This extension provides clear guidelines and examples for
implementing these attributes, aiming to improve lifecycle management and ensure
better communication with consumers.

Signed-off-by: Vandewilly Oliveira da Silva <vandewilly.oliveira-da.silva@hpe.com>

* Fix PR comments: Add deprecationmigration and deprecationfrom

Signed-off-by: Vandewilly Oliveira da Silva <vandewilly.oliveira-da.silva@hpe.com>

* Fix typos

Signed-off-by: Vandewilly Oliveira da Silva <vandewilly.oliveira-da.silva@hpe.com>

* Fix PR comments, and renamed deprecated field

Signed-off-by: Vandewilly Oliveira da Silva <vandewilly.oliveira-da.silva@hpe.com>

* Remove reference to RFC 3339

Signed-off-by: Vandewilly Oliveira da Silva <vandewilly.oliveira-da.silva@hpe.com>

* Remove deprecated unnecessary constraint

Signed-off-by: Vandewilly Oliveira da Silva <vandewilly.oliveira-da.silva@hpe.com>

* Remove quotes from boolean value example

Signed-off-by: Vandewilly Oliveira da Silva <vandewilly.oliveira-da.silva@hpe.com>

* Change deprecated to be Required

Signed-off-by: Vandewilly Oliveira da Silva <vandewilly.oliveira-da.silva@hpe.com>

* Make deprecate have a value of true all the time

Signed-off-by: Vandewilly Oliveira da Silva <vandewilly.oliveira-da.silva@hpe.com>

* Improve clarity

Signed-off-by: Vandewilly Oliveira da Silva <vandewilly.oliveira-da.silva@hpe.com>

* Fix typo

Signed-off-by: Vandewilly Oliveira da Silva <vandewilly.oliveira-da.silva@hpe.com>

---------

Signed-off-by: Vandewilly Oliveira da Silva <vandewilly.oliveira-da.silva@hpe.com>
2024-08-26 22:20:14 -04:00
Doug Davis 04a47b7991
Merge pull request #1308 from duglin/avro
fix typo in avro spec
2024-08-23 07:23:21 -04:00
Doug Davis e1b4f2c5ea fix typo in avro spec
Signed-off-by: Doug Davis <dug@microsoft.com>
2024-08-21 11:57:05 +00:00
Doug Davis 25fa60b04e
Merge pull request #1304 from dynatrace-oss-contrib/feat/clarify-distributed-tracing-extension
docs: link distributed tracing extension to otel semconv
2024-07-25 12:37:34 -04:00
Doug Davis dc92e423fd
Merge pull request #1305 from duglin/securityExamples
Provide some guidance on how security can be layered on top of CE
2024-07-25 12:36:45 -04:00
Doug Davis 6ed43691d8 Provide some guidance on how security can be layered on top of CE
Fixes #1290
Fixes #1288

Signed-off-by: Doug Davis <dug@microsoft.com>
2024-07-18 16:43:02 +00:00
Klaus Deißner 96134ee30b
Clean up contributors.md (#1303)
* Removed company names and put names in alphabetical order

Signed-off-by: Klaus Deißner <klaus.deissner@sap.com>

* Removed company names and put names in alphabetical order

Signed-off-by: Klaus Deißner <klaus.deissner@sap.com>

* Update docs/contributors.md

Co-authored-by: Calum Murray <cmurray@redhat.com>
Signed-off-by: Klaus Deißner <klaus.deissner@sap.com>

---------

Signed-off-by: Klaus Deißner <klaus.deissner@sap.com>
Co-authored-by: Calum Murray <cmurray@redhat.com>
2024-07-18 12:26:06 -04:00
Joao Grassi 93565f4ae8 doc: link distributed tracing extension to otel semconv
Signed-off-by: Joao Grassi <5938087+joaopgrassi@users.noreply.github.com>
2024-07-16 10:24:15 +02:00
Doug Davis 6374c6fca7
Merge pull request #1299 from duglin/fixname
fixup some cesql names
2024-06-17 14:16:05 -04:00
Doug Davis b20cf0ac60 fixup some cersql names
Signed-off-by: Doug Davis <dug@microsoft.com>
2024-06-17 18:14:50 +00:00
53 changed files with 1607 additions and 287 deletions

1
.github/ISSUE_TEMPLATE/config.yml vendored Normal file
View File

@ -0,0 +1 @@
blank_issues_enabled: false

34
.github/ISSUE_TEMPLATE/issue.md vendored Normal file
View File

@ -0,0 +1,34 @@
---
name: New issue
about: 'Create a new issue'
title: ''
labels: ''
assignees: ''
---
(Please remove the text below after reading it.)
When filing an issue in this repository, please consider the following:
- Is this:
- A feature request for a new form of integration etc?
- A report of a technical issue with the existing spec?
- A suggestion for improving the clarity of the existing spec
(even if it's not "wrong" as such)?
- Something else?
- Is there context behind your request that would be useful for readers to
understand? (There's no need to go into huge amounts of detail, but a few
sentences about the motivation can be really helpful.)
- Do you know *roughly* what you'd expect a change to address this issue would
look like? If so, it's useful to create a PR at the same time, linking to
the issue. This doesn't need to be polished and ready to merge - it's just to
help clarify roughly what you're considering.
If the issue is requires discussion, it's really useful if you're able to
attend the weekly working group meeting as described
[here](https://github.com/cloudevents/spec/?tab=readme-ov-file#meeting-time).
Often a discussion which would take multiple back-and-forth comments on an
issue can be resolved with a few minutes of conversation. We understand the
timing may not be convenient for everyone - please let us know if that's the
case, and we can potentially arrange something with the most relevant group
members at a more convenient time.

View File

@ -1,29 +0,0 @@
name: Stale
on:
schedule:
- cron: "0 1 * * *" # daily
jobs:
stale:
runs-on: "ubuntu-latest"
steps:
- uses: "actions/stale@v7"
with:
repo-token: "${{ secrets.GITHUB_TOKEN }}"
stale-issue-message: |-
This issue is stale because it has been open for 30 days with no
activity. Mark as fresh by updating e.g., adding the comment `/remove-lifecycle stale`.
stale-issue-label: "lifecycle/stale"
exempt-issue-labels: "lifecycle/frozen"
stale-pr-message: |-
This Pull Request is stale because it has been open for 30 days with
no activity. Mark as fresh by updating e.g., adding the comment `/remove-lifecycle stale`.
stale-pr-label: "lifecycle/stale"
exempt-pr-labels: "lifecycle/frozen"
days-before-stale: 30
days-before-close: -1 # never

View File

@ -51,7 +51,7 @@ and a graduated project on [Jan 25, 2024](https://github.com/cncf/toc/pull/996)
| |
| **Additional Documentation:** |
| CloudEvents Primer | [v1.0.2](https://github.com/cloudevents/spec/blob/v1.0.2/cloudevents/primer.md) | [WIP](cloudevents/primer.md) |
| [CloudEvents Adapters](cloudevents/adapters.md) | - | [Not versioned](cloudevents/adapters.md) |
| [CloudEvents Adapters](cloudevents/adapters/README.md) | - | [Not versioned](cloudevents/adapters/README.md) |
| [CloudEvents SDK Requirements](cloudevents/SDK.md) | - | [Not versioned](cloudevents/SDK.md) |
| [Documented Extensions](cloudevents/extensions/README.md) | - | [Not versioned](cloudevents/extensions/README.md) |
| [Proprietary Specifications](cloudevents/proprietary-specs.md) | - | [Not versioned](cloudevents/proprietary-specs.md) |
@ -59,7 +59,7 @@ and a graduated project on [Jan 25, 2024](https://github.com/cncf/toc/pull/996)
## Other Specifications
| | Latest Release | Working Draft |
| :-------------- | :-------------------------------------------------------------: | :---------------------------: |
| CE SQL | [v1.0.0](https://github.com/cloudevents/spec/tree/cesql/v1.0.0) | [WIP](cesql/spec.md) |
| CE SQL | [v1.0.0](https://github.com/cloudevents/spec/tree/cesql/v1.0.0/cesql) | [WIP](cesql/spec.md) |
| Subscriptions | - | [WIP](subscriptions/spec.md) |
The Registry and Pagination specifications can now be found in the
@ -76,7 +76,7 @@ and design decisions, and then move on to the
Since not all event producers generate CloudEvents by default, there is
documentation describing the recommended process for adapting some popular
events into CloudEvents, see
[CloudEvents Adapters](cloudevents/adapters.md).
[CloudEvents Adapters](cloudevents/adapters/README.md).
## SDKs
@ -120,6 +120,7 @@ native ecosystem by making our systems interoperable with CloudEvents.
docs](https://drive.google.com/drive/folders/1eKH-tVNV25jwkuBEoi3ESqvVjNRlJqYX?usp=sharing)
- [Demos & open source](docs/README.md) -- if you have something to share
about your use of CloudEvents, please submit a PR!
- [Potential CloudEvents v2 work items](cloudevents/v2.md)
- [Code of Conduct](https://github.com/cncf/foundation/blob/master/code-of-conduct.md)
### Security Concerns

View File

@ -1,4 +1,4 @@
# CloudEvents SQL Expression Language - Version 1.0
# CloudEvents SQL Expression Language - Version 1.0.0
CloudEvents SQL expressions (also known as CESQL) allow computing values and
matching of CloudEvent attributes against complex expressions that lean on the

View File

@ -1,4 +1,4 @@
# CloudEvents SQL Expression Language - Version 1.0
# CloudEvents SQL Expression Language - Version 1.0.0
本文档尚未被翻译,请先阅读英文[原版文档](../../spec.md) 。

View File

@ -1,4 +1,4 @@
# CloudEvents SQL Expression Language - Version 1.0
# CloudEvents SQL Expression Language - Version 1.0.0
## Abstract

View File

@ -202,6 +202,7 @@ Undo it when done:
| [WebSockets Structured](https://github.com/cloudevents/spec/blob/v1.0.2/cloudevents/bindings/websockets-protocol-binding.md) | :x: | | :x: | :heavy_check_mark: | | | | :x: | :x: |
| Proprietary Bindings |
| [RocketMQ](https://github.com/apache/rocketmq-externals/blob/master/rocketmq-cloudevents-binding/rocketmq-transport-binding.md) | :x: | | :heavy_check_mark: | :x: | | | | :x: | :x: |
| [RabbitMQ](https://github.com/knative-extensions/eventing-rabbitmq/blob/main/cloudevents-protocol-spec/spec.md) | :x: | | | | | | | | |
| |
| **[v0.3](https://github.com/cloudevents/spec/tree/v0.3)** |
| [CloudEvents Core](https://github.com/cloudevents/spec/blob/v0.3/spec.md) | :x: | :heavy_check_mark: | :heavy_check_mark: | :heavy_check_mark: | :x: | :x: | :heavy_check_mark: | :heavy_check_mark: | :heavy_check_mark: |

View File

@ -9,8 +9,8 @@ CloudEvents attributes. In order to promote interoperability across multiple
implementations of these adapters, the following documents show the proposed
algorithms that should be used:
- [AWS S3](adapters/aws-s3.md)
- [AWS SNS](adapters/aws-sns.md)
- [CouchDB](adapters/couchdb.md)
- [GitHub](adapters/github.md)
- [GitLab](adapters/gitlab.md)
- [AWS S3](./aws-s3.md)
- [AWS SNS](./aws-sns.md)
- [CouchDB](./couchdb.md)
- [GitHub](./github.md)
- [GitLab](./gitlab.md)

View File

@ -1,15 +1,47 @@
# GitHub CloudEvents Adapter
This document describes how to convert
[GitHub webhook events](https://developer.github.com/v3/activity/events/types/)
[GitHub webhook events](https://docs.github.com/en/webhooks/webhook-events-and-payloads)
into a CloudEvents.
GitHub webhook event documentation:
https://developer.github.com/v3/activity/events/types/
https://docs.github.com/en/webhooks/webhook-events-and-payloads
Each section below describes how to determine the CloudEvents attributes
based on the specified event.
For time attribute, if the proposed value is null, the default timestamp SHOULD be used.
### BranchProtectionConfigurationEvent
| CloudEvents Attribute | Value |
| :-------------------- | :------------------------------------------------------------- |
| `id` | "X-GitHub-Delivery" HTTP header value |
| `source` | "repository.url" value |
| `specversion` | `1.0` |
| `type` | `com.github.branch_protection_configuration.` + "action" value |
| `datacontentencoding` | Omit |
| `datacontenttype` | `application/json` |
| `dataschema` | Omit |
| `subject` | Omit |
| `time` | Current time |
| `data` | Content of HTTP request body |
### BranchProtectionRuleEvent
| CloudEvents Attribute | Value |
| :-------------------- | :---------------------------------------------------- |
| `id` | "X-GitHub-Delivery" HTTP header value |
| `source` | "repository.url" value |
| `specversion` | `1.0` |
| `type` | `com.github.branch_protection_rule.` + "action" value |
| `datacontentencoding` | Omit |
| `datacontenttype` | `application/json` |
| `dataschema` | Omit |
| `subject` | "rule.id" value |
| `time` | "rule.updated_at" value |
| `data` | Content of HTTP request body |
### CheckRunEvent
| CloudEvents Attribute | Value |
@ -40,6 +72,21 @@ based on the specified event.
| `time` | "check_suite.updated_at" value |
| `data` | Content of HTTP request body |
### CodeScanningAlertEvent
| CloudEvents Attribute | Value |
| :-------------------- | :------------------------------------------------- |
| `id` | "X-GitHub-Delivery" HTTP header value |
| `source` | "repository.url" value |
| `specversion` | `1.0` |
| `type` | `com.github.code_scanning_alert.` + "action" value |
| `datacontentencoding` | Omit |
| `datacontenttype` | `application/json` |
| `dataschema` | Omit |
| `subject` | "alert.number" value |
| `time` | Current time |
| `data` | Content of HTTP request body |
### CommitCommentEvent
| CloudEvents Attribute | Value |
@ -85,6 +132,36 @@ based on the specified event.
| `time` | Current time |
| `data` | Content of HTTP request body |
### CustomPropertyEvent
| CloudEvents Attribute | Value |
| :-------------------- | :--------------------------------------------- |
| `id` | "X-GitHub-Delivery" HTTP header value |
| `source` | "repository.url" value |
| `specversion` | `1.0` |
| `type` | `com.github.custom_property.` + "action" value |
| `datacontentencoding` | Omit |
| `datacontenttype` | `application/json` |
| `dataschema` | Omit |
| `subject` | "definition.property_name" value |
| `time` | Current time |
| `data` | Content of HTTP request body |
### CustomPropertyValuesEvent
| CloudEvents Attribute | Value |
| :-------------------- | :---------------------------------------------------- |
| `id` | "X-GitHub-Delivery" HTTP header value |
| `source` | "repository.url" value |
| `specversion` | `1.0` |
| `type` | `com.github.custom_property_values.` + "action" value |
| `datacontentencoding` | Omit |
| `datacontenttype` | `application/json` |
| `dataschema` | Omit |
| `subject` | Omit |
| `time` | Current time |
| `data` | Content of HTTP request body |
### DeleteEvent
| CloudEvents Attribute | Value |
@ -100,6 +177,21 @@ based on the specified event.
| `time` | Current time |
| `data` | Content of HTTP request body |
### DependabotAlertEvent
| CloudEvents Attribute | Value |
| :-------------------- | :---------------------------------------------- |
| `id` | "X-GitHub-Delivery" HTTP header value |
| `source` | "repository.url" value |
| `specversion` | `1.0` |
| `type` | `com.github.dependabot_alert.` + "action" value |
| `datacontentencoding` | Omit |
| `datacontenttype` | `application/json` |
| `dataschema` | Omit |
| `subject` | "alert.id" value |
| `time` | Current time |
| `data` | Content of HTTP request body |
### DeployKeyEvent
| CloudEvents Attribute | Value |
@ -130,6 +222,36 @@ based on the specified event.
| `time` | "deployment.updated_at" value |
| `data` | Content of HTTP request body |
### DeploymentProtectionRuleEvent
| CloudEvents Attribute | Value |
| :-------------------- | :-------------------------------------------------------- |
| `id` | "X-GitHub-Delivery" HTTP header value |
| `source` | "deployment.url" value |
| `specversion` | `1.0` |
| `type` | `com.github.deployment_protection_rule.` + "action" value |
| `datacontentencoding` | Omit |
| `datacontenttype` | `application/json` |
| `dataschema` | Omit |
| `subject` | "deployment.id" value if exists |
| `time` | Current time |
| `data` | Content of HTTP request body |
### DeploymentReviewEvent
| CloudEvents Attribute | Value |
| :-------------------- | :----------------------------------------------- |
| `id` | "X-GitHub-Delivery" HTTP header value |
| `source` | "deployment.url" value |
| `specversion` | `1.0` |
| `type` | `com.github.deployment_review.` + "action" value |
| `datacontentencoding` | Omit |
| `datacontenttype` | `application/json` |
| `dataschema` | Omit |
| `subject` | "workflow_run.id" value |
| `time` | Current time |
| `data` | Content of HTTP request body |
### DeploymentStatusEvent
| CloudEvents Attribute | Value |
@ -145,6 +267,36 @@ based on the specified event.
| `time` | "deployment_status.updated_at" value |
| `data` | Content of HTTP request body |
### DiscussionEvent
| CloudEvents Attribute | Value |
| :-------------------- | :---------------------------------------- |
| `id` | "X-GitHub-Delivery" HTTP header value |
| `source` | "repository.url" value |
| `specversion` | `1.0` |
| `type` | `com.github.discussion.` + "action" value |
| `datacontentencoding` | Omit |
| `datacontenttype` | `application/json` |
| `dataschema` | "discussion.updated_at" value |
| `subject` | "discussion.id" value |
| `time` | Current time |
| `data` | Content of HTTP request body |
### DiscussionCommentEvent
| CloudEvents Attribute | Value |
| :-------------------- | :------------------------------------------------ |
| `id` | "X-GitHub-Delivery" HTTP header value |
| `source` | "repository.url" value |
| `specversion` | `1.0` |
| `type` | `com.github.discussion_comment.` + "action" value |
| `datacontentencoding` | Omit |
| `datacontenttype` | `application/json` |
| `dataschema` | Omit |
| `subject` | "discussion.id" value |
| `time` | "comment.updated_at" value |
| `data` | Content of HTTP request body |
### ForkEvent
| CloudEvents Attribute | Value |
@ -177,18 +329,18 @@ based on the specified event.
### GollumEvent
| CloudEvents Attribute | Value |
| :-------------------- | :------------------------------------------ |
| `id` | "X-GitHub-Delivery" HTTP header value |
| `source` | "repository.url" value |
| `specversion` | `1.0` |
| `type` | `com.github.gollum.` + "pages.action" value |
| `datacontentencoding` | Omit |
| `datacontenttype` | `application/json` |
| `dataschema` | Omit |
| `subject` | "pages.page_name" value |
| `time` | Current time |
| `data` | Content of HTTP request body |
| CloudEvents Attribute | Value |
| :-------------------- | :------------------------------------ |
| `id` | "X-GitHub-Delivery" HTTP header value |
| `source` | "repository.url" value |
| `specversion` | `1.0` |
| `type` | `com.github.gollum` |
| `datacontentencoding` | Omit |
| `datacontenttype` | `application/json` |
| `dataschema` | Omit |
| `subject` | Omit |
| `time` | Current time |
| `data` | Content of HTTP request body |
### InstallationEvent
@ -207,18 +359,33 @@ based on the specified event.
### InstallationRepositoryEvent
| CloudEvents Attribute | Value |
| :-------------------- | :----------------------------------------------------- |
| `id` | "X-GitHub-Delivery" HTTP header value |
| `source` | "installation.account.url" value |
| `specversion` | `1.0` |
| `type` | `com.github.installation_repository.` + "action" value |
| `datacontentencoding` | Omit |
| `datacontenttype` | `application/json` |
| `dataschema` | Omit |
| `subject` | "installation.id" value |
| `time` | "installation.updated_at" value # not a timestamp?? |
| `data` | Content of HTTP request body |
| CloudEvents Attribute | Value |
| :-------------------- | :------------------------------------------------------- |
| `id` | "X-GitHub-Delivery" HTTP header value |
| `source` | "installation.account.url" value |
| `specversion` | `1.0` |
| `type` | `com.github.installation_repositories.` + "action" value |
| `datacontentencoding` | Omit |
| `datacontenttype` | `application/json` |
| `dataschema` | Omit |
| `subject` | "installation.id" value |
| `time` | "installation.updated_at" value |
| `data` | Content of HTTP request body |
### InstallationTargetEvent
| CloudEvents Attribute | Value |
| :-------------------- | :------------------------------------------------- |
| `id` | "X-GitHub-Delivery" HTTP header value |
| `source` | "installation.account.url" value |
| `specversion` | `1.0` |
| `type` | `com.github.installation_target.` + "action" value |
| `datacontentencoding` | Omit |
| `datacontenttype` | `application/json` |
| `dataschema` | Omit |
| `subject` | "installation.id" value |
| `time` | Current time |
| `data` | Content of HTTP request body |
### IssueCommentEvent
@ -242,7 +409,7 @@ based on the specified event.
| `id` | "X-GitHub-Delivery" HTTP header value |
| `source` | "repository.url" value |
| `specversion` | `1.0` |
| `type` | `com.github.issue.` + "action" value |
| `type` | `com.github.issues.` + "action" value |
| `datacontentencoding` | Omit |
| `datacontenttype` | `application/json` |
| `dataschema` | Omit |
@ -310,6 +477,21 @@ based on the specified event.
| `time` | Current time |
| `data` | Content of HTTP request body |
### MergeGroupEvent
| CloudEvents Attribute | Value |
| :-------------------- | :----------------------------------------- |
| `id` | "X-GitHub-Delivery" HTTP header value |
| `source` | "repository.url" value |
| `specversion` | `1.0` |
| `type` | `com.github.merge_group.` + "action" value |
| `datacontentencoding` | Omit |
| `datacontenttype` | `application/json` |
| `dataschema` | Omit |
| `subject` | "merge_group.head_ref" value |
| `time` | Current time |
| `data` | Content of HTTP request body |
### MetaEvent
| CloudEvents Attribute | Value |
@ -370,6 +552,21 @@ based on the specified event.
| `time` | Current time |
| `data` | Content of HTTP request body |
### PackageEvent
| CloudEvents Attribute | Value |
| :-------------------- | :------------------------------------------------------------------------- |
| `id` | "X-GitHub-Delivery" HTTP header value |
| `source` | "repository.url" value |
| `specversion` | `1.0` |
| `type` | `com.github.package.` + "action" value |
| `datacontentencoding` | Omit |
| `datacontenttype` | `application/json` |
| `dataschema` | Omit |
| `subject` | "package.id" value |
| `time` | "package.updated_at" value, unless "null", then "package.created_at" value |
| `data` | Content of HTTP request body |
### PageBuildEvent
| CloudEvents Attribute | Value |
@ -430,6 +627,51 @@ based on the specified event.
| `time` | "project.updated_at" value |
| `data` | Content of HTTP request body |
### ProjectsV2Event
| CloudEvents Attribute | Value |
| :-------------------- | :----------------------------------------- |
| `id` | "X-GitHub-Delivery" HTTP header value |
| `source` | "repository.url" value |
| `specversion` | `1.0` |
| `type` | `com.github.projects_v2.` + "action" value |
| `datacontentencoding` | Omit |
| `datacontenttype` | `application/json` |
| `dataschema` | Omit |
| `subject` | "projects_v2.id" value |
| `time` | "projects_v2.updated_at" value |
| `data` | Content of HTTP request body |
### ProjectsV2ItemEvent
| CloudEvents Attribute | Value |
| :-------------------- | :---------------------------------------------- |
| `id` | "X-GitHub-Delivery" HTTP header value |
| `source` | "repository.url" value |
| `specversion` | `1.0` |
| `type` | `com.github.projects_v2_item.` + "action" value |
| `datacontentencoding` | Omit |
| `datacontenttype` | `application/json` |
| `dataschema` | Omit |
| `subject` | "projects_v2_item.id" value |
| `time` | "projects_v2_item.updated_at" value |
| `data` | Content of HTTP request body |
### ProjectsV2StatusUpdateEvent
| CloudEvents Attribute | Value |
| :-------------------- | :------------------------------------------------------- |
| `id` | "X-GitHub-Delivery" HTTP header value |
| `source` | "repository.url" value |
| `specversion` | `1.0` |
| `type` | `com.github.projects_v2_status_update.` + "action" value |
| `datacontentencoding` | Omit |
| `datacontenttype` | `application/json` |
| `dataschema` | Omit |
| `subject` | "projects_v2_status_update.id" value |
| `time` | "projects_v2_status_update.updated_at" value |
| `data` | Content of HTTP request body |
### PublicEvent
| CloudEvents Attribute | Value |
@ -490,6 +732,21 @@ based on the specified event.
| `time` | "pull_request.updated_at" value |
| `data` | Content of HTTP request body |
### PullRequestReviewThreadEvent
| CloudEvents Attribute | Value |
| :-------------------- | :-------------------------------------------------------- |
| `id` | "X-GitHub-Delivery" HTTP header value |
| `source` | "pull_request.url" value |
| `specversion` | `1.0` |
| `type` | `com.github.pull_request_review_thread.` + "action" value |
| `datacontentencoding` | Omit |
| `datacontenttype` | `application/json` |
| `dataschema` | Omit |
| `subject` | "pull_request.id" value |
| `time` | Current time |
| `data` | Content of HTTP request body |
### PushEvent
| CloudEvents Attribute | Value |
@ -550,6 +807,36 @@ based on the specified event.
| `time` | "repository.updated_at" value |
| `data` | Content of HTTP request body |
### RepositoryAdvisoryEvent
| CloudEvents Attribute | Value |
| :-------------------- | :------------------------------------------------- |
| `id` | "X-GitHub-Delivery" HTTP header value |
| `source` | "repository.url" value |
| `specversion` | `1.0` |
| `type` | `com.github.repository_advisory.` + "action" value |
| `datacontentencoding` | Omit |
| `datacontenttype` | `application/json` |
| `dataschema` | Omit |
| `subject` | "repository_advisory.ghsa_id" value |
| `time` | "repository_advisory.updated_at" value |
| `data` | Content of HTTP request body |
### RepositoryDispatchEvent
| CloudEvents Attribute | Value |
| :-------------------- | :------------------------------------------------ |
| `id` | "X-GitHub-Delivery" HTTP header value |
| `source` | "repository.owner.url" value |
| `specversion` | `1.0` |
| `type` | `com.github.repository_dispatch` + "action" value |
| `datacontentencoding` | Omit |
| `datacontenttype` | `application/json` |
| `dataschema` | Omit |
| `subject` | Omit |
| `time` | Current time |
| `data` | Content of HTTP request body |
### RepositoryImportEvent
| CloudEvents Attribute | Value |
@ -565,6 +852,21 @@ based on the specified event.
| `time` | "repository.updated_at" value |
| `data` | Content of HTTP request body |
### RepositoryRulesetEvent
| CloudEvents Attribute | Value |
| :-------------------- | :----------------------------------------------- |
| `id` | "X-GitHub-Delivery" HTTP header value |
| `source` | "repository.owner.url" value |
| `specversion` | `1.0` |
| `type` | `com.github.repository_ruleset` + "action" value |
| `datacontentencoding` | Omit |
| `datacontenttype` | `application/json` |
| `dataschema` | Omit |
| `subject` | "repository.name" value |
| `time` | "repository.updated_at" value |
| `data` | Content of HTTP request body |
### RepositoryVulnerabilityAlertEvent
| CloudEvents Attribute | Value |
@ -580,6 +882,36 @@ based on the specified event.
| `time` | Current time # repository.updated_id ? |
| `data` | Content of HTTP request body |
### SecretScanningAlertEvent
| CloudEvents Attribute | Value |
| :-------------------- | :---------------------------------------------------------------------- |
| `id` | "X-GitHub-Delivery" HTTP header value |
| `source` | "repository.url" value |
| `specversion` | `1.0` |
| `type` | `com.github.secret_scanning_alert.` + "action" value |
| `datacontentencoding` | Omit |
| `datacontenttype` | `application/json` |
| `dataschema` | Omit |
| `subject` | "alert.number" value |
| `time` | "alert.updated_at" value , unless "null", then "alert.created_at" value |
| `data` | Content of HTTP request body |
### SecretScanningAlertLocationEvent
| CloudEvents Attribute | Value |
| :-------------------- | :---------------------------------------------------------------------- |
| `id` | "X-GitHub-Delivery" HTTP header value |
| `source` | "repository.url" value |
| `specversion` | `1.0` |
| `type` | `com.github.secret_scanning_alert_location.` + "action" value |
| `datacontentencoding` | Omit |
| `datacontenttype` | `application/json` |
| `dataschema` | Omit |
| `subject` | "alert.number" value |
| `time` | "alert.updated_at" value , unless "null", then "alert.created_at" value |
| `data` | Content of HTTP request body |
### SecurityAdvisoryEvent
| CloudEvents Attribute | Value |
@ -595,6 +927,36 @@ based on the specified event.
| `time` | "security_advisory.updated_at" value |
| `data` | Content of HTTP request body |
### SecurityAndAnalysisEvent
| CloudEvents Attribute | Value |
| :-------------------- | :------------------------------------ |
| `id` | "X-GitHub-Delivery" HTTP header value |
| `source` | "repository.url" value |
| `specversion` | `1.0` |
| `type` | `com.github.security_and_analysis` |
| `datacontentencoding` | Omit |
| `datacontenttype` | `application/json` |
| `dataschema` | Omit |
| `subject` | Omit |
| `time` | Current time |
| `data` | Content of HTTP request body |
### SponsorshipEvent
| CloudEvents Attribute | Value |
| :-------------------- | :----------------------------------------- |
| `id` | "X-GitHub-Delivery" HTTP header value |
| `source` | "repository.url" value |
| `specversion` | `1.0` |
| `type` | `com.github.sponsorship.` + "action" value |
| `datacontentencoding` | Omit |
| `datacontenttype` | `application/json` |
| `dataschema` | Omit |
| `subject` | "sponsorship.sponsor.login" |
| `time` | Current time |
| `data` | Content of HTTP request body |
### StarEvent
| CloudEvents Attribute | Value |
@ -670,3 +1032,47 @@ based on the specified event.
| `time` | Current time |
| `data` | Content of HTTP request body |
### WorkflowDispatchEvent
| CloudEvents Attribute | Value |
| :-------------------- | :------------------------------------ |
| `id` | "X-GitHub-Delivery" HTTP header value |
| `source` | "repository.url" value |
| `specversion` | `1.0` |
| `type` | `com.github.workflow_dispatch` |
| `datacontentencoding` | Omit |
| `datacontenttype` | `application/json` |
| `dataschema` | Omit |
| `subject` | "workflow" value |
| `time` | Current time |
| `data` | Content of HTTP request body |
### WorkflowJobEvent
| CloudEvents Attribute | Value |
| :-------------------- | :------------------------------------------ |
| `id` | "X-GitHub-Delivery" HTTP header value |
| `source` | "repository.url" value |
| `specversion` | `1.0` |
| `type` | `com.github.workflow_job.` + "action" value |
| `datacontentencoding` | Omit |
| `datacontenttype` | `application/json` |
| `dataschema` | Omit |
| `subject` | "workflow_job.name" value |
| `time` | Current time |
| `data` | Content of HTTP request body |
### WorkflowRunEvent
| CloudEvents Attribute | Value |
| :-------------------- | :------------------------------------------ |
| `id` | "X-GitHub-Delivery" HTTP header value |
| `source` | "repository.url" value |
| `specversion` | `1.0` |
| `type` | `com.github.workflow_run.` + "action" value |
| `datacontentencoding` | Omit |
| `datacontenttype` | `application/json` |
| `dataschema` | Omit |
| `subject` | "workflow.name" value |
| `time` | Current time |
| `data` | Content of HTTP request body |

View File

@ -196,7 +196,8 @@ message will represent a _tombstone_ record, as described in the
#### 3.2.3. Metadata Headers
All [CloudEvents][ce] attributes and
[CloudEvent Attributes Extensions](../primer.md#cloudevent-extension-attributes)
[CloudEvent Attributes
Extensions](../primer.md#cloudevents-extension-attributes)
with exception of `data` MUST be individually mapped to and from the Header
fields in the Kafka message. Both header keys and header values MUST be encoded
as UTF-8 strings.

View File

@ -42,7 +42,9 @@ for more information.
- [Auth Context](authcontext.md)
- [BAM](bam.md)
- [Data Classification](data-classification.md)
- [Dataref (Claim Check Pattern)](dataref.md)
- [Deprecation](deprecation.md)
- [Distributed Tracing](distributed-tracing.md)
- [Expiry Time](expirytime.md)
- [OPC UA](opcua.md)

View File

@ -46,9 +46,10 @@ this extension is being used.
### authid
- Type: `String`
- Description: A unique identifier of the principal that triggered the
occurrence. This might, for example, be a unique ID in an identity database
(userID), an email of a platform user or service account, or the label for an
API key.
occurrence. This specification makes no statement as to what this value
ought to be, however including personally identifiable information (PII)
in a CloudEvent is often considered inappropriate, so some indirect reference
(e.g. a hash or label of an API key) might be considered.
- Constraints
- OPTIONAL

View File

@ -34,7 +34,7 @@ this extension is being used.
### bamtxid (BAM Transaction ID)
- Type: `String`
- Description: A unique identifer for the instance of a transaction.
- Description: A unique identifier for the instance of a transaction.
This identifier connects the actual processing in the distributed
system (e.g. payment, invoice, warehouse) with the model of this process.
- Constraints
@ -53,7 +53,7 @@ this extension is being used.
- Constraints
- REQUIRED
- MUST be a non-empty string
- RECOMMENDED a alphanumeric string that contains non-whitespace characters
- RECOMMENDED an alphanumeric string that contains non-whitespace characters
and only hyphens, underscores, and periods.
### bamptxid (BAM Process Transaction ID)
@ -64,7 +64,7 @@ this extension is being used.
- Constraints
- REQUIRED
- MUST be a non-empty string
- RECOMMENDED a alphanumeric string that contains non-whitespace characters
- RECOMMENDED an alphanumeric string that contains non-whitespace characters
and only hyphens, underscores, and periods.
### bamptxsid (BAM Process Transaction Step ID)
@ -75,7 +75,7 @@ this extension is being used.
- Constraints
- REQUIRED
- MUST be a non-empty string
- RECOMMENDED a alphanumeric string that contains non-whitespace characters
- RECOMMENDED an alphanumeric string that contains non-whitespace characters
and only hyphens, underscores, and periods.
### bamptxsstatus (BAM Transaction Step Status)
@ -86,7 +86,7 @@ this extension is being used.
- Constraints
- OPTIONAL
- if present, MUST be a non-empty string
- RECOMMENDED a alphanumeric string that contains non-whitespace characters
- RECOMMENDED an alphanumeric string that contains non-whitespace characters
and only hyphens, underscores, and periods.
### bamptxcompleted (BAM Process Transaction Completed)
@ -95,7 +95,7 @@ this extension is being used.
- Description: Indicates if the instance of the transaction (`bamtxid`) has
actually been completed, or if the transaction has somehow failed.
This is a mechanism to indicate a final completion or failure that is
not captures by the model of the business process..
not captured by the model of the business process.
- Constraints
- OPTIONAL
- if present, MUST be a boolean value
@ -103,12 +103,12 @@ this extension is being used.
## Usage
When this extension is used, producers MUST set the value of
the `bamtxid`, `bampid`, `bamptxid`, and `bamptxid` attributes
the `bamtxid`, `bampid`, `bamptxid`, and `bamptxsid` attributes
to the unique identifiers of the business process, transaction,
and transaction step associated with the event.
Intermediaries MUST NOT change the value of the `bamtxid`,
`bampid`, `bamptxid`, and `bamptxid` attributes.
`bampid`, `bamptxid`, and `bamptxsid` attributes.
## Use cases

View File

@ -0,0 +1,228 @@
# Correlation
This extension defines attributes for tracking occurrence relationships and
causality in distributed systems, enabling comprehensive traceability through
correlation and causation identifiers.
## Notational Conventions
As with the main [CloudEvents specification](../spec.md), the key words "MUST",
"MUST NOT", "REQUIRED", "SHALL", "SHALL NOT", "SHOULD", "SHOULD NOT",
"RECOMMENDED", "MAY", and "OPTIONAL" in this document are to be interpreted as
described in [RFC 2119](https://tools.ietf.org/html/rfc2119).
## Attributes
### correlationid
- Type: `String`
- Description: An identifier that groups related events within the same logical
flow or business transaction. All events sharing the same correlation ID are
part of the same workflow.
- Constraints
- OPTIONAL
- If present, MUST be a non-empty string
### causationid
- Type: `String`
- Description: The unique identifier of the event that directly caused this
event to be generated. This SHOULD be the `id` value of the causing event.
- Constraints
- OPTIONAL
- If present, MUST be a non-empty string
## Usage
The Correlation extension provides two complementary mechanisms for tracking
event relationships:
1. **Correlation ID**: Groups all events that are part of the same logical flow,
regardless of their causal relationships
2. **Causation ID**: Tracks the direct parent-child relationships between events
in a causal chain
These attributes can be used independently or together, depending on the correlation
requirements of your system.
### Correlation vs Causation
Understanding the distinction between these two concepts is crucial:
- **Correlation ID** answers: "Which events are part of the same business
transaction?"
- **Causation ID** answers: "Which specific event directly triggered this
event?"
### Example Scenario
Consider an e-commerce order processing flow:
1. User initiates checkout (correlation ID: "txn-abc-123" is created)
2. Order is placed (Event A)
3. Payment is processed (Event B, caused by A)
4. Inventory is checked (Event C, caused by A)
5. Shipping is scheduled (Event D, caused by C)
6. Notification is sent (Event E, caused by D)
In this scenario:
- All events share the same `correlationid`: "txn-abc-123"
- Each event has a `causationid` pointing to its direct trigger:
- Event B and C have `causationid`: "order-123" (Event A's ID)
- Event D has `causationid`: "inventory-456" (Event C's ID)
- Event E has `causationid`: "shipping-789" (Event D's ID)
## Examples
### Example 1: Complete Correlation Chain
Initial Order Event:
```json
{
"specversion": "1.0",
"type": "com.example.order.placed",
"source": "https://example.com/orders",
"id": "order-123",
"correlationid": "txn-abc-123",
"data": {
"orderId": "123",
"customerId": "456"
}
}
```
Payment Processing (triggered by order):
```json
{
"specversion": "1.0",
"type": "com.example.payment.processed",
"source": "https://example.com/payments",
"id": "payment-789",
"correlationid": "txn-abc-123",
"causationid": "order-123",
"data": {
"amount": 150.0,
"currency": "USD"
}
}
```
Inventory Check (also triggered by order):
```json
{
"specversion": "1.0",
"type": "com.example.inventory.checked",
"source": "https://example.com/inventory",
"id": "inventory-456",
"correlationid": "txn-abc-123",
"causationid": "order-123",
"data": {
"items": ["sku-001", "sku-002"],
"available": true
}
}
```
Shipping Scheduled (triggered by inventory check):
```json
{
"specversion": "1.0",
"type": "com.example.shipping.scheduled",
"source": "https://example.com/shipping",
"id": "shipping-012",
"correlationid": "txn-abc-123",
"causationid": "inventory-456",
"data": {
"carrier": "FastShip",
"estimatedDelivery": "2024-01-15"
}
}
```
### Example 2: Error Handling with Correlation
When an error occurs, the correlation attributes help identify both the affected
transaction and the specific trigger:
```json
{
"specversion": "1.0",
"type": "com.example.payment.failed",
"source": "https://example.com/payments",
"id": "error-345",
"correlationid": "txn-abc-123",
"causationid": "payment-789",
"data": {
"error": "Insufficient funds",
"retryable": true
}
}
```
### Example 3: Fan-out Pattern
A single event can cause multiple downstream events:
```json
{
"specversion": "1.0",
"type": "com.example.order.fulfilled",
"source": "https://example.com/fulfillment",
"id": "fulfillment-567",
"correlationid": "txn-abc-123",
"causationid": "shipping-012",
"data": {
"completedAt": "2024-01-14T10:30:00Z"
}
}
```
This might trigger multiple notification events, all with the same causationid:
```json
{
"specversion": "1.0",
"type": "com.example.notification.email",
"source": "https://example.com/notifications",
"id": "notify-email-890",
"correlationid": "txn-abc-123",
"causationid": "fulfillment-567",
"data": {
"recipient": "customer@example.com",
"template": "order-fulfilled"
}
}
```
```json
{
"specversion": "1.0",
"type": "com.example.notification.sms",
"source": "https://example.com/notifications",
"id": "notify-sms-891",
"correlationid": "txn-abc-123",
"causationid": "fulfillment-567",
"data": {
"recipient": "+1234567890",
"message": "Your order has been fulfilled!"
}
}
```
## Best Practices
1. **Correlation ID Generation**: Generate correlation IDs at the entry point of
your system (e.g., API gateway, UI interaction)
2. **Causation ID Propagation**: Always set the causation ID to the `id` of the
event that directly triggered the current event
3. **Consistent Usage**: If you start using these attributes in a flow, use them
consistently throughout
4. **ID Format**: Use globally unique identifiers (e.g., UUIDs) to avoid
collisions across distributed systems
5. **Retention**: Consider the retention implications when designing queries
based on these attributes

View File

@ -0,0 +1,110 @@
# Data Classification Extension
CloudEvents might contain payloads which are subjected to data protection
regulations like GDPR or HIPAA. For intermediaries and consumers knowing how
event payloads are classified, which data protection regulation applies and how
payloads are categorized, enables compliant processing of events.
This extension defines attributes to describe to
[consumers](../spec.md#consumer) or [intermediaries](../spec.md#intermediary)
how an event and its payload is classified, category of the payload and any
applicable data protection regulations.
These attributes are intended for classification at an event and payload level
and not at a `data` field level. Classification at a field level is best defined
in the schema specified via the `dataschema` attribute.
## Notational Conventions
As with the main [CloudEvents specification](../spec.md), the key words "MUST",
"MUST NOT", "REQUIRED", "SHALL", "SHALL NOT", "SHOULD", "SHOULD NOT",
"RECOMMENDED", "MAY", and "OPTIONAL" in this document are to be interpreted as
described in [RFC 2119](https://tools.ietf.org/html/rfc2119).
However, the scope of these key words is limited to when this extension is used.
For example, an attribute being marked as "REQUIRED" does not mean it needs to
be in all CloudEvents, rather it needs to be included only when this extension
is being used.
## Attributes
### dataclassification
- Type: `String`
- Description: Data classification level for the event payload within the
context of a `dataregulation`. In situations where `dataregulation` is
undefined or the data protection regulation does not define any labels, then
RECOMMENDED labels are: `public`, `internal`, `confidential`, or
`restricted`.
- Constraints:
- REQUIRED
### dataregulation
- Type: `String`
- Description: A comma-delimited list of applicable data protection regulations.
For example: `GDPR`, `HIPAA`, `PCI-DSS`, `ISO-27001`, `NIST-800-53`, `CCPA`.
- Constraints:
- OPTIONAL
- if present, MUST be a non-empty string without internal spaces. Leading and
trailing spaces around each entry MUST be ignored.
### datacategory
- Type: `String`
- Description: Data category of the event payload within the context of a
`dataregulation` and `dataclassification`. For GDPR personal data typical
labels are: `non-sensitive`, `standard`, `sensitive`, `special-category`. For
US personal data this could be: `sensitive-pii`, `non-sensitive-pii`,
`non-pii`. And for personal health information under HIPAA: `phi`.
- Constraints:
- OPTIONAL
- if present, MUST be a non-empty string
## Usage
When this extension is used, producers MUST set the value of the
`dataclassification` attribute. When applicable the `dataregulation` and
`datacategory` attributes MAY be set to provide additional details on the
classification context.
When an implementation supports this extension, then intermediaries and
consumers MUST take these attributes into account and act accordingly to data
regulations and/or internal policies in processing the event and payload. If
intermediaries or consumers cannot meet such requirements, they MUST reject and
report an error through a protocol-level mechanism.
If intermediaries or consumers are unsure on how to interpret these attributes,
for example when they encounter an unknown classification level or data
regulation, they MUST assume they cannot meet requirements and MUST reject the
event and report an error through a protocol-level mechanism.
Intermediaries SHOULD NOT modify the `dataclassification`, `dataregulation`, and
`datacategory` attributes.
## Use cases
Examples where data classification of events can be useful are:
- When an event contains PII or restricted information and therefore processing
by intermediaries or consumers need to adhere to certain policies. For example
having separate processing pipelines by sensitivity or having logging,
auditing and access policies based upon classification.
- When an event payload is subjected to regulation and therefore retention
policies apply. For example, having event retention policies based upon data
classification or to enable automated data purging of durable topics.
## Appendix: Data Protection and Privacy Regulations
For reference purposes, a catalog of common data protection and privacy
regulation and abbreviations is availble from [UNCTAD
(United Nations Conference on Trade and
Development)](https://unctad.org/page/data-protection-and-privacy-legislation-worldwide),
under the `DOWNLOAD FULL DATA` button ([direct
link](https://unctad.org/system/files/information-document/DP.xlsx)). Others
might exist.
Some examples include:
- `GDPR` - General Data Protection Regulation, Europe
- `HIPAA` - Health Insurance Portability and Accountability Act, United States
- `NDPR` - Nigeria Data Protection Regulation, Nigeria

View File

@ -0,0 +1,86 @@
# Deprecation extension
This specification defines attributes that can be included in CloudEvents to
indicate to [consumers](../spec.md#consumer) or
[intermediaries](../spec.md#intermediary) the deprecation of events. These
attributes inform CloudEvents consumers about upcoming changes or removals,
facilitating smoother transitions and proactive adjustments.
## Notational Conventions
As with the main [CloudEvents specification](../spec.md), the key words "MUST",
"MUST NOT", "REQUIRED", "SHALL", "SHALL NOT", "SHOULD", "SHOULD NOT",
"RECOMMENDED", "MAY", and "OPTIONAL" in this document are to be interpreted as
described in [RFC 2119](https://tools.ietf.org/html/rfc2119).
However, the scope of these key words is limited to when this extension is
used. For example, an attribute being marked as "REQUIRED" does not mean it
needs to be in all CloudEvents, rather it needs to be included only when this
extension is being used.
## Attributes
### deprecated
- Type: `Boolean`
- Description: Indicates whether the event type is deprecated.
- Constraints
- MUST be `true`
- REQUIRED
- Example: `"deprecated": true`
### deprecationfrom
- Type: `Timestamp`
- Description: Specifies the date and time when the event type was
officially marked as deprecated.
- Constraints
- OPTIONAL
- The `deprecationfrom` timestamp SHOULD remain stable once set and SHOULD
reflect a point in the past or present. Pre-announcing deprecation by
setting a future date is not encouraged.
- Example: `"deprecationfrom": "2024-10-11T00:00:00Z"`
### deprecationsunset
- Type: `Timestamp`
- Description: Specifies the future date and time when the event type will
become unsupported.
- Constraints
- OPTIONAL
- The timestamp MUST be later than or the same as the one given in the
`deprecationfrom` field, if present. It MAY be extended to a later date but
MUST NOT be shortened once set.
- Example: `"deprecationsunset": "2024-11-12T00:00:00Z"`
### deprecationmigration
- Type: `URI`
- Description: Provides a link to documentation or resources that describe
the migration path from the deprecated event to an alternative. This helps
consumers transition away from the deprecated event.
- Constraints
- OPTIONAL
- The URI SHOULD point to a valid and accessible resource that helps
consumers understand what SHOULD replace the deprecated event.
- Example: `"deprecationmigration": "https://example.com/migrate-to-new-evt"`
## Usage
When this extension is used, producers MUST set the value of the `deprecated`
attribute to `true`. This gives consumers a heads-up that they SHOULD begin
migrating to a new event or version.
Consumers SHOULD make efforts to switch to the suggested replacement before the
specified `deprecationsunset` timestamp. It is advisable to begin transitioning
as soon as the event is marked as deprecated to ensure a smooth migration and
avoid potential disruptions after the sunset date.
If an event is received after the `deprecationsunset` timestamp, consumers
SHOULD choose to stop processing such events, especially if unsupported events
can cause downstream issues.
Producers SHOULD stop emitting deprecated events after the `deprecationsunset`
timestamp. They SHOULD also provide detailed documentation via the
`deprecationmigration` attribute to guide consumers toward the correct replacement
event.

View File

@ -1,11 +1,28 @@
# Distributed Tracing extension
This extension embeds context from
[Distributed Tracing](https://w3c.github.io/trace-context/) so that distributed
systems can include traces that span an event-driven system. This extension is
meant to contain historical data of the parent trace, in order to diagnose
eventual failures of the system through tracing platforms like Jaeger, Zipkin,
etc.
[W3C TraceContext](https://www.w3.org/TR/trace-context/) into a CloudEvent.
The goal of this extension is to offer means to carry context when instrumenting
CloudEvents based systems with OpenTelemetry.
The [OpenTelemetry](https://opentelemetry.io/) project is a collection
of tools, APIs and SDKs that can be used to instrument, generate, collect,
and export telemetry data (metrics, logs, and traces) to help you
analyze your softwares performance and behavior.
The OpenTelemetry specification defines both
[Context](https://github.com/open-telemetry/opentelemetry-specification/blob/v1.8.0/specification/context/context.md#overview)
and
[Distributed Tracing](https://github.com/open-telemetry/opentelemetry-specification/blob/v1.8.0/specification/overview.md#tracing-signal)
as:
> A `Context` is a propagation mechanism which carries execution-scoped values across
API boundaries and between logically associated execution units. Cross-cutting
concerns access their data in-process using the same shared `Context` object.
>
> A `Distributed Trace` is a set of events, triggered as a result of a single
logical operation, consolidated across various components of an application.
A distributed trace contains events that cross process, network and security boundaries.
## Notational Conventions
@ -16,12 +33,12 @@ described in [RFC 2119](https://tools.ietf.org/html/rfc2119).
However, the scope of these key words is limited to when this extension is
used. For example, an attribute being marked as "REQUIRED" does not mean
it needs to be in all CloudEvents, rather it needs to be included only when
it needs to be in all CloudEvents, rather it needs to be included only when
this extension is being used.
## Attributes
#### traceparent
### traceparent
- Type: `String`
- Description: Contains a version, trace ID, span ID, and trace options as
@ -29,7 +46,7 @@ this extension is being used.
- Constraints
- REQUIRED
#### tracestate
### tracestate
- Type: `String`
- Description: a comma-delimited list of key-value pairs, defined by
@ -50,6 +67,11 @@ carry the trace information of the starting trace of the transmission.
In other words, it MUST NOT carry trace information of each individual hop, since this information is usually
carried using protocol specific headers, understood by tools like [OpenTelemetry](https://opentelemetry.io/).
The
[OpenTelemetry Semantic Conventions for CloudEvents](https://opentelemetry.io/docs/specs/semconv/cloudevents/cloudevents-spans/)
define the trace structure to follow when instrumenting CloudEvent systems and
in which scenarios this extension can be used and how to use it to achieve said structure.
Middleware between the source and the sink of the event could eventually add a Distributed Tracing Extension
if the source didn't include any, in order to provide to the sink the starting trace of the transmission.

View File

@ -169,7 +169,7 @@ The following table shows exemplary mappings:
| datacontenttype | string | `application/octet-stream` |
| dataschema | string | `http://registry.com/schema/v1/much.json` |
| subject | string | `mynewfile.jpg` |
| time | long | `2019-06-05T23:45:00Z` |
| time | string | `2019-06-05T23:45:00Z` |
| data | bytes | `[bytes]` |
## References

View File

@ -186,7 +186,7 @@ Reaching the delivery agreement is realized using the following validation
handshake. The handshake can either be executed immediately at registration time
or as a "pre-flight" request immediately preceding a delivery.
It is important to understand is that the handshake does not aim to establish an
It is important to understand that the handshake does not aim to establish an
authentication or authorization context. It only serves to protect the sender
from being told to a push to a destination that is not expecting the traffic.
While this specification mandates use of an authorization model, this mandate is

View File

@ -1,2 +1,2 @@
# CloudEvents Adapters
מסמך זה טרם תורגם. בבקשה תשתמשו [בגרסה האנגלית של המסמך](../../adapters.md) לבינתיים.
מסמך זה טרם תורגם. בבקשה תשתמשו [בגרסה האנגלית של המסמך](../../../adapters/README.md) לבינתיים.

View File

@ -0,0 +1,7 @@
# Correlation
מסמך זה טרם תורגם. בבקשה תשתמשו [בגרסה האנגלית של המסמך](../../../extensions/correlation.md) לבינתיים.
אם ברצונכם להציע תרגום לעברית, ניתן לעשות זאת באמצעות [פתיחת issue](https://github.com/cloudevents/spec/issues) בגיטהאב.
אנו מבקשים מכם להציע תרגום לעברית בלבד, ולא להציע תרגום לאנגלית.

View File

@ -0,0 +1,2 @@
# Data Classification Extension
מסמך זה טרם תורגם. בבקשה תשתמשו [בגרסה האנגלית של המסמך](../../../extensions/data-classification.md) לבינתיים.

View File

@ -0,0 +1,2 @@
# Deprecation extension
מסמך זה טרם תורגם. בבקשה תשתמשו [בגרסה האנגלית של המסמך](../../../extensions/deprecation.md) לבינתיים.

View File

@ -22,7 +22,7 @@
| /cloudevents/formats/json-format.md | Ready to start | | |
| /cloudevents/formats/protobuf-format.md | Ready to start | | |
| /cloudevents/working-drafts/xml-format.md | Ready to start | | |
| /cloudevents/adapters.md | Ready to start | | |
| /cloudevents/adapters/README.md | Ready to start | | |
| /cloudevents/http-webhook.md | Ready to start | | |
| /cloudevents/primer.md | Ready to start | | |
| /cloudevents/proprietary-specs.md | Ready to start | | |

View File

@ -0,0 +1,2 @@
# CloudEvents Version 2 Consideration List
מסמך זה טרם תורגם. בבקשה תשתמשו [בגרסה האנגלית של המסמך](../../v2.md) לבינתיים.

View File

@ -0,0 +1,2 @@
# AsyncAPI With CloudEvents - Version 1.0.3-wip
מסמך זה טרם תורגם. בבקשה תשתמשו [בגרסה האנגלית של המסמך](../../../working-drafts/asyncapi.md) לבינתיים.

View File

@ -2,6 +2,6 @@
由于并不是所有事件生产者在本地都以CloudEvents的格式生产事件因此需要一些"适配器"将这些非CloudEvents格式的事件转换为CloudEvents。要完成这个转换通常需要将非CloudEvents格式事件的元数据抽取出来用作CloudEvents的属性。为了能更好地提升适配器间不同实现的互操作性以下文件列出了推荐使用的算法
- [AWS S3](../../adapters/aws-s3.md)
- [GitHub](../../adapters/github.md)
- [GitLab](../../adapters/gitlab.md)
- [AWS S3](../../../adapters/aws-s3.md)
- [GitHub](../../../adapters/github.md)
- [GitLab](../../../adapters/gitlab.md)

View File

@ -0,0 +1,6 @@
# Correlation
本文档尚未被翻译,请先阅读英文[原版文档](../../../extensions/correlation.md) 。
如果您迫切地需要此文档的中文翻译,请[提交一个 issue](https://github.com/cloudevents/spec/issues)
我们会尽快安排专人进行翻译。

View File

@ -0,0 +1,6 @@
# Data Classification Extension
本文档尚未被翻译,请先阅读英文[原版文档](../../../extensions/data-classification.md) 。
如果您迫切地需要此文档的中文翻译,请[提交一个issue](https://github.com/cloudevents/spec/issues)
我们会尽快安排专人进行翻译。

View File

@ -0,0 +1,6 @@
# Deprecation extension
本文档尚未被翻译,请先阅读英文[原版文档](../../../extensions/deprecation.md) 。
如果您迫切地需要此文档的中文翻译,请[提交一个issue](https://github.com/cloudevents/spec/issues)
我们会尽快安排专人进行翻译。

View File

@ -99,7 +99,7 @@ CloudEvents 的核心规范中定义了一组称之为属性的元数据,
接下来应由事件生产者定义将使用的 CloudEvents 属性值,就像它可能生成的任何其他事件一样。
由于并非所有事件生产者都将其事件以CloudEvents的形式发布
因此我们定义了一组 [适配器](../../adapters.md)
因此我们定义了一组 [适配器](../../adapters/README.md)
来展示如何将事件从一些流行的事件生产者映射到 CloudEvents。
这些适配器是非规范的,
但它们是规范作者对 CloudEvents 属性如何在其它生产者本地生成事件并映射到CloudEvents时的最佳猜测。

View File

@ -22,7 +22,7 @@
| /cloudevents/formats/json-format.md | Ready to start | | |
| /cloudevents/formats/protobuf-format.md | Ready to start | | |
| /cloudevents/working-drafts/xml-format.md | Ready to start | | |
| /cloudevents/adapters.md | Ready to start | | |
| /cloudevents/adapters/README.md | Ready to start | | |
| /cloudevents/http-webhook.md | Ready to start | | |
| /cloudevents/primer.md | Ready to start | | |
| /cloudevents/proprietary-specs.md | Ready to start | | |

View File

@ -0,0 +1,6 @@
# CloudEvents Version 2 Consideration List
本文档尚未被翻译,请先阅读英文[原版文档](../../v2.md) 。
如果您迫切地需要此文档的中文翻译,请[提交一个issue](https://github.com/cloudevents/spec/issues)
我们会尽快安排专人进行翻译。

View File

@ -0,0 +1,6 @@
# AsyncAPI With CloudEvents - Version 1.0.3-wip
本文档尚未被翻译,请先阅读英文[原版文档](../../../working-drafts/asyncapi.md) 。
如果您迫切地需要此文档的中文翻译,请[提交一个issue](https://github.com/cloudevents/spec/issues)
我们会尽快安排专人进行翻译。

View File

@ -5,10 +5,10 @@
## Abstract
This non-normative document provides an overview of the CloudEvents
specification. It is meant to complement the CloudEvent specification to provide
additional background and insight into the history and design decisions made
during the development of the specification. This allows the specification
itself to focus on the normative technical details.
specification. It is meant to complement the CloudEvents specification to
provide additional background and insight into the history and design
decisions made during the development of the specification. This allows the
specification itself to focus on the normative technical details.
## Table of Contents
@ -17,8 +17,9 @@ itself to focus on the normative technical details.
- [Design Goals](#design-goals)
- [Architecture](#architecture)
- [Versioning of CloudEvents](#versioning-of-cloudevents)
- [CloudEvent Core Attributes](#cloudevent-core-attributes)
- [CloudEvent Extension Attributes](#cloudevent-extension-attributes)
- [CloudEvents Core Attributes](#cloudevents-core-attributes)
- [CloudEvents Extension Attributes](#cloudevents-extension-attributes)
- [CloudEvents with Security](#cloudevents-with-security)
- [Creating CloudEvents](#creating-cloudevents)
- [Qualifying Protocols and Encodings](#qualifying-protocols-and-encodings)
- [Proprietary Protocols and Encodings](#proprietary-protocols-and-encodings)
@ -137,7 +138,7 @@ It would then be up to the event producer to define the CloudEvents attribute
values that would be used, just like any other event it might generate.
Since not all event producers generate their events as CloudEvents, there are
a set of [adapters](./adapters.md) defined that show how to map events from
a set of [adapters](./adapters/README.md) defined that show how to map events from
some popular event producers into CloudEvents. These adapters are non-normative
but are the specification authors' best guess as to how the CloudEvents
attribute would be populated if the event producer produced them natively.
@ -197,7 +198,7 @@ by the components implemented by the implementor of the specification
themselves. However, if the community observes a pattern in usage of certain
extension attributes, as a standard way to deal with the topic of data
integrity. In that case, such extension attributes can be declared as official
extensions to the CloudEvent specification.
extensions to the CloudEvents specification.
## Architecture
@ -317,10 +318,10 @@ When a CloudEvent's data changes in a backwardly-incompatible way,
the value of `dataschema` attribute should generally change,
along with the `type` attribute as described above.
## CloudEvent Core Attributes
## CloudEvents Core Attributes
This section provides additional background and design points related to some of
the CloudEvent core attributes.
the CloudEvents core attributes.
### id
@ -346,11 +347,11 @@ then some additional data within the CloudEvent would be used for that purpose.
In this respect, while the exact value chosen by the event producer might be
some random string, or a string that has some semantic meaning in some other
context, for the purposes of this CloudEvent attribute those meanings are not
context, for the purposes of this CloudEvents attribute those meanings are not
relevant and therefore using `id` for some other purpose beyond uniqueness
checking is out of scope of the specification and not recommended.
## CloudEvent Extension Attributes
## CloudEvents Extension Attributes
In order to achieve the stated goals, the specification authors will attempt to
constrain the number of metadata attributes they define in CloudEvents. To that
@ -380,7 +381,7 @@ be included at all, the group uses use-cases and user-stories to explain the
rationale and need for them. This supporting information will be added to the
[Prior Art](#prior-art) section of this document.
Extension attributes to the CloudEvent specification are meant to be additional
Extension attributes to the CloudEvents specification are meant to be additional
metadata that needs to be included to help ensure proper routing and processing
of the CloudEvent. Additional metadata for other purposes, that is related to
the event itself and not needed in the transportation or processing of the
@ -448,6 +449,95 @@ serialization for unknown, or even new, properties. It was also noted that the
HTTP specification is now following a similar pattern by no longer suggesting
that extension HTTP headers be prefixed with `X-`.
## CloudEvents with Security
The core CloudEvents specification purposely does not address security beyond
suggesting that preexisting security mechanisms should be used. For example,
the use of TLS when using HTTP. The CloudEvents specification authors did not
see the need to invent something new, rather composition with existing
technologies that are already being used seemed more appropriate.
With that in mind, below are a few CloudEvent serializations that are composed
with popular encryption technologies to give a non-normative examples of how
security may be layered on top of a CloudEvent:
- a binary CloudEvent composed with the
[JOSE](https://datatracker.ietf.org/doc/rfc7516) specification being sent
over HTTP (line-breaks are added for display purposes only):
```json
POST /receiver HTTP/1.1
Host: example.com
Content-Type: application/jose
ce-specversion: 1.0
ce-type: PAYMENT.AUTHORIZATION.CREATED
ce-source: https://paymentprocessor.example.com/
ce-subject: c7bbb040-d458-4d47-82a8-45413f9f2d33
ce-id: a978702e-ef48-4032-ac18-a057e0104076
ce-time: 2024-05-30T17:31:00Z
eyJhbGciOiJSU0EtT0FFUCIsImVuYyI6IkEyNTZHQ00ifQ.OKOawDo13gRp2ojaH
V7LFpZcgV7T6DVZKTyKOMTYUmKoTCVJRgckCL9kiMT03JGeipsEdY3mx_etLbbWS
rFr05kLzcSr4qKAq7YN7e9jwQRb23nfa6c9d-StnImGyFDbSv04uVuxIp5Zms1gN
xKKK2Da14B8S4rzVRltdYwam_lDp5XnZAYpQdb76FdIKLaVmqgfwX7XWRxv2322i
-vDxRfqNzo_tETKzpVLzfiwQyeyPGLBIO56YJ7eObdv0je81860ppamavo35UgoR
dbYaBcoh9QcfylQr66oc6vFWXRcZ_ZT2LawVCWTIy3brGPi6UklfCpIMfIjf7iGd
XKHzg.48V1_ALb6US04U3b.5eym8TW_c8SuK0ltJ3rpYIzOeDQz7TALvtu6UG9oM
o4vpzs9tX_EFShS8iB7j6jiSdiwkIr3ajwQzaBtQD_A.XFBoMYUZodetZdvTiFvS
kQ
```
- an XML serialization of a structured CloudEvent composed with the
[XML Signature](https://www.w3.org/TR/xmldsig-core1/) specificiation
(line-breaks are added for display purposes only):
```xml
<?xml version="1.0" encoding="UTF-8"?>
<event xmlns="http://cloudevents.io/xmlformat/V1"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:xs="http://www.w3.org/2001/XMLSchema" specversion="1.0" >
<time>2020-03-19T12:54:00-07:00</time>
<datacontenttype>application/xml</datacontenttype>
<id>000-1111-2222</id>
<source>urn:uuid:123e4567-e89b-12d3-a456-426614174000</source>
<type>SOME.EVENT.TYPE</type>
<data xsi:type="xs:any" xml:id="data">
<geo:Location xmlns:geo="http://someauthority.example/">
<geo:Latitude>51.509865</geo:Latitude>
<geo:Longitude>-0.118092</geo:Longitude>
</geo:Location>
</data>
<!-- End of CloudEvent, below is the signature info. -->
<!-- The Values are examples, and not necessarily accurate for 'data'. -->
<dsig:Signature xmlns:dsig="http://www.w3.org/2000/09/xmldsig#">
<dsig:SignedInfo>
<dsig:CanonicalizationMethod
Algorithm="http://www.w3.org/TR/2001/REC-xml-c14n-20010315"/>
<dsig:SignatureMethod
Algorithm="http://www.w3.org/2000/09/xmldsig#rsa-sha1"/>
<dsig:Reference URI="#data">
<dsig:Transforms>
<dsig:Transform
Algorithm="http://www.w3.org/2000/09/xmldsig#enveloped-signature"/>
<dsig:Transform Algorithm="http://www.w3.org/2001/10/xml-exc-c14n#"/>
</dsig:Transforms>
<dsig:DigestMethod Algorithm="http://www.w3.org/2000/09/xmldsig#sha1"/>
<dsig:DigestValue>DCDNxibEA3BHpFMtzvj7hxd7p5A=</dsig:DigestValue>
</dsig:Reference>
</dsig:SignedInfo>
<dsig:SignatureValue>
jocgUrZPKR8jvery4gG4V34qx7/yxOESPJq//iS3Q5Ps7lPADNBEVK4Y50HIdrkodcY
LZjBkvuGMT89nTeT24W/Dw/XEeMWXRmy/Mj1/rza8JMaP46F+2MZ6tlGWlyA2tRZNEx
e5TPA8Wo6jTSN3KX3aLoLkwRsLBt50Zr8zz8xFtadZciNWnsD6y/UgQzNYfLovMw54A
HGk+5FzRWMgwtTseISWxSF+9zsgiQStrrXzy1SaRycQTAjz4PF6HebGWJcECLa+r/iL
tigbTmgL3Mj7mkmw90M3mNncqZKBFmjNxTZCPiMQHbSvTgOBe8REwCrclHJkyYP14Ns
xEg6LZQ==
</dsig:SignatureValue>
</dsig:Signature>
</event>
```
## Creating CloudEvents
The CloudEvents specification purposely avoids being too prescriptive about how
@ -614,7 +704,7 @@ links to all specs.
## Prior Art
This section describes some of the input material used by the group during the
development of the CloudEvent specification.
development of the CloudEvents specification.
### Roles
@ -808,7 +898,7 @@ and this help event consumers safely work with event data as it evolves.
#### Normalizing Webhooks
Webhooks is a style of event publishing which does not use a common format.
Webhooks is a style of event publishing that does not use a common format.
Consumers of webhooks dont have a consistent way to develop, test, identify,
validate, and overall process event data delivered via webhooks.

View File

@ -8,6 +8,7 @@ the responsibility of the respective project maintainers.
- [Apache RocketMQ Transport Binding](https://github.com/apache/rocketmq-externals/blob/master/rocketmq-cloudevents-binding/rocketmq-transport-binding.md)
- [Google Cloud Pub/Sub Protocol Binding](https://github.com/googleapis/google-cloudevents/blob/main/docs/spec/pubsub.md)
- [RabbitMQ Transport Binding](https://github.com/knative-extensions/eventing-rabbitmq/blob/main/cloudevents-protocol-spec/spec.md)
**Want to add a binding to a proprietary transport?**

View File

@ -503,7 +503,7 @@ messages if the copied values differ from the cloud-event serialized values.
#### Defining Extensions
See
[CloudEvent Attributes Extensions](primer.md#cloudevent-extension-attributes)
[CloudEvent Attributes Extensions](primer.md#cloudevents-extension-attributes)
for additional information concerning the use and definition of extensions.
The definition of an extension SHOULD fully define all aspects of the
@ -610,6 +610,10 @@ Consider the following to prevent inadvertent leakage especially when leveraging
Protocol level security SHOULD be employed to ensure the trusted and secure
exchange of CloudEvents.
See the [CloudEvents Primer](primer.md#cloudevents-with-security) for more
information about how existing security mechanisms can be used with
CloudEvents.
## Example
The following example shows a CloudEvent serialized as JSON:

16
cloudevents/v2.md Normal file
View File

@ -0,0 +1,16 @@
# CloudEvents Version 2 Consideration List
<!-- no verify-specs -->
The list below contains the set of changes that the group had identified as
potential candidates for a v2 of the CloudEvents specification. There no
guarantee as to when, or if, a v2 might happen, or that any of these will be
adopted. Rather this list is just a reminder to consider them so that we can
reduce the number of open issues and PRs on our backlog.
- [Do we want a new optional "mustunderstand" core attribute](https://github.com/cloudevents/spec/issues/1321)
- [Why does the HTTP binding for batch require the same spec version for all events in the batch?](https://github.com/cloudevents/spec/issues/807)
- [handling of datacontenttype is inconsistent](https://github.com/cloudevents/spec/issues/558)
- [fix(protobuf)!: Expose CloudEvent structure with explicit fields](https://github.com/cloudevents/spec/pull/1354)
- [The case for minor version](https://github.com/cloudevents/spec/pull/1032)

View File

@ -0,0 +1,39 @@
# yaml-language-server: $schema=https://asyncapi.com/schema-store/3.0.0-without-$id.json
asyncapi: 3.0.0
info:
title: Light Switch Events With CloudEvents as Headers (Binary Mode)
version: 1.0.0
description: Informes about light swich changes.
operations:
onOfficeLightSwitchChanged:
title: Office light switch was triggered
channel:
$ref: '#/channels/officeLightSwitchChanged'
action: receive
channels:
officeLightSwitchChanged:
address: 'lightswitch.office.changed'
title: Office light switch changes
messages:
lightSwitchChanged:
$ref: '#/components/messages/lightSwitchChanged'
components:
messages:
lightSwitchChanged:
description: Light switch was triggered event with CloudEvents headers
traits:
- $ref: 'https://raw.githubusercontent.com/cloudevents/spec/main/cloudevents/working-drafts/asyncapi-traits/cloudevents-headers-kafka-binary.yaml'
payload:
type: object
properties:
lightSwitchId:
type: integer
examples:
- 1
position:
type: string
enum:
- ON
- OFF

View File

@ -0,0 +1,42 @@
# yaml-language-server: $schema=https://asyncapi.com/schema-store/3.0.0-without-$id.json
asyncapi: 3.0.0
info:
title: Light Switch Events With CloudEvents as Headers (Binary Mode)
version: 1.0.0
description: Informes about light swich changes.
operations:
onOfficeLightSwitchChanged:
title: Office light switch was triggered
channel:
$ref: '#/channels/officeLightSwitchChanged'
action: receive
channels:
officeLightSwitchChanged:
address: 'lightswitch.office.changed'
title: Office light switch changes
messages:
lightSwitchChanged:
$ref: '#/components/messages/lightSwitchChanged'
components:
messages:
lightSwitchChanged:
description: Light switch was triggered event with CloudEvents headers
payload:
type: object
allOf:
- $ref: 'https://raw.githubusercontent.com/cloudevents/spec/v1.0.2/cloudevents/formats/cloudevents.json'
properties:
data:
type: object
properties:
lightSwitchId:
type: integer
examples:
- 1
position:
type: string
enum:
- ON
- OFF

View File

@ -0,0 +1,56 @@
name: cloudevents-headers-kafka-binary
summary: Message headers for CloudEvents in binary content mode with Kafka (see https://github.com/cloudevents/spec/blob/v1.0.2/cloudevents/bindings/kafka-protocol-binding.md)
headers:
type: object
required:
- ce_id
- ce_source
- ce_specversion
- ce_type
properties:
ce_id:
type: string
minLength: 1
description: Identifies the event.
examples:
- "1234-1234-1234"
ce_source:
type: string
format: uri-reference
minLength: 1
description: Identifies the context in which an event happened.
examples:
- "https://example.com/storage/tenant/container"
ce_specversion:
type: string
description: The version of the CloudEvents specification which the event uses.
enum:
- "1.0"
ce_type:
type: string
minLength: 1
description: Describes the type of event related to the originating occurrence.
examples:
- "com.example.someevent"
content-type:
type: string
description: Kafka default field to describing the content type of the data. Must be mapped directly to the CloudEvents datacontenttype attribute.
examples:
- "application/avro"
- "application/json;charset=utf-8"
ce_dataschema:
type: string
description: Identifies the schema that data adheres to.
examples:
- "http://registry.com/schema/v1/much.json"
ce_subject:
type: string
description: Describes the subject of the event in the context of the event producer (identified by source)
examples:
- mynewfile.jpg
ce_time:
type: string
format: date-time
description: Timestamp of when the occurrence happened. Must adhere to RFC 3339.
examples:
- "2018-04-05T17:31:00Z"

View File

@ -0,0 +1,67 @@
# AsyncAPI With CloudEvents - Version 1.0.3-wip
## Purpose
Asynchronous APIs, e.g., events, can be specified in AsyncAPI, similar to how
RESTful APIs can be specified in [OpenAPI](https://swagger.io/specification/).
When defining new events in an API-first approach it can be hard to add
CloudEvents headers or fields according to spec. This makes following the
standard harder. This document will clarify how CloudEvents headers can be
specified in AsyncAPI.
## Usage
Depending on the protocol and the mode (binary/structured), the inclusion of the
CloudEvents fields varies.
## Structured Mode
In structured mode, the entire event, attributes, and data are encoded in the
message body. When using structured mode, the usage only varies depending on the
serialization format:
| Format | Example                                                                 | Include                                  |
| ------ | ----------------------------------------------------------------------- | ---------------------------------------- |
| JSON   | [Short Example](#json-example) [Full Example](./asyncapi-examples/light-switch-events-structured-json.yaml) | [Reference](../formats/cloudevents.json) |
### JSON Example
To add CloudEvents in structured mode, the following `allOf` reference needs to
be added:
```yaml
components:
messages:
messageKey:
payload:
type: object
allOf:
- $ref: 'https://raw.githubusercontent.com/cloudevents/spec/v1.0.2/cloudevents/formats/cloudevents.json'
```
See also: [Full Example](./asyncapi-examples/light-switch-events-structured-json.yaml)
## Binary Mode
In binary mode, protocol-specific bindings are mapping fields to protocol
content-type metadata property or headers; therefore, the AsyncAPI format needs
to depend on the protocol:
| Protocol Binding                               | Example                                                              | Trait                                                            |
| ---------------------------------------------- | -------------------------------------------------------------------- | ---------------------------------------------------------------- |
| [Kafka](../bindings/kafka-protocol-binding.md) | [Short Example](#avro-example) [Full Example](./asyncapi-examples/light-switch-events-binary-kafka.yaml) | [Trait](./asyncapi-traits/cloudevents-headers-kafka-binary.yaml) |
### Avro Example
To add CloudEvents in binary mode, the following `traits` reference needs to
be added:
```yaml
components:
messages:
messageKey:
traits:
- $ref: 'https://raw.githubusercontent.com/cloudevents/spec/main/cloudevents/working-drafts/asyncapi-traits/cloudevents-headers-kafka-binary.yaml'
```
See also: [Full Example](./asyncapi-examples/light-switch-events-binary-kafka.yaml)

View File

@ -5,15 +5,25 @@
This document describes the governance process under which the CloudEvents
project will manage this repository.
## Table of Contents & References
- [Meetings](#meetings)
- [Membership](#membership)
- [Admins](#admins)
- [PRs](#prs)
- [Voting](#voting)
- [Release Process and Versioning](#release-process-and-versioning)
- [Additional Information](#additional-information)
For easy reference, additional documentation related to how this project,
and its subprojects, operate are listed below:
- [Contributing](CONTRIBUTING.md)
- [List of contributors to the project](contributors.md)
- [Project Releases](RELEASES.md)
- [Project Roadmap](ROADMAP.md)
- [SDK Governance](SDK-GOVERNANCE.md)
- [SDK Maintainer Guidlines](SDK-maintainer-guidelines.md)
- [SDK PR Guidlines](SDK-PR-guidelines.md)
- [Contributing](CONTRIBUTING.md)
- [List of contributors to the project](contributors.md)
- [Project Releases](RELEASES.md)
- [Project Roadmap](ROADMAP.md)
- [SDK Governance](SDK-GOVERNANCE.md)
- [SDK Maintainer Guidlines](SDK-maintainer-guidelines.md)
- [SDK PR Guidlines](SDK-PR-guidelines.md)
## Meetings
@ -45,7 +55,7 @@ be followed:
There are three categories of project membership:
1. **Member.** This is anyone who participates in the group's activities in any
of our communication channels (email, github issues/PRs, meetings, etc.). No
of our communication channels (email, GitHub issues/PRs, meetings, etc.). No
formal registration process is needed.
2. **Voting Member.** See the [Voting](#voting) section below for more
@ -56,7 +66,7 @@ There are three categories of project membership:
3. **Admin.** Admins are Members of the group but have the ability to perform
administrative actions on behalf of the group. For example, manage the
website, github repos and moderate the meetings. Their actions should be done
website, GitHub repos and moderate the meetings. Their actions should be done
with the knowledge and consent of the group. They also have the ability to
merge/close PRs, but only per the group's approval. See the
[OWNERS](../OWNERS) file for the current list of Admins.
@ -137,20 +147,19 @@ If a vote is taken, the follow rules will be followed:
## Release Process and Versioning
### Versioning
The specifications produced will adhere to the following:
- The versioning scheme used will follow [semver](https://semver.org/)
- All normative specifications, and the Primer, will be grouped together into a
single logical unit and released at the same time, at the same version number.
This is true regardless of whether each individual document actually changed
during the release cycle.
- When a new release of a specification is ready, it will be given a version
number matching the appropriate semver version string but with a suffix of
`-rc#` (release candidate). This will indicate that the authors believe it
is ready for final release but it needs to go through a testing period to
allow for broader testing before it promoted to its final version number.
This will be true for updates to existing specifications and for new
specifications.
- The versioning scheme will follow [semver](https://semver.org/) for the
version number part of the version string.
- Specifications will be grouped into logical units, with all documents in a
group released at the same time, with the same version number. This is true
regardless of whether each individual document actually changed during the
release cycle. The determination of the number of groups, and which document
belongs in a group, can change over time.
- Since changing the CloudEvents `specversion` string could have a significant
impact on implementations, all non-breaking changes will be made as
"patch" version updates - this allows for the value "on the wire" to remain
@ -159,53 +168,95 @@ The specifications produced will adhere to the following:
that the "minor" version number will always be zero and the `specversion`
string will always be of the form `X.0`.
Note that these rules do not apply to the
- Each release will have both a tag and a branch. The tag will be kept
up-to-date with the tip of the branch. The purpose of having a branch is to
support very minor fixes (typos, clarifications) which amend a release in
place. The purpose of having a tag is to support GitHub releases, which can
act as a notification channel for interested users.
- Naming will adhere to the following pattern:
- Release Name: `SUBJECT@vX.Y.Z`
- Release Candidate Tag Name: `SUBJECT@vX.Y.Z-rc#`
- Release Branch Name: `SUBJECT@vX.Y.Z-branch`
- Release Tag Name: `SUBJECT@vX.Y.Z`
Note that these rules do not apply to unversioned documents, such as the
[documented extensions](../cloudevents/extensions/README.md).
All versions are tagged from the `main` branch, but the tag only applies to
the "subject" of the release - the directory containing the information
covered by that release (e.g. `subscriptions` or `cloudevents`). The
[CloudEvents web site](https://cloudevents.io/) takes appropriate content from
each tagged version. (If the directory containing the information covered
by the release is not in a top-level directory, the subject should be the full path,
e.g. `top-dir/sub-dir`.)
> Note: should the need arise, additional branches may be created. For example,
> it is likely that a `core-v2.0` branch will be created to collect changes for
> the core specification version 2.0 significantly before those changes are
> merged into the main branch, to allow for ongoing work on the main branch.
> Such branches should be deleted once their content is eventually merged.
To create a new release:
### Creating A New Release
- Periodically the group will examine the list of extensions to determine
if any action should be taken (e.g. removed due to it being stale). The
creation of a new release will be the reminder to do this check. If any
changes are needed then PRs will be created and reviewed by the group.
- Create a PR that modifies the [README](README.md), and all specifications (ie.
\*.md files) that include a version string, to the new release version string.
Make sure to remove `-wip` from all of the version strings.
- Merge the PR.
- Create a [new release](https://github.com/cloudevents/spec/releases/new):
- Choose a "Tag version" of the form: `<subject>/vX.Y.Z`, e.g.
`cloudevents/v1.0.4` or `subscriptions/v1.0.0`
- Target should be `main`, the default value
- Release title should be the same as the Tag - `<subject>/vX.Y.Z`
- Add some descriptive text, or the list of PRs that have been merged since
the previous release. The git query to get the list commits since the last
release is:
`git log --pretty=format:%s main...v0.1 | grep -v "Merge pull"`.
Just replace "v0.1" with the name of the previous release.
- Press `Publish release` button
- Create an "announcement" highlighting the key features of the new release and
any potential noteworthy activities of the group:
- Send it to the mailing list
- Announce the release on our
[twitter account](http://twitter.com/cloudeventsio)
- Determine the new release version string. It should be of the form:
`SUBJECT@vX.Y.Z`, e.g. `cloudevents@v1.0.4` or `subscriptions@v1.0.0`.
- Before a new release is finalized, a "release candidate" (rc) should be
created that identifies the versions of files in the repository that are to
be reviewed. The process for a RC is as follows:
- Create a PR (for the "main" branch) that modifies the appropriate files
to use the new version string appended with `-rc#`. Make sure to remove
all `-wip` suffixes as needed.
- Review and merge the PR. Note that this review is not really meant for
checking the functionality of the specs, rather is it intended to verify
the version string renaming was done properly.
- Create a Github tag pointing to the commit on the "main" branch after the
PR is merged, using the release version string suffixed with `-rc#`.
- Initiate a final review/test of the release, pointing reviwers to the tag.
- When review/testing is completed, update all of the version string references
to no longer use the `-rc#` suffix:
- Create a PR with the following changes:
- Modify the repo's files to use the new version string (without `-rc#`)
as appropriate..
- Update [RELEASES.md](RELEASES.md) to mention the new release, and
reference the yet-to-be-created release tag.
- Update the appropriate `*/RELEASE_NOTES.md` file with the changes
for the release. The list can be generated via:
`git log --pretty=format:%s main...cloudevents@v1.0.3 | grep -v "Merge pull"`
by replacing "cloudevents@v1.0.3" with the name of the previous release.
Or, use GitHub's
[new release](https://github.com/cloudevents/spec/releases/new) process
to generate the list without actually creating the release yet.
- Merge the PR.
- Note that the link checker should fail since any references to the new
release tag will not be valid yet. This is expected.
- Create the Github release and tag for the new release:
- Use GitHub to create a
[new release](https://github.com/cloudevents/spec/releases/new).
During that process, create a new tag with the new release version string
in the format `SUBJECT@vX.Y.Z`.
- Create a new branch with the new release version string in the format
`SUBJECT@vX.Y.Z-branch`.
- Rerun the GitHub CI actions from the previous PR and the "main" branch as
they should all pass now; as a sanity check.
- Create an "announcement" highlighting the key features of the new release
and any potential noteworthy activities of the group:
- Send it to the mailing lists.
- Announce the release on our [X account](http://x.com/cloudeventsio).
- Add it to the "announcement" section of our
[website](https://cloudevents.io/)
[website](https://cloudevents.io/).
- If an update to a release is needed, create a PR for the appropriate
branches (including "main"), and merge when ready. For any release that's
updated, you'll need to move the tag for that release to point to the head
of that branch. We'll eventually setup a GitHub action to automatically do
it but for now you can do it via the CLI:
- `git pull --tags` to make sure you have all latest branches and tags
- `git tag -d SUBJECT@vX.Y.Z` to delete the old tag for the release
- `git tag SUBJECT@vX.Y.Z SUBJECT@vX.Y.Z-branch` to create a new tag for
the head of the release branch
- `git push REMOTE SUBJECT@vX.Y.Z -f` to force the tag to updated in the
GitHub repo, where `REMOTE` is replaced with the git "remote" name that
you have defined that references the GitHub repo
## Additional Information
- We adhere to the CNCF's
[Code of Conduct](https://github.com/cncf/foundation/blob/master/code-of-conduct.md) guidelines
[Code of
Conduct](https://github.com/cncf/foundation/blob/master/code-of-conduct.md)
guidelines.

View File

@ -2,7 +2,7 @@
| Specification Group | Version | Release Date | Release Notes |
| :------------------ | :------------------------------------------------------------------: | :----------- | :------------------------------------------------------: |
| CESQL | [1.0](https://github.com/cloudevents/spec/tree/cesql/v1.0.0) | 2024/06/13 | [Notes](../cesql/RELEASE_NOTES.md#v100---20240613) |
| CESQL | [1.0.0](https://github.com/cloudevents/spec/tree/cesql/v1.0.0) | 2024/06/13 | [Notes](../cesql/RELEASE_NOTES.md#v100---20240613) |
| CloudEvents | [1.0.2](https://github.com/cloudevents/spec/tree/v1.0.2/cloudevents) | 2022/02/05 | [Notes](../cloudevents/RELEASE_NOTES.md#v102---20220205) |
| CloudEvents | [1.0.1](https://github.com/cloudevents/spec/tree/v1.0.1) | 2020/12/12 | [Notes](../cloudevents/RELEASE_NOTES.md#v101---20201212) |
| CloudEvents | [1.0](https://github.com/cloudevents/spec/tree/v1.0) | 2019/10/24 | [Notes](../cloudevents/RELEASE_NOTES.md#v100---20191024) |

View File

@ -13,69 +13,97 @@ to see who has been involved already.
Contributions do not constitute an official endorsement.
- **Alibaba**
- Ryan Zhang - [@nerdyyatrice](https://github.com/nerdyyatrice)
- Haoran Yang, Hongqi Wang
- Heng Du - [@duhenglucky](https://github.com/duhenglucky)
- **Amazon**
- Arun Gupta, Ajay Nair, Rob Leidle, Orr Weinstein
- **commercetools**
- Christoph Neijenhuis - [@cneijenhuis](https://github.com/cneijenhuis)
- **Confluent**
- Neil Avery - [@bluemonk3y](https://github.com/bluemonk3y)
- **Google**
- Sarah Allen - [@ultrasaurus](https://github.com/ultrasaurus)
- Rachel Myers - [@rachelmyers](https://github.com/rachelmyers)
- Thomas Bouldin - [@inlined](https://github.com/inlined)
- Mike McDonald, Morgan Hallmon, Robert-Jan Huijsman
- Scott Nichols - [@n3wscott](https://github.com/n3wscott)
- Jon Skeet - [@jskeet](https://github.com/jskeet)
- **Huawei**
- Cathy Hong Zhang - [@cathyhongzhang](https://github.com/cathyhongzhang)
- Louis Fourie - [@lfourie](https://github.com/lfourie)
- **IBM**
- Doug Davis - [@duglin](https://github.com/duglin)
- Daniel Krook - [@krook](https://github.com/krook)
- Matt Rutkowski - [@mrutkows](https://github.com/mrutkows)
- Michael M Behrendt - [@mbehrendt](https://github.com/mbehrendt)
- **Iguazio**
- Yaron Haviv - [@yaronha](https://github.com/yaronha)
- Orit Nissan-Messing - [@oritnm](https://github.com/oritnm)
- **Intel**
- David Lyle - [@dklyle](https://github.com/dklyle)
- **Lightbend**
- James Roper - [@jroper](https://github.com/jroper)
- **Linkall**
- Jie Ding - [@jieding](https://github.com/jieding)
- **Microsoft**
- Clemens Vasters - [@clemensv](https://github.com/clemensv)
- Bahram Banisadr - [@banisadr](https://github.com/banisadr)
- Dan Rosanova - [@djrosanova](https://github.com/djrosanova)
- Cesar Ruiz-Meraz, Raja Ravipati
- **Oracle**
- Chad Arimura - [@carimura](https://github.com/banisadr)
- Stanley Halka - [@shalka](https://github.com/banisadr)
- Travis Reeder - [@treeder](https://github.com/banisadr)
- **Red Hat**
- Jim Curtis - [@jimcurtis64](https://github.com/jimcurtis2)
- William Markito Oliveira - [@william_markito](https://github.com/markito)
- Gunnar Morling - [@gunnarmorling](https://github.com/gunnarmorling/)
- Tihomir Surdilovic - [@tsurdilo](https://github.com/tsurdilo)
- Lance Ball - [@lance](https://github.com/lance)
- **SAP**
- Nathan Oyler - [@notque](https://github.com/notque)
- Stevo Slavić - [@sslavic](https://github.com/sslavic)
- Klaus Deissner - [@deissnerk](https://github.com/deissnerk)
- **Serverless Inc**
- Austen Collins - [@ac360](https://github.com/ac360)
- Rupak Ganguly - [@rupakg](https://github.com/rupakg)
- Brian Neisler - [@brianneisler](https://github.com/brianneisler)
- Jeremy Coffield, Ganesh Radhakirshnan
- **SolarWinds**
- Lee Calcote - [@leecalcote](https://github.com/leecalcote)
- **VMWare**
- Mark Peek - [@markpeek](https://github.com/markpeek)
- **Individuals**
- Rémi Cattiau - [@loopingz](https://github.com/loopingz)
- Jem Day - [@JemDay](https://github.com/JemDay)
- Vladimir Bacvanski
# A
- Sarah Allen - [@ultrasaurus](https://github.com/ultrasaurus)
- Chad Arimura - [@carimura](https://github.com/banisadr)
- Neil Avery - [@bluemonk3y](https://github.com/bluemonk3y)
# B
- Vladimir Bacvanski
- Lance Ball - [@lance](https://github.com/lance)
- Bahram Banisadr - [@banisadr](https://github.com/banisadr)
- Michael M Behrendt - [@mbehrendt](https://github.com/mbehrendt)
- Thomas Bouldin - [@inlined](https://github.com/inlined)
# C
- Rémi Cattiau - [@loopingz](https://github.com/loopingz)
- Lee Calcote - [@leecalcote](https://github.com/leecalcote)
- Jeremy Coffield
- Austen Collins - [@ac360](https://github.com/ac360)
- Jim Curtis - [@jimcurtis64](https://github.com/jimcurtis2)
# D
- Doug Davis - [@duglin](https://github.com/duglin)
- Jem Day - [@JemDay](https://github.com/JemDay)
- Klaus Deissner - [@deissnerk](https://github.com/deissnerk)
- Jie Ding - [@jieding](https://github.com/jieding)
- Heng Du - [@duhenglucky](https://github.com/duhenglucky)
# F
- Louis Fourie - [@lfourie](https://github.com/lfourie)
# G
- Rupak Ganguly - [@rupakg](https://github.com/rupakg)
- Arun Gupta
# H
- Stanley Halka - [@shalka](https://github.com/banisadr)
- Morgan Hallmon
- Yaron Haviv - [@yaronha](https://github.com/yaronha)
- Robert-Jan Huijsman
# K
- Daniel Krook - [@krook](https://github.com/krook)
# L
- Rob Leidle
- David Lyle - [@dklyle](https://github.com/dklyle)
# M
- Mike McDonald
- Gunnar Morling - [@gunnarmorling](https://github.com/gunnarmorling/)
- Rachel Myers - [@rachelmyers](https://github.com/rachelmyers)
- Calum Murray - [@Cali0707](https://github.com/Cali0707/)
# N
- Ajay Nair
- Christoph Neijenhuis - [@cneijenhuis](https://github.com/cneijenhuis)
- Brian Neisler - [@brianneisler](https://github.com/brianneisler)
- Scott Nichols - [@n3wscott](https://github.com/n3wscott)
- Orit Nissan-Messing - [@oritnm](https://github.com/oritnm)
# O
- William Markito Oliveira - [@william_markito](https://github.com/markito)
- Nathan Oyler - [@notque](https://github.com/notque)
# P
- Mark Peek - [@markpeek](https://github.com/markpeek)
# R
- Ganesh Radhakirshnan
- Raja Ravipati
- Travis Reeder - [@treeder](https://github.com/banisadr)
- James Roper - [@jroper](https://github.com/jroper)
- Dan Rosanova - [@djrosanova](https://github.com/djrosanova)
- Cesar Ruiz-Meraz
- Matt Rutkowski - [@mrutkows](https://github.com/mrutkows)
# S
- Vandewilly Silva - [@vandewillysilva](https://github.com/vandewillysilva)
- Jon Skeet - [@jskeet](https://github.com/jskeet)
- Stevo Slavić - [@sslavic](https://github.com/sslavic)
- Tihomir Surdilovic - [@tsurdilo](https://github.com/tsurdilo)
# V
- Clemens Vasters - [@clemensv](https://github.com/clemensv)
# W
- Hongqi Wang
- Orr Weinstein
# Y
- Haoran Yang
# Z
- Cathy Hong Zhang - [@cathyhongzhang](https://github.com/cathyhongzhang)
- Ryan Zhang - [@nerdyyatrice](https://github.com/nerdyyatrice)

View File

@ -1,8 +1,6 @@
<!-- no verify-specs -->
If you have a demo of CloudEvents in action, please add a link here. If there
isn't an associated blog or github repo, feel free to add descriptive text as a
markdown file in `docs/demos/`.
If you have a demo of CloudEvents in action, please add a link here. If there isn't an associated blog or github repo, feel free to add descriptive text as a markdown file in `docs/demos/`.
- Simulating CloudEvents with [AsyncAPI](https://www.asyncapi.com/) and [Microcks](https://microcks.io/) - Nov 18, 2021
@ -11,14 +9,14 @@ markdown file in `docs/demos/`.
- [Orchestrating Cloud Events](https://salaboy.com/2020/05/18/orchestrating-cloud-events-with-zeebe/) - May 18, 2020
- Blog post about Cloud Events Orchestration using an workflow engine([Zeebe](http://zeebe.io)).
- Blog post about Cloud Events Orchestration using an workflow engine([Zeebe](http://zeebe.io)).
- Instructions: You can find the [Demo Code and instructions here](https://github.com/salaboy/zeebe-cloud-events-examples).
- [Gaining business visibility into processes implement by Cloud Events](https://blog.bernd-ruecker.com/gaining-visibility-into-processes-spanning-multiple-microservices-a1fc751c4c13) - Jun 23, 2020
- Blog post on how to use process events monitoring to understand the business process behind a choreography using Cloud Events
- Instructions: You can find the [used sample application on GitHub](https://github.com/berndruecker/flowing-retail/). [This recording walks you through](https://www.youtube.com/watch?v=JptEJZ10Ra4)
- CNCF/Kube-Con EU - May, 2018
- CloudEvents interop demo during Austen Collins'
@ -32,7 +30,7 @@ markdown file in `docs/demos/`.
- CNCF/Kube-Con NA - Dec, 2018
- CloudEvents interop demo during Serverless WG Intro
[session](https://sched.co/Grcc)
[session](https://kccna18.sched.com/event/Grcc)
([video](https://www.youtube.com/watch?v=iNlqLr9vlD4&feature=youtu.be))
- Participants: IBM, Knative, Oracle, OpenFaaS, VMware, Microsoft, Huawei,
SAP, PayPal

View File

@ -3,6 +3,8 @@ CloudEvents v1.0, please include it in the list below.
- [Argo-Events](https://github.com/argoproj/argo-events): An event-based
dependency manager for Kubernetes which uses sensors to act on CloudEvents.
- [CDEvents](https://github.com/cdevents/spec): A common specification for SDLC
events to bring interoperability across tools and services.
- [CloudEvents Extend API](https://github.com/goextend/cloudevents-extend-api)
is a JavaScript programming model for Extend which allows handling
CloudEvents.
@ -15,11 +17,25 @@ CloudEvents v1.0, please include it in the list below.
- [CloudEvents Verify](https://github.com/btbd/CEVerify): is a tool to help
verify CloudEvents according to the proper specifications. It is currently
being hosted publicly [here](http://soaphub.org/ceverify).
- [Cloud Shapes](https://github.com/neuroglia-io/cloud-shapes): Cloud Shapes is
an event-driven database designed for real-time projection materialization
based on CloudEvents.
- [Cloud Streams](https://github.com/neuroglia-io/cloud-streams): Cloud Streams
is a cloud-native tool that empowers users to capture and process CloudEvents
in real-time, enabling event-driven architectures that are both scalable and
efficient.
- [Gloo](https://github.com/solo-io/gloo): is a function gateway built on top of
[Envoy Proxy](https://envoyproxy.io/) by [Solo.io](https://www.solo.io/) that
supports CloudEvents.
- [Knative Eventing](https://knative.dev) implements CloudEvents based sources
and event delivery abstractions build on top of Kubernetes.
- [Microcks](https://microcks.io/) is a Cloud Native Computing [Sandbox project](https://landscape.cncf.io/?selected=microcks) dedicated to API Mocking and Testing. See our CloudEvents integration 👉 [Simulating CloudEvents with AsyncAPI and Microcks](https://microcks.io/blog/simulating-cloudevents-with-asyncapi/).
- [VMware Event Broker Appliance](https://vmweventbroker.io) enables event
- [Microcks](https://microcks.io/) is a Cloud Native Computing
[Sandbox project](https://landscape.cncf.io/?selected=microcks) dedicated to
API Mocking and Testing. See our CloudEvents integration 👉
[Simulating CloudEvents with AsyncAPI and Microcks](https://microcks.io/blog/simulating-cloudevents-with-asyncapi/).
- [Synapse](https://github.com/serverlessworkflow/synapse): Synapse is a
vendor-neutral, open-source, and community-driven Workflow Management System
(WFMS) designed to implement the
[Serverless Workflow specification](https://github.com/serverlessworkflow/specification).
- [VMware Event Broker Appliance](https://github.com/vmware-samples/vcenter-event-broker-appliance) enables event
driven workflows from vCenter Server Events.

View File

@ -42,7 +42,7 @@ CloudEvents 是一个以通用格式来描述事件数据的规范。它提供
| |
| **附加文件:** |
| CloudEvents 入门文档 | [v1.0.2](https://github.com/cloudevents/spec/blob/v1.0.2/cloudevents/primer.md) | [WIP](../../cloudevents/languages/zh-CN/primer.md) |
| [CloudEvents 适配器](../../cloudevents/languages/zh-CN/adapters.md) | - | [无版本工作草案](../../cloudevents/languages/zh-CN/adapters.md) |
| [CloudEvents 适配器](../../cloudevents/languages/zh-CN/adapters/README.md) | - | [无版本工作草案](../../cloudevents/languages/zh-CN/adapters/README.md) |
| [CloudEvents SDK 必要条件](../../cloudevents/languages/zh-CN/SDK.md) | - | [无版本工作草案](../../cloudevents/languages/zh-CN/SDK.md) |
| [记录的扩展属性](../../cloudevents/languages/zh-CN/extensions/README.md) | - | [无版本工作草案](../../cloudevents/languages/zh-CN/extensions/README.md) |
| [专有规范](../../cloudevents/languages/zh-CN/proprietary-specs.md) | - | [无版本工作草案](../../cloudevents/languages/zh-CN/proprietary-specs.md) |
@ -59,7 +59,7 @@ CloudEvents 是一个以通用格式来描述事件数据的规范。它提供
如果你初次接触 CloudEvents 并且希望对它有全面认识,可以阅读[入门文档](../../cloudevents/languages/zh-CN/primer.md)了解 CloudEvents 规范的目标和设计理念。
如果你希望快速了解并使用 CloudEvents ,可以直接阅读[核心规范](../../cloudevents/languages/zh-CN/spec.md)。
由于并非所有事件生产者都默认生产符合CloudEvents规范的事件因此可以用[CloudEvents适配器](../../cloudevents/languages/zh-CN/adapters.md)
由于并非所有事件生产者都默认生产符合CloudEvents规范的事件因此可以用[CloudEvents适配器](../../cloudevents/languages/zh-CN/adapters/README.md)
来将现有的事件与CloudEvents做适配。
## SDKs

View File

@ -388,7 +388,7 @@ Each subscription is represented by an object that has the following properties:
- Examples:
- `https://example.com/event-processor`
##### sinkCredential
##### sinkcredential
- Type: Map of attributes
- Description: A set of settings carrying credential information that
@ -643,7 +643,7 @@ settings. All other settings SHOULD be supported.
A sink credential provides authentication or authorization information necessary
to enable delivery of events to a target.
##### credentialType
##### credentialtype
- Type: `String`
- Description: Identifier of a credential type. The predefined types are "PLAIN",
@ -662,7 +662,7 @@ to enable delivery of events to a target.
username.
- Constraints:
- REQUIRED for credentialType="PLAIN"
- REQUIRED for credentialtype="PLAIN"
##### secret
@ -671,45 +671,45 @@ to enable delivery of events to a target.
passphrase or key.
- Constraints:
- REQUIRED for credentialType="PLAIN"
- REQUIRED for credentialtype="PLAIN"
- SHOULD NOT be returned during enumeration or retrieval
##### accessToken
##### accesstoken
- Type: String
- Description: An access token is a previously acquired token granting access to
the target resource.
- Constraints:
- REQUIRED for credentialType="ACCESSTOKEN" and credentialType="REFRESHTOKEN"
- REQUIRED for credentialtype="ACCESSTOKEN" and credentialtype="REFRESHTOKEN"
- SHOULD NOT be returned during enumeration or retrieval
##### accessTokenExpiresUtc
##### accesstokenexpiresutc
- Type: Timestamp
- Description: An absolute UTC instant at which the token SHALL be considered
expired.
- Constraints:
- REQUIRED for credentialType="ACCESSTOKEN" and credentialType="REFRESHTOKEN"
- REQUIRED for credentialtype="ACCESSTOKEN" and credentialtype="REFRESHTOKEN"
##### accessTokenType
##### accesstokentype
- Type: String
- Description: Type of the access token (See [OAuth 2.0](https://tools.ietf.org/html/rfc6749#section-7.1)).
- Constraints:
- REQUIRED for credentialType="ACCESSTOKEN" and credentialType="REFRESHTOKEN"
- REQUIRED for credentialtype="ACCESSTOKEN" and credentialtype="REFRESHTOKEN"
##### refreshToken
##### refreshtoken
- Type: String
- Description: A refresh token credential used to acquire access tokens.
- Constraints:
- REQUIRED for credentialType="REFRESHTOKEN"
- REQUIRED for credentialtype="REFRESHTOKEN"
##### refreshTokenEndpoint
##### refreshtokenendpoint
- Type: String
- Description: A URL at which the refresh token can be traded for an access
@ -720,7 +720,7 @@ to enable delivery of events to a target.
endpoint MUST be authorized. The credentials for this authorization
relationship, which exists between the delivery service managed by the
subscription API and the refresh endpoint, are out of scope for this
specification. The sinkCredentials represent the authorization relationship
specification. The sinkcredentials represent the authorization relationship
between the subscriber and the delivery target it points the subscription to.
#### 3.2.4 Filters

View File

@ -139,7 +139,7 @@ components:
properties:
protocol:
$ref: "#/components/schemas/Protocol"
protocolSettings:
protocolsettings:
oneOf:
- $ref: "#/components/schemas/ProtocolSettings"
- $ref: "#/components/schemas/AMQPSettings"
@ -152,7 +152,7 @@ components:
format: url
description: REQUIRED. The address to which events shall be delivered using the selected protocol.
example: "https://endpoint.example.com/webhook"
sinkCredential:
sinkcredential:
oneOf:
- $ref: "#/components/schemas/SinkCredential"
- $ref: "#/components/schemas/AccessTokenCredential"
@ -365,7 +365,7 @@ components:
allOf:
- $ref: "#/components/schemas/ProtocolSettings"
- properties:
topicName:
topicname:
type: string
qos:
type: integer
@ -375,10 +375,10 @@ components:
expiry:
type: integer
format: int32
userProperties:
userproperties:
type: object
required:
- topicName
- topicname
AMQPSettings:
type: object
allOf:
@ -388,7 +388,7 @@ components:
type: string
linkName:
type: string
senderSettlementMode:
sendersettlementmode:
type: string
enum: ["settled", "unsettled"]
linkProperties:
@ -400,16 +400,16 @@ components:
allOf:
- $ref: "#/components/schemas/ProtocolSettings"
- properties:
topicName:
topicname:
type: string
partitionKeyExtractor:
partitionkeyextractor:
type: string
clientId:
clientid:
type: string
ackMode:
ackmode:
type: integer
required:
- topicName
- topicname
NATSSettings:
type: object
allOf:
@ -422,7 +422,7 @@ components:
SinkCredential:
type: object
properties:
credentialType:
credentialtype:
type: string
enum: ["PLAIN", "ACCESSTOKEN", "REFRESHTOKEN"]
description: "The type of the credential."
@ -445,20 +445,20 @@ components:
allOf:
- $ref: "#/components/schemas/SinkCredential"
- properties:
accessToken:
accesstoken:
description: REQUIRED. An access token is a previously acquired token granting access to the target resource.
type: string
accessTokenExpiresUtc:
accesstokenexpiresutc:
type: string
format: date-time
description: RECOMMENDED. An absolute UTC instant at which the token shall be considered expired.
accessTokenType:
accesstokentype:
description: OPTIONAL. )Type of the access token (See https://tools.ietf.org/html/rfc6749#section-7.1).
type: string
default: bearer
required:
- accessToken
- accessTokenExpiresUtc
- accesstoken
- accesstokenexpiresutc
RefreshTokenCredential:
type: object
description: An access token credential with a refresh token.
@ -466,13 +466,13 @@ components:
- $ref: "#/components/schemas/AccessTokenCredential"
- type: object
properties:
refreshToken:
refreshtoken:
description: REQUIRED. An refresh token credential used to acquire access tokens.
type: string
refreshTokenEndpoint:
refreshtokenendpoint:
type: string
format: uri
description: REQUIRED. A URL at which the refresh token can be traded for an access token.
required:
- refreshToken
- refreshTokenEndpoint
- refreshtoken
- refreshtokenendpoint

View File

@ -7,7 +7,7 @@ verify:
@python tools/verify.py .
test_tools:
@pytest tools/
@pytest -o asyncio_mode=auto -o asyncio_default_fixture_loop_scope=function tools/
docker:
@docker run -ti -v $(PWD):/tmp/spec -w /tmp/spec python:latest \

View File

@ -29,6 +29,7 @@ _TOOLS_DIR = Path(__file__).parent
_REPO_ROOT = _TOOLS_DIR.parent
_FAKE_DOCS_DIR = Path(__file__).parent / "fake-docs"
_FAKE_DOCS = set(_FAKE_DOCS_DIR.rglob("**/*"))
_FAKE_DOCS.update(Path(".github").rglob("**/*"))
_LANGUAGES_DIR_NAME = "languages"
_ROOT_LANGUAGES_DIR = _REPO_ROOT / _LANGUAGES_DIR_NAME
@ -125,7 +126,7 @@ def _skip_type(text: str) -> Optional[str]:
def _find_all_uris(html: HtmlText) -> Iterable[Uri]:
for a in _html_parser(html).findAll("a"):
for a in _html_parser(html).find_all("a"):
uri = a.get("href")
if uri:
yield Uri(uri.strip())
@ -135,6 +136,7 @@ async def _uri_availability_issues(uri: HttpUri, settings: Settings) -> Sequence
if "example.com" in uri: return []
if "ietf.org" in uri: return []
if "rfc-edit.org" in uri: return []
if "iso20022.org" in uri: return []
try:
for attempt in Retrying(stop=stop_after_attempt(settings.http_max_get_attemps)):
@ -142,7 +144,7 @@ async def _uri_availability_issues(uri: HttpUri, settings: Settings) -> Sequence
async with ClientSession() as session:
with closing(
await session.get(
uri, timeout=settings.http_timeout_seconds, ssl=False
uri, timeout=settings.http_timeout_seconds, ssl=False, max_field_size=81900
)
) as response:
match response.status:
@ -151,7 +153,8 @@ async def _uri_availability_issues(uri: HttpUri, settings: Settings) -> Sequence
case _:
return [] # no issues
except Exception: # noqa
except Exception as e: # noqa
print(f"Exception: {e}")
return [Issue(f"Could Not access {repr(uri)}")]
else:
return []