Compare commits

...

24 Commits
1.7.0 ... main

Author SHA1 Message Date
Yurii Serhiichuk a38933d7ab
Drop EOL Python 3.8 support (#249)
* chore: add missing changelog items

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* chore: drop Python 3.8 support

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* chore: add a changelog item on Python 3.8 removal

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* chore: remove mypy-constrains reference as we don't need it anymore

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* chore: Update pre-commit check versions.

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* chore: fix isort pre-commit

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* chore: Use Python 3.12 as base version

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

---------

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2025-06-02 14:58:00 -04:00
Hal Blackburn 37ae369ced
Improve public API type annotations & fix unit test type errors (#248)
* chore: improve typing of functions returning AnyCloudEvent

kafka.conversion.from_binary() and from_structured() return
AnyCloudEvent type var according to their event_type argument, but when
event_type is None, type checkers cannot infer the return type. We now
use an overload to declare that the return type is http.CloudEvent when
event_type is None.

Previously users had to explicitly annotate this type when calling
without event_type. This happens quite a lot in this repo's
test_kafka_conversions.py — this fixes quite a few type errors like:

> error: Need type annotation for "result"  [var-annotated]

Signed-off-by: Hal Blackburn <hwtb2@cam.ac.uk>

* chore: type v1.Event chainable Set*() methods

The v1.Event self-returning Set*() methods like SetData() were returning
BaseEvent, which doesn't declare the same Set* methods. As a result,
chaining more than one Set* method would make the return type unknown.

This was causing type errors in test_event_pipeline.py.

The Set*() methods now return the Self type.

Signed-off-by: Hal Blackburn <hwtb2@cam.ac.uk>

* chore: fix type errors in tests

mypy was failing with lots of type errors in test modules. I've not
annotated all fixtures, mostly fixed existing type errors.

Signed-off-by: Hal Blackburn <hwtb2@cam.ac.uk>

* chore: allow non-dict headers types in from_http()

from_http() conversion function was requiring its headers argument to
be a typing.Dict, which makes it incompatible with headers types of http
libraries, which support features like multiple values per key.
typing.Mapping and even _typeshed.SupportsItems do not cover these
types. For example,
samples/http-image-cloudevents/image_sample_server.py was failing to
type check where it calls `from_http(request.headers, ...)`.

To support these kind of headers types in from_http(), we now define our
own SupportsDuplicateItems protocol, which is broader than
_typeshed.SupportsItems.

I've only applied this to from_http(), as typing.Mapping is OK for most
other methods that accept dict-like objects, and using this more lenient
interface everywhere would impose restrictions on our implementation,
even though it might be more flexible for users.

Signed-off-by: Hal Blackburn <hwtb2@cam.ac.uk>

* build: run mypy via tox

Tox now runs mypy on cloudevents itself, and the samples.

Signed-off-by: Hal Blackburn <hwtb2@cam.ac.uk>

* build(ci): run mypy in CI alongside linting

Signed-off-by: Hal Blackburn <hwtb2@cam.ac.uk>

* chore: fix minor mypy type complaint in samples

Signed-off-by: Hal Blackburn <hwtb2@cam.ac.uk>

* feat: use Mapping, not Dict for input arguments

Mapping imposes less restrictions on callers, because it's read-only and
allows non-dict types to be passed without copying them as dict(), or
passing dict-like values and ignoring the resulting type error.

Signed-off-by: Hal Blackburn <hwtb2@cam.ac.uk>

* chore: fix tests on py3.8

Tests were failing because the sanic dependency dropped support for
py3.8 in its current release. sanic is now pinned to the last compatible
version for py3.8 only.

Signed-off-by: Hal Blackburn <hwtb2@cam.ac.uk>

* feat: support new model_validate_json() kwargs

Pydantic added by_alias and by_name keyword arguments to
BaseModel.model_validate_json in 2.11.1:

acb0f10fda

This caused mypy to report that that the Pydantic v2 CloudEvent did not
override model_validate_json() correctly. Our override now accepts these
newly-added arguments. They have no effect, as the implementation does
not use Pydantic to validate the JSON, but we also don't use field
aliases, so the only effect they could have in the superclass would be
to raise an error if they're both False.

Signed-off-by: Hal Blackburn <hwtb2@cam.ac.uk>

* chore: accept Mapping as well as SupportsDuplicateItems

Although our types.SupportsDuplicateItems type is wider than Dict and
Mapping, it's not a familar type to users, so explicitly accepting
Mapping in the from_http() functions should make it more clear to users
that a dict-like object is required for the headers argument.

Signed-off-by: Hal Blackburn <hwtb2@cam.ac.uk>

* chore: constrain deps to maintain py 3.8 support

Python 3.8 is unsupported and dependencies (such as pydantic) are now
shipping releases that fail to type check with mypy running in 3.8
compatibility mode. We run mypy in py 3.8 compatibility mode, so the
mypy tox environments must only use deps that support 3.8. And unit
tests run by py 3.8 must only use deps that support 3.8.

To constrain the deps for 3.8 support, we use two constraint files, one
for general environments that only constrains the dependencies that
python 3.8 interpreters use, and another for mypy that constraints the
dependencies that all interpreters use.

Signed-off-by: Hal Blackburn <hwtb2@cam.ac.uk>

---------

Signed-off-by: Hal Blackburn <hwtb2@cam.ac.uk>
2025-05-23 22:26:18 +03:00
Yurii Serhiichuk c5645d8fcf
chpre: disable attestations while we're not using trusted publishing (#243)
Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>
2024-11-09 20:27:52 +02:00
Yurii Serhiichuk 96cfaa6529
chore: release 1.11.1 (#241)
Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>
2024-10-30 11:54:36 +02:00
Christoph Hösler efca352e21
fix kafka unmarshaller args typing and defaults (#240)
* fix kafka unmarshaller args typing and defaults

Signed-off-by: Christoph Hösler <christoph.hoesler@inovex.de>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

---------

Signed-off-by: Christoph Hösler <christoph.hoesler@inovex.de>
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2024-10-30 11:41:03 +02:00
Yurii Serhiichuk c6c7e8c2f9
Release/v1.11.0 (#237)
* Add missing changelog items

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* Bump version

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

---------

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>
2024-06-20 09:31:13 +03:00
Vivian 16441d79f4
Modified content-type to abide by attribute naming conventions for cloudevents (#232)
* fix: changed content-type to a valid attribute

Signed-off-by: vivjd <vivjdeng@hotmail.com>

* fix: changed headers back to content-type

Signed-off-by: Vivian <118199397+vivjd@users.noreply.github.com>
Signed-off-by: vivjd <vivjdeng@hotmail.com>

* modified kafka test cases to match datacontenttype

Signed-off-by: vivjd <vivjdeng@hotmail.com>

* fix: updated kafka/conversion.py and test cases to check for valid attributes

Signed-off-by: vivjd <vivjdeng@hotmail.com>

---------

Signed-off-by: vivjd <vivjdeng@hotmail.com>
Signed-off-by: Vivian <118199397+vivjd@users.noreply.github.com>
Co-authored-by: Yurii Serhiichuk <xSAVIKx@users.noreply.github.com>
2024-05-26 21:56:16 +03:00
Fábio D. Batista 11520e35e1
Pydantic v2 (#235)
* Fixes examples when using Pydantic V2

Signed-off-by: Fabio Batista <fabio@atelie.dev.br>

* When type checking, uses the latest (V2) version of Pydantic

Signed-off-by: Fabio Batista <fabio@atelie.dev.br>

---------

Signed-off-by: Fabio Batista <fabio@atelie.dev.br>
Co-authored-by: Yurii Serhiichuk <xSAVIKx@users.noreply.github.com>
2024-05-26 21:51:36 +03:00
Yurii Serhiichuk eedc61e9b0
Update CI and tooling (#236)
* Update pre-commit hooks

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* Add Python 3.12

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* Drop python 3.7 and add 3.12 to TOX

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* Migrate to latest action versions. Drop v3.7 from CI and add 3.12

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* Migrate to Python 3.8

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* Fix changelog message.

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

---------

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>
2024-05-26 21:49:35 +03:00
Yurii Serhiichuk 21572afb57
Fix Pydantic custom attributes (#229)
* Add custom extension attribute to the test set.

Replicates bug test data from the https://github.com/cloudevents/sdk-python/issues/228

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* use modern `super` syntax

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* Fix `black` language version

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* Fixes https://github.com/cloudevents/sdk-python/issues/228

Pydantic v2 .__dict__ has different behavior from what Pydantic v1 had and is not giving us `extra` fields anymore. On the other hand the iterator over the event gives us extras as well

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* Add missing EOF

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* Add Pydantic fix to the changelog

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* Add links to the changelog

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* Bump version

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* Update Black and MyPy versions

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

---------

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>
2023-10-30 06:44:36 +01:00
pre-commit-ci[bot] 8ada7d947b
[pre-commit.ci] pre-commit autoupdate (#224)
updates:
- [github.com/pre-commit/pre-commit-hooks: v4.4.0 → v4.5.0](https://github.com/pre-commit/pre-commit-hooks/compare/v4.4.0...v4.5.0)
- [github.com/pre-commit/mirrors-mypy: v1.5.1 → v1.6.0](https://github.com/pre-commit/mirrors-mypy/compare/v1.5.1...v1.6.0)

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2023-10-23 15:24:12 +03:00
Doug Davis c5418b99a0
add link to our security mailing list (#226)
Signed-off-by: Doug Davis <dug@microsoft.com>
2023-10-16 19:14:38 +03:00
Yurii Serhiichuk d4873037e2
Release/v1.10.0 (#223)
* Bump version

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* Update changelog

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

---------

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>
2023-09-25 08:00:00 -06:00
pre-commit-ci[bot] 66dcabb254
[pre-commit.ci] pre-commit autoupdate (#220)
updates:
- [github.com/psf/black: 23.7.0 → 23.9.1](https://github.com/psf/black/compare/23.7.0...23.9.1)

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Co-authored-by: Yurii Serhiichuk <xSAVIKx@users.noreply.github.com>
2023-09-25 12:29:56 +03:00
Doug Davis 252efdbbce
Governance docs per CE PR 1226 (#221)
Signed-off-by: Doug Davis <dug@microsoft.com>
2023-09-21 22:59:54 +03:00
Federico Busetti 5a1063e50d
Pydantic v2 native implementation (#219)
* Create stub pydantic v2 implementation and parametrize tests for both implementations

Signed-off-by: Federico Busetti <729029+febus982@users.noreply.github.com>

* Add default values to optional fields

Signed-off-by: Federico Busetti <729029+febus982@users.noreply.github.com>

* Adapt pydantic v1 serializer/deserializer logic

Signed-off-by: Federico Busetti <729029+febus982@users.noreply.github.com>

* Extract CloudEvent fields non functional data in separate module

Signed-off-by: Federico Busetti <729029+febus982@users.noreply.github.com>

* Fix lint

Signed-off-by: Federico Busetti <729029+febus982@users.noreply.github.com>

* Add missing Copyright

Signed-off-by: Federico Busetti <729029+febus982@users.noreply.github.com>

* Add missing docstring

Signed-off-by: Federico Busetti <729029+febus982@users.noreply.github.com>

* Remove test leftover

Signed-off-by: Federico Busetti <729029+febus982@users.noreply.github.com>

* Remove dependency on HTTP CloudEvent implementation

Signed-off-by: Federico Busetti <729029+febus982@users.noreply.github.com>

* Remove failing test for unsupported scenario

Fix typo

Signed-off-by: Federico Busetti <729029+febus982@users.noreply.github.com>

* Use SDK json serialization logic

Signed-off-by: Federico Busetti <729029+febus982@users.noreply.github.com>

* No need to filter base64_data

Signed-off-by: Federico Busetti <729029+febus982@users.noreply.github.com>

* Use SDK json deserialization logic

Signed-off-by: Federico Busetti <729029+febus982@users.noreply.github.com>

* Fix imports

Signed-off-by: Federico Busetti <729029+febus982@users.noreply.github.com>

* Move docs after field declarations

Signed-off-by: Federico Busetti <729029+febus982@users.noreply.github.com>

* Add test for model_validate_json method

Signed-off-by: Federico Busetti <729029+febus982@users.noreply.github.com>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* Use fully qualified imports

Signed-off-by: Federico Busetti <729029+febus982@users.noreply.github.com>

* Ignore typing error

Signed-off-by: Federico Busetti <729029+febus982@users.noreply.github.com>

---------

Signed-off-by: Federico Busetti <729029+febus982@users.noreply.github.com>
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2023-09-20 22:59:13 +03:00
pre-commit-ci[bot] e5f76ed14c
[pre-commit.ci] pre-commit autoupdate (#212)
updates:
- [github.com/psf/black: 23.3.0 → 23.7.0](https://github.com/psf/black/compare/23.3.0...23.7.0)
- [github.com/pre-commit/mirrors-mypy: v1.2.0 → v1.5.1](https://github.com/pre-commit/mirrors-mypy/compare/v1.2.0...v1.5.1)

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Co-authored-by: Yurii Serhiichuk <xSAVIKx@users.noreply.github.com>
2023-08-28 20:29:25 +03:00
Federico Busetti 739c71e0b7
Adds a pydantic V2 compatibility layer (#218)
* feat: Pydantic V2 compatibility layer

Signed-off-by: Federico Busetti <729029+febus982@users.noreply.github.com>

* Ignore incompatible import

Signed-off-by: Federico Busetti <729029+febus982@users.noreply.github.com>

---------

Signed-off-by: Federico Busetti <729029+febus982@users.noreply.github.com>
2023-08-28 20:09:53 +03:00
pre-commit-ci[bot] 8104ce1b68
[pre-commit.ci] pre-commit autoupdate (#205)
* [pre-commit.ci] pre-commit autoupdate

updates:
- [github.com/pycqa/isort: 5.11.4 → 5.12.0](https://github.com/pycqa/isort/compare/5.11.4...5.12.0)
- [github.com/psf/black: 22.12.0 → 23.3.0](https://github.com/psf/black/compare/22.12.0...23.3.0)
- [github.com/pre-commit/mirrors-mypy: v0.991 → v1.2.0](https://github.com/pre-commit/mirrors-mypy/compare/v0.991...v1.2.0)

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

---------

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2023-05-14 20:53:02 +03:00
Yurii Serhiichuk ef982743b6
Add Python 3.11 support (#209)
* docs: add missing release notes

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* chore: add Python3.11 support

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* chore: Bump version

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* docs: create release section

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>
2023-01-04 11:33:33 -07:00
Yurii Serhiichuk 5e00c4f41f
Introduce typings (#207)
* chore: Add pre-commit hook

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* chore: address typing issues

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* chore: add py.typed meta

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* Add Pydantic plugin

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* Add Pydantic dependency

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* Add MyPy best practices configs

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* Add deprecation MyPy ignore

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* chore: more typing fixes

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* chore: more typings and explicit optionals

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* Use lowest-supported Python version

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* chore: Fix silly `dict` and other MyPy-related issues.

We're now explicitly ensuring codebase supports Python3.7+

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* chore: ignore typing limitation

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* chore: `not` with `dict` returns `false` for an empty dict, so use `is None` check

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* deps: Update hooks

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* chore: Make sure only non-callable unmarshallers are flagged

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* chore: Have some coverage slack

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* deps: bump pre-commit-hooks

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* ci: make sure py.typed is included into the bundle

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* docs: improve setup.py setup and add missing package metadata

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>
2023-01-04 08:29:41 -07:00
Yurii Serhiichuk a02864eaab
Drop python36 (#208)
* chore: drop Python 3.6 official support

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* docs: update docs regarding Python 3.6 being unsupported anymore

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* deps: drop Python3.6-only dependencies

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* chore: drop extra `;`

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* chore: try `setup.py` syntax

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>
2022-12-09 07:26:30 -07:00
Yurii Serhiichuk 119264cdfe
hotfix: Hotfix Pydantic dependency constraints. (#204)
* hotfix: Hotfix Pydantic dependency constraints.

docs: Add mention of the constraints fix

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

chore: bump version

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

fix: PyPi constraints for Pydantic

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

ci: add ability to release from tag branches.

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* docs: add missing links

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* docs: fix release 1.6.3 link

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>
2022-11-22 08:03:03 -07:00
Yurii Serhiichuk 81f07b6d9f
ci: refine publishing WF (#202)
* ci: update CI workflow to use `buildwheel` action.

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* docs: Add pipeline change to the changelog

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* chore: temporary add ability to build on PRs.

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* ci: Do not try using cibuildwheels

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* docs: Update changelog

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* ci: don't build on PRs

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* ci: don't fetch repo history on publish

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2022-11-21 07:20:09 -07:00
58 changed files with 1837 additions and 856 deletions

View File

@ -7,28 +7,28 @@ jobs:
lint:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
- name: Setup Python
uses: actions/setup-python@v4
uses: actions/setup-python@v5
with:
python-version: '3.10'
python-version: '3.12'
cache: 'pip'
cache-dependency-path: 'requirements/*.txt'
- name: Install dev dependencies
run: python -m pip install -r requirements/dev.txt
- name: Run linting
run: python -m tox -e lint
run: python -m tox -e lint,mypy,mypy-samples-image,mypy-samples-json
test:
strategy:
matrix:
python: ['3.6', '3.7', '3.8', '3.9', '3.10']
python: ['3.9', '3.10', '3.11', '3.12', '3.13']
os: [ubuntu-latest, windows-latest, macos-latest]
runs-on: ${{ matrix.os }}
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
- name: Setup Python
uses: actions/setup-python@v4
uses: actions/setup-python@v5
with:
python-version: ${{ matrix.python }}
cache: 'pip'

View File

@ -1,28 +1,58 @@
name: PyPI-Release
on:
workflow_dispatch:
push:
branches:
- main
- 'tag/v**'
jobs:
build-and-publish:
build_dist:
name: Build source distribution
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Set up Python
uses: actions/setup-python@v4
- uses: actions/checkout@v4
with:
python-version: "3.10"
fetch-depth: 0
- name: Build SDist and wheel
run: pipx run build
- uses: actions/upload-artifact@v4
with:
name: artifact
path: dist/*
- name: Check metadata
run: pipx run twine check dist/*
publish:
runs-on: ubuntu-latest
if: github.event_name == 'push'
needs: [ build_dist ]
steps:
- uses: actions/checkout@v4
with:
fetch-depth: 0
- name: Set up Python
uses: actions/setup-python@v5
with:
python-version: "3.12"
cache: 'pip'
- name: Install build dependencies
run: pip install -U setuptools wheel build
- name: Build
run: python -m build .
- uses: actions/download-artifact@v4
with:
# unpacks default artifact into dist/
# if `name: artifact` is omitted, the action will create extra parent dir
name: artifact
path: dist
- name: Publish
uses: pypa/gh-action-pypi-publish@release/v1
with:
user: __token__
password: ${{ secrets.pypi_password }}
attestations: false
- name: Install GitPython and cloudevents for pypi_packaging
run: pip install -U -r requirements/publish.txt
- name: Create Tag

View File

@ -1,17 +1,27 @@
repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v4.3.0
rev: v5.0.0
hooks:
- id: trailing-whitespace
- id: end-of-file-fixer
- id: check-toml
- repo: https://github.com/pycqa/isort
rev: 5.10.1
rev: 6.0.1
hooks:
- id: isort
args: [ "--profile", "black", "--filter-files" ]
- repo: https://github.com/psf/black
rev: 22.10.0
rev: 25.1.0
hooks:
- id: black
language_version: python3.10
language_version: python3.11
- repo: https://github.com/pre-commit/mirrors-mypy
rev: v1.16.0
hooks:
- id: mypy
files: ^(cloudevents/)
exclude: ^(cloudevents/tests/)
types: [ python ]
args: [ ]
additional_dependencies:
- "pydantic~=2.7"

View File

@ -6,11 +6,66 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
## [Unreleased]
## [1.12.0]
### Changed
- Dropped Python3.8 support while it has reached EOL. ([])
## [1.11.1]
### Fixed
- Kafka `conversion` marshaller and unmarshaller typings ([#240])
- Improved public API type annotations and fixed unit test type errors ([#248])
## [1.11.0]
### Fixed
- Pydantic v2 `examples` keyword usage and improved typings handling ([#235])
- Kafka `to_binary` check for invalid `content-type` attribute ([#232])
### Changed
- Dropped Python3.7 from CI while its EOL. ([#236])
## [1.10.1]
### Fixed
- Fixed Pydantic v2 `to_json` (and `to_structured`) conversion ([#229])
## [1.10.0] — 2023-09-25
### Added
- Pydantic v2 support. ([#219])
- Pydantic v2 to v1 compatibility layer. ([#218])
- Governance docs per main CE discussions. ([#221])
## [1.9.0] — 2023-01-04
### Added
- Added typings to the codebase. ([#207])
- Added Python3.11 support. ([#209])
## [1.8.0] — 2022-12-08
### Changed
- Dropped support of Python 3.6 that has reached EOL almost a year ago.
[v1.7.1](https://pypi.org/project/cloudevents/1.7.1/) is the last
one to support Python 3.6 ([#208])
## [1.7.1] — 2022-11-21
### Fixed
- Fixed Pydantic extras dependency constraint (backport of v1.6.3, [#204])
### Changed
- Refined build and publishing process. Added SDist to the released package ([#202])
## [1.7.0] — 2022-11-17
### Added
- Added [Kafka](https://github.com/cloudevents/spec/blob/v1.0.2/cloudevents/bindings/kafka-protocol-binding.md)
support ([#197], thanks [David Martines](https://github.com/davidwmartines))
## [1.6.3] — 2022-11-21
### Fixed
- Fixed Pydantic extras dependency constraint ([#204])
## [1.6.2] — 2022-10-18
### Added
- Added `get_attributes` API to the `CloudEvent` API. The method returns a read-only
@ -157,7 +212,14 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
### Added
- Initial release
[1.11.0]: https://github.com/cloudevents/sdk-python/compare/1.10.1...1.11.0
[1.10.1]: https://github.com/cloudevents/sdk-python/compare/1.10.0...1.10.1
[1.10.0]: https://github.com/cloudevents/sdk-python/compare/1.9.0...1.10.0
[1.9.0]: https://github.com/cloudevents/sdk-python/compare/1.8.0...1.9.0
[1.8.0]: https://github.com/cloudevents/sdk-python/compare/1.7.0...1.8.0
[1.7.1]: https://github.com/cloudevents/sdk-python/compare/1.7.0...1.7.1
[1.7.0]: https://github.com/cloudevents/sdk-python/compare/1.6.0...1.7.0
[1.6.3]: https://github.com/cloudevents/sdk-python/compare/1.6.2...1.6.3
[1.6.2]: https://github.com/cloudevents/sdk-python/compare/1.6.1...1.6.2
[1.6.1]: https://github.com/cloudevents/sdk-python/compare/1.6.0...1.6.1
[1.6.0]: https://github.com/cloudevents/sdk-python/compare/1.5.0...1.6.0
@ -225,3 +287,17 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
[#191]: https://github.com/cloudevents/sdk-python/pull/191
[#195]: https://github.com/cloudevents/sdk-python/pull/195
[#197]: https://github.com/cloudevents/sdk-python/pull/197
[#202]: https://github.com/cloudevents/sdk-python/pull/202
[#204]: https://github.com/cloudevents/sdk-python/pull/204
[#207]: https://github.com/cloudevents/sdk-python/pull/207
[#208]: https://github.com/cloudevents/sdk-python/pull/208
[#209]: https://github.com/cloudevents/sdk-python/pull/209
[#218]: https://github.com/cloudevents/sdk-python/pull/218
[#219]: https://github.com/cloudevents/sdk-python/pull/219
[#221]: https://github.com/cloudevents/sdk-python/pull/221
[#229]: https://github.com/cloudevents/sdk-python/pull/229
[#232]: https://github.com/cloudevents/sdk-python/pull/232
[#235]: https://github.com/cloudevents/sdk-python/pull/235
[#236]: https://github.com/cloudevents/sdk-python/pull/236
[#240]: https://github.com/cloudevents/sdk-python/pull/240
[#248]: https://github.com/cloudevents/sdk-python/pull/248

9
MAINTAINERS.md Normal file
View File

@ -0,0 +1,9 @@
# Maintainers
Current active maintainers of this SDK:
- [Grant Timmerman](https://github.com/grant)
- [Denys Makogon](https://github.com/denismakogon)
- [Curtis Mason](https://github.com/cumason123)
- [Claudio Canales](https://github.com/Klaudioz)
- [Yurii Serhiichuk](https://github.com/xSAVIKx)

4
MANIFEST.in Normal file
View File

@ -0,0 +1,4 @@
include README.md
include CHANGELOG.md
include LICENSE
include cloudevents/py.typed

6
OWNERS
View File

@ -1,6 +0,0 @@
admins:
- grant
- denismakogon
- cumason123
- Klaudioz
- xSAVIKx

View File

@ -149,6 +149,17 @@ for how PR reviews and approval, and our
[Code of Conduct](https://github.com/cloudevents/spec/blob/main/docs/GOVERNANCE.md#additional-information)
information.
If there is a security concern with one of the CloudEvents specifications, or
with one of the project's SDKs, please send an email to
[cncf-cloudevents-security@lists.cncf.io](mailto:cncf-cloudevents-security@lists.cncf.io).
## Additional SDK Resources
- [List of current active maintainers](MAINTAINERS.md)
- [How to contribute to the project](CONTRIBUTING.md)
- [SDK's License](LICENSE)
- [SDK's Release process](RELEASING.md)
## Maintenance
We use [black][black] and [isort][isort] for autoformatting. We set up a [tox][tox]

View File

@ -12,4 +12,4 @@
# License for the specific language governing permissions and limitations
# under the License.
__version__ = "1.7.0"
__version__ = "1.12.0"

View File

@ -14,4 +14,4 @@
from cloudevents.abstract.event import AnyCloudEvent, CloudEvent
__all__ = [AnyCloudEvent, CloudEvent]
__all__ = ["AnyCloudEvent", "CloudEvent"]

View File

@ -17,6 +17,8 @@ from abc import abstractmethod
from types import MappingProxyType
from typing import Mapping
AnyCloudEvent = typing.TypeVar("AnyCloudEvent", bound="CloudEvent")
class CloudEvent:
"""
@ -29,10 +31,10 @@ class CloudEvent:
@classmethod
def create(
cls,
attributes: typing.Dict[str, typing.Any],
cls: typing.Type[AnyCloudEvent],
attributes: typing.Mapping[str, typing.Any],
data: typing.Optional[typing.Any],
) -> "AnyCloudEvent":
) -> AnyCloudEvent:
"""
Creates a new instance of the CloudEvent using supplied `attributes`
and `data`.
@ -70,7 +72,7 @@ class CloudEvent:
raise NotImplementedError()
@abstractmethod
def _get_data(self) -> typing.Optional[typing.Any]:
def get_data(self) -> typing.Optional[typing.Any]:
"""
Returns the data of the event.
@ -85,7 +87,7 @@ class CloudEvent:
def __eq__(self, other: typing.Any) -> bool:
if isinstance(other, CloudEvent):
same_data = self._get_data() == other._get_data()
same_data = self.get_data() == other.get_data()
same_attributes = self._get_attributes() == other._get_attributes()
return same_data and same_attributes
return False
@ -140,7 +142,4 @@ class CloudEvent:
return key in self._get_attributes()
def __repr__(self) -> str:
return str({"attributes": self._get_attributes(), "data": self._get_data()})
AnyCloudEvent = typing.TypeVar("AnyCloudEvent", bound=CloudEvent)
return str({"attributes": self._get_attributes(), "data": self.get_data()})

View File

@ -23,7 +23,7 @@ from cloudevents.sdk.converters import is_binary
from cloudevents.sdk.event import v1, v03
def _best_effort_serialize_to_json(
def _best_effort_serialize_to_json( # type: ignore[no-untyped-def]
value: typing.Any, *args, **kwargs
) -> typing.Optional[typing.Union[bytes, str, typing.Any]]:
"""
@ -43,18 +43,18 @@ def _best_effort_serialize_to_json(
return value
_default_marshaller_by_format = {
_default_marshaller_by_format: typing.Dict[str, types.MarshallerType] = {
converters.TypeStructured: lambda x: x,
converters.TypeBinary: _best_effort_serialize_to_json,
} # type: typing.Dict[str, types.MarshallerType]
}
_obj_by_version = {"1.0": v1.Event, "0.3": v03.Event}
def to_json(
event: AnyCloudEvent,
data_marshaller: types.MarshallerType = None,
) -> typing.Union[str, bytes]:
data_marshaller: typing.Optional[types.MarshallerType] = None,
) -> bytes:
"""
Converts given `event` to a JSON string.
@ -69,7 +69,7 @@ def to_json(
def from_json(
event_type: typing.Type[AnyCloudEvent],
data: typing.Union[str, bytes],
data_unmarshaller: types.UnmarshallerType = None,
data_unmarshaller: typing.Optional[types.UnmarshallerType] = None,
) -> AnyCloudEvent:
"""
Parses JSON string `data` into a CloudEvent.
@ -91,9 +91,11 @@ def from_json(
def from_http(
event_type: typing.Type[AnyCloudEvent],
headers: typing.Dict[str, str],
data: typing.Union[str, bytes, None],
data_unmarshaller: types.UnmarshallerType = None,
headers: typing.Union[
typing.Mapping[str, str], types.SupportsDuplicateItems[str, str]
],
data: typing.Optional[typing.Union[str, bytes]],
data_unmarshaller: typing.Optional[types.UnmarshallerType] = None,
) -> AnyCloudEvent:
"""
Parses CloudEvent `data` and `headers` into an instance of a given `event_type`.
@ -133,14 +135,14 @@ def from_http(
except json.decoder.JSONDecodeError:
raise cloud_exceptions.MissingRequiredFields(
"Failed to read specversion from both headers and data. "
f"The following can not be parsed as json: {data}"
"The following can not be parsed as json: {!r}".format(data)
)
if hasattr(raw_ce, "get"):
specversion = raw_ce.get("specversion", None)
else:
raise cloud_exceptions.MissingRequiredFields(
"Failed to read specversion from both headers and data. "
f"The following deserialized data has no 'get' method: {raw_ce}"
"The following deserialized data has no 'get' method: {}".format(raw_ce)
)
if specversion is None:
@ -152,7 +154,7 @@ def from_http(
if event_handler is None:
raise cloud_exceptions.InvalidRequiredFields(
f"Found invalid specversion {specversion}"
"Found invalid specversion {}".format(specversion)
)
event = marshall.FromRequest(
@ -163,20 +165,19 @@ def from_http(
attrs.pop("extensions", None)
attrs.update(**event.extensions)
result_data: typing.Optional[typing.Any] = event.data
if event.data == "" or event.data == b"":
# TODO: Check binary unmarshallers to debug why setting data to ""
# returns an event with data set to None, but structured will return ""
data = None
else:
data = event.data
return event_type.create(attrs, data)
# returns an event with data set to None, but structured will return ""
result_data = None
return event_type.create(attrs, result_data)
def _to_http(
event: AnyCloudEvent,
format: str = converters.TypeStructured,
data_marshaller: types.MarshallerType = None,
) -> typing.Tuple[dict, typing.Union[bytes, str]]:
data_marshaller: typing.Optional[types.MarshallerType] = None,
) -> typing.Tuple[typing.Dict[str, str], bytes]:
"""
Returns a tuple of HTTP headers/body dicts representing this Cloud Event.
@ -196,7 +197,7 @@ def _to_http(
event_handler = _obj_by_version[event["specversion"]]()
for attribute_name in event:
event_handler.Set(attribute_name, event[attribute_name])
event_handler.data = event.data
event_handler.data = event.get_data()
return marshaller.NewDefaultHTTPMarshaller().ToRequest(
event_handler, format, data_marshaller=data_marshaller
@ -205,8 +206,8 @@ def _to_http(
def to_structured(
event: AnyCloudEvent,
data_marshaller: types.MarshallerType = None,
) -> typing.Tuple[dict, typing.Union[bytes, str]]:
data_marshaller: typing.Optional[types.MarshallerType] = None,
) -> typing.Tuple[typing.Dict[str, str], bytes]:
"""
Returns a tuple of HTTP headers/body dicts representing this Cloud Event.
@ -222,8 +223,8 @@ def to_structured(
def to_binary(
event: AnyCloudEvent, data_marshaller: types.MarshallerType = None
) -> typing.Tuple[dict, typing.Union[bytes, str]]:
event: AnyCloudEvent, data_marshaller: typing.Optional[types.MarshallerType] = None
) -> typing.Tuple[typing.Dict[str, str], bytes]:
"""
Returns a tuple of HTTP headers/body dicts representing this Cloud Event.
@ -261,7 +262,7 @@ def best_effort_encode_attribute_value(value: typing.Any) -> typing.Any:
def from_dict(
event_type: typing.Type[AnyCloudEvent],
event: typing.Dict[str, typing.Any],
event: typing.Mapping[str, typing.Any],
) -> AnyCloudEvent:
"""
Constructs an Event object of a given `event_type` from
@ -287,19 +288,13 @@ def to_dict(event: AnyCloudEvent) -> typing.Dict[str, typing.Any]:
:returns: The canonical dict representation of the event.
"""
result = {attribute_name: event.get(attribute_name) for attribute_name in event}
result["data"] = event.data
result["data"] = event.get_data()
return result
def _json_or_string(
content: typing.Optional[typing.AnyStr],
) -> typing.Optional[
typing.Union[
typing.Dict[typing.Any, typing.Any],
typing.List[typing.Any],
typing.AnyStr,
]
]:
content: typing.Optional[typing.Union[str, bytes]],
) -> typing.Any:
"""
Returns a JSON-decoded dictionary or a list of dictionaries if
a valid JSON string is provided.

View File

@ -25,15 +25,15 @@ from cloudevents.http.http_methods import ( # deprecated
from cloudevents.http.json_methods import to_json # deprecated
__all__ = [
to_binary,
to_structured,
from_json,
from_http,
from_dict,
CloudEvent,
is_binary,
is_structured,
to_binary_http,
to_structured_http,
to_json,
"to_binary",
"to_structured",
"from_json",
"from_http",
"from_dict",
"CloudEvent",
"is_binary",
"is_structured",
"to_binary_http",
"to_structured_http",
"to_json",
]

View File

@ -23,7 +23,7 @@ from cloudevents.sdk import types
def from_json(
data: typing.Union[str, bytes],
data_unmarshaller: types.UnmarshallerType = None,
data_unmarshaller: typing.Optional[types.UnmarshallerType] = None,
) -> CloudEvent:
"""
Parses JSON string `data` into a CloudEvent.
@ -37,9 +37,11 @@ def from_json(
def from_http(
headers: typing.Dict[str, str],
data: typing.Union[str, bytes, None],
data_unmarshaller: types.UnmarshallerType = None,
headers: typing.Union[
typing.Mapping[str, str], types.SupportsDuplicateItems[str, str]
],
data: typing.Optional[typing.Union[str, bytes]],
data_unmarshaller: typing.Optional[types.UnmarshallerType] = None,
) -> CloudEvent:
"""
Parses CloudEvent `data` and `headers` into a CloudEvent`.
@ -58,7 +60,7 @@ def from_http(
def from_dict(
event: typing.Dict[str, typing.Any],
event: typing.Mapping[str, typing.Any],
) -> CloudEvent:
"""
Constructs a CloudEvent from a dict `event` representation.

View File

@ -34,11 +34,13 @@ class CloudEvent(abstract.CloudEvent):
@classmethod
def create(
cls, attributes: typing.Dict[str, typing.Any], data: typing.Optional[typing.Any]
cls,
attributes: typing.Mapping[str, typing.Any],
data: typing.Optional[typing.Any],
) -> "CloudEvent":
return cls(attributes, data)
def __init__(self, attributes: typing.Dict[str, str], data: typing.Any = None):
def __init__(self, attributes: typing.Mapping[str, str], data: typing.Any = None):
"""
Event Constructor
:param attributes: a dict with cloudevent attributes. Minimally
@ -82,7 +84,7 @@ class CloudEvent(abstract.CloudEvent):
def _get_attributes(self) -> typing.Dict[str, typing.Any]:
return self._attributes
def _get_data(self) -> typing.Optional[typing.Any]:
def get_data(self) -> typing.Optional[typing.Any]:
return self.data
def __setitem__(self, key: str, value: typing.Any) -> None:

View File

@ -31,8 +31,8 @@ from cloudevents.sdk import types
details="Use cloudevents.conversion.to_binary function instead",
)
def to_binary(
event: AnyCloudEvent, data_marshaller: types.MarshallerType = None
) -> typing.Tuple[dict, typing.Union[bytes, str]]:
event: AnyCloudEvent, data_marshaller: typing.Optional[types.MarshallerType] = None
) -> typing.Tuple[typing.Dict[str, str], bytes]:
return _moved_to_binary(event, data_marshaller)
@ -42,8 +42,8 @@ def to_binary(
)
def to_structured(
event: AnyCloudEvent,
data_marshaller: types.MarshallerType = None,
) -> typing.Tuple[dict, typing.Union[bytes, str]]:
data_marshaller: typing.Optional[types.MarshallerType] = None,
) -> typing.Tuple[typing.Dict[str, str], bytes]:
return _moved_to_structured(event, data_marshaller)
@ -53,21 +53,21 @@ def to_structured(
)
def from_http(
headers: typing.Dict[str, str],
data: typing.Union[str, bytes, None],
data_unmarshaller: types.UnmarshallerType = None,
data: typing.Optional[typing.AnyStr],
data_unmarshaller: typing.Optional[types.UnmarshallerType] = None,
) -> CloudEvent:
return _moved_from_http(headers, data, data_unmarshaller)
@deprecated(deprecated_in="1.0.2", details="Use to_binary function instead")
def to_binary_http(
event: CloudEvent, data_marshaller: types.MarshallerType = None
) -> typing.Tuple[dict, typing.Union[bytes, str]]:
event: CloudEvent, data_marshaller: typing.Optional[types.MarshallerType] = None
) -> typing.Tuple[typing.Dict[str, str], bytes]:
return _moved_to_binary(event, data_marshaller)
@deprecated(deprecated_in="1.0.2", details="Use to_structured function instead")
def to_structured_http(
event: CloudEvent, data_marshaller: types.MarshallerType = None
) -> typing.Tuple[dict, typing.Union[bytes, str]]:
event: CloudEvent, data_marshaller: typing.Optional[types.MarshallerType] = None
) -> typing.Tuple[typing.Dict[str, str], bytes]:
return _moved_to_structured(event, data_marshaller)

View File

@ -31,8 +31,8 @@ from cloudevents.sdk import types
)
def to_json(
event: AnyCloudEvent,
data_marshaller: types.MarshallerType = None,
) -> typing.Union[str, bytes]:
data_marshaller: typing.Optional[types.MarshallerType] = None,
) -> bytes:
return _moved_to_json(event, data_marshaller)
@ -42,6 +42,6 @@ def to_json(
)
def from_json(
data: typing.Union[str, bytes],
data_unmarshaller: types.UnmarshallerType = None,
data_unmarshaller: typing.Optional[types.UnmarshallerType] = None,
) -> CloudEvent:
return _moved_from_json(data, data_unmarshaller)

View File

@ -11,6 +11,8 @@
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import typing
from deprecation import deprecated
from cloudevents.conversion import (
@ -24,5 +26,7 @@ from cloudevents.conversion import (
deprecated_in="1.6.0",
details="You SHOULD NOT use the default marshaller",
)
def default_marshaller(content: any):
def default_marshaller(
content: typing.Any,
) -> typing.Optional[typing.Union[bytes, str, typing.Any]]:
return _moved_default_marshaller(content)

View File

@ -22,10 +22,10 @@ from cloudevents.kafka.conversion import (
)
__all__ = [
KafkaMessage,
KeyMapper,
from_binary,
from_structured,
to_binary,
to_structured,
"KafkaMessage",
"KeyMapper",
"from_binary",
"from_structured",
"to_binary",
"to_structured",
]

View File

@ -21,9 +21,14 @@ from cloudevents.abstract import AnyCloudEvent
from cloudevents.kafka.exceptions import KeyMapperError
from cloudevents.sdk import types
DEFAULT_MARSHALLER: types.MarshallerType = json.dumps
DEFAULT_UNMARSHALLER: types.MarshallerType = json.loads
DEFAULT_EMBEDDED_DATA_MARSHALLER: types.MarshallerType = lambda x: x
JSON_MARSHALLER: types.MarshallerType = json.dumps
JSON_UNMARSHALLER: types.UnmarshallerType = json.loads
IDENTITY_MARSHALLER = IDENTITY_UNMARSHALLER = lambda x: x
DEFAULT_MARSHALLER: types.MarshallerType = JSON_MARSHALLER
DEFAULT_UNMARSHALLER: types.UnmarshallerType = JSON_UNMARSHALLER
DEFAULT_EMBEDDED_DATA_MARSHALLER: types.MarshallerType = IDENTITY_MARSHALLER
DEFAULT_EMBEDDED_DATA_UNMARSHALLER: types.UnmarshallerType = IDENTITY_UNMARSHALLER
class KafkaMessage(typing.NamedTuple):
@ -38,12 +43,12 @@ class KafkaMessage(typing.NamedTuple):
The dictionary of message headers key/values.
"""
key: typing.Optional[typing.AnyStr]
key: typing.Optional[typing.Union[str, bytes]]
"""
The message key.
"""
value: typing.AnyStr
value: typing.Union[str, bytes]
"""
The message value.
"""
@ -87,15 +92,15 @@ def to_binary(
)
headers = {}
if event["content-type"]:
headers["content-type"] = event["content-type"].encode("utf-8")
if event["datacontenttype"]:
headers["content-type"] = event["datacontenttype"].encode("utf-8")
for attr, value in event.get_attributes().items():
if attr not in ["data", "partitionkey", "content-type"]:
if attr not in ["data", "partitionkey", "datacontenttype"]:
if value is not None:
headers["ce_{0}".format(attr)] = value.encode("utf-8")
try:
data = data_marshaller(event.data)
data = data_marshaller(event.get_data())
except Exception as e:
raise cloud_exceptions.DataMarshallerError(
f"Failed to marshall data with error: {type(e).__name__}('{e}')"
@ -106,11 +111,29 @@ def to_binary(
return KafkaMessage(headers, message_key, data)
@typing.overload
def from_binary(
message: KafkaMessage,
event_type: None = None,
data_unmarshaller: typing.Optional[types.UnmarshallerType] = None,
) -> http.CloudEvent:
pass
@typing.overload
def from_binary(
message: KafkaMessage,
event_type: typing.Type[AnyCloudEvent],
data_unmarshaller: typing.Optional[types.UnmarshallerType] = None,
) -> AnyCloudEvent:
pass
def from_binary(
message: KafkaMessage,
event_type: typing.Optional[typing.Type[AnyCloudEvent]] = None,
data_unmarshaller: typing.Optional[types.MarshallerType] = None,
) -> AnyCloudEvent:
data_unmarshaller: typing.Optional[types.UnmarshallerType] = None,
) -> typing.Union[http.CloudEvent, AnyCloudEvent]:
"""
Returns a CloudEvent from a KafkaMessage in binary format.
@ -121,14 +144,12 @@ def from_binary(
"""
data_unmarshaller = data_unmarshaller or DEFAULT_UNMARSHALLER
event_type = event_type or http.CloudEvent
attributes = {}
attributes: typing.Dict[str, typing.Any] = {}
for header, value in message.headers.items():
header = header.lower()
if header == "content-type":
attributes["content-type"] = value.decode()
attributes["datacontenttype"] = value.decode()
elif header.startswith("ce_"):
attributes[header[3:]] = value.decode()
@ -141,8 +162,12 @@ def from_binary(
raise cloud_exceptions.DataUnmarshallerError(
f"Failed to unmarshall data with error: {type(e).__name__}('{e}')"
)
return event_type.create(attributes, data)
result: typing.Union[http.CloudEvent, AnyCloudEvent]
if event_type:
result = event_type.create(attributes, data)
else:
result = http.CloudEvent.create(attributes, data)
return result
def to_structured(
@ -174,10 +199,10 @@ def to_structured(
f"Failed to map message key with error: {type(e).__name__}('{e}')"
)
attrs: dict[str, typing.Any] = dict(event.get_attributes())
attrs: typing.Dict[str, typing.Any] = dict(event.get_attributes())
try:
data = data_marshaller(event.data)
data = data_marshaller(event.get_data())
except Exception as e:
raise cloud_exceptions.DataMarshallerError(
f"Failed to marshall data with error: {type(e).__name__}('{e}')"
@ -188,8 +213,8 @@ def to_structured(
attrs["data"] = data
headers = {}
if "content-type" in attrs:
headers["content-type"] = attrs.pop("content-type").encode("utf-8")
if "datacontenttype" in attrs:
headers["content-type"] = attrs.pop("datacontenttype").encode("utf-8")
try:
value = envelope_marshaller(attrs)
@ -204,12 +229,32 @@ def to_structured(
return KafkaMessage(headers, message_key, value)
@typing.overload
def from_structured(
message: KafkaMessage,
event_type: None = None,
data_unmarshaller: typing.Optional[types.UnmarshallerType] = None,
envelope_unmarshaller: typing.Optional[types.UnmarshallerType] = None,
) -> http.CloudEvent:
pass
@typing.overload
def from_structured(
message: KafkaMessage,
event_type: typing.Type[AnyCloudEvent],
data_unmarshaller: typing.Optional[types.UnmarshallerType] = None,
envelope_unmarshaller: typing.Optional[types.UnmarshallerType] = None,
) -> AnyCloudEvent:
pass
def from_structured(
message: KafkaMessage,
event_type: typing.Optional[typing.Type[AnyCloudEvent]] = None,
data_unmarshaller: typing.Optional[types.MarshallerType] = None,
data_unmarshaller: typing.Optional[types.UnmarshallerType] = None,
envelope_unmarshaller: typing.Optional[types.UnmarshallerType] = None,
) -> AnyCloudEvent:
) -> typing.Union[http.CloudEvent, AnyCloudEvent]:
"""
Returns a CloudEvent from a KafkaMessage in structured format.
@ -221,10 +266,8 @@ def from_structured(
:returns: CloudEvent
"""
data_unmarshaller = data_unmarshaller or DEFAULT_EMBEDDED_DATA_MARSHALLER
data_unmarshaller = data_unmarshaller or DEFAULT_EMBEDDED_DATA_UNMARSHALLER
envelope_unmarshaller = envelope_unmarshaller or DEFAULT_UNMARSHALLER
event_type = event_type or http.CloudEvent
try:
structure = envelope_unmarshaller(message.value)
except Exception as e:
@ -232,7 +275,7 @@ def from_structured(
"Failed to unmarshall message with error: " f"{type(e).__name__}('{e}')"
)
attributes: dict[str, typing.Any] = {}
attributes: typing.Dict[str, typing.Any] = {}
if message.key is not None:
attributes["partitionkey"] = message.key
@ -256,6 +299,13 @@ def from_structured(
attributes[name] = decoded_value
for header, val in message.headers.items():
attributes[header.lower()] = val.decode()
return event_type.create(attributes, data)
if header.lower() == "content-type":
attributes["datacontenttype"] = val.decode()
else:
attributes[header.lower()] = val.decode()
result: typing.Union[AnyCloudEvent, http.CloudEvent]
if event_type:
result = event_type.create(attributes, data)
else:
result = http.CloudEvent.create(attributes, data)
return result

0
cloudevents/py.typed Normal file
View File

View File

@ -11,7 +11,37 @@
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from cloudevents.pydantic.conversion import from_dict, from_http, from_json
from cloudevents.pydantic.event import CloudEvent
__all__ = [CloudEvent, from_json, from_dict, from_http]
from typing import TYPE_CHECKING
from cloudevents.exceptions import PydanticFeatureNotInstalled
try:
if TYPE_CHECKING:
from cloudevents.pydantic.v2 import CloudEvent, from_dict, from_http, from_json
else:
from pydantic import VERSION as PYDANTIC_VERSION
pydantic_major_version = PYDANTIC_VERSION.split(".")[0]
if pydantic_major_version == "1":
from cloudevents.pydantic.v1 import (
CloudEvent,
from_dict,
from_http,
from_json,
)
else:
from cloudevents.pydantic.v2 import (
CloudEvent,
from_dict,
from_http,
from_json,
)
except ImportError: # pragma: no cover # hard to test
raise PydanticFeatureNotInstalled(
"CloudEvents pydantic feature is not installed. "
"Install it using pip install cloudevents[pydantic]"
)
__all__ = ["CloudEvent", "from_json", "from_dict", "from_http"]

View File

@ -1,303 +0,0 @@
# Copyright 2018-Present The CloudEvents Authors
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import datetime
import json
import typing
from cloudevents.exceptions import PydanticFeatureNotInstalled
try:
import pydantic
except ImportError: # pragma: no cover # hard to test
raise PydanticFeatureNotInstalled(
"CloudEvents pydantic feature is not installed. "
"Install it using pip install cloudevents[pydantic]"
)
from cloudevents import abstract, conversion, http
from cloudevents.exceptions import IncompatibleArgumentsError
from cloudevents.sdk.event import attribute
def _ce_json_dumps(obj: typing.Dict[str, typing.Any], *args, **kwargs) -> str:
"""
Needed by the pydantic base-model to serialize the event correctly to json.
Without this function the data will be incorrectly serialized.
:param obj: CloudEvent represented as a dict.
:param args: User arguments which will be passed to json.dumps function.
:param kwargs: User arguments which will be passed to json.dumps function.
:return: Event serialized as a standard JSON CloudEvent with user specific
parameters.
"""
# Using HTTP from dict due to performance issues.
# Pydantic is known for initialization time lagging.
return json.dumps(
# We SHOULD de-serialize the value, to serialize it back with
# the correct json args and kwargs passed by the user.
# This MAY cause performance issues in the future.
# When that issue will cause real problem you MAY add a special keyword
# argument that disabled this conversion
json.loads(
conversion.to_json(
http.from_dict(obj),
).decode("utf-8")
),
*args,
**kwargs
)
def _ce_json_loads(
data: typing.Union[str, bytes], *args, **kwargs # noqa
) -> typing.Dict[typing.Any, typing.Any]:
"""
Needed by the pydantic base-model to de-serialize the event correctly from json.
Without this function the data will be incorrectly de-serialized.
:param obj: CloudEvent encoded as a json string.
:param args: These arguments SHOULD NOT be passed by pydantic.
Located here for fail-safe reasons, in-case it does.
:param kwargs: These arguments SHOULD NOT be passed by pydantic.
Located here for fail-safe reasons, in-case it does.
:return: CloudEvent in a dict representation.
"""
# Using HTTP from dict due to performance issues.
# Pydantic is known for initialization time lagging.
return conversion.to_dict(http.from_json(data))
class CloudEvent(abstract.CloudEvent, pydantic.BaseModel):
"""
A Python-friendly CloudEvent representation backed by Pydantic-modeled fields.
Supports both binary and structured modes of the CloudEvents v1 specification.
"""
@classmethod
def create(
cls, attributes: typing.Dict[str, typing.Any], data: typing.Optional[typing.Any]
) -> "CloudEvent":
return cls(attributes, data)
data: typing.Optional[typing.Any] = pydantic.Field(
title="Event Data",
description=(
"CloudEvents MAY include domain-specific information about the occurrence."
" When present, this information will be encapsulated within data.It is"
" encoded into a media format which is specified by the datacontenttype"
" attribute (e.g. application/json), and adheres to the dataschema format"
" when those respective attributes are present."
),
)
source: str = pydantic.Field(
title="Event Source",
description=(
"Identifies the context in which an event happened. Often this will include"
" information such as the type of the event source, the organization"
" publishing the event or the process that produced the event. The exact"
" syntax and semantics behind the data encoded in the URI is defined by the"
" event producer.\n"
"\n"
"Producers MUST ensure that source + id is unique for"
" each distinct event.\n"
"\n"
"An application MAY assign a unique source to each"
" distinct producer, which makes it easy to produce unique IDs since no"
" other producer will have the same source. The application MAY use UUIDs,"
" URNs, DNS authorities or an application-specific scheme to create unique"
" source identifiers.\n"
"\n"
"A source MAY include more than one producer. In"
" that case the producers MUST collaborate to ensure that source + id is"
" unique for each distinct event."
),
example="https://github.com/cloudevents",
)
id: str = pydantic.Field(
default_factory=attribute.default_id_selection_algorithm,
title="Event ID",
description=(
"Identifies the event. Producers MUST ensure that source + id is unique for"
" each distinct event. If a duplicate event is re-sent (e.g. due to a"
" network error) it MAY have the same id. Consumers MAY assume that Events"
" with identical source and id are duplicates. MUST be unique within the"
" scope of the producer"
),
example="A234-1234-1234",
)
type: str = pydantic.Field(
title="Event Type",
description=(
"This attribute contains a value describing the type of event related to"
" the originating occurrence. Often this attribute is used for routing,"
" observability, policy enforcement, etc. The format of this is producer"
" defined and might include information such as the version of the type"
),
example="com.github.pull_request.opened",
)
specversion: attribute.SpecVersion = pydantic.Field(
default=attribute.DEFAULT_SPECVERSION,
title="Specification Version",
description=(
"The version of the CloudEvents specification which the event uses. This"
" enables the interpretation of the context.\n"
"\n"
"Currently, this attribute will only have the 'major'"
" and 'minor' version numbers included in it. This allows for 'patch'"
" changes to the specification to be made without changing this property's"
" value in the serialization."
),
example=attribute.DEFAULT_SPECVERSION,
)
time: typing.Optional[datetime.datetime] = pydantic.Field(
default_factory=attribute.default_time_selection_algorithm,
title="Occurrence Time",
description=(
" Timestamp of when the occurrence happened. If the time of the occurrence"
" cannot be determined then this attribute MAY be set to some other time"
" (such as the current time) by the CloudEvents producer, however all"
" producers for the same source MUST be consistent in this respect. In"
" other words, either they all use the actual time of the occurrence or"
" they all use the same algorithm to determine the value used."
),
example="2018-04-05T17:31:00Z",
)
subject: typing.Optional[str] = pydantic.Field(
title="Event Subject",
description=(
"This describes the subject of the event in the context of the event"
" producer (identified by source). In publish-subscribe scenarios, a"
" subscriber will typically subscribe to events emitted by a source, but"
" the source identifier alone might not be sufficient as a qualifier for"
" any specific event if the source context has internal"
" sub-structure.\n"
"\n"
"Identifying the subject of the event in context"
" metadata (opposed to only in the data payload) is particularly helpful in"
" generic subscription filtering scenarios where middleware is unable to"
" interpret the data content. In the above example, the subscriber might"
" only be interested in blobs with names ending with '.jpg' or '.jpeg' and"
" the subject attribute allows for constructing a simple and efficient"
" string-suffix filter for that subset of events."
),
example="123",
)
datacontenttype: typing.Optional[str] = pydantic.Field(
title="Event Data Content Type",
description=(
"Content type of data value. This attribute enables data to carry any type"
" of content, whereby format and encoding might differ from that of the"
" chosen event format."
),
example="text/xml",
)
dataschema: typing.Optional[str] = pydantic.Field(
title="Event Data Schema",
description=(
"Identifies the schema that data adheres to. "
"Incompatible changes to the schema SHOULD be reflected by a different URI"
),
)
def __init__(
self,
attributes: typing.Optional[typing.Dict[str, typing.Any]] = None,
data: typing.Optional[typing.Any] = None,
**kwargs
):
"""
:param attributes: A dict with CloudEvent attributes.
Minimally expects the attributes 'type' and 'source'. If not given the
attributes 'specversion', 'id' or 'time', this will create
those attributes with default values.
If no attribute is given the class MUST use the kwargs as the attributes.
Example Attributes:
{
"specversion": "1.0",
"type": "com.github.pull_request.opened",
"source": "https://github.com/cloudevents/spec/pull",
"id": "A234-1234-1234",
"time": "2018-04-05T17:31:00Z",
}
:param data: Domain-specific information about the occurrence.
"""
if attributes:
if len(kwargs) != 0:
# To prevent API complexity and confusion.
raise IncompatibleArgumentsError(
"Attributes dict and kwargs are incompatible."
)
attributes = {k.lower(): v for k, v in attributes.items()}
kwargs.update(attributes)
super(CloudEvent, self).__init__(data=data, **kwargs)
class Config:
extra: str = "allow" # this is the way we implement extensions
schema_extra = {
"example": {
"specversion": "1.0",
"type": "com.github.pull_request.opened",
"source": "https://github.com/cloudevents/spec/pull",
"subject": "123",
"id": "A234-1234-1234",
"time": "2018-04-05T17:31:00Z",
"comexampleextension1": "value",
"comexampleothervalue": 5,
"datacontenttype": "text/xml",
"data": '<much wow="xml"/>',
}
}
json_dumps = _ce_json_dumps
json_loads = _ce_json_loads
def _get_attributes(self) -> typing.Dict[str, typing.Any]:
return {
key: conversion.best_effort_encode_attribute_value(value)
for key, value in self.__dict__.items()
if key != "data"
}
def _get_data(self) -> typing.Optional[typing.Any]:
return self.data
def __setitem__(self, key: str, value: typing.Any) -> None:
"""
Set event attribute value
MUST NOT set event data with this method, use `.data` member instead
Method SHOULD mimic `cloudevents.http.event.CloudEvent` interface
:param key: Event attribute name
:param value: New event attribute value
"""
if key != "data": # to mirror the behaviour of the http event
setattr(self, key, value)
else:
pass # It is de-facto ignored by the http event
def __delitem__(self, key: str) -> None:
"""
SHOULD raise `KeyError` if no event attribute for the given key exists.
Method SHOULD mimic `cloudevents.http.event.CloudEvent` interface
:param key: The event attribute name.
"""
if key == "data":
raise KeyError(key) # to mirror the behaviour of the http event
delattr(self, key)

View File

@ -0,0 +1,142 @@
# Copyright 2018-Present The CloudEvents Authors
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from cloudevents.sdk.event import attribute
FIELD_DESCRIPTIONS = {
"data": {
"title": "Event Data",
"description": (
"CloudEvents MAY include domain-specific information about the occurrence."
" When present, this information will be encapsulated within data.It is"
" encoded into a media format which is specified by the datacontenttype"
" attribute (e.g. application/json), and adheres to the dataschema format"
" when those respective attributes are present."
),
},
"source": {
"title": "Event Source",
"description": (
"Identifies the context in which an event happened. Often this will include"
" information such as the type of the event source, the organization"
" publishing the event or the process that produced the event. The exact"
" syntax and semantics behind the data encoded in the URI is defined by the"
" event producer.\n"
"\n"
"Producers MUST ensure that source + id is unique for"
" each distinct event.\n"
"\n"
"An application MAY assign a unique source to each"
" distinct producer, which makes it easy to produce unique IDs since no"
" other producer will have the same source. The application MAY use UUIDs,"
" URNs, DNS authorities or an application-specific scheme to create unique"
" source identifiers.\n"
"\n"
"A source MAY include more than one producer. In"
" that case the producers MUST collaborate to ensure that source + id is"
" unique for each distinct event."
),
"example": "https://github.com/cloudevents",
},
"id": {
"title": "Event ID",
"description": (
"Identifies the event. Producers MUST ensure that source + id is unique for"
" each distinct event. If a duplicate event is re-sent (e.g. due to a"
" network error) it MAY have the same id. Consumers MAY assume that Events"
" with identical source and id are duplicates. MUST be unique within the"
" scope of the producer"
),
"example": "A234-1234-1234",
},
"type": {
"title": "Event Type",
"description": (
"This attribute contains a value describing the type of event related to"
" the originating occurrence. Often this attribute is used for routing,"
" observability, policy enforcement, etc. The format of this is producer"
" defined and might include information such as the version of the type"
),
"example": "com.github.pull_request.opened",
},
"specversion": {
"title": "Specification Version",
"description": (
"The version of the CloudEvents specification which the event uses. This"
" enables the interpretation of the context.\n"
"\n"
"Currently, this attribute will only have the 'major'"
" and 'minor' version numbers included in it. This allows for 'patch'"
" changes to the specification to be made without changing this property's"
" value in the serialization."
),
"example": attribute.DEFAULT_SPECVERSION,
},
"time": {
"title": "Occurrence Time",
"description": (
" Timestamp of when the occurrence happened. If the time of the occurrence"
" cannot be determined then this attribute MAY be set to some other time"
" (such as the current time) by the CloudEvents producer, however all"
" producers for the same source MUST be consistent in this respect. In"
" other words, either they all use the actual time of the occurrence or"
" they all use the same algorithm to determine the value used."
),
"example": "2018-04-05T17:31:00Z",
},
"subject": {
"title": "Event Subject",
"description": (
"This describes the subject of the event in the context of the event"
" producer (identified by source). In publish-subscribe scenarios, a"
" subscriber will typically subscribe to events emitted by a source, but"
" the source identifier alone might not be sufficient as a qualifier for"
" any specific event if the source context has internal"
" sub-structure.\n"
"\n"
"Identifying the subject of the event in context"
" metadata (opposed to only in the data payload) is particularly helpful in"
" generic subscription filtering scenarios where middleware is unable to"
" interpret the data content. In the above example, the subscriber might"
" only be interested in blobs with names ending with '.jpg' or '.jpeg' and"
" the subject attribute allows for constructing a simple and efficient"
" string-suffix filter for that subset of events."
),
"example": "123",
},
"datacontenttype": {
"title": "Event Data Content Type",
"description": (
"Content type of data value. This attribute enables data to carry any type"
" of content, whereby format and encoding might differ from that of the"
" chosen event format."
),
"example": "text/xml",
},
"dataschema": {
"title": "Event Data Schema",
"description": (
"Identifies the schema that data adheres to. "
"Incompatible changes to the schema SHOULD be reflected by a different URI"
),
},
}
"""
The dictionary above contains title, description, example and other
NON-FUNCTIONAL data for pydantic fields. It could be potentially.
used across all the SDK.
Functional field configurations (e.g. defaults) are still defined
in the pydantic model classes.
"""

View File

@ -0,0 +1,18 @@
# Copyright 2018-Present The CloudEvents Authors
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from cloudevents.pydantic.v1.conversion import from_dict, from_http, from_json
from cloudevents.pydantic.v1.event import CloudEvent
__all__ = ["CloudEvent", "from_json", "from_dict", "from_http"]

View File

@ -16,13 +16,15 @@ import typing
from cloudevents.conversion import from_dict as _abstract_from_dict
from cloudevents.conversion import from_http as _abstract_from_http
from cloudevents.conversion import from_json as _abstract_from_json
from cloudevents.pydantic.event import CloudEvent
from cloudevents.pydantic.v1.event import CloudEvent
from cloudevents.sdk import types
def from_http(
headers: typing.Dict[str, str],
data: typing.Union[str, bytes, None],
headers: typing.Union[
typing.Mapping[str, str], types.SupportsDuplicateItems[str, str]
],
data: typing.Optional[typing.AnyStr],
data_unmarshaller: typing.Optional[types.UnmarshallerType] = None,
) -> CloudEvent:
"""
@ -47,7 +49,7 @@ def from_http(
def from_json(
data: typing.AnyStr,
data_unmarshaller: types.UnmarshallerType = None,
data_unmarshaller: typing.Optional[types.UnmarshallerType] = None,
) -> CloudEvent:
"""
Parses JSON string `data` into a CloudEvent.
@ -63,7 +65,7 @@ def from_json(
def from_dict(
event: typing.Dict[str, typing.Any],
event: typing.Mapping[str, typing.Any],
) -> CloudEvent:
"""
Construct an CloudEvent from a dict `event` representation.

View File

@ -0,0 +1,247 @@
# Copyright 2018-Present The CloudEvents Authors
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import datetime
import json
import typing
from cloudevents.exceptions import PydanticFeatureNotInstalled
from cloudevents.pydantic.fields_docs import FIELD_DESCRIPTIONS
try:
from pydantic import VERSION as PYDANTIC_VERSION
pydantic_major_version = PYDANTIC_VERSION.split(".")[0]
if pydantic_major_version == "2":
from pydantic.v1 import BaseModel, Field
else:
from pydantic import BaseModel, Field # type: ignore
except ImportError: # pragma: no cover # hard to test
raise PydanticFeatureNotInstalled(
"CloudEvents pydantic feature is not installed. "
"Install it using pip install cloudevents[pydantic]"
)
from cloudevents import abstract, conversion, http
from cloudevents.exceptions import IncompatibleArgumentsError
from cloudevents.sdk.event import attribute
def _ce_json_dumps( # type: ignore[no-untyped-def]
obj: typing.Dict[str, typing.Any],
*args,
**kwargs,
) -> str:
"""Performs Pydantic-specific serialization of the event.
Needed by the pydantic base-model to serialize the event correctly to json.
Without this function the data will be incorrectly serialized.
:param obj: CloudEvent represented as a dict.
:param args: User arguments which will be passed to json.dumps function.
:param kwargs: User arguments which will be passed to json.dumps function.
:return: Event serialized as a standard JSON CloudEvent with user specific
parameters.
"""
# Using HTTP from dict due to performance issues.
event = http.from_dict(obj)
event_json = conversion.to_json(event)
# Pydantic is known for initialization time lagging.
return json.dumps(
# We SHOULD de-serialize the value, to serialize it back with
# the correct json args and kwargs passed by the user.
# This MAY cause performance issues in the future.
# When that issue will cause real problem you MAY add a special keyword
# argument that disabled this conversion
json.loads(event_json),
*args,
**kwargs,
)
def _ce_json_loads( # type: ignore[no-untyped-def]
data: typing.AnyStr, *args, **kwargs # noqa
) -> typing.Dict[typing.Any, typing.Any]:
"""Performs Pydantic-specific deserialization of the event.
Needed by the pydantic base-model to de-serialize the event correctly from json.
Without this function the data will be incorrectly de-serialized.
:param obj: CloudEvent encoded as a json string.
:param args: These arguments SHOULD NOT be passed by pydantic.
Located here for fail-safe reasons, in-case it does.
:param kwargs: These arguments SHOULD NOT be passed by pydantic.
Located here for fail-safe reasons, in-case it does.
:return: CloudEvent in a dict representation.
"""
# Using HTTP from dict due to performance issues.
# Pydantic is known for initialization time lagging.
return conversion.to_dict(http.from_json(data))
class CloudEvent(abstract.CloudEvent, BaseModel): # type: ignore
"""
A Python-friendly CloudEvent representation backed by Pydantic-modeled fields.
Supports both binary and structured modes of the CloudEvents v1 specification.
"""
@classmethod
def create(
cls,
attributes: typing.Mapping[str, typing.Any],
data: typing.Optional[typing.Any],
) -> "CloudEvent":
return cls(attributes, data)
data: typing.Optional[typing.Any] = Field(
title=FIELD_DESCRIPTIONS["data"].get("title"),
description=FIELD_DESCRIPTIONS["data"].get("description"),
example=FIELD_DESCRIPTIONS["data"].get("example"),
)
source: str = Field(
title=FIELD_DESCRIPTIONS["source"].get("title"),
description=FIELD_DESCRIPTIONS["source"].get("description"),
example=FIELD_DESCRIPTIONS["source"].get("example"),
)
id: str = Field(
title=FIELD_DESCRIPTIONS["id"].get("title"),
description=FIELD_DESCRIPTIONS["id"].get("description"),
example=FIELD_DESCRIPTIONS["id"].get("example"),
default_factory=attribute.default_id_selection_algorithm,
)
type: str = Field(
title=FIELD_DESCRIPTIONS["type"].get("title"),
description=FIELD_DESCRIPTIONS["type"].get("description"),
example=FIELD_DESCRIPTIONS["type"].get("example"),
)
specversion: attribute.SpecVersion = Field(
title=FIELD_DESCRIPTIONS["specversion"].get("title"),
description=FIELD_DESCRIPTIONS["specversion"].get("description"),
example=FIELD_DESCRIPTIONS["specversion"].get("example"),
default=attribute.DEFAULT_SPECVERSION,
)
time: typing.Optional[datetime.datetime] = Field(
title=FIELD_DESCRIPTIONS["time"].get("title"),
description=FIELD_DESCRIPTIONS["time"].get("description"),
example=FIELD_DESCRIPTIONS["time"].get("example"),
default_factory=attribute.default_time_selection_algorithm,
)
subject: typing.Optional[str] = Field(
title=FIELD_DESCRIPTIONS["subject"].get("title"),
description=FIELD_DESCRIPTIONS["subject"].get("description"),
example=FIELD_DESCRIPTIONS["subject"].get("example"),
)
datacontenttype: typing.Optional[str] = Field(
title=FIELD_DESCRIPTIONS["datacontenttype"].get("title"),
description=FIELD_DESCRIPTIONS["datacontenttype"].get("description"),
example=FIELD_DESCRIPTIONS["datacontenttype"].get("example"),
)
dataschema: typing.Optional[str] = Field(
title=FIELD_DESCRIPTIONS["dataschema"].get("title"),
description=FIELD_DESCRIPTIONS["dataschema"].get("description"),
example=FIELD_DESCRIPTIONS["dataschema"].get("example"),
)
def __init__( # type: ignore[no-untyped-def]
self,
attributes: typing.Optional[typing.Mapping[str, typing.Any]] = None,
data: typing.Optional[typing.Any] = None,
**kwargs,
):
"""
:param attributes: A dict with CloudEvent attributes.
Minimally expects the attributes 'type' and 'source'. If not given the
attributes 'specversion', 'id' or 'time', this will create
those attributes with default values.
If no attribute is given the class MUST use the kwargs as the attributes.
Example Attributes:
{
"specversion": "1.0",
"type": "com.github.pull_request.opened",
"source": "https://github.com/cloudevents/spec/pull",
"id": "A234-1234-1234",
"time": "2018-04-05T17:31:00Z",
}
:param data: Domain-specific information about the occurrence.
"""
if attributes:
if len(kwargs) != 0:
# To prevent API complexity and confusion.
raise IncompatibleArgumentsError(
"Attributes dict and kwargs are incompatible."
)
attributes = {k.lower(): v for k, v in attributes.items()}
kwargs.update(attributes)
super().__init__(data=data, **kwargs)
class Config:
extra: str = "allow" # this is the way we implement extensions
schema_extra = {
"example": {
"specversion": "1.0",
"type": "com.github.pull_request.opened",
"source": "https://github.com/cloudevents/spec/pull",
"subject": "123",
"id": "A234-1234-1234",
"time": "2018-04-05T17:31:00Z",
"comexampleextension1": "value",
"comexampleothervalue": 5,
"datacontenttype": "text/xml",
"data": '<much wow="xml"/>',
}
}
json_dumps = _ce_json_dumps
json_loads = _ce_json_loads
def _get_attributes(self) -> typing.Dict[str, typing.Any]:
return {
key: conversion.best_effort_encode_attribute_value(value)
for key, value in self.__dict__.items()
if key != "data"
}
def get_data(self) -> typing.Optional[typing.Any]:
return self.data
def __setitem__(self, key: str, value: typing.Any) -> None:
"""
Set event attribute value
MUST NOT set event data with this method, use `.data` member instead
Method SHOULD mimic `cloudevents.http.event.CloudEvent` interface
:param key: Event attribute name
:param value: New event attribute value
"""
if key != "data": # to mirror the behaviour of the http event
setattr(self, key, value)
else:
pass # It is de-facto ignored by the http event
def __delitem__(self, key: str) -> None:
"""
SHOULD raise `KeyError` if no event attribute for the given key exists.
Method SHOULD mimic `cloudevents.http.event.CloudEvent` interface
:param key: The event attribute name.
"""
if key == "data":
raise KeyError(key) # to mirror the behaviour of the http event
delattr(self, key)

View File

@ -0,0 +1,18 @@
# Copyright 2018-Present The CloudEvents Authors
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from cloudevents.pydantic.v2.conversion import from_dict, from_http, from_json
from cloudevents.pydantic.v2.event import CloudEvent
__all__ = ["CloudEvent", "from_json", "from_dict", "from_http"]

View File

@ -0,0 +1,77 @@
# Copyright 2018-Present The CloudEvents Authors
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import typing
from cloudevents.conversion import from_dict as _abstract_from_dict
from cloudevents.conversion import from_http as _abstract_from_http
from cloudevents.conversion import from_json as _abstract_from_json
from cloudevents.pydantic.v2.event import CloudEvent
from cloudevents.sdk import types
def from_http(
headers: typing.Union[
typing.Mapping[str, str], types.SupportsDuplicateItems[str, str]
],
data: typing.Optional[typing.AnyStr],
data_unmarshaller: typing.Optional[types.UnmarshallerType] = None,
) -> CloudEvent:
"""
Parses CloudEvent `data` and `headers` into a CloudEvent.
The method supports both binary and structured representations.
:param headers: The HTTP request headers.
:param data: The HTTP request body. If set to None, "" or b'', the returned
event's `data` field will be set to None.
:param data_unmarshaller: Callable function to map data to a python object
e.g. lambda x: x or lambda x: json.loads(x)
:returns: A CloudEvent parsed from the passed HTTP parameters
"""
return _abstract_from_http(
headers=headers,
data=data,
data_unmarshaller=data_unmarshaller,
event_type=CloudEvent,
)
def from_json(
data: typing.AnyStr,
data_unmarshaller: typing.Optional[types.UnmarshallerType] = None,
) -> CloudEvent:
"""
Parses JSON string `data` into a CloudEvent.
:param data: JSON string representation of a CloudEvent.
:param data_unmarshaller: Callable function that casts `data` to a
Python object.
:returns: A CloudEvent parsed from the given JSON representation.
"""
return _abstract_from_json(
data=data, data_unmarshaller=data_unmarshaller, event_type=CloudEvent
)
def from_dict(
event: typing.Mapping[str, typing.Any],
) -> CloudEvent:
"""
Construct an CloudEvent from a dict `event` representation.
:param event: The event represented as a dict.
:returns: A CloudEvent parsed from the given dict representation.
"""
return _abstract_from_dict(CloudEvent, event)

View File

@ -0,0 +1,248 @@
# Copyright 2018-Present The CloudEvents Authors
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import datetime
import json
import typing
from typing import Any
from pydantic.deprecated import parse as _deprecated_parse
from cloudevents.exceptions import PydanticFeatureNotInstalled
from cloudevents.pydantic.fields_docs import FIELD_DESCRIPTIONS
try:
from pydantic import BaseModel, ConfigDict, Field, model_serializer
except ImportError: # pragma: no cover # hard to test
raise PydanticFeatureNotInstalled(
"CloudEvents pydantic feature is not installed. "
"Install it using pip install cloudevents[pydantic]"
)
from cloudevents import abstract, conversion
from cloudevents.exceptions import IncompatibleArgumentsError
from cloudevents.sdk.event import attribute
class CloudEvent(abstract.CloudEvent, BaseModel): # type: ignore
"""
A Python-friendly CloudEvent representation backed by Pydantic-modeled fields.
Supports both binary and structured modes of the CloudEvents v1 specification.
"""
@classmethod
def create(
cls,
attributes: typing.Mapping[str, typing.Any],
data: typing.Optional[typing.Any],
) -> "CloudEvent":
return cls(attributes, data)
data: typing.Optional[typing.Any] = Field(
title=FIELD_DESCRIPTIONS["data"].get("title"),
description=FIELD_DESCRIPTIONS["data"].get("description"),
examples=[FIELD_DESCRIPTIONS["data"].get("example")],
default=None,
)
source: str = Field(
title=FIELD_DESCRIPTIONS["source"].get("title"),
description=FIELD_DESCRIPTIONS["source"].get("description"),
examples=[FIELD_DESCRIPTIONS["source"].get("example")],
)
id: str = Field(
title=FIELD_DESCRIPTIONS["id"].get("title"),
description=FIELD_DESCRIPTIONS["id"].get("description"),
examples=[FIELD_DESCRIPTIONS["id"].get("example")],
default_factory=attribute.default_id_selection_algorithm,
)
type: str = Field(
title=FIELD_DESCRIPTIONS["type"].get("title"),
description=FIELD_DESCRIPTIONS["type"].get("description"),
examples=[FIELD_DESCRIPTIONS["type"].get("example")],
)
specversion: attribute.SpecVersion = Field(
title=FIELD_DESCRIPTIONS["specversion"].get("title"),
description=FIELD_DESCRIPTIONS["specversion"].get("description"),
examples=[FIELD_DESCRIPTIONS["specversion"].get("example")],
default=attribute.DEFAULT_SPECVERSION,
)
time: typing.Optional[datetime.datetime] = Field(
title=FIELD_DESCRIPTIONS["time"].get("title"),
description=FIELD_DESCRIPTIONS["time"].get("description"),
examples=[FIELD_DESCRIPTIONS["time"].get("example")],
default_factory=attribute.default_time_selection_algorithm,
)
subject: typing.Optional[str] = Field(
title=FIELD_DESCRIPTIONS["subject"].get("title"),
description=FIELD_DESCRIPTIONS["subject"].get("description"),
examples=[FIELD_DESCRIPTIONS["subject"].get("example")],
default=None,
)
datacontenttype: typing.Optional[str] = Field(
title=FIELD_DESCRIPTIONS["datacontenttype"].get("title"),
description=FIELD_DESCRIPTIONS["datacontenttype"].get("description"),
examples=[FIELD_DESCRIPTIONS["datacontenttype"].get("example")],
default=None,
)
dataschema: typing.Optional[str] = Field(
title=FIELD_DESCRIPTIONS["dataschema"].get("title"),
description=FIELD_DESCRIPTIONS["dataschema"].get("description"),
examples=[FIELD_DESCRIPTIONS["dataschema"].get("example")],
default=None,
)
def __init__( # type: ignore[no-untyped-def]
self,
attributes: typing.Optional[typing.Mapping[str, typing.Any]] = None,
data: typing.Optional[typing.Any] = None,
**kwargs,
):
"""
:param attributes: A dict with CloudEvent attributes.
Minimally expects the attributes 'type' and 'source'. If not given the
attributes 'specversion', 'id' or 'time', this will create
those attributes with default values.
If no attribute is given the class MUST use the kwargs as the attributes.
Example Attributes:
{
"specversion": "1.0",
"type": "com.github.pull_request.opened",
"source": "https://github.com/cloudevents/spec/pull",
"id": "A234-1234-1234",
"time": "2018-04-05T17:31:00Z",
}
:param data: Domain-specific information about the occurrence.
"""
if attributes:
if len(kwargs) != 0:
# To prevent API complexity and confusion.
raise IncompatibleArgumentsError(
"Attributes dict and kwargs are incompatible."
)
attributes = {k.lower(): v for k, v in attributes.items()}
kwargs.update(attributes)
super().__init__(data=data, **kwargs)
model_config = ConfigDict(
extra="allow", # this is the way we implement extensions
json_schema_extra={
"example": {
"specversion": "1.0",
"type": "com.github.pull_request.opened",
"source": "https://github.com/cloudevents/spec/pull",
"subject": "123",
"id": "A234-1234-1234",
"time": "2018-04-05T17:31:00Z",
"comexampleextension1": "value",
"comexampleothervalue": 5,
"datacontenttype": "text/xml",
"data": '<much wow="xml"/>',
}
},
)
"""
We should use a @model_validator decorator to handle JSON deserialisation,
however it's not possible to completely bypass the internal pydantic logic
and still use the CloudEvents shared conversion logic.
Same issue applies to the multiple from/to JSON conversion logic in the
@model_serializer implemented after
To remove the need for the multiple from/to JSON transformation we need
major refactor in the SDK conversion logic.
"""
@classmethod
def model_validate_json(
cls,
json_data: typing.Union[str, bytes, bytearray],
*,
strict: typing.Optional[bool] = None,
context: typing.Optional[typing.Dict[str, Any]] = None,
by_alias: typing.Optional[bool] = None,
by_name: typing.Optional[bool] = None,
) -> "CloudEvent":
return conversion.from_json(cls, json_data)
@classmethod
def parse_raw(
cls,
b: typing.Union[str, bytes],
*,
content_type: typing.Optional[str] = None,
encoding: str = "utf8",
proto: typing.Optional[_deprecated_parse.Protocol] = None,
allow_pickle: bool = False,
) -> "CloudEvent":
return conversion.from_json(cls, b)
@model_serializer(when_used="json")
def _ce_json_dumps(self) -> typing.Dict[str, typing.Any]:
"""Performs Pydantic-specific serialization of the event when
serializing the model using `.model_dump_json()` method.
Needed by the pydantic base-model to serialize the event correctly to json.
Without this function the data will be incorrectly serialized.
:param self: CloudEvent.
:return: Event serialized as a standard CloudEvent dict with user specific
parameters.
"""
# Here mypy complains about json.loads returning Any
# which is incompatible with this method return type
# but we know it's always a dictionary in this case
return json.loads(conversion.to_json(self)) # type: ignore
def _get_attributes(self) -> typing.Dict[str, typing.Any]:
return {
key: conversion.best_effort_encode_attribute_value(value)
for key, value in dict(BaseModel.__iter__(self)).items()
if key not in ["data"]
}
def get_data(self) -> typing.Optional[typing.Any]:
return self.data
def __setitem__(self, key: str, value: typing.Any) -> None:
"""
Set event attribute value
MUST NOT set event data with this method, use `.data` member instead
Method SHOULD mimic `cloudevents.http.event.CloudEvent` interface
:param key: Event attribute name
:param value: New event attribute value
"""
if key != "data": # to mirror the behaviour of the http event
setattr(self, key, value)
else:
pass # It is de-facto ignored by the http event
def __delitem__(self, key: str) -> None:
"""
SHOULD raise `KeyError` if no event attribute for the given key exists.
Method SHOULD mimic `cloudevents.http.event.CloudEvent` interface
:param key: The event attribute name.
"""
if key == "data":
raise KeyError(key) # to mirror the behaviour of the http event
delattr(self, key)

View File

@ -16,7 +16,14 @@ from cloudevents.sdk.converters import binary, structured
from cloudevents.sdk.converters.binary import is_binary
from cloudevents.sdk.converters.structured import is_structured
TypeBinary = binary.BinaryHTTPCloudEventConverter.TYPE
TypeStructured = structured.JSONHTTPCloudEventConverter.TYPE
TypeBinary: str = binary.BinaryHTTPCloudEventConverter.TYPE
TypeStructured: str = structured.JSONHTTPCloudEventConverter.TYPE
__all__ = [binary, structured, is_binary, is_structured, TypeBinary, TypeStructured]
__all__ = [
"binary",
"structured",
"is_binary",
"is_structured",
"TypeBinary",
"TypeStructured",
]

View File

@ -18,14 +18,13 @@ from cloudevents.sdk.event import base
class Converter(object):
TYPE = None
TYPE: str = ""
def read(
self,
event,
headers: dict,
body: typing.IO,
event: typing.Any,
headers: typing.Mapping[str, str],
body: typing.Union[str, bytes],
data_unmarshaller: typing.Callable,
) -> base.BaseEvent:
raise Exception("not implemented")
@ -33,10 +32,14 @@ class Converter(object):
def event_supported(self, event: object) -> bool:
raise Exception("not implemented")
def can_read(self, content_type: str) -> bool:
def can_read(
self,
content_type: typing.Optional[str],
headers: typing.Optional[typing.Mapping[str, str]] = None,
) -> bool:
raise Exception("not implemented")
def write(
self, event: base.BaseEvent, data_marshaller: typing.Callable
) -> (dict, object):
self, event: base.BaseEvent, data_marshaller: typing.Optional[typing.Callable]
) -> typing.Tuple[typing.Dict[str, str], bytes]:
raise Exception("not implemented")

View File

@ -22,16 +22,16 @@ from cloudevents.sdk.event import v1, v03
class BinaryHTTPCloudEventConverter(base.Converter):
TYPE = "binary"
TYPE: str = "binary"
SUPPORTED_VERSIONS = [v03.Event, v1.Event]
def can_read(
self,
content_type: str = None,
headers: typing.Dict[str, str] = {"ce-specversion": None},
content_type: typing.Optional[str] = None,
headers: typing.Optional[typing.Mapping[str, str]] = None,
) -> bool:
if headers is None:
headers = {"ce-specversion": ""}
return has_binary_headers(headers)
def event_supported(self, event: object) -> bool:
@ -40,8 +40,8 @@ class BinaryHTTPCloudEventConverter(base.Converter):
def read(
self,
event: event_base.BaseEvent,
headers: dict,
body: typing.IO,
headers: typing.Mapping[str, str],
body: typing.Union[str, bytes],
data_unmarshaller: types.UnmarshallerType,
) -> event_base.BaseEvent:
if type(event) not in self.SUPPORTED_VERSIONS:
@ -50,8 +50,10 @@ class BinaryHTTPCloudEventConverter(base.Converter):
return event
def write(
self, event: event_base.BaseEvent, data_marshaller: types.MarshallerType
) -> typing.Tuple[dict, bytes]:
self,
event: event_base.BaseEvent,
data_marshaller: typing.Optional[types.MarshallerType],
) -> typing.Tuple[typing.Dict[str, str], bytes]:
return event.MarshalBinary(data_marshaller)
@ -59,7 +61,7 @@ def NewBinaryHTTPCloudEventConverter() -> BinaryHTTPCloudEventConverter:
return BinaryHTTPCloudEventConverter()
def is_binary(headers: typing.Dict[str, str]) -> bool:
def is_binary(headers: typing.Mapping[str, str]) -> bool:
"""
Determines whether an event with the supplied `headers` is in binary format.

View File

@ -22,11 +22,16 @@ from cloudevents.sdk.event import base as event_base
# TODO: Singleton?
class JSONHTTPCloudEventConverter(base.Converter):
TYPE: str = "structured"
MIME_TYPE: str = "application/cloudevents+json"
TYPE = "structured"
MIME_TYPE = "application/cloudevents+json"
def can_read(self, content_type: str, headers: typing.Dict[str, str] = {}) -> bool:
def can_read(
self,
content_type: typing.Optional[str] = None,
headers: typing.Optional[typing.Mapping[str, str]] = None,
) -> bool:
if headers is None:
headers = {}
return (
isinstance(content_type, str)
and content_type.startswith(self.MIME_TYPE)
@ -40,16 +45,18 @@ class JSONHTTPCloudEventConverter(base.Converter):
def read(
self,
event: event_base.BaseEvent,
headers: dict,
body: typing.IO,
headers: typing.Mapping[str, str],
body: typing.Union[str, bytes],
data_unmarshaller: types.UnmarshallerType,
) -> event_base.BaseEvent:
event.UnmarshalJSON(body, data_unmarshaller)
return event
def write(
self, event: event_base.BaseEvent, data_marshaller: types.MarshallerType
) -> typing.Tuple[dict, bytes]:
self,
event: event_base.BaseEvent,
data_marshaller: typing.Optional[types.MarshallerType],
) -> typing.Tuple[typing.Dict[str, str], bytes]:
http_headers = {"content-type": self.MIME_TYPE}
return http_headers, event.MarshalJSON(data_marshaller).encode("utf-8")
@ -58,7 +65,7 @@ def NewJSONHTTPCloudEventConverter() -> JSONHTTPCloudEventConverter:
return JSONHTTPCloudEventConverter()
def is_structured(headers: typing.Dict[str, str]) -> bool:
def is_structured(headers: typing.Mapping[str, str]) -> bool:
"""
Determines whether an event with the supplied `headers` is in a structured format.

View File

@ -15,7 +15,7 @@
import typing
def has_binary_headers(headers: typing.Dict[str, str]) -> bool:
def has_binary_headers(headers: typing.Mapping[str, str]) -> bool:
"""Determines if all CloudEvents required headers are presents
in the `headers`.

View File

@ -34,7 +34,7 @@ class SpecVersion(str, Enum):
DEFAULT_SPECVERSION = SpecVersion.v1_0
def default_time_selection_algorithm() -> datetime:
def default_time_selection_algorithm() -> datetime.datetime:
"""
:return: A time value which will be used as CloudEvent time attribute value.
"""

View File

@ -15,6 +15,7 @@
import base64
import json
import typing
from typing import Set
import cloudevents.exceptions as cloud_exceptions
from cloudevents.sdk import types
@ -23,112 +24,111 @@ from cloudevents.sdk import types
class EventGetterSetter(object): # pragma: no cover
# ce-specversion
def CloudEventVersion(self) -> str:
raise Exception("not implemented")
@property
def specversion(self):
def specversion(self) -> str:
return self.CloudEventVersion()
@specversion.setter
def specversion(self, value: str) -> None:
self.SetCloudEventVersion(value)
def SetCloudEventVersion(self, specversion: str) -> object:
raise Exception("not implemented")
@specversion.setter
def specversion(self, value: str):
self.SetCloudEventVersion(value)
# ce-type
def EventType(self) -> str:
raise Exception("not implemented")
@property
def type(self):
def type(self) -> str:
return self.EventType()
@type.setter
def type(self, value: str) -> None:
self.SetEventType(value)
def SetEventType(self, eventType: str) -> object:
raise Exception("not implemented")
@type.setter
def type(self, value: str):
self.SetEventType(value)
# ce-source
def Source(self) -> str:
raise Exception("not implemented")
@property
def source(self):
def source(self) -> str:
return self.Source()
@source.setter
def source(self, value: str) -> None:
self.SetSource(value)
def SetSource(self, source: str) -> object:
raise Exception("not implemented")
@source.setter
def source(self, value: str):
self.SetSource(value)
# ce-id
def EventID(self) -> str:
raise Exception("not implemented")
@property
def id(self):
def id(self) -> str:
return self.EventID()
@id.setter
def id(self, value: str) -> None:
self.SetEventID(value)
def SetEventID(self, eventID: str) -> object:
raise Exception("not implemented")
@id.setter
def id(self, value: str):
self.SetEventID(value)
# ce-time
def EventTime(self) -> str:
def EventTime(self) -> typing.Optional[str]:
raise Exception("not implemented")
@property
def time(self):
def time(self) -> typing.Optional[str]:
return self.EventTime()
def SetEventTime(self, eventTime: str) -> object:
raise Exception("not implemented")
@time.setter
def time(self, value: str):
def time(self, value: typing.Optional[str]) -> None:
self.SetEventTime(value)
def SetEventTime(self, eventTime: typing.Optional[str]) -> object:
raise Exception("not implemented")
# ce-schema
def SchemaURL(self) -> str:
def SchemaURL(self) -> typing.Optional[str]:
raise Exception("not implemented")
@property
def schema(self) -> str:
def schema(self) -> typing.Optional[str]:
return self.SchemaURL()
def SetSchemaURL(self, schemaURL: str) -> object:
raise Exception("not implemented")
@schema.setter
def schema(self, value: str):
def schema(self, value: typing.Optional[str]) -> None:
self.SetSchemaURL(value)
def SetSchemaURL(self, schemaURL: typing.Optional[str]) -> object:
raise Exception("not implemented")
# data
def Data(self) -> object:
def Data(self) -> typing.Optional[object]:
raise Exception("not implemented")
@property
def data(self) -> object:
def data(self) -> typing.Optional[object]:
return self.Data()
def SetData(self, data: object) -> object:
raise Exception("not implemented")
@data.setter
def data(self, value: object):
def data(self, value: typing.Optional[object]) -> None:
self.SetData(value)
def SetData(self, data: typing.Optional[object]) -> object:
raise Exception("not implemented")
# ce-extensions
def Extensions(self) -> dict:
raise Exception("not implemented")
@ -137,34 +137,38 @@ class EventGetterSetter(object): # pragma: no cover
def extensions(self) -> dict:
return self.Extensions()
@extensions.setter
def extensions(self, value: dict) -> None:
self.SetExtensions(value)
def SetExtensions(self, extensions: dict) -> object:
raise Exception("not implemented")
@extensions.setter
def extensions(self, value: dict):
self.SetExtensions(value)
# Content-Type
def ContentType(self) -> str:
def ContentType(self) -> typing.Optional[str]:
raise Exception("not implemented")
@property
def content_type(self) -> str:
def content_type(self) -> typing.Optional[str]:
return self.ContentType()
def SetContentType(self, contentType: str) -> object:
raise Exception("not implemented")
@content_type.setter
def content_type(self, value: str):
def content_type(self, value: typing.Optional[str]) -> None:
self.SetContentType(value)
def SetContentType(self, contentType: typing.Optional[str]) -> object:
raise Exception("not implemented")
class BaseEvent(EventGetterSetter):
_ce_required_fields = set()
_ce_optional_fields = set()
"""Base implementation of the CloudEvent."""
def Properties(self, with_nullable=False) -> dict:
_ce_required_fields: Set[str] = set()
"""A set of required CloudEvent field names."""
_ce_optional_fields: Set[str] = set()
"""A set of optional CloudEvent field names."""
def Properties(self, with_nullable: bool = False) -> dict:
props = dict()
for name, value in self.__dict__.items():
if str(name).startswith("ce__"):
@ -174,19 +178,18 @@ class BaseEvent(EventGetterSetter):
return props
def Get(self, key: str) -> typing.Tuple[object, bool]:
formatted_key = "ce__{0}".format(key.lower())
ok = hasattr(self, formatted_key)
value = getattr(self, formatted_key, None)
if not ok:
def Get(self, key: str) -> typing.Tuple[typing.Optional[object], bool]:
formatted_key: str = "ce__{0}".format(key.lower())
key_exists: bool = hasattr(self, formatted_key)
if not key_exists:
exts = self.Extensions()
return exts.get(key), key in exts
value: typing.Any = getattr(self, formatted_key)
return value.get(), key_exists
return value.get(), ok
def Set(self, key: str, value: object):
formatted_key = "ce__{0}".format(key)
key_exists = hasattr(self, formatted_key)
def Set(self, key: str, value: typing.Optional[object]) -> None:
formatted_key: str = "ce__{0}".format(key)
key_exists: bool = hasattr(self, formatted_key)
if key_exists:
attr = getattr(self, formatted_key)
attr.set(value)
@ -196,19 +199,20 @@ class BaseEvent(EventGetterSetter):
exts.update({key: value})
self.Set("extensions", exts)
def MarshalJSON(self, data_marshaller: types.MarshallerType) -> str:
if data_marshaller is None:
data_marshaller = lambda x: x # noqa: E731
def MarshalJSON(
self, data_marshaller: typing.Optional[types.MarshallerType]
) -> str:
props = self.Properties()
if "data" in props:
data = props.pop("data")
try:
data = data_marshaller(data)
if data_marshaller:
data = data_marshaller(data)
except Exception as e:
raise cloud_exceptions.DataMarshallerError(
f"Failed to marshall data with error: {type(e).__name__}('{e}')"
)
if isinstance(data, (bytes, bytes, memoryview)):
if isinstance(data, (bytes, bytearray, memoryview)):
props["data_base64"] = base64.b64encode(data).decode("ascii")
else:
props["data"] = data
@ -221,7 +225,7 @@ class BaseEvent(EventGetterSetter):
self,
b: typing.Union[str, bytes],
data_unmarshaller: types.UnmarshallerType,
):
) -> None:
raw_ce = json.loads(b)
missing_fields = self._ce_required_fields - raw_ce.keys()
@ -231,30 +235,27 @@ class BaseEvent(EventGetterSetter):
)
for name, value in raw_ce.items():
decoder = lambda x: x
if name == "data":
# Use the user-provided serializer, which may have customized
# JSON decoding
decoder = lambda v: data_unmarshaller(json.dumps(v))
if name == "data_base64":
decoder = lambda v: data_unmarshaller(base64.b64decode(v))
name = "data"
try:
set_value = decoder(value)
if name == "data":
decoded_value = data_unmarshaller(json.dumps(value))
elif name == "data_base64":
decoded_value = data_unmarshaller(base64.b64decode(value))
name = "data"
else:
decoded_value = value
except Exception as e:
raise cloud_exceptions.DataUnmarshallerError(
"Failed to unmarshall data with error: "
f"{type(e).__name__}('{e}')"
)
self.Set(name, set_value)
self.Set(name, decoded_value)
def UnmarshalBinary(
self,
headers: dict,
body: typing.Union[bytes, str],
headers: typing.Mapping[str, str],
body: typing.Union[str, bytes],
data_unmarshaller: types.UnmarshallerType,
):
) -> None:
required_binary_fields = {f"ce-{field}" for field in self._ce_required_fields}
missing_fields = required_binary_fields - headers.keys()
@ -279,20 +280,25 @@ class BaseEvent(EventGetterSetter):
self.Set("data", raw_ce)
def MarshalBinary(
self, data_marshaller: types.MarshallerType
) -> typing.Tuple[dict, bytes]:
if data_marshaller is None:
self, data_marshaller: typing.Optional[types.MarshallerType]
) -> typing.Tuple[typing.Dict[str, str], bytes]:
if not data_marshaller:
data_marshaller = json.dumps
headers = {}
if self.ContentType():
headers["content-type"] = self.ContentType()
props = self.Properties()
headers: typing.Dict[str, str] = {}
content_type = self.ContentType()
if content_type:
headers["content-type"] = content_type
props: typing.Dict = self.Properties()
for key, value in props.items():
if key not in ["data", "extensions", "datacontenttype"]:
if value is not None:
headers["ce-{0}".format(key)] = value
for key, value in props.get("extensions").items():
extensions = props.get("extensions")
if extensions is None or not isinstance(extensions, typing.Mapping):
raise cloud_exceptions.DataMarshallerError(
"No extensions are available in the binary event."
)
for key, value in extensions.items():
headers["ce-{0}".format(key)] = value
data, _ = self.Get("data")

View File

@ -11,29 +11,36 @@
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import typing
from typing import Any
class Option(object):
def __init__(self, name, value, is_required):
self.name = name
self.value = value
self.is_required = is_required
class Option:
"""A value holder of CloudEvents extensions."""
def set(self, new_value):
def __init__(self, name: str, value: typing.Optional[Any], is_required: bool):
self.name: str = name
"""The name of the option."""
self.value: Any = value
"""The value of the option."""
self.is_required: bool = is_required
"""Determines if the option value must be present."""
def set(self, new_value: typing.Optional[Any]) -> None:
"""Sets given new value as the value of this option."""
is_none = new_value is None
if self.is_required and is_none:
raise ValueError(
"Attribute value error: '{0}', "
""
"invalid new value.".format(self.name)
"Attribute value error: '{0}', invalid new value.".format(self.name)
)
self.value = new_value
def get(self):
def get(self) -> typing.Optional[Any]:
"""Returns the value of this option."""
return self.value
def required(self):
"""Determines if the option value must be present."""
return self.is_required
def __eq__(self, obj):

View File

@ -11,6 +11,7 @@
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import typing
from cloudevents.sdk.event import base, opt
@ -41,37 +42,55 @@ class Event(base.BaseEvent):
self.ce__extensions = opt.Option("extensions", dict(), False)
def CloudEventVersion(self) -> str:
return self.ce__specversion.get()
return str(self.ce__specversion.get())
def EventType(self) -> str:
return self.ce__type.get()
return str(self.ce__type.get())
def Source(self) -> str:
return self.ce__source.get()
return str(self.ce__source.get())
def EventID(self) -> str:
return self.ce__id.get()
return str(self.ce__id.get())
def EventTime(self) -> str:
return self.ce__time.get()
def EventTime(self) -> typing.Optional[str]:
result = self.ce__time.get()
if result is None:
return None
return str(result)
def Subject(self) -> str:
return self.ce__subject.get()
def Subject(self) -> typing.Optional[str]:
result = self.ce__subject.get()
if result is None:
return None
return str(result)
def SchemaURL(self) -> str:
return self.ce__schemaurl.get()
def SchemaURL(self) -> typing.Optional[str]:
result = self.ce__schemaurl.get()
if result is None:
return None
return str(result)
def Data(self) -> object:
def Data(self) -> typing.Optional[object]:
return self.ce__data.get()
def Extensions(self) -> dict:
return self.ce__extensions.get()
result = self.ce__extensions.get()
if result is None:
return {}
return dict(result)
def ContentType(self) -> str:
return self.ce__datacontenttype.get()
def ContentType(self) -> typing.Optional[str]:
result = self.ce__datacontenttype.get()
if result is None:
return None
return str(result)
def ContentEncoding(self) -> str:
return self.ce__datacontentencoding.get()
def ContentEncoding(self) -> typing.Optional[str]:
result = self.ce__datacontentencoding.get()
if result is None:
return None
return str(result)
def SetEventType(self, eventType: str) -> base.BaseEvent:
self.Set("type", eventType)
@ -85,54 +104,56 @@ class Event(base.BaseEvent):
self.Set("id", eventID)
return self
def SetEventTime(self, eventTime: str) -> base.BaseEvent:
def SetEventTime(self, eventTime: typing.Optional[str]) -> base.BaseEvent:
self.Set("time", eventTime)
return self
def SetSubject(self, subject: str) -> base.BaseEvent:
def SetSubject(self, subject: typing.Optional[str]) -> base.BaseEvent:
self.Set("subject", subject)
return self
def SetSchemaURL(self, schemaURL: str) -> base.BaseEvent:
def SetSchemaURL(self, schemaURL: typing.Optional[str]) -> base.BaseEvent:
self.Set("schemaurl", schemaURL)
return self
def SetData(self, data: object) -> base.BaseEvent:
def SetData(self, data: typing.Optional[object]) -> base.BaseEvent:
self.Set("data", data)
return self
def SetExtensions(self, extensions: dict) -> base.BaseEvent:
def SetExtensions(self, extensions: typing.Optional[dict]) -> base.BaseEvent:
self.Set("extensions", extensions)
return self
def SetContentType(self, contentType: str) -> base.BaseEvent:
def SetContentType(self, contentType: typing.Optional[str]) -> base.BaseEvent:
self.Set("datacontenttype", contentType)
return self
def SetContentEncoding(self, contentEncoding: str) -> base.BaseEvent:
def SetContentEncoding(
self, contentEncoding: typing.Optional[str]
) -> base.BaseEvent:
self.Set("datacontentencoding", contentEncoding)
return self
@property
def datacontentencoding(self):
def datacontentencoding(self) -> typing.Optional[str]:
return self.ContentEncoding()
@datacontentencoding.setter
def datacontentencoding(self, value: str):
def datacontentencoding(self, value: typing.Optional[str]) -> None:
self.SetContentEncoding(value)
@property
def subject(self) -> str:
def subject(self) -> typing.Optional[str]:
return self.Subject()
@subject.setter
def subject(self, value: str):
def subject(self, value: typing.Optional[str]) -> None:
self.SetSubject(value)
@property
def schema_url(self) -> str:
def schema_url(self) -> typing.Optional[str]:
return self.SchemaURL()
@schema_url.setter
def schema_url(self, value: str):
def schema_url(self, value: typing.Optional[str]) -> None:
self.SetSchemaURL(value)

View File

@ -11,9 +11,15 @@
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from __future__ import annotations
import typing
from cloudevents.sdk.event import base, opt
if typing.TYPE_CHECKING:
from typing_extensions import Self
class Event(base.BaseEvent):
_ce_required_fields = {"id", "source", "type", "specversion"}
@ -34,83 +40,98 @@ class Event(base.BaseEvent):
self.ce__extensions = opt.Option("extensions", dict(), False)
def CloudEventVersion(self) -> str:
return self.ce__specversion.get()
return str(self.ce__specversion.get())
def EventType(self) -> str:
return self.ce__type.get()
return str(self.ce__type.get())
def Source(self) -> str:
return self.ce__source.get()
return str(self.ce__source.get())
def EventID(self) -> str:
return self.ce__id.get()
return str(self.ce__id.get())
def EventTime(self) -> str:
return self.ce__time.get()
def EventTime(self) -> typing.Optional[str]:
result = self.ce__time.get()
if result is None:
return None
return str(result)
def Subject(self) -> str:
return self.ce__subject.get()
def Subject(self) -> typing.Optional[str]:
result = self.ce__subject.get()
if result is None:
return None
return str(result)
def Schema(self) -> str:
return self.ce__dataschema.get()
def Schema(self) -> typing.Optional[str]:
result = self.ce__dataschema.get()
if result is None:
return None
return str(result)
def ContentType(self) -> str:
return self.ce__datacontenttype.get()
def ContentType(self) -> typing.Optional[str]:
result = self.ce__datacontenttype.get()
if result is None:
return None
return str(result)
def Data(self) -> object:
def Data(self) -> typing.Optional[object]:
return self.ce__data.get()
def Extensions(self) -> dict:
return self.ce__extensions.get()
result = self.ce__extensions.get()
if result is None:
return {}
return dict(result)
def SetEventType(self, eventType: str) -> base.BaseEvent:
def SetEventType(self, eventType: str) -> Self:
self.Set("type", eventType)
return self
def SetSource(self, source: str) -> base.BaseEvent:
def SetSource(self, source: str) -> Self:
self.Set("source", source)
return self
def SetEventID(self, eventID: str) -> base.BaseEvent:
def SetEventID(self, eventID: str) -> Self:
self.Set("id", eventID)
return self
def SetEventTime(self, eventTime: str) -> base.BaseEvent:
def SetEventTime(self, eventTime: typing.Optional[str]) -> Self:
self.Set("time", eventTime)
return self
def SetSubject(self, subject: str) -> base.BaseEvent:
def SetSubject(self, subject: typing.Optional[str]) -> Self:
self.Set("subject", subject)
return self
def SetSchema(self, schema: str) -> base.BaseEvent:
def SetSchema(self, schema: typing.Optional[str]) -> Self:
self.Set("dataschema", schema)
return self
def SetContentType(self, contentType: str) -> base.BaseEvent:
def SetContentType(self, contentType: typing.Optional[str]) -> Self:
self.Set("datacontenttype", contentType)
return self
def SetData(self, data: object) -> base.BaseEvent:
def SetData(self, data: typing.Optional[object]) -> Self:
self.Set("data", data)
return self
def SetExtensions(self, extensions: dict) -> base.BaseEvent:
def SetExtensions(self, extensions: typing.Optional[dict]) -> Self:
self.Set("extensions", extensions)
return self
@property
def schema(self) -> str:
def schema(self) -> typing.Optional[str]:
return self.Schema()
@schema.setter
def schema(self, value: str):
def schema(self, value: typing.Optional[str]) -> None:
self.SetSchema(value)
@property
def subject(self) -> str:
def subject(self) -> typing.Optional[str]:
return self.Subject()
@subject.setter
def subject(self, value: str):
def subject(self, value: typing.Optional[str]) -> None:
self.SetSubject(value)

View File

@ -26,36 +26,34 @@ class HTTPMarshaller(object):
API of this class designed to work with CloudEvent (upstream and v0.1)
"""
def __init__(self, converters: typing.List[base.Converter]):
def __init__(self, converters: typing.Sequence[base.Converter]):
"""
CloudEvent HTTP marshaller constructor
:param converters: a list of HTTP-to-CloudEvent-to-HTTP constructors
:type converters: typing.List[base.Converter]
"""
self.http_converters = [c for c in converters]
self.http_converters_by_type = {c.TYPE: c for c in converters}
self.http_converters: typing.List[base.Converter] = [c for c in converters]
self.http_converters_by_type: typing.Dict[str, base.Converter] = {
c.TYPE: c for c in converters
}
def FromRequest(
self,
event: event_base.BaseEvent,
headers: dict,
headers: typing.Mapping[str, str],
body: typing.Union[str, bytes],
data_unmarshaller: types.UnmarshallerType = json.loads,
data_unmarshaller: typing.Optional[types.UnmarshallerType] = None,
) -> event_base.BaseEvent:
"""
Reads a CloudEvent from an HTTP headers and request body
:param event: CloudEvent placeholder
:type event: cloudevents.sdk.event.base.BaseEvent
:param headers: a dict-like HTTP headers
:type headers: dict
:param body: an HTTP request body as a string or bytes
:type body: typing.Union[str, bytes]
:param data_unmarshaller: a callable-like
unmarshaller the CloudEvent data
:param data_unmarshaller: a callable-like unmarshaller the CloudEvent data
:return: a CloudEvent
:rtype: event_base.BaseEvent
"""
if not isinstance(data_unmarshaller, typing.Callable):
if not data_unmarshaller:
data_unmarshaller = json.loads
if not callable(data_unmarshaller):
raise exceptions.InvalidDataUnmarshaller()
# Lower all header keys
@ -77,23 +75,17 @@ class HTTPMarshaller(object):
def ToRequest(
self,
event: event_base.BaseEvent,
converter_type: str = None,
data_marshaller: types.MarshallerType = None,
) -> (dict, bytes):
converter_type: typing.Optional[str] = None,
data_marshaller: typing.Optional[types.MarshallerType] = None,
) -> typing.Tuple[typing.Dict[str, str], bytes]:
"""
Writes a CloudEvent into a HTTP-ready form of headers and request body
:param event: CloudEvent
:type event: event_base.BaseEvent
:param converter_type: a type of CloudEvent-to-HTTP converter
:type converter_type: str
:param data_marshaller: a callable-like marshaller CloudEvent data
:type data_marshaller: typing.Callable
:return: dict of HTTP headers and stream of HTTP request body
:rtype: tuple
"""
if data_marshaller is not None and not isinstance(
data_marshaller, typing.Callable
):
if data_marshaller is not None and not callable(data_marshaller):
raise exceptions.InvalidDataMarshaller()
if converter_type is None:
@ -108,10 +100,9 @@ class HTTPMarshaller(object):
def NewDefaultHTTPMarshaller() -> HTTPMarshaller:
"""
Creates the default HTTP marshaller with both structured
and binary converters
Creates the default HTTP marshaller with both structured and binary converters.
:return: an instance of HTTP marshaller
:rtype: cloudevents.sdk.marshaller.HTTPMarshaller
"""
return HTTPMarshaller(
[
@ -122,14 +113,13 @@ def NewDefaultHTTPMarshaller() -> HTTPMarshaller:
def NewHTTPMarshaller(
converters: typing.List[base.Converter],
converters: typing.Sequence[base.Converter],
) -> HTTPMarshaller:
"""
Creates the default HTTP marshaller with both
structured and binary converters
Creates the default HTTP marshaller with both structured and binary converters.
:param converters: a list of CloudEvent-to-HTTP-to-CloudEvent converters
:type converters: typing.List[base.Converter]
:return: an instance of HTTP marshaller
:rtype: cloudevents.sdk.marshaller.HTTPMarshaller
"""
return HTTPMarshaller(converters)

View File

@ -14,12 +14,25 @@
import typing
_K_co = typing.TypeVar("_K_co", covariant=True)
_V_co = typing.TypeVar("_V_co", covariant=True)
# Use consistent types for marshal and unmarshal functions across
# both JSON and Binary format.
MarshallerType = typing.Optional[
typing.Callable[[typing.Any], typing.Union[bytes, str]]
]
UnmarshallerType = typing.Optional[
typing.Callable[[typing.Union[bytes, str]], typing.Any]
]
MarshallerType = typing.Callable[[typing.Any], typing.AnyStr]
UnmarshallerType = typing.Callable[[typing.AnyStr], typing.Any]
class SupportsDuplicateItems(typing.Protocol[_K_co, _V_co]):
"""
Dict-like objects with an items() method that may produce duplicate keys.
"""
# This is wider than _typeshed.SupportsItems, which expects items() to
# return type an AbstractSet. werkzeug's Headers class satisfies this type,
# but not _typeshed.SupportsItems.
def items(self) -> typing.Iterable[typing.Tuple[_K_co, _V_co]]:
pass

View File

@ -21,7 +21,7 @@ from cloudevents.sdk.converters import base, binary
def test_binary_converter_raise_unsupported():
with pytest.raises(exceptions.UnsupportedEvent):
cnvtr = binary.BinaryHTTPCloudEventConverter()
cnvtr.read(None, {}, None, None)
cnvtr.read(None, {}, None, None) # type: ignore[arg-type] # intentionally wrong type # noqa: E501
def test_base_converters_raise_exceptions():
@ -35,8 +35,8 @@ def test_base_converters_raise_exceptions():
with pytest.raises(Exception):
cnvtr = base.Converter()
cnvtr.write(None, None)
cnvtr.write(None, None) # type: ignore[arg-type] # intentionally wrong type
with pytest.raises(Exception):
cnvtr = base.Converter()
cnvtr.read(None, None, None, None)
cnvtr.read(None, None, None, None) # type: ignore[arg-type] # intentionally wrong type # noqa: E501

View File

@ -25,7 +25,7 @@ from cloudevents.tests import data
@pytest.mark.parametrize("event_class", [v03.Event, v1.Event])
def test_binary_converter_upstream(event_class):
m = marshaller.NewHTTPMarshaller([binary.NewBinaryHTTPCloudEventConverter()])
event = m.FromRequest(event_class(), data.headers[event_class], None, lambda x: x)
event = m.FromRequest(event_class(), data.headers[event_class], b"", lambda x: x)
assert event is not None
assert event.EventType() == data.ce_type
assert event.EventID() == data.ce_id

View File

@ -77,7 +77,7 @@ def test_object_event_v1():
_, structured_body = m.ToRequest(event)
assert isinstance(structured_body, bytes)
structured_obj = json.loads(structured_body)
error_msg = f"Body was {structured_body}, obj is {structured_obj}"
error_msg = f"Body was {structured_body!r}, obj is {structured_obj}"
assert isinstance(structured_obj, dict), error_msg
assert isinstance(structured_obj["data"], dict), error_msg
assert len(structured_obj["data"]) == 1, error_msg

View File

@ -11,6 +11,7 @@
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from __future__ import annotations
import bz2
import io
@ -241,11 +242,11 @@ def test_structured_to_request(specversion):
assert headers["content-type"] == "application/cloudevents+json"
for key in attributes:
assert body[key] == attributes[key]
assert body["data"] == data, f"|{body_bytes}|| {body}"
assert body["data"] == data, f"|{body_bytes!r}|| {body}"
@pytest.mark.parametrize("specversion", ["1.0", "0.3"])
def test_attributes_view_accessor(specversion: str):
def test_attributes_view_accessor(specversion: str) -> None:
attributes: dict[str, typing.Any] = {
"specversion": specversion,
"type": "word.found.name",
@ -333,7 +334,7 @@ def test_valid_structured_events(specversion):
events_queue = []
num_cloudevents = 30
for i in range(num_cloudevents):
event = {
raw_event = {
"id": f"id{i}",
"source": f"source{i}.com.test",
"type": "cloudevent.test.type",
@ -343,7 +344,7 @@ def test_valid_structured_events(specversion):
events_queue.append(
from_http(
{"content-type": "application/cloudevents+json"},
json.dumps(event),
json.dumps(raw_event),
)
)
@ -454,7 +455,7 @@ def test_invalid_data_format_structured_from_http():
headers = {"Content-Type": "application/cloudevents+json"}
data = 20
with pytest.raises(cloud_exceptions.InvalidStructuredJSON) as e:
from_http(headers, data)
from_http(headers, data) # type: ignore[arg-type] # intentionally wrong type
assert "Expected json of type (str, bytes, bytearray)" in str(e.value)
@ -526,7 +527,7 @@ def test_generic_exception():
e.errisinstance(cloud_exceptions.MissingRequiredFields)
with pytest.raises(cloud_exceptions.GenericException) as e:
from_http({}, 123)
from_http({}, 123) # type: ignore[arg-type] # intentionally wrong type
e.errisinstance(cloud_exceptions.InvalidStructuredJSON)
with pytest.raises(cloud_exceptions.GenericException) as e:

View File

@ -19,6 +19,7 @@ import json
import pytest
from cloudevents import exceptions as cloud_exceptions
from cloudevents.abstract.event import AnyCloudEvent
from cloudevents.http import CloudEvent
from cloudevents.kafka.conversion import (
KafkaMessage,
@ -36,7 +37,9 @@ def simple_serialize(data: dict) -> bytes:
def simple_deserialize(data: bytes) -> dict:
return json.loads(data.decode())
value = json.loads(data.decode())
assert isinstance(value, dict)
return value
def failing_func(*args):
@ -44,11 +47,10 @@ def failing_func(*args):
class KafkaConversionTestBase:
expected_data = {"name": "test", "amount": 1}
expected_custom_mapped_key = "custom-key"
def custom_key_mapper(self, _) -> str:
def custom_key_mapper(self, _: AnyCloudEvent) -> str:
return self.expected_custom_mapped_key
@pytest.fixture
@ -60,7 +62,7 @@ class KafkaConversionTestBase:
"source": "pytest",
"type": "com.pytest.test",
"time": datetime.datetime(2000, 1, 1, 6, 42, 33).isoformat(),
"content-type": "foo",
"datacontenttype": "foo",
"partitionkey": "test_key_123",
},
data=self.expected_data,
@ -124,7 +126,7 @@ class TestToBinary(KafkaConversionTestBase):
assert result.headers["ce_source"] == source_event["source"].encode("utf-8")
assert result.headers["ce_type"] == source_event["type"].encode("utf-8")
assert result.headers["ce_time"] == source_event["time"].encode("utf-8")
assert result.headers["content-type"] == source_event["content-type"].encode(
assert result.headers["content-type"] == source_event["datacontenttype"].encode(
"utf-8"
)
assert "data" not in result.headers
@ -164,7 +166,7 @@ class TestFromBinary(KafkaConversionTestBase):
"ce_time": datetime.datetime(2000, 1, 1, 6, 42, 33)
.isoformat()
.encode("utf-8"),
"content-type": "foo".encode("utf-8"),
"datacontenttype": "foo".encode("utf-8"),
},
value=simple_serialize(self.expected_data),
key="test_key_123",
@ -206,7 +208,7 @@ class TestFromBinary(KafkaConversionTestBase):
assert result["type"] == source_binary_json_message.headers["ce_type"].decode()
assert result["time"] == source_binary_json_message.headers["ce_time"].decode()
assert (
result["content-type"]
result["datacontenttype"]
== source_binary_json_message.headers["content-type"].decode()
)
@ -329,7 +331,7 @@ class TestToStructured(KafkaConversionTestBase):
def test_sets_headers(self, source_event):
result = to_structured(source_event)
assert len(result.headers) == 1
assert result.headers["content-type"] == source_event["content-type"].encode(
assert result.headers["content-type"] == source_event["datacontenttype"].encode(
"utf-8"
)
@ -475,7 +477,7 @@ class TestFromStructured(KafkaConversionTestBase):
):
result = from_structured(source_structured_json_message)
assert (
result["content-type"]
result["datacontenttype"]
== source_structured_json_message.headers["content-type"].decode()
)
@ -488,7 +490,7 @@ class TestFromStructured(KafkaConversionTestBase):
envelope_unmarshaller=custom_unmarshaller,
)
assert (
result["content-type"]
result["datacontenttype"]
== source_structured_bytes_bytes_message.headers["content-type"].decode()
)

View File

@ -49,13 +49,15 @@ def structured_data():
def test_from_request_wrong_unmarshaller():
with pytest.raises(exceptions.InvalidDataUnmarshaller):
m = marshaller.NewDefaultHTTPMarshaller()
_ = m.FromRequest(v1.Event(), {}, "", None)
_ = m.FromRequest(
event=v1.Event(), headers={}, body="", data_unmarshaller=object() # type: ignore[arg-type] # intentionally wrong type # noqa: E501
)
def test_to_request_wrong_marshaller():
with pytest.raises(exceptions.InvalidDataMarshaller):
m = marshaller.NewDefaultHTTPMarshaller()
_ = m.ToRequest(v1.Event(), data_marshaller="")
_ = m.ToRequest(v1.Event(), data_marshaller="") # type: ignore[arg-type] # intentionally wrong type # noqa: E501
def test_from_request_cannot_read(binary_headers):

View File

@ -15,11 +15,13 @@ import datetime
from json import loads
import pytest
from pydantic import ValidationError
from pydantic import ValidationError as PydanticV2ValidationError
from pydantic.v1 import ValidationError as PydanticV1ValidationError
from cloudevents.conversion import _json_or_string
from cloudevents.exceptions import IncompatibleArgumentsError
from cloudevents.pydantic import CloudEvent
from cloudevents.pydantic.v1.event import CloudEvent as PydanticV1CloudEvent
from cloudevents.pydantic.v2.event import CloudEvent as PydanticV2CloudEvent
from cloudevents.sdk.event.attribute import SpecVersion
_DUMMY_SOURCE = "dummy:source"
@ -33,6 +35,25 @@ def specversion(request):
return request.param
_pydantic_implementation = {
"v1": {
"event": PydanticV1CloudEvent,
"validation_error": PydanticV1ValidationError,
"pydantic_version": "v1",
},
"v2": {
"event": PydanticV2CloudEvent,
"validation_error": PydanticV2ValidationError,
"pydantic_version": "v2",
},
}
@pytest.fixture(params=["v1", "v2"])
def cloudevents_implementation(request):
return _pydantic_implementation[request.param]
@pytest.fixture()
def dummy_attributes(specversion):
return {
@ -58,8 +79,10 @@ def your_dummy_data():
@pytest.fixture()
def dummy_event(dummy_attributes, my_dummy_data):
return CloudEvent(attributes=dummy_attributes, data=my_dummy_data)
def dummy_event(dummy_attributes, my_dummy_data, cloudevents_implementation):
return cloudevents_implementation["event"](
attributes=dummy_attributes, data=my_dummy_data
)
@pytest.fixture()
@ -69,10 +92,12 @@ def non_exiting_attribute_name(dummy_event):
return result
def test_pydantic_cloudevent_equality(dummy_attributes, my_dummy_data, your_dummy_data):
def test_pydantic_cloudevent_equality(
dummy_attributes, my_dummy_data, your_dummy_data, cloudevents_implementation
):
data = my_dummy_data
event1 = CloudEvent(dummy_attributes, data)
event2 = CloudEvent(dummy_attributes, data)
event1 = cloudevents_implementation["event"](dummy_attributes, data)
event2 = cloudevents_implementation["event"](dummy_attributes, data)
assert event1 == event2
# Test different attributes
for key in dummy_attributes:
@ -80,15 +105,15 @@ def test_pydantic_cloudevent_equality(dummy_attributes, my_dummy_data, your_dumm
continue
else:
dummy_attributes[key] = f"noise-{key}"
event3 = CloudEvent(dummy_attributes, data)
event2 = CloudEvent(dummy_attributes, data)
event3 = cloudevents_implementation["event"](dummy_attributes, data)
event2 = cloudevents_implementation["event"](dummy_attributes, data)
assert event2 == event3
assert event1 != event2 and event3 != event1
# Test different data
data = your_dummy_data
event3 = CloudEvent(dummy_attributes, data)
event2 = CloudEvent(dummy_attributes, data)
event3 = cloudevents_implementation["event"](dummy_attributes, data)
event2 = cloudevents_implementation["event"](dummy_attributes, data)
assert event2 == event3
assert event1 != event2 and event3 != event1
@ -109,12 +134,12 @@ def test_http_cloudevent_must_not_equal_to_non_cloudevent_value(
def test_http_cloudevent_mutates_equality(
dummy_attributes, my_dummy_data, your_dummy_data
dummy_attributes, my_dummy_data, your_dummy_data, cloudevents_implementation
):
data = my_dummy_data
event1 = CloudEvent(dummy_attributes, data)
event2 = CloudEvent(dummy_attributes, data)
event3 = CloudEvent(dummy_attributes, data)
event1 = cloudevents_implementation["event"](dummy_attributes, data)
event2 = cloudevents_implementation["event"](dummy_attributes, data)
event3 = cloudevents_implementation["event"](dummy_attributes, data)
assert event1 == event2
# Test different attributes
@ -134,29 +159,40 @@ def test_http_cloudevent_mutates_equality(
assert event1 != event2 and event3 != event1
def test_cloudevent_missing_specversion():
def test_cloudevent_missing_specversion(cloudevents_implementation):
errors = {
"v1": "value is not a valid enumeration member; permitted: '0.3', '1.0'",
"v2": "Input should be '0.3' or '1.0'",
}
attributes = {"specversion": "0.2", "source": "s", "type": "t"}
with pytest.raises(ValidationError) as e:
_ = CloudEvent(attributes, None)
assert "value is not a valid enumeration member; permitted: '0.3', '1.0'" in str(
e.value
)
with pytest.raises(cloudevents_implementation["validation_error"]) as e:
_ = cloudevents_implementation["event"](attributes, None)
assert errors[cloudevents_implementation["pydantic_version"]] in str(e.value)
def test_cloudevent_missing_minimal_required_fields():
def test_cloudevent_missing_minimal_required_fields(cloudevents_implementation):
attributes = {"type": "t"}
with pytest.raises(ValidationError) as e:
_ = CloudEvent(attributes, None)
assert "\nsource\n field required " in str(e.value)
errors = {
"v1": "\nsource\n field required ",
"v2": "\nsource\n Field required ",
}
with pytest.raises(cloudevents_implementation["validation_error"]) as e:
_ = cloudevents_implementation["event"](attributes, None)
assert errors[cloudevents_implementation["pydantic_version"]] in str(e.value)
attributes = {"source": "s"}
with pytest.raises(ValidationError) as e:
_ = CloudEvent(attributes, None)
assert "\ntype\n field required " in str(e.value)
errors = {
"v1": "\ntype\n field required ",
"v2": "\ntype\n Field required ",
}
with pytest.raises(cloudevents_implementation["validation_error"]) as e:
_ = cloudevents_implementation["event"](attributes, None)
assert errors[cloudevents_implementation["pydantic_version"]] in str(e.value)
def test_cloudevent_general_overrides():
event = CloudEvent(
def test_cloudevent_general_overrides(cloudevents_implementation):
event = cloudevents_implementation["event"](
{
"source": "my-source",
"type": "com.test.overrides",
@ -217,9 +253,9 @@ def test_get_operation_on_non_existing_attribute_should_not_copy_default_value(
@pytest.mark.xfail() # https://github.com/cloudevents/sdk-python/issues/185
def test_json_data_serialization_without_explicit_type():
def test_json_data_serialization_without_explicit_type(cloudevents_implementation):
assert loads(
CloudEvent(
cloudevents_implementation["event"](
source=_DUMMY_SOURCE, type=_DUMMY_TYPE, data='{"hello": "world"}'
).json()
)["data"] == {"hello": "world"}
@ -236,12 +272,15 @@ def test_json_data_serialization_without_explicit_type():
],
)
def test_json_data_serialization_with_explicit_json_content_type(
dummy_attributes, json_content_type
dummy_attributes, json_content_type, cloudevents_implementation
):
dummy_attributes["datacontenttype"] = json_content_type
assert loads(CloudEvent(dummy_attributes, data='{"hello": "world"}',).json())[
"data"
] == {"hello": "world"}
assert loads(
cloudevents_implementation["event"](
dummy_attributes,
data='{"hello": "world"}',
).json()
)["data"] == {"hello": "world"}
_NON_JSON_CONTENT_TYPES = [
@ -264,10 +303,10 @@ _NON_JSON_CONTENT_TYPES = [
@pytest.mark.parametrize("datacontenttype", _NON_JSON_CONTENT_TYPES)
def test_json_data_serialization_with_explicit_non_json_content_type(
dummy_attributes, datacontenttype
dummy_attributes, datacontenttype, cloudevents_implementation
):
dummy_attributes["datacontenttype"] = datacontenttype
event = CloudEvent(
event = cloudevents_implementation["event"](
dummy_attributes,
data='{"hello": "world"}',
).json()
@ -275,18 +314,20 @@ def test_json_data_serialization_with_explicit_non_json_content_type(
@pytest.mark.parametrize("datacontenttype", _NON_JSON_CONTENT_TYPES)
def test_binary_data_serialization(dummy_attributes, datacontenttype):
def test_binary_data_serialization(
dummy_attributes, datacontenttype, cloudevents_implementation
):
dummy_attributes["datacontenttype"] = datacontenttype
event = CloudEvent(
event = cloudevents_implementation["event"](
dummy_attributes,
data=b"\x00\x00\x11Hello World",
).json()
result_json = loads(event)
assert result_json["data_base64"] == "AAARSGVsbG8gV29ybGQ="
assert "daata" not in result_json
assert "data" not in result_json
def test_binary_data_deserialization():
def test_binary_data_deserialization(cloudevents_implementation):
given = (
b'{"source": "dummy:source", "id": "11775cb2-fd00-4487-a18b-30c3600eaa5f",'
b' "type": "dummy.type", "specversion": "1.0", "time":'
@ -307,7 +348,12 @@ def test_binary_data_deserialization():
),
"type": "dummy.type",
}
assert CloudEvent.parse_raw(given).dict() == expected
assert cloudevents_implementation["event"].parse_raw(given).dict() == expected
if cloudevents_implementation["pydantic_version"] == "v2":
assert (
cloudevents_implementation["event"].model_validate_json(given).dict()
== expected
)
def test_access_data_event_attribute_should_raise_key_error(dummy_event):
@ -344,6 +390,6 @@ def test_data_must_never_exist_as_an_attribute_name(dummy_event):
assert "data" not in dummy_event
def test_attributes_and_kwards_are_incompatible():
def test_attributes_and_kwards_are_incompatible(cloudevents_implementation):
with pytest.raises(IncompatibleArgumentsError):
CloudEvent({"a": "b"}, other="hello world")
cloudevents_implementation["event"]({"a": "b"}, other="hello world")

View File

@ -17,21 +17,51 @@ import datetime
import json
import pytest
from pydantic import ValidationError as PydanticV2ValidationError
from pydantic.v1 import ValidationError as PydanticV1ValidationError
from cloudevents.conversion import to_json
from cloudevents.pydantic import CloudEvent, from_dict, from_json
from cloudevents.pydantic.v1.conversion import from_dict as pydantic_v1_from_dict
from cloudevents.pydantic.v1.conversion import from_json as pydantic_v1_from_json
from cloudevents.pydantic.v1.event import CloudEvent as PydanticV1CloudEvent
from cloudevents.pydantic.v2.conversion import from_dict as pydantic_v2_from_dict
from cloudevents.pydantic.v2.conversion import from_json as pydantic_v2_from_json
from cloudevents.pydantic.v2.event import CloudEvent as PydanticV2CloudEvent
from cloudevents.sdk.event.attribute import SpecVersion
test_data = json.dumps({"data-key": "val"})
test_attributes = {
"type": "com.example.string",
"source": "https://example.com/event-producer",
"extension-attribute": "extension-attribute-test-value",
}
_pydantic_implementation = {
"v1": {
"event": PydanticV1CloudEvent,
"validation_error": PydanticV1ValidationError,
"from_dict": pydantic_v1_from_dict,
"from_json": pydantic_v1_from_json,
"pydantic_version": "v1",
},
"v2": {
"event": PydanticV2CloudEvent,
"validation_error": PydanticV2ValidationError,
"from_dict": pydantic_v2_from_dict,
"from_json": pydantic_v2_from_json,
"pydantic_version": "v2",
},
}
@pytest.fixture(params=["v1", "v2"])
def cloudevents_implementation(request):
return _pydantic_implementation[request.param]
@pytest.mark.parametrize("specversion", ["0.3", "1.0"])
def test_to_json(specversion):
event = CloudEvent(test_attributes, test_data)
def test_to_json(specversion, cloudevents_implementation):
event = cloudevents_implementation["event"](test_attributes, test_data)
event_json = to_json(event)
event_dict = json.loads(event_json)
@ -42,10 +72,10 @@ def test_to_json(specversion):
@pytest.mark.parametrize("specversion", ["0.3", "1.0"])
def test_to_json_base64(specversion):
def test_to_json_base64(specversion, cloudevents_implementation):
data = b"test123"
event = CloudEvent(test_attributes, data)
event = cloudevents_implementation["event"](test_attributes, data)
event_json = to_json(event)
event_dict = json.loads(event_json)
@ -60,7 +90,7 @@ def test_to_json_base64(specversion):
@pytest.mark.parametrize("specversion", ["0.3", "1.0"])
def test_from_json(specversion):
def test_from_json(specversion, cloudevents_implementation):
payload = {
"type": "com.example.string",
"source": "https://example.com/event-producer",
@ -68,7 +98,7 @@ def test_from_json(specversion):
"specversion": specversion,
"data": {"data-key": "val"},
}
event = from_json(json.dumps(payload))
event = cloudevents_implementation["from_json"](json.dumps(payload))
for key, val in payload.items():
if key == "data":
@ -78,7 +108,7 @@ def test_from_json(specversion):
@pytest.mark.parametrize("specversion", ["0.3", "1.0"])
def test_from_json_base64(specversion):
def test_from_json_base64(specversion, cloudevents_implementation):
# Create base64 encoded data
raw_data = {"data-key": "val"}
data = json.dumps(raw_data).encode()
@ -95,7 +125,7 @@ def test_from_json_base64(specversion):
payload_json = json.dumps(payload)
# Create event
event = from_json(payload_json)
event = cloudevents_implementation["from_json"](payload_json)
# Test fields were marshalled properly
for key, val in payload.items():
@ -107,11 +137,11 @@ def test_from_json_base64(specversion):
@pytest.mark.parametrize("specversion", ["0.3", "1.0"])
def test_json_can_talk_to_itself(specversion):
event = CloudEvent(test_attributes, test_data)
def test_json_can_talk_to_itself(specversion, cloudevents_implementation):
event = cloudevents_implementation["event"](test_attributes, test_data)
event_json = to_json(event)
event = from_json(event_json)
event = cloudevents_implementation["from_json"](event_json)
for key, val in test_attributes.items():
assert event[key] == val
@ -119,20 +149,20 @@ def test_json_can_talk_to_itself(specversion):
@pytest.mark.parametrize("specversion", ["0.3", "1.0"])
def test_json_can_talk_to_itself_base64(specversion):
def test_json_can_talk_to_itself_base64(specversion, cloudevents_implementation):
data = b"test123"
event = CloudEvent(test_attributes, data)
event = cloudevents_implementation["event"](test_attributes, data)
event_json = to_json(event)
event = from_json(event_json)
event = cloudevents_implementation["from_json"](event_json)
for key, val in test_attributes.items():
assert event[key] == val
assert event.data == data
def test_from_dict():
def test_from_dict(cloudevents_implementation):
given = {
"data": b"\x00\x00\x11Hello World",
"datacontenttype": "application/octet-stream",
@ -146,12 +176,4 @@ def test_from_dict():
),
"type": "dummy.type",
}
assert from_dict(given).dict() == given
@pytest.mark.parametrize("specversion", ["0.3", "1.0"])
def test_pydantic_json_function_parameters_must_affect_output(specversion):
event = CloudEvent(test_attributes, test_data)
v1 = event.json(indent=2, sort_keys=True)
v2 = event.json(indent=4, sort_keys=True)
assert v1 != v2
assert cloudevents_implementation["from_dict"](given).dict() == given

View File

@ -11,6 +11,7 @@
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from __future__ import annotations
import bz2
import io
@ -18,15 +19,23 @@ import json
import typing
import pytest
from pydantic import ValidationError as PydanticV2ValidationError
from pydantic.v1 import ValidationError as PydanticV1ValidationError
from sanic import Sanic, response
import cloudevents.exceptions as cloud_exceptions
from cloudevents.conversion import to_binary, to_structured
from cloudevents.pydantic import CloudEvent, from_http
from cloudevents.sdk import converters
from cloudevents.pydantic.v1.conversion import from_http as pydantic_v1_from_http
from cloudevents.pydantic.v1.event import CloudEvent as PydanticV1CloudEvent
from cloudevents.pydantic.v2.conversion import from_http as pydantic_v2_from_http
from cloudevents.pydantic.v2.event import CloudEvent as PydanticV2CloudEvent
from cloudevents.sdk import converters, types
from cloudevents.sdk.converters.binary import is_binary
from cloudevents.sdk.converters.structured import is_structured
if typing.TYPE_CHECKING:
from typing_extensions import TypeAlias
invalid_test_headers = [
{
"ce-source": "<event-source>",
@ -66,12 +75,59 @@ test_data = {"payload-content": "Hello World!"}
app = Sanic("test_pydantic_http_events")
@app.route("/event", ["POST"])
async def echo(request):
AnyPydanticCloudEvent: TypeAlias = typing.Union[
PydanticV1CloudEvent, PydanticV2CloudEvent
]
class FromHttpFn(typing.Protocol):
def __call__(
self,
headers: typing.Dict[str, str],
data: typing.Optional[typing.AnyStr],
data_unmarshaller: typing.Optional[types.UnmarshallerType] = None,
) -> AnyPydanticCloudEvent:
pass
class PydanticImplementation(typing.TypedDict):
event: typing.Type[AnyPydanticCloudEvent]
validation_error: typing.Type[Exception]
from_http: FromHttpFn
pydantic_version: typing.Literal["v1", "v2"]
_pydantic_implementation: typing.Mapping[str, PydanticImplementation] = {
"v1": {
"event": PydanticV1CloudEvent,
"validation_error": PydanticV1ValidationError,
"from_http": pydantic_v1_from_http,
"pydantic_version": "v1",
},
"v2": {
"event": PydanticV2CloudEvent,
"validation_error": PydanticV2ValidationError,
"from_http": pydantic_v2_from_http,
"pydantic_version": "v2",
},
}
@pytest.fixture(params=["v1", "v2"])
def cloudevents_implementation(
request: pytest.FixtureRequest,
) -> PydanticImplementation:
return _pydantic_implementation[request.param]
@app.route("/event/<pydantic_version>", ["POST"])
async def echo(request, pydantic_version):
decoder = None
if "binary-payload" in request.headers:
decoder = lambda x: x
event = from_http(dict(request.headers), request.body, data_unmarshaller=decoder)
event = _pydantic_implementation[pydantic_version]["from_http"](
dict(request.headers), request.body, data_unmarshaller=decoder
)
data = (
event.data
if isinstance(event.data, (bytes, bytearray, memoryview))
@ -81,28 +137,36 @@ async def echo(request):
@pytest.mark.parametrize("body", invalid_cloudevent_request_body)
def test_missing_required_fields_structured(body):
def test_missing_required_fields_structured(
body: dict, cloudevents_implementation: PydanticImplementation
) -> None:
with pytest.raises(cloud_exceptions.MissingRequiredFields):
_ = from_http(
_ = cloudevents_implementation["from_http"](
{"Content-Type": "application/cloudevents+json"}, json.dumps(body)
)
@pytest.mark.parametrize("headers", invalid_test_headers)
def test_missing_required_fields_binary(headers):
def test_missing_required_fields_binary(
headers: dict, cloudevents_implementation: PydanticImplementation
) -> None:
with pytest.raises(cloud_exceptions.MissingRequiredFields):
_ = from_http(headers, json.dumps(test_data))
_ = cloudevents_implementation["from_http"](headers, json.dumps(test_data))
@pytest.mark.parametrize("headers", invalid_test_headers)
def test_missing_required_fields_empty_data_binary(headers):
def test_missing_required_fields_empty_data_binary(
headers: dict, cloudevents_implementation: PydanticImplementation
) -> None:
# Test for issue #115
with pytest.raises(cloud_exceptions.MissingRequiredFields):
_ = from_http(headers, None)
_ = cloudevents_implementation["from_http"](headers, None)
@pytest.mark.parametrize("specversion", ["1.0", "0.3"])
def test_emit_binary_event(specversion):
def test_emit_binary_event(
specversion: str, cloudevents_implementation: PydanticImplementation
) -> None:
headers = {
"ce-id": "my-id",
"ce-source": "<event-source>",
@ -111,7 +175,11 @@ def test_emit_binary_event(specversion):
"Content-Type": "text/plain",
}
data = json.dumps(test_data)
_, r = app.test_client.post("/event", headers=headers, data=data)
_, r = app.test_client.post(
f"/event/{cloudevents_implementation['pydantic_version']}",
headers=headers,
data=data,
)
# Convert byte array to dict
# e.g. r.body = b'{"payload-content": "Hello World!"}'
@ -128,7 +196,9 @@ def test_emit_binary_event(specversion):
@pytest.mark.parametrize("specversion", ["1.0", "0.3"])
def test_emit_structured_event(specversion):
def test_emit_structured_event(
specversion: str, cloudevents_implementation: PydanticImplementation
) -> None:
headers = {"Content-Type": "application/cloudevents+json"}
body = {
"id": "my-id",
@ -137,7 +207,11 @@ def test_emit_structured_event(specversion):
"specversion": specversion,
"data": test_data,
}
_, r = app.test_client.post("/event", headers=headers, data=json.dumps(body))
_, r = app.test_client.post(
f"/event/{cloudevents_implementation['pydantic_version']}",
headers=headers,
data=json.dumps(body),
)
# Convert byte array to dict
# e.g. r.body = b'{"payload-content": "Hello World!"}'
@ -153,7 +227,11 @@ def test_emit_structured_event(specversion):
"converter", [converters.TypeBinary, converters.TypeStructured]
)
@pytest.mark.parametrize("specversion", ["1.0", "0.3"])
def test_roundtrip_non_json_event(converter, specversion):
def test_roundtrip_non_json_event(
converter: str,
specversion: str,
cloudevents_implementation: PydanticImplementation,
) -> None:
input_data = io.BytesIO()
for _ in range(100):
for j in range(20):
@ -161,7 +239,7 @@ def test_roundtrip_non_json_event(converter, specversion):
compressed_data = bz2.compress(input_data.getvalue())
attrs = {"source": "test", "type": "t"}
event = CloudEvent(attrs, compressed_data)
event = cloudevents_implementation["event"](attrs, compressed_data)
if converter == converters.TypeStructured:
headers, data = to_structured(event, data_marshaller=lambda x: x)
@ -169,7 +247,11 @@ def test_roundtrip_non_json_event(converter, specversion):
headers, data = to_binary(event, data_marshaller=lambda x: x)
headers["binary-payload"] = "true" # Decoding hint for server
_, r = app.test_client.post("/event", headers=headers, data=data)
_, r = app.test_client.post(
f"/event/{cloudevents_implementation['pydantic_version']}",
headers=headers,
data=data,
)
assert r.status_code == 200
for key in attrs:
@ -178,7 +260,9 @@ def test_roundtrip_non_json_event(converter, specversion):
@pytest.mark.parametrize("specversion", ["1.0", "0.3"])
def test_missing_ce_prefix_binary_event(specversion):
def test_missing_ce_prefix_binary_event(
specversion: str, cloudevents_implementation: PydanticImplementation
) -> None:
prefixed_headers = {}
headers = {
"ce-id": "my-id",
@ -195,13 +279,17 @@ def test_missing_ce_prefix_binary_event(specversion):
# and NotImplementedError because structured calls aren't
# implemented. In this instance one of the required keys should have
# prefix e-id instead of ce-id therefore it should throw
_ = from_http(prefixed_headers, json.dumps(test_data))
_ = cloudevents_implementation["from_http"](
prefixed_headers, json.dumps(test_data)
)
@pytest.mark.parametrize("specversion", ["1.0", "0.3"])
def test_valid_binary_events(specversion):
def test_valid_binary_events(
specversion: str, cloudevents_implementation: PydanticImplementation
) -> None:
# Test creating multiple cloud events
events_queue = []
events_queue: list[AnyPydanticCloudEvent] = []
headers = {}
num_cloudevents = 30
for i in range(num_cloudevents):
@ -212,10 +300,12 @@ def test_valid_binary_events(specversion):
"ce-specversion": specversion,
}
data = {"payload": f"payload-{i}"}
events_queue.append(from_http(headers, json.dumps(data)))
events_queue.append(
cloudevents_implementation["from_http"](headers, json.dumps(data))
)
for i, event in enumerate(events_queue):
data = event.data
assert isinstance(event.data, dict)
assert event["id"] == f"id{i}"
assert event["source"] == f"source{i}.com.test"
assert event["specversion"] == specversion
@ -223,7 +313,9 @@ def test_valid_binary_events(specversion):
@pytest.mark.parametrize("specversion", ["1.0", "0.3"])
def test_structured_to_request(specversion):
def test_structured_to_request(
specversion: str, cloudevents_implementation: PydanticImplementation
) -> None:
attributes = {
"specversion": specversion,
"type": "word.found.name",
@ -232,7 +324,7 @@ def test_structured_to_request(specversion):
}
data = {"message": "Hello World!"}
event = CloudEvent(attributes, data)
event = cloudevents_implementation["event"](attributes, data)
headers, body_bytes = to_structured(event)
assert isinstance(body_bytes, bytes)
body = json.loads(body_bytes)
@ -240,11 +332,13 @@ def test_structured_to_request(specversion):
assert headers["content-type"] == "application/cloudevents+json"
for key in attributes:
assert body[key] == attributes[key]
assert body["data"] == data, f"|{body_bytes}|| {body}"
assert body["data"] == data, f"|{body_bytes!r}|| {body}"
@pytest.mark.parametrize("specversion", ["1.0", "0.3"])
def test_attributes_view_accessor(specversion: str):
def test_attributes_view_accessor(
specversion: str, cloudevents_implementation: PydanticImplementation
) -> None:
attributes: dict[str, typing.Any] = {
"specversion": specversion,
"type": "word.found.name",
@ -253,7 +347,7 @@ def test_attributes_view_accessor(specversion: str):
}
data = {"message": "Hello World!"}
event: CloudEvent = CloudEvent(attributes, data)
event = cloudevents_implementation["event"](attributes, data)
event_attributes: typing.Mapping[str, typing.Any] = event.get_attributes()
assert event_attributes["specversion"] == attributes["specversion"]
assert event_attributes["type"] == attributes["type"]
@ -263,7 +357,9 @@ def test_attributes_view_accessor(specversion: str):
@pytest.mark.parametrize("specversion", ["1.0", "0.3"])
def test_binary_to_request(specversion):
def test_binary_to_request(
specversion: str, cloudevents_implementation: PydanticImplementation
) -> None:
attributes = {
"specversion": specversion,
"type": "word.found.name",
@ -271,7 +367,7 @@ def test_binary_to_request(specversion):
"source": "pytest",
}
data = {"message": "Hello World!"}
event = CloudEvent(attributes, data)
event = cloudevents_implementation["event"](attributes, data)
headers, body_bytes = to_binary(event)
body = json.loads(body_bytes)
@ -282,7 +378,9 @@ def test_binary_to_request(specversion):
@pytest.mark.parametrize("specversion", ["1.0", "0.3"])
def test_empty_data_structured_event(specversion):
def test_empty_data_structured_event(
specversion: str, cloudevents_implementation: PydanticImplementation
) -> None:
# Testing if cloudevent breaks when no structured data field present
attributes = {
"specversion": specversion,
@ -293,21 +391,23 @@ def test_empty_data_structured_event(specversion):
"source": "<source-url>",
}
event = from_http(
event = cloudevents_implementation["from_http"](
{"content-type": "application/cloudevents+json"}, json.dumps(attributes)
)
assert event.data is None
attributes["data"] = ""
# Data of empty string will be marshalled into None
event = from_http(
event = cloudevents_implementation["from_http"](
{"content-type": "application/cloudevents+json"}, json.dumps(attributes)
)
assert event.data is None
@pytest.mark.parametrize("specversion", ["1.0", "0.3"])
def test_empty_data_binary_event(specversion):
def test_empty_data_binary_event(
specversion: str, cloudevents_implementation: PydanticImplementation
) -> None:
# Testing if cloudevent breaks when no structured data field present
headers = {
"Content-Type": "application/octet-stream",
@ -317,22 +417,24 @@ def test_empty_data_binary_event(specversion):
"ce-time": "2018-10-23T12:28:22.4579346Z",
"ce-source": "<source-url>",
}
event = from_http(headers, None)
event = cloudevents_implementation["from_http"](headers, None)
assert event.data is None
data = ""
# Data of empty string will be marshalled into None
event = from_http(headers, data)
event = cloudevents_implementation["from_http"](headers, data)
assert event.data is None
@pytest.mark.parametrize("specversion", ["1.0", "0.3"])
def test_valid_structured_events(specversion):
def test_valid_structured_events(
specversion: str, cloudevents_implementation: PydanticImplementation
) -> None:
# Test creating multiple cloud events
events_queue = []
events_queue: list[AnyPydanticCloudEvent] = []
num_cloudevents = 30
for i in range(num_cloudevents):
event = {
raw_event = {
"id": f"id{i}",
"source": f"source{i}.com.test",
"type": "cloudevent.test.type",
@ -340,13 +442,14 @@ def test_valid_structured_events(specversion):
"data": {"payload": f"payload-{i}"},
}
events_queue.append(
from_http(
cloudevents_implementation["from_http"](
{"content-type": "application/cloudevents+json"},
json.dumps(event),
json.dumps(raw_event),
)
)
for i, event in enumerate(events_queue):
assert isinstance(event.data, dict)
assert event["id"] == f"id{i}"
assert event["source"] == f"source{i}.com.test"
assert event["specversion"] == specversion
@ -354,7 +457,9 @@ def test_valid_structured_events(specversion):
@pytest.mark.parametrize("specversion", ["1.0", "0.3"])
def test_structured_no_content_type(specversion):
def test_structured_no_content_type(
specversion: str, cloudevents_implementation: PydanticImplementation
) -> None:
# Test creating multiple cloud events
data = {
"id": "id",
@ -363,8 +468,9 @@ def test_structured_no_content_type(specversion):
"specversion": specversion,
"data": test_data,
}
event = from_http({}, json.dumps(data))
event = cloudevents_implementation["from_http"]({}, json.dumps(data))
assert isinstance(event.data, dict)
assert event["id"] == "id"
assert event["source"] == "source.com.test"
assert event["specversion"] == specversion
@ -392,7 +498,9 @@ def test_is_binary():
@pytest.mark.parametrize("specversion", ["1.0", "0.3"])
def test_cloudevent_repr(specversion):
def test_cloudevent_repr(
specversion: str, cloudevents_implementation: PydanticImplementation
) -> None:
headers = {
"Content-Type": "application/octet-stream",
"ce-specversion": specversion,
@ -401,7 +509,7 @@ def test_cloudevent_repr(specversion):
"ce-time": "2018-10-23T12:28:22.4579346Z",
"ce-source": "<source-url>",
}
event = from_http(headers, "")
event = cloudevents_implementation["from_http"](headers, "")
# Testing to make sure event is printable. I could run event. __repr__() but
# we had issues in the past where event.__repr__() could run but
# print(event) would fail.
@ -409,8 +517,10 @@ def test_cloudevent_repr(specversion):
@pytest.mark.parametrize("specversion", ["1.0", "0.3"])
def test_none_data_cloudevent(specversion):
event = CloudEvent(
def test_none_data_cloudevent(
specversion: str, cloudevents_implementation: PydanticImplementation
) -> None:
event = cloudevents_implementation["event"](
{
"source": "<my-url>",
"type": "issue.example",
@ -421,7 +531,7 @@ def test_none_data_cloudevent(specversion):
to_structured(event)
def test_wrong_specversion():
def test_wrong_specversion(cloudevents_implementation: PydanticImplementation) -> None:
headers = {"Content-Type": "application/cloudevents+json"}
data = json.dumps(
{
@ -432,20 +542,24 @@ def test_wrong_specversion():
}
)
with pytest.raises(cloud_exceptions.InvalidRequiredFields) as e:
from_http(headers, data)
cloudevents_implementation["from_http"](headers, data)
assert "Found invalid specversion 0.2" in str(e.value)
def test_invalid_data_format_structured_from_http():
def test_invalid_data_format_structured_from_http(
cloudevents_implementation: PydanticImplementation,
) -> None:
headers = {"Content-Type": "application/cloudevents+json"}
data = 20
with pytest.raises(cloud_exceptions.InvalidStructuredJSON) as e:
from_http(headers, data)
cloudevents_implementation["from_http"](headers, data) # type: ignore[type-var] # intentionally wrong type # noqa: E501
assert "Expected json of type (str, bytes, bytearray)" in str(e.value)
def test_wrong_specversion_to_request():
event = CloudEvent({"source": "s", "type": "t"}, None)
def test_wrong_specversion_to_request(
cloudevents_implementation: PydanticImplementation,
) -> None:
event = cloudevents_implementation["event"]({"source": "s", "type": "t"}, None)
with pytest.raises(cloud_exceptions.InvalidRequiredFields) as e:
event["specversion"] = "0.2"
to_binary(event)
@ -468,22 +582,26 @@ def test_is_structured():
assert not is_structured(headers)
def test_empty_json_structured():
def test_empty_json_structured(
cloudevents_implementation: PydanticImplementation,
) -> None:
headers = {"Content-Type": "application/cloudevents+json"}
data = ""
with pytest.raises(cloud_exceptions.MissingRequiredFields) as e:
from_http(headers, data)
cloudevents_implementation["from_http"](headers, data)
assert "Failed to read specversion from both headers and data" in str(e.value)
def test_uppercase_headers_with_none_data_binary():
def test_uppercase_headers_with_none_data_binary(
cloudevents_implementation: PydanticImplementation,
) -> None:
headers = {
"Ce-Id": "my-id",
"Ce-Source": "<event-source>",
"Ce-Type": "cloudevent.event.type",
"Ce-Specversion": "1.0",
}
event = from_http(headers, None)
event = cloudevents_implementation["from_http"](headers, None)
for key in headers:
assert event[key.lower()[3:]] == headers[key]
@ -493,7 +611,7 @@ def test_uppercase_headers_with_none_data_binary():
assert new_data is None
def test_generic_exception():
def test_generic_exception(cloudevents_implementation: PydanticImplementation) -> None:
headers = {"Content-Type": "application/cloudevents+json"}
data = json.dumps(
{
@ -505,28 +623,32 @@ def test_generic_exception():
}
)
with pytest.raises(cloud_exceptions.GenericException) as e:
from_http({}, None)
cloudevents_implementation["from_http"]({}, None)
e.errisinstance(cloud_exceptions.MissingRequiredFields)
with pytest.raises(cloud_exceptions.GenericException) as e:
from_http({}, 123)
cloudevents_implementation["from_http"]({}, 123) # type: ignore[type-var] # intentionally wrong type # noqa: E501
e.errisinstance(cloud_exceptions.InvalidStructuredJSON)
with pytest.raises(cloud_exceptions.GenericException) as e:
from_http(headers, data, data_unmarshaller=lambda x: 1 / 0)
cloudevents_implementation["from_http"](
headers, data, data_unmarshaller=lambda x: 1 / 0
)
e.errisinstance(cloud_exceptions.DataUnmarshallerError)
with pytest.raises(cloud_exceptions.GenericException) as e:
event = from_http(headers, data)
event = cloudevents_implementation["from_http"](headers, data)
to_binary(event, data_marshaller=lambda x: 1 / 0)
e.errisinstance(cloud_exceptions.DataMarshallerError)
def test_non_dict_data_no_headers_bug():
def test_non_dict_data_no_headers_bug(
cloudevents_implementation: PydanticImplementation,
) -> None:
# Test for issue #116
headers = {"Content-Type": "application/cloudevents+json"}
data = "123"
with pytest.raises(cloud_exceptions.MissingRequiredFields) as e:
from_http(headers, data)
cloudevents_implementation["from_http"](headers, data)
assert "Failed to read specversion from both headers and data" in str(e.value)
assert "The following deserialized data has no 'get' method" in str(e.value)

16
mypy.ini Normal file
View File

@ -0,0 +1,16 @@
[mypy]
plugins = pydantic.mypy
python_version = 3.8
pretty = True
show_error_context = True
follow_imports_for_stubs = True
# subset of mypy --strict
# https://mypy.readthedocs.io/en/stable/config_file.html
check_untyped_defs = True
disallow_incomplete_defs = True
warn_return_any = True
strict_equality = True
[mypy-deprecation.*]
ignore_missing_imports = True

View File

@ -5,3 +5,4 @@ pep8-naming
flake8-print
tox
pre-commit
mypy

5
requirements/mypy.txt Normal file
View File

@ -0,0 +1,5 @@
mypy
# mypy has the pydantic plugin enabled
pydantic>=2.0.0,<3.0
types-requests
deprecation>=2.0,<3.0

View File

@ -4,12 +4,10 @@ flake8-print
pytest
pytest-cov
# web app tests
sanic<=20.12.7; python_version <= '3.6'
sanic; python_version > '3.6'
sanic-testing; python_version > '3.6'
sanic
sanic-testing
aiohttp
Pillow
requests
flask
pydantic>=1.0.0<1.9.0; python_version <= '3.6'
pydantic>=1.0.0<2.0; python_version > '3.6'
pydantic>=2.0.0,<3.0

View File

@ -25,7 +25,7 @@ resp = requests.get(
image_bytes = resp.content
def send_binary_cloud_event(url: str):
def send_binary_cloud_event(url: str) -> None:
# Create cloudevent
attributes = {
"type": "com.example.string",
@ -42,7 +42,7 @@ def send_binary_cloud_event(url: str):
print(f"Sent {event['id']} of type {event['type']}")
def send_structured_cloud_event(url: str):
def send_structured_cloud_event(url: str) -> None:
# Create cloudevent
attributes = {
"type": "com.example.base64",

View File

@ -46,9 +46,11 @@ long_description = (here / "README.md").read_text(encoding="utf-8")
if __name__ == "__main__":
setup(
name=pypi_config["package_name"],
summary="CloudEvents SDK Python",
summary="CloudEvents Python SDK",
long_description_content_type="text/markdown",
long_description=long_description,
description="CloudEvents Python SDK",
url="https://github.com/cloudevents/sdk-python",
author="The Cloud Events Contributors",
author_email="cncfcloudevents@gmail.com",
home_page="https://cloudevents.io",
@ -58,21 +60,25 @@ if __name__ == "__main__":
"Intended Audience :: Developers",
"License :: OSI Approved :: Apache Software License",
"Development Status :: 5 - Production/Stable",
"Operating System :: POSIX :: Linux",
"Operating System :: OS Independent",
"Natural Language :: English",
"Programming Language :: Python",
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3.6",
"Programming Language :: Python :: 3.7",
"Programming Language :: Python :: 3.8",
"Programming Language :: Python :: 3 :: Only",
"Programming Language :: Python :: 3.9",
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: 3.11",
"Programming Language :: Python :: 3.12",
"Programming Language :: Python :: 3.13",
"Typing :: Typed",
],
keywords="CloudEvents Eventing Serverless",
license="https://www.apache.org/licenses/LICENSE-2.0",
license_file="LICENSE",
packages=find_packages(exclude=["cloudevents.tests"]),
include_package_data=True,
version=pypi_config["version_target"],
install_requires=["deprecation>=2.0,<3.0"],
extras_require={
"pydantic": [
"pydantic>=1.0.0<1.9.0; python_version <= '3.6'",
"pydantic>=1.0.0<2.0; python_version > '3.6'",
],
},
extras_require={"pydantic": "pydantic>=1.0.0,<3.0"},
zip_safe=True,
)

26
tox.ini
View File

@ -1,5 +1,5 @@
[tox]
envlist = py{36,37,38,39,310},lint
envlist = py{39,310,311,312,313},lint,mypy,mypy-samples-{image,json}
skipsdist = True
[testenv]
@ -8,11 +8,11 @@ deps =
-r{toxinidir}/requirements/test.txt
-r{toxinidir}/requirements/publish.txt
setenv =
PYTESTARGS = -v -s --tb=long --cov=cloudevents --cov-report term-missing --cov-fail-under=100
PYTESTARGS = -v -s --tb=long --cov=cloudevents --cov-report term-missing --cov-fail-under=95
commands = pytest {env:PYTESTARGS} {posargs}
[testenv:reformat]
basepython = python3.10
basepython = python3.12
deps =
black
isort
@ -21,7 +21,7 @@ commands =
isort cloudevents samples
[testenv:lint]
basepython = python3.10
basepython = python3.12
deps =
black
isort
@ -30,3 +30,21 @@ commands =
black --check .
isort -c cloudevents samples
flake8 cloudevents samples --ignore W503,E731 --extend-ignore E203 --max-line-length 88
[testenv:mypy]
basepython = python3.12
deps =
-r{toxinidir}/requirements/mypy.txt
# mypy needs test dependencies to check test modules
-r{toxinidir}/requirements/test.txt
commands = mypy cloudevents
[testenv:mypy-samples-{image,json}]
basepython = python3.12
setenv =
mypy-samples-image: SAMPLE_DIR={toxinidir}/samples/http-image-cloudevents
mypy-samples-json: SAMPLE_DIR={toxinidir}/samples/http-json-cloudevents
deps =
-r{toxinidir}/requirements/mypy.txt
-r{env:SAMPLE_DIR}/requirements.txt
commands = mypy {env:SAMPLE_DIR}