Compare commits

...

60 Commits
1.2.0 ... main

Author SHA1 Message Date
Yurii Serhiichuk a38933d7ab
Drop EOL Python 3.8 support (#249)
* chore: add missing changelog items

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* chore: drop Python 3.8 support

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* chore: add a changelog item on Python 3.8 removal

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* chore: remove mypy-constrains reference as we don't need it anymore

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* chore: Update pre-commit check versions.

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* chore: fix isort pre-commit

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* chore: Use Python 3.12 as base version

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

---------

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2025-06-02 14:58:00 -04:00
Hal Blackburn 37ae369ced
Improve public API type annotations & fix unit test type errors (#248)
* chore: improve typing of functions returning AnyCloudEvent

kafka.conversion.from_binary() and from_structured() return
AnyCloudEvent type var according to their event_type argument, but when
event_type is None, type checkers cannot infer the return type. We now
use an overload to declare that the return type is http.CloudEvent when
event_type is None.

Previously users had to explicitly annotate this type when calling
without event_type. This happens quite a lot in this repo's
test_kafka_conversions.py — this fixes quite a few type errors like:

> error: Need type annotation for "result"  [var-annotated]

Signed-off-by: Hal Blackburn <hwtb2@cam.ac.uk>

* chore: type v1.Event chainable Set*() methods

The v1.Event self-returning Set*() methods like SetData() were returning
BaseEvent, which doesn't declare the same Set* methods. As a result,
chaining more than one Set* method would make the return type unknown.

This was causing type errors in test_event_pipeline.py.

The Set*() methods now return the Self type.

Signed-off-by: Hal Blackburn <hwtb2@cam.ac.uk>

* chore: fix type errors in tests

mypy was failing with lots of type errors in test modules. I've not
annotated all fixtures, mostly fixed existing type errors.

Signed-off-by: Hal Blackburn <hwtb2@cam.ac.uk>

* chore: allow non-dict headers types in from_http()

from_http() conversion function was requiring its headers argument to
be a typing.Dict, which makes it incompatible with headers types of http
libraries, which support features like multiple values per key.
typing.Mapping and even _typeshed.SupportsItems do not cover these
types. For example,
samples/http-image-cloudevents/image_sample_server.py was failing to
type check where it calls `from_http(request.headers, ...)`.

To support these kind of headers types in from_http(), we now define our
own SupportsDuplicateItems protocol, which is broader than
_typeshed.SupportsItems.

I've only applied this to from_http(), as typing.Mapping is OK for most
other methods that accept dict-like objects, and using this more lenient
interface everywhere would impose restrictions on our implementation,
even though it might be more flexible for users.

Signed-off-by: Hal Blackburn <hwtb2@cam.ac.uk>

* build: run mypy via tox

Tox now runs mypy on cloudevents itself, and the samples.

Signed-off-by: Hal Blackburn <hwtb2@cam.ac.uk>

* build(ci): run mypy in CI alongside linting

Signed-off-by: Hal Blackburn <hwtb2@cam.ac.uk>

* chore: fix minor mypy type complaint in samples

Signed-off-by: Hal Blackburn <hwtb2@cam.ac.uk>

* feat: use Mapping, not Dict for input arguments

Mapping imposes less restrictions on callers, because it's read-only and
allows non-dict types to be passed without copying them as dict(), or
passing dict-like values and ignoring the resulting type error.

Signed-off-by: Hal Blackburn <hwtb2@cam.ac.uk>

* chore: fix tests on py3.8

Tests were failing because the sanic dependency dropped support for
py3.8 in its current release. sanic is now pinned to the last compatible
version for py3.8 only.

Signed-off-by: Hal Blackburn <hwtb2@cam.ac.uk>

* feat: support new model_validate_json() kwargs

Pydantic added by_alias and by_name keyword arguments to
BaseModel.model_validate_json in 2.11.1:

acb0f10fda

This caused mypy to report that that the Pydantic v2 CloudEvent did not
override model_validate_json() correctly. Our override now accepts these
newly-added arguments. They have no effect, as the implementation does
not use Pydantic to validate the JSON, but we also don't use field
aliases, so the only effect they could have in the superclass would be
to raise an error if they're both False.

Signed-off-by: Hal Blackburn <hwtb2@cam.ac.uk>

* chore: accept Mapping as well as SupportsDuplicateItems

Although our types.SupportsDuplicateItems type is wider than Dict and
Mapping, it's not a familar type to users, so explicitly accepting
Mapping in the from_http() functions should make it more clear to users
that a dict-like object is required for the headers argument.

Signed-off-by: Hal Blackburn <hwtb2@cam.ac.uk>

* chore: constrain deps to maintain py 3.8 support

Python 3.8 is unsupported and dependencies (such as pydantic) are now
shipping releases that fail to type check with mypy running in 3.8
compatibility mode. We run mypy in py 3.8 compatibility mode, so the
mypy tox environments must only use deps that support 3.8. And unit
tests run by py 3.8 must only use deps that support 3.8.

To constrain the deps for 3.8 support, we use two constraint files, one
for general environments that only constrains the dependencies that
python 3.8 interpreters use, and another for mypy that constraints the
dependencies that all interpreters use.

Signed-off-by: Hal Blackburn <hwtb2@cam.ac.uk>

---------

Signed-off-by: Hal Blackburn <hwtb2@cam.ac.uk>
2025-05-23 22:26:18 +03:00
Yurii Serhiichuk c5645d8fcf
chpre: disable attestations while we're not using trusted publishing (#243)
Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>
2024-11-09 20:27:52 +02:00
Yurii Serhiichuk 96cfaa6529
chore: release 1.11.1 (#241)
Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>
2024-10-30 11:54:36 +02:00
Christoph Hösler efca352e21
fix kafka unmarshaller args typing and defaults (#240)
* fix kafka unmarshaller args typing and defaults

Signed-off-by: Christoph Hösler <christoph.hoesler@inovex.de>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

---------

Signed-off-by: Christoph Hösler <christoph.hoesler@inovex.de>
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2024-10-30 11:41:03 +02:00
Yurii Serhiichuk c6c7e8c2f9
Release/v1.11.0 (#237)
* Add missing changelog items

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* Bump version

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

---------

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>
2024-06-20 09:31:13 +03:00
Vivian 16441d79f4
Modified content-type to abide by attribute naming conventions for cloudevents (#232)
* fix: changed content-type to a valid attribute

Signed-off-by: vivjd <vivjdeng@hotmail.com>

* fix: changed headers back to content-type

Signed-off-by: Vivian <118199397+vivjd@users.noreply.github.com>
Signed-off-by: vivjd <vivjdeng@hotmail.com>

* modified kafka test cases to match datacontenttype

Signed-off-by: vivjd <vivjdeng@hotmail.com>

* fix: updated kafka/conversion.py and test cases to check for valid attributes

Signed-off-by: vivjd <vivjdeng@hotmail.com>

---------

Signed-off-by: vivjd <vivjdeng@hotmail.com>
Signed-off-by: Vivian <118199397+vivjd@users.noreply.github.com>
Co-authored-by: Yurii Serhiichuk <xSAVIKx@users.noreply.github.com>
2024-05-26 21:56:16 +03:00
Fábio D. Batista 11520e35e1
Pydantic v2 (#235)
* Fixes examples when using Pydantic V2

Signed-off-by: Fabio Batista <fabio@atelie.dev.br>

* When type checking, uses the latest (V2) version of Pydantic

Signed-off-by: Fabio Batista <fabio@atelie.dev.br>

---------

Signed-off-by: Fabio Batista <fabio@atelie.dev.br>
Co-authored-by: Yurii Serhiichuk <xSAVIKx@users.noreply.github.com>
2024-05-26 21:51:36 +03:00
Yurii Serhiichuk eedc61e9b0
Update CI and tooling (#236)
* Update pre-commit hooks

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* Add Python 3.12

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* Drop python 3.7 and add 3.12 to TOX

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* Migrate to latest action versions. Drop v3.7 from CI and add 3.12

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* Migrate to Python 3.8

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* Fix changelog message.

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

---------

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>
2024-05-26 21:49:35 +03:00
Yurii Serhiichuk 21572afb57
Fix Pydantic custom attributes (#229)
* Add custom extension attribute to the test set.

Replicates bug test data from the https://github.com/cloudevents/sdk-python/issues/228

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* use modern `super` syntax

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* Fix `black` language version

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* Fixes https://github.com/cloudevents/sdk-python/issues/228

Pydantic v2 .__dict__ has different behavior from what Pydantic v1 had and is not giving us `extra` fields anymore. On the other hand the iterator over the event gives us extras as well

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* Add missing EOF

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* Add Pydantic fix to the changelog

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* Add links to the changelog

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* Bump version

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* Update Black and MyPy versions

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

---------

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>
2023-10-30 06:44:36 +01:00
pre-commit-ci[bot] 8ada7d947b
[pre-commit.ci] pre-commit autoupdate (#224)
updates:
- [github.com/pre-commit/pre-commit-hooks: v4.4.0 → v4.5.0](https://github.com/pre-commit/pre-commit-hooks/compare/v4.4.0...v4.5.0)
- [github.com/pre-commit/mirrors-mypy: v1.5.1 → v1.6.0](https://github.com/pre-commit/mirrors-mypy/compare/v1.5.1...v1.6.0)

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2023-10-23 15:24:12 +03:00
Doug Davis c5418b99a0
add link to our security mailing list (#226)
Signed-off-by: Doug Davis <dug@microsoft.com>
2023-10-16 19:14:38 +03:00
Yurii Serhiichuk d4873037e2
Release/v1.10.0 (#223)
* Bump version

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* Update changelog

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

---------

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>
2023-09-25 08:00:00 -06:00
pre-commit-ci[bot] 66dcabb254
[pre-commit.ci] pre-commit autoupdate (#220)
updates:
- [github.com/psf/black: 23.7.0 → 23.9.1](https://github.com/psf/black/compare/23.7.0...23.9.1)

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Co-authored-by: Yurii Serhiichuk <xSAVIKx@users.noreply.github.com>
2023-09-25 12:29:56 +03:00
Doug Davis 252efdbbce
Governance docs per CE PR 1226 (#221)
Signed-off-by: Doug Davis <dug@microsoft.com>
2023-09-21 22:59:54 +03:00
Federico Busetti 5a1063e50d
Pydantic v2 native implementation (#219)
* Create stub pydantic v2 implementation and parametrize tests for both implementations

Signed-off-by: Federico Busetti <729029+febus982@users.noreply.github.com>

* Add default values to optional fields

Signed-off-by: Federico Busetti <729029+febus982@users.noreply.github.com>

* Adapt pydantic v1 serializer/deserializer logic

Signed-off-by: Federico Busetti <729029+febus982@users.noreply.github.com>

* Extract CloudEvent fields non functional data in separate module

Signed-off-by: Federico Busetti <729029+febus982@users.noreply.github.com>

* Fix lint

Signed-off-by: Federico Busetti <729029+febus982@users.noreply.github.com>

* Add missing Copyright

Signed-off-by: Federico Busetti <729029+febus982@users.noreply.github.com>

* Add missing docstring

Signed-off-by: Federico Busetti <729029+febus982@users.noreply.github.com>

* Remove test leftover

Signed-off-by: Federico Busetti <729029+febus982@users.noreply.github.com>

* Remove dependency on HTTP CloudEvent implementation

Signed-off-by: Federico Busetti <729029+febus982@users.noreply.github.com>

* Remove failing test for unsupported scenario

Fix typo

Signed-off-by: Federico Busetti <729029+febus982@users.noreply.github.com>

* Use SDK json serialization logic

Signed-off-by: Federico Busetti <729029+febus982@users.noreply.github.com>

* No need to filter base64_data

Signed-off-by: Federico Busetti <729029+febus982@users.noreply.github.com>

* Use SDK json deserialization logic

Signed-off-by: Federico Busetti <729029+febus982@users.noreply.github.com>

* Fix imports

Signed-off-by: Federico Busetti <729029+febus982@users.noreply.github.com>

* Move docs after field declarations

Signed-off-by: Federico Busetti <729029+febus982@users.noreply.github.com>

* Add test for model_validate_json method

Signed-off-by: Federico Busetti <729029+febus982@users.noreply.github.com>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* Use fully qualified imports

Signed-off-by: Federico Busetti <729029+febus982@users.noreply.github.com>

* Ignore typing error

Signed-off-by: Federico Busetti <729029+febus982@users.noreply.github.com>

---------

Signed-off-by: Federico Busetti <729029+febus982@users.noreply.github.com>
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2023-09-20 22:59:13 +03:00
pre-commit-ci[bot] e5f76ed14c
[pre-commit.ci] pre-commit autoupdate (#212)
updates:
- [github.com/psf/black: 23.3.0 → 23.7.0](https://github.com/psf/black/compare/23.3.0...23.7.0)
- [github.com/pre-commit/mirrors-mypy: v1.2.0 → v1.5.1](https://github.com/pre-commit/mirrors-mypy/compare/v1.2.0...v1.5.1)

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Co-authored-by: Yurii Serhiichuk <xSAVIKx@users.noreply.github.com>
2023-08-28 20:29:25 +03:00
Federico Busetti 739c71e0b7
Adds a pydantic V2 compatibility layer (#218)
* feat: Pydantic V2 compatibility layer

Signed-off-by: Federico Busetti <729029+febus982@users.noreply.github.com>

* Ignore incompatible import

Signed-off-by: Federico Busetti <729029+febus982@users.noreply.github.com>

---------

Signed-off-by: Federico Busetti <729029+febus982@users.noreply.github.com>
2023-08-28 20:09:53 +03:00
pre-commit-ci[bot] 8104ce1b68
[pre-commit.ci] pre-commit autoupdate (#205)
* [pre-commit.ci] pre-commit autoupdate

updates:
- [github.com/pycqa/isort: 5.11.4 → 5.12.0](https://github.com/pycqa/isort/compare/5.11.4...5.12.0)
- [github.com/psf/black: 22.12.0 → 23.3.0](https://github.com/psf/black/compare/22.12.0...23.3.0)
- [github.com/pre-commit/mirrors-mypy: v0.991 → v1.2.0](https://github.com/pre-commit/mirrors-mypy/compare/v0.991...v1.2.0)

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

---------

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2023-05-14 20:53:02 +03:00
Yurii Serhiichuk ef982743b6
Add Python 3.11 support (#209)
* docs: add missing release notes

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* chore: add Python3.11 support

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* chore: Bump version

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* docs: create release section

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>
2023-01-04 11:33:33 -07:00
Yurii Serhiichuk 5e00c4f41f
Introduce typings (#207)
* chore: Add pre-commit hook

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* chore: address typing issues

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* chore: add py.typed meta

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* Add Pydantic plugin

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* Add Pydantic dependency

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* Add MyPy best practices configs

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* Add deprecation MyPy ignore

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* chore: more typing fixes

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* chore: more typings and explicit optionals

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* Use lowest-supported Python version

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* chore: Fix silly `dict` and other MyPy-related issues.

We're now explicitly ensuring codebase supports Python3.7+

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* chore: ignore typing limitation

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* chore: `not` with `dict` returns `false` for an empty dict, so use `is None` check

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* deps: Update hooks

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* chore: Make sure only non-callable unmarshallers are flagged

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* chore: Have some coverage slack

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* deps: bump pre-commit-hooks

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* ci: make sure py.typed is included into the bundle

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* docs: improve setup.py setup and add missing package metadata

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>
2023-01-04 08:29:41 -07:00
Yurii Serhiichuk a02864eaab
Drop python36 (#208)
* chore: drop Python 3.6 official support

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* docs: update docs regarding Python 3.6 being unsupported anymore

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* deps: drop Python3.6-only dependencies

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* chore: drop extra `;`

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* chore: try `setup.py` syntax

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>
2022-12-09 07:26:30 -07:00
Yurii Serhiichuk 119264cdfe
hotfix: Hotfix Pydantic dependency constraints. (#204)
* hotfix: Hotfix Pydantic dependency constraints.

docs: Add mention of the constraints fix

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

chore: bump version

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

fix: PyPi constraints for Pydantic

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

ci: add ability to release from tag branches.

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* docs: add missing links

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* docs: fix release 1.6.3 link

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>
2022-11-22 08:03:03 -07:00
Yurii Serhiichuk 81f07b6d9f
ci: refine publishing WF (#202)
* ci: update CI workflow to use `buildwheel` action.

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* docs: Add pipeline change to the changelog

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* chore: temporary add ability to build on PRs.

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* ci: Do not try using cibuildwheels

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* docs: Update changelog

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* ci: don't build on PRs

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* ci: don't fetch repo history on publish

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2022-11-21 07:20:09 -07:00
Yurii Serhiichuk cf5616be42
Release/v1.7.0 (#201)
* chore: Fix typings errors and cleanup code a bit

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* chore: Use `AnyStr` shortcut instead of `Union[bytes, str]`

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* chore: Bump version.

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* Update the changelog

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2022-11-17 21:47:29 -07:00
David W Martines de61dd9fd2
feat: Kafka Protocol (#197)
* Add kafka event and conversions.

Signed-off-by: davidwmartines <d5172@yahoo.com>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* Remove kafka CloudEvent class

Signed-off-by: davidwmartines <d5172@yahoo.com>

* Update conversion and init

Signed-off-by: davidwmartines <d5172@yahoo.com>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* Fix formatting.

Signed-off-by: davidwmartines <d5172@yahoo.com>

* Add tests for kafka binary conversion.

Signed-off-by: davidwmartines <d5172@yahoo.com>

* Catch marshalling errors, raise cloud_exceptions.

Signed-off-by: davidwmartines <d5172@yahoo.com>

* Add tests for to/from structured.

Signed-off-by: davidwmartines <d5172@yahoo.com>

* Fix spacing issues.

Signed-off-by: davidwmartines <d5172@yahoo.com>

* Rename ProtocolMessage to KafkaMessage.

Signed-off-by: davidwmartines <d5172@yahoo.com>

* Correct type annotations.

Signed-off-by: davidwmartines <d5172@yahoo.com>

* Use .create function.

Signed-off-by: davidwmartines <d5172@yahoo.com>

* Simplify failing serdes function.

Signed-off-by: davidwmartines <d5172@yahoo.com>

* Organize tests into classes.

Signed-off-by: davidwmartines <d5172@yahoo.com>

* Fix partitionkey attribute name and logic.

Signed-off-by: davidwmartines <d5172@yahoo.com>

* Add key_mapper option.

Signed-off-by: davidwmartines <d5172@yahoo.com>

* Refactor tests, raise KeyMapperError

Signed-off-by: davidwmartines <d5172@yahoo.com>

* Add copyright.x

Signed-off-by: davidwmartines <d5172@yahoo.com>

* Remove optional typing.

Signed-off-by: davidwmartines <d5172@yahoo.com>

Signed-off-by: davidwmartines <d5172@yahoo.com>
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2022-11-17 10:29:13 +02:00
Yurii Serhiichuk 6648eb52aa
Feat/expose event attributes (#195)
* feat: Add an API to read all event attributes

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* deps: update black version

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* chore: update version to v1.6.2

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* docs: update changelog

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* docs: fix the release number link

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>
2022-10-19 09:21:28 -07:00
pre-commit-ci[bot] 60f848a204
[pre-commit.ci] pre-commit autoupdate (#192)
updates:
- [github.com/psf/black: 22.6.0 → 22.8.0](https://github.com/psf/black/compare/22.6.0...22.8.0)

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2022-09-05 20:26:21 +03:00
Alexander Tkachev eba24db1b9
fix: to_json breaking change (#191)
* fix: missing to_json import #190

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

* test: backwards compatability import from http module #190

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* docs: update changelog

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* docs: update changelog

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

* feat: bump version

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2022-08-25 08:58:51 +03:00
Yurii Serhiichuk 5e64e3fea1
release: v1.6.0 (#189)
* chore: bump version.

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* docs: Update changelog with the release

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* docs: Use new `conversion` module over deprecated APIs.

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* docs: Also sort imports in README

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* docs: cleanup README and refereance latest Flask

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2022-08-18 07:24:20 -07:00
Yurii Serhiichuk 8a88ffee10
chore: cleanup codebase and fix flake errors (#188)
* deps: `flake8-strict` and  `flake8-import-order` are not compatible with Black and modern Python anymore

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* chore: Cleanup imports and remove obsolete `#noqa`.

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* chore: sort imports.

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* chore: Define `__all__`

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* chore: Fix licenses and add __all__ to imports.

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* chore: Fix formatting

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* chore: Export `from_http`

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* fix: Do not export functions of other modules from this one.

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* chore: Resolve more flake8 errors

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* chore: Fix more warnings

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* docs: add a note in the changelog about the fixes.

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* fix: imports in tests.

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* fix: more import fixes.

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* fix: use proper implementations as replacements.

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2022-08-14 15:47:38 -07:00
Alexander Tkachev f5bb285d96
feat: pydantic (#182)
* feat: pydantic

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

Squashed commit of the following:

commit f7cdffc2c124d1f2a4517588364b818795bc729d
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Sun Aug 7 22:32:27 2022 +0300

    docs: canonical representation

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit f0bffb4118d2936fa2f7ff759d218f706168fd61
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Fri Aug 12 22:04:33 2022 +0300

    docs: remove duplicate deprecated module warnings

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit a9bc2cee634503d41ee257c039817fca0de164d8
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Fri Aug 12 22:02:54 2022 +0300

    docs: fix grammar

    Co-authored-by: Yurii Serhiichuk <xSAVIKx@users.noreply.github.com>

commit 8b4f3db9e2c23c3d1ba68c0b3b1f0ea55e2972f5
Author: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Date:   Fri Aug 12 15:43:02 2022 +0000

    [pre-commit.ci] auto fixes from pre-commit.com hooks

    for more information, see https://pre-commit.ci

commit 685e43d77d23e20f9f8272aefe29405d3249ef68
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Fri Aug 12 18:35:59 2022 +0300

    test: exclude import testing

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit f69bcd2759df7fc3ea16421947316191832fcfcb
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Fri Aug 12 18:33:48 2022 +0300

    docs: simplify specversion documentation

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 6199278600d60ab3f36dd45f93e8cc3ca03f88b5
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Fri Aug 12 18:33:14 2022 +0300

    docs: specversion

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 44de28b6d2ce9ae4c0cfff47967a86d9e2da36af
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Fri Aug 12 18:30:45 2022 +0300

    refactor: optimize imports

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 4a6be338cc29e86cde7c2ce224d5b0127e142af9
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Fri Aug 12 18:29:28 2022 +0300

    refactor: optimize imports

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 8615073ee4617895c41e097bdc4ecb868f8d0eb5
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Fri Aug 12 18:24:03 2022 +0300

    refactor: remove anyt

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit f03d23b39b2a8554321c9b71cc2a988a7c26d1f6
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Fri Aug 12 18:22:15 2022 +0300

    feat: import is_binary and is_structured from converts module

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit b920645df88676a74341ba32ec4dd914855b5aa2
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Fri Aug 12 18:21:49 2022 +0300

    style: formatting

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 0dbd63e713cb26fc951c205ad740f166d76df84d
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Fri Aug 12 18:18:50 2022 +0300

    docs: cleanup license

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 9fdef9480af3e3af277af6df4ea7ccff6a98a02a
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Fri Aug 12 18:41:52 2022 +0300

    build: fixate python version

    Co-authored-by: Yurii Serhiichuk <xSAVIKx@users.noreply.github.com>

commit de47cc8412984cf22a75044ef63daa1c23cb4b18
Author: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Date:   Fri Aug 12 15:23:31 2022 +0000

    [pre-commit.ci] auto fixes from pre-commit.com hooks

    for more information, see https://pre-commit.ci

commit 7be086530bd19748867a221313a221284b1679bb
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Fri Aug 12 18:23:24 2022 +0300

    docs: improve best effort serialization docs

    Co-authored-by: Yurii Serhiichuk <xSAVIKx@users.noreply.github.com>

commit a55d60676e15ce83867be9f8c72f44d03d559773
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Fri Aug 12 18:22:49 2022 +0300

    docs: fix grammar

    Co-authored-by: Yurii Serhiichuk <xSAVIKx@users.noreply.github.com>

commit 4d68ec402dbe3e4bac08fcdf821e07b49b321541
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Fri Aug 12 18:22:36 2022 +0300

    docs: remove uneeded spacing

    Co-authored-by: Yurii Serhiichuk <xSAVIKx@users.noreply.github.com>

commit 9b3537e89f2bd3cabab21373266fc7c3f113afcf
Author: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Date:   Fri Aug 12 15:17:32 2022 +0000

    [pre-commit.ci] auto fixes from pre-commit.com hooks

    for more information, see https://pre-commit.ci

commit 49635fe180b9ebdf49d77536869ee2d3601c8324
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Fri Aug 12 18:15:37 2022 +0300

    docs: incompatible arguments error

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 909b72e612cbabe0bbf104a36df8d98b475bff30
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Fri Aug 12 18:14:24 2022 +0300

    docs: pydantic not installed exception

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 141f9090f490757dec6453aa22f207329a616877
Author: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Date:   Fri Aug 12 13:57:31 2022 +0000

    [pre-commit.ci] auto fixes from pre-commit.com hooks

    for more information, see https://pre-commit.ci

commit d487124a648bd9b6bdb50f81794f2fff63e01016
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Fri Aug 12 16:56:46 2022 +0300

    build: pin pydantic version on python 3.6

    Co-authored-by: Yurii Serhiichuk <xSAVIKx@users.noreply.github.com>

commit a46feba840f99c5a86575d7df074798126b66ef3
Merge: 21368b5 47818a9
Author: Yurii Serhiichuk <xSAVIKx@users.noreply.github.com>
Date:   Thu Aug 11 12:28:57 2022 +0300

    Merge branch 'main' into feature/pydantic

commit 21368b5e123664810a03f19f06d3255be79b9e2e
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Wed Aug 10 20:26:52 2022 +0300

    feat: raise indicative error on non-installed pydantic feature

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 65745f351856b82fc9e0781307cb2d597bea7f26
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Wed Aug 10 20:26:36 2022 +0300

    feat: pydantic feature not installed exception

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit ab218e7568d9c9ed51e74edfc30f2f820d9eb4cf
Author: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Date:   Mon Aug 8 22:10:56 2022 +0000

    [pre-commit.ci] auto fixes from pre-commit.com hooks

    for more information, see https://pre-commit.ci

commit a026d319daa39fad7621affb1deeef6b6d7793e1
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Tue Aug 9 01:10:16 2022 +0300

    fix: test int correctly

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit c49afe41c071be8f6052b6198b419bb57609e26c
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Tue Aug 9 01:08:57 2022 +0300

    test: incompatible types

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit fb74ae39a255adf0f23fe4d0920d902aedf8dd11
Author: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Date:   Mon Aug 8 21:38:12 2022 +0000

    [pre-commit.ci] auto fixes from pre-commit.com hooks

    for more information, see https://pre-commit.ci

commit 9300c005a6647704601a48b92e591e371c2f3737
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Tue Aug 9 00:37:05 2022 +0300

    test: backwards compatability with calling

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 15ccc350b5d8154dd3bce1af9de2a2fa9a803996
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Tue Aug 9 00:25:53 2022 +0300

    test: test is structured backwards compatability

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit bfe441866a4a9371516114214f19649d445756ef
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Tue Aug 9 00:24:42 2022 +0300

    test: improve is binary test

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit aa9a69dd1690d3f02a9fb7932a23756874548702
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Tue Aug 9 00:13:51 2022 +0300

    stlye: formatting

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit fb81f310124a7711a3145df0a69282441f7c1e7c
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Tue Aug 9 00:13:00 2022 +0300

    fix: remove code duplication

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 650dd1634cd3df74d56cd35faac0528067245832
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Tue Aug 9 00:11:56 2022 +0300

    docs: explain why dependency what it is

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit b2780791314e46a918848de2aae9e778927a5441
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Tue Aug 9 00:10:15 2022 +0300

    build: explicitly specify pydantic version

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 29e13ca9a67f39eefaad6ed1ca82317927ad8123
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Tue Aug 9 00:05:54 2022 +0300

    docs: update example

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 42a4f016e5377041ba60bf631f4c413793fcf188
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Tue Aug 9 00:04:59 2022 +0300

    docs: init function

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit e01c2b707473cf7fe1c56124d97cbd95da3ef10e
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Aug 8 23:58:10 2022 +0300

    docs: explain why we ignore the data

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 5ddadf4e5bd158a93bdd1a2037a66e629c530126
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Aug 8 23:53:32 2022 +0300

    refactor: use custom exception type

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 8889abbcd233d4a244ccae4a3b56c42a1e31b24a
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Aug 8 23:51:38 2022 +0300

    feat: incompatible arguments error

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit a4dda34d41338cd80b3b821c9c3f5c5f5bcd5d2f
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Aug 8 23:46:41 2022 +0300

    refactor: use value error instead of assertion

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 61f68a5f5c3ff81b46c05204af67a6fcf5a1f873
Author: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Date:   Mon Aug 8 20:43:10 2022 +0000

    [pre-commit.ci] auto fixes from pre-commit.com hooks

    for more information, see https://pre-commit.ci

commit 1630fc36dbf161d8a0767a332f88606cd66bc394
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Aug 8 23:41:37 2022 +0300

    feat: add examples to field values

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit e019c42194b8f07f45e84d49f8e463ff3c6a6faa
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Aug 8 23:38:37 2022 +0300

    fix: example data

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 9b48f6e7270eb253cce7b8d24561f608a717c911
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Aug 8 23:04:48 2022 +0300

    docs: improve pydantic cloudevent base class

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 6605fa822540e0291da221fba128dc7db9c54e8b
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Aug 8 23:04:22 2022 +0300

    style: formatting

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 39a3ba22c0bde0c5dba919ead1f3ba82f09df033
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Aug 8 23:02:47 2022 +0300

    docs: dumps and loads funcions

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 6d59e2902ed46cc1cdca8886e2f615d85a1b629b
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Aug 8 22:46:17 2022 +0300

    fix: pydantic dumps bugs

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 614496f5875b35e0e103a9b4f3df7e6a4a53c7cb
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Aug 8 22:39:15 2022 +0300

    Revert "refactor: make best effort serialize to json public"

    This reverts commit cdf7e2ebb5c92c9a7d362a5d6b2fb16aab0461a3.

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit cdf7e2ebb5c92c9a7d362a5d6b2fb16aab0461a3
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Aug 8 22:35:31 2022 +0300

    refactor: make best effort serialize to json public

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 75aa8436c3e6bd1865b326c5168c4e2e8ba4be27
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Aug 8 22:33:49 2022 +0300

    feat: add args and kwargs to best effort serialize to json

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit e74ae8149280cbe7d56f11d1458af8bec5a9e37e
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Aug 8 22:32:14 2022 +0300

    test: pydantic json event regression bug

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 9f2e0c6e962b55f8a0683ee936b8a443ddb533c3
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Aug 8 22:23:46 2022 +0300

    perf: use http event for ce_json_* functions

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 8af3ed1c48b278b14cdd127ba06c1f653bd3c4ba
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Aug 8 22:20:01 2022 +0300

    refactor: _best_effort_serialize_to_json type information

    also includes docs

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 20a4e0a1fabbd6d59d371d7340d93d1c01f732b0
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Aug 8 22:13:35 2022 +0300

    refactor: rename marshaller functions

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 9d7da629b64d84b0e99fffe306680ec023b1c39b
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Aug 8 22:06:20 2022 +0300

    fix: bad type information

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit b3f5bbc573baea1127c1390b1291956f43fba183
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Aug 8 22:05:03 2022 +0300

    docs: add module deprecation comments

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 6882ada4f2dec848c521eda3e41f72290b80748d
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Aug 8 22:04:03 2022 +0300

    docs: add module deprecation comments

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 53e6dec5c1ab8161049ad185b5fedc82090c670f
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Aug 8 22:03:32 2022 +0300

    docs: add module deprecation comments

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 169d024cfb2372003b93e7ac33c409aef5f06759
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Aug 8 22:02:44 2022 +0300

    docs: add module deprecation comments

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 3d4b0c94d7182ac444cabf85b3ccda23c7afa813
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Aug 8 22:01:42 2022 +0300

    refactor: use deprecation function

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 5c39cc0add47806e5bffb6550f2a762c484672ba
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Aug 8 21:59:54 2022 +0300

    refactor: use deprecation functions

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 064e2e8cef0c0cb41c837bfb018c037a2f83185b
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Aug 8 21:57:17 2022 +0300

    refactor: use deprecation functions

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 6ea1e54f8ea13b3a520e83991c9b129ef47b272e
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Aug 8 21:54:44 2022 +0300

    refactor: deprecation functions

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 71a06b6179b8d7142f4bd5c7690c2119d4448cb5
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Aug 8 21:46:18 2022 +0300

    docs: default time selection algorithm

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 3fcd085ff4ab6ec289f7c5f80ff369e03784c20e
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Aug 8 21:46:04 2022 +0300

    docs: default id selection algorithm

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 3fdef87ef11d36945b527ad083409b895d249993
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Aug 8 21:41:24 2022 +0300

    docs: license

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 363679837cc7153b5cfdcb9b4aefa16d21e2c9fa
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Aug 8 21:32:39 2022 +0300

    docs: improve documentation

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 53d1931387bb0b565cb1e76f5ddd5b25b0fdf002
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Sun Aug 7 23:21:45 2022 +0300

    docs: conversion documentation

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 050ed7536b8797ae9f752715006bdc9d59d9b767
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Sun Aug 7 23:19:37 2022 +0300

    docs: fix line length

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit bd70199a02551490f4533e773d7434af22daa711
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Sun Aug 7 23:15:05 2022 +0300

    refactor: add best_effort suffix for clerefication

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 14ed5616b25a0fcf4498a5b6347865327cf66762
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Sun Aug 7 23:14:18 2022 +0300

    docs: encode_attribute value

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 6baf7d0726aed09b1394b8e4b36bbecafafa82d9
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Sun Aug 7 23:09:10 2022 +0300

    refactor: move attributes to variable

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 3a77b1e446973d43e46db58e421323a11dde26f6
Author: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Date:   Sun Aug 7 20:10:03 2022 +0000

    [pre-commit.ci] auto fixes from pre-commit.com hooks

    for more information, see https://pre-commit.ci

commit 8ab108ac7221fdf1561965d37f21264558cb53da
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Sun Aug 7 23:09:54 2022 +0300

    docs:  _json_or_string

    Co-authored-by: Yurii Serhiichuk <xSAVIKx@users.noreply.github.com>

commit 4778c109543b7419fd443e436e32eb2d8ced4f1a
Author: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Date:   Sun Aug 7 20:06:11 2022 +0000

    [pre-commit.ci] auto fixes from pre-commit.com hooks

    for more information, see https://pre-commit.ci

commit 4809c75578e6b1058a69368fc8066a9056161b7a
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Sun Aug 7 23:06:03 2022 +0300

    docs: from_dict better description

    Co-authored-by: Yurii Serhiichuk <xSAVIKx@users.noreply.github.com>

commit a538834fc5b49c34246c27637dd68afe1895a06b
Author: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Date:   Sun Aug 7 20:04:20 2022 +0000

    [pre-commit.ci] auto fixes from pre-commit.com hooks

    for more information, see https://pre-commit.ci

commit f1d09a2dd2f1922b1226d31d6fefb6b9bdbc1d68
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Sun Aug 7 23:04:11 2022 +0300

    docs: is_structured better description

    Co-authored-by: Yurii Serhiichuk <xSAVIKx@users.noreply.github.com>

commit 4cf7559aec29d77d4aa4bb29dd7b705a4e01ad56
Author: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Date:   Sun Aug 7 20:01:56 2022 +0000

    [pre-commit.ci] auto fixes from pre-commit.com hooks

    for more information, see https://pre-commit.ci

commit 1efab9149991adf2afa42bcd8a38d62c932827e0
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Sun Aug 7 23:01:48 2022 +0300

    docs: is_binary

    Co-authored-by: Yurii Serhiichuk <xSAVIKx@users.noreply.github.com>

commit 8e44b2462226e24fe28837758a808b68c73a91ec
Author: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Date:   Sun Aug 7 19:32:36 2022 +0000

    [pre-commit.ci] auto fixes from pre-commit.com hooks

    for more information, see https://pre-commit.ci

commit f9956d4d2d9935ee4e1a5f0f96bbd87a25044120
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Sun Aug 7 22:32:27 2022 +0300

    docs: canonical representation

    Co-authored-by: Yurii Serhiichuk <xSAVIKx@users.noreply.github.com>

commit 42578aff4d07c2e4fc5030c57077b96c72eee3a7
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Sat Aug 6 15:11:45 2022 +0300

    fix: circular dependency

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 6b90af97f077d1cfae9912754092b0b6354a3a5b
Author: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Date:   Sat Aug 6 12:01:59 2022 +0000

    [pre-commit.ci] auto fixes from pre-commit.com hooks

    for more information, see https://pre-commit.ci

commit 789fb64fcff83593ba3c73104f2a08620b26962e
Merge: 4e60121 785bfe7
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Sat Aug 6 15:02:07 2022 +0300

    Merge branch 'main' into feature/pydantic

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

    # Conflicts:
    #	cloudevents/abstract/event.py
    #	cloudevents/conversion.py
    #	cloudevents/http/event.py
    #	cloudevents/http/http_methods.py
    #	cloudevents/http/json_methods.py

commit 4e60121514f31fdc538ae45a9ca00c2651334e4d
Author: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Date:   Fri Aug 5 14:18:33 2022 +0000

    [pre-commit.ci] auto fixes from pre-commit.com hooks

    for more information, see https://pre-commit.ci

commit 58c18f2237efc8765a12d7183a5889739cb7f9e7
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Fri Aug 5 17:14:39 2022 +0300

    refactor: convert get_data and get_attributes to private member

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit c1e9105dea7ce9ea1a715d8583c32bfdc55afe2f
Merge: d73311e 96c41a1
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Fri Aug 5 17:12:59 2022 +0300

    Merge branch 'feature/abstract-cloudevent' into feature/pydantic
    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 96c41a15ca
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Fri Aug 5 17:11:12 2022 +0300

    build: ignore not-implemented functions in coverage

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 4e00b55062
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Fri Aug 5 17:09:17 2022 +0300

    refactor: convert get_data and get_attributes to private member functions

    instead of classmethods

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit d73311e44203d9d2aabbb378a131da2f7941deb7
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Jul 25 02:30:55 2022 +0300

    test: remove unused variable

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 82aa0d41f727c61f0ec4b8cb72f08c34166653d8
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Jul 25 02:30:24 2022 +0300

    style: formatting

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit f376bb51e6c70b0f2827775adaf5865d0b2ed789
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Jul 25 02:29:42 2022 +0300

    style: formatting

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 5c6a511e2e234097b1b9ae782e7010c587d1f8a9
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Jul 25 02:26:56 2022 +0300

    style: formatting

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit dbb8263e28ae2725773e7e6225a68f4aa8c30dcc
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Jul 25 02:25:33 2022 +0300

    test: add backwards compatibility tests

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 7eb8c9991cad818d282380e44a9107dc732298ca
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Jul 25 02:22:25 2022 +0300

    refactor: use direct imports

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 175084a01a851e5237413bdbed482087ee752515
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Jul 25 02:21:51 2022 +0300

    test: http event dict serialization

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit dec8244fb9d22a1b18dccde0b229c3fec6760775
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Jul 25 02:19:49 2022 +0300

    refactor: use direct imports

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit fdf4e8124eb1b35784c74f79e8e0ace6a613be9e
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Jul 25 02:16:47 2022 +0300

    test: fix to_dict bug

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit adfbd40a92ccb7dd2f83472c79ef8216f548bb47
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Jul 25 02:16:10 2022 +0300

    refactor: gut util module

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 9024c83a7897e655ad363bb8ce6a9679707c9faf
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Jul 25 02:13:07 2022 +0300

    refactor: remove problematic mappings module

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit ee34c0e744d0d263efbd69750c72386db477d194
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Jul 25 02:05:18 2022 +0300

    style: formatting

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 73d35da6229d6ab3243685c2775e34abbadf3098
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Jul 25 02:03:06 2022 +0300

    fix: order confusion

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 8ef16850d291f72c8f4e4aa90364a0feef491304
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Jul 25 02:01:45 2022 +0300

    fix: remove uneeded symbol

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 7238465ecd282ba63d3fa9a2b70f5a0118599771
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Jul 25 02:00:34 2022 +0300

    fix: circular imports

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 618d2182aa9fba80a8dc9e88aff9612360014b76
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Jul 25 01:59:38 2022 +0300

    fix: from_dict order confusion

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit f4c7f729db256d403b7943e2a7a2b62a69ffdc70
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Jul 25 01:58:42 2022 +0300

    refactor: move is structured to sdk

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit e11913bfcdf2900c3045c109ee576b1a090bf5c9
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Jul 25 01:57:25 2022 +0300

    refactor: move is_binary to sdk

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 067e046204c16878e31a4f213ae4402866fc2415
Merge: 48d7d68 0c2bafc
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Jul 25 01:55:32 2022 +0300

    Merge branch 'feature/abstract-cloudevent' into feature/pydantic

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

    # Conflicts:
    #	cloudevents/http/http_methods.py
    #	cloudevents/http/json_methods.py

commit 0c2bafc423
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Jul 25 01:53:52 2022 +0300

    refactor: remove optional type

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 48d7d68686f630ee0f1f31283a33900b4174878e
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Jul 25 01:50:22 2022 +0300

    refactor: move all methods to conversion

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 81905e73050f0ba89ff5ba4aa6a47257aad7aadb
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Jul 25 01:43:46 2022 +0300

    refactor: move json methods to conversion

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 474bf4368d0e540fee0bdfa632d01c81a16223d1
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Jul 25 01:42:25 2022 +0300

    refactor: merge conversion logic under conversion

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit a8156274a8fc5ebe9af45a0b25bf9f78b10273e6
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Jul 25 01:37:28 2022 +0300

    feat: init default cloudevent

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 523e1cb331f1131390581389ded2e6de762087e6
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Jul 25 01:37:12 2022 +0300

    docs: dict conversion functions

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 88c168932b97e3a73d02238e81a2e87328f69469
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Jul 25 01:35:20 2022 +0300

    refactor: move dict methods to conversion

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit b6e008a338b1e4fd5a1d805792a12131a88ce99a
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Jul 25 01:30:38 2022 +0300

    fix: broken merge

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 2e9e255322064001e04c91fba6d96d89c2da1859
Merge: 316a9fc fbc0632
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Jul 25 01:27:27 2022 +0300

    Merge branch 'feature/abstract-cloudevent' into feature/pydantic

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

    # Conflicts:
    #	cloudevents/abstract/json_methods.py
    #	cloudevents/conversion.py
    #	cloudevents/http/event.py
    #	cloudevents/http/http_methods.py
    #	cloudevents/http/json_methods.py
    #	cloudevents/http/util.py

commit fbc063244b
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Jul 25 01:26:10 2022 +0300

    refactor: use classmethods

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit a8872b9808
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Jul 25 01:06:06 2022 +0300

    test: remove broken tests

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 065ef91277
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Jul 25 01:02:17 2022 +0300

    refactor: expose data and attributes in class

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit c0b54130c6
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Jul 25 00:56:01 2022 +0300

    refactor: remove mutation variables from contract

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 1109bc5b76
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Jul 25 00:55:34 2022 +0300

    docs: remove inconsistent types

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 6a9201647c
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Jul 25 00:54:22 2022 +0300

    refactor: add default value for conversions

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 5d0882d8b9
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Jul 25 00:50:04 2022 +0300

    test: rename badly named test

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 41c5f5984b
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Jul 25 00:48:37 2022 +0300

    refactor: move all abstract conversion logic under conversion

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit f47087d490
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Jul 25 00:32:16 2022 +0300

    Revert "refactor: rename abstract to generic"

    This reverts commit 89d30eb23d.

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit ea19f7dbd6
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Sun Jul 24 23:10:53 2022 +0300

    test: fix broken test

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit ba16cdd3ac
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Sun Jul 24 23:10:43 2022 +0300

    refactor: cloudevent is no longer absctract

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit d303eaecab
Merge: 89d30eb 61c8657
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Sun Jul 24 23:06:37 2022 +0300

    Merge branch 'main' into feature/abstract-cloudevent

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

    # Conflicts:
    #	CHANGELOG.md
    #	cloudevents/http/event.py
    #	cloudevents/tests/test_http_cloudevent.py

commit 89d30eb23d
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Sun Jul 24 23:04:58 2022 +0300

    refactor: rename abstract to generic

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit a22efbde37
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Sun Jul 24 23:00:36 2022 +0300

    test: add abstract cloudevent coverage tests

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 2b3c0f1292
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Sun Jul 24 22:04:25 2022 +0300

    docs: add missing comment to from_http

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 62595ffc3b
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Sun Jul 24 22:02:48 2022 +0300

    docs: explain why impl has no public attributes property

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit b9e8763594
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Sun Jul 24 21:59:53 2022 +0300

    docs: not implemented errors

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit ecf9418a1b
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Sun Jul 24 21:56:02 2022 +0300

    docs: explain read model

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 1187600b1b
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Sun Jul 24 21:51:32 2022 +0300

    docs: better cloudevent explenation

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit fb4f993536
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Sun Jul 24 21:50:22 2022 +0300

    docs: getitem documentation

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 3845aa7295
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Sun Jul 24 21:48:38 2022 +0300

    refactor: use anycloudevent for generics

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 316a9fca85a16f5771cf1cac7723d8711f3ada87
Merge: 8072e61 a96bd6c
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Sat Jul 23 01:22:39 2022 +0300

    Merge branch 'feature/abstract-cloudevent' into feature/pydantic
    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit a96bd6cdde
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Sat Jul 23 01:22:31 2022 +0300

    feat: define abstract methods

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 8072e6110cbca2206e72a267f007e1e28f564c3c
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Sat Jul 23 01:18:30 2022 +0300

    docs: wording

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit e6b5c9c66d7774f9b993164e96b98dba1eed07b6
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Sat Jul 23 01:17:51 2022 +0300

    refactor: explicit optional

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit e51926c4d2e05c620f964b4cb5047bd5dec19dd7
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Sat Jul 23 01:16:27 2022 +0300

    refactor: use anystr

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 115c7f5223c4d4346c23786df7b0303a3b30ab4e
Author: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Date:   Fri Jul 22 22:14:15 2022 +0000

    [pre-commit.ci] auto fixes from pre-commit.com hooks

    for more information, see https://pre-commit.ci

commit 60c00065679ddbd285898ada54a63459c514caa2
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Sat Jul 23 01:14:02 2022 +0300

    test: remove pytest fixture parameterization

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 88f7ae58e7828c5b71b92e3cc3005a8a9ee2632e
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Sat Jul 23 01:08:46 2022 +0300

    feat: remove strict event

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 982436c65b72ec46112645ede6fc9cdbe56ea6e4
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Sat Jul 23 01:08:07 2022 +0300

    Revert "fix: strict event did not inherit descriptions"

    This reverts commit 63975cd67e5bdbc6889327914c1b78d3cd430aa7.

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

    # Conflicts:
    #	cloudevents/pydantic/event.py
    #	cloudevents/pydantic/strict_event.py

commit f569c541cf3f4d1850f5841504a90c087283766a
Author: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Date:   Fri Jul 22 21:59:25 2022 +0000

    [pre-commit.ci] auto fixes from pre-commit.com hooks

    for more information, see https://pre-commit.ci

commit 43ecfeea816b2a98b6d2087e6c7d327817baed11
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Sat Jul 23 00:58:05 2022 +0300

    refactor: remove uneeded code

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 154f7674533fa32f1789ed157353cc5d4ee1bceb
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Sat Jul 23 00:43:43 2022 +0300

    refactor: integrate abstract event

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 896299b66df63791258a4dc5594c30843ec76dae
Merge: d034677 09062e3
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Sat Jul 23 00:40:46 2022 +0300

    Merge branch 'feature/abstract-cloudevent' into feature/pydantic
    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 09062e35ff
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Sat Jul 23 00:40:40 2022 +0300

    fix: intengrate data read model

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit d034677da266080c49a91cb857d9b660cb508111
Merge: fb5165e 5648968
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Sat Jul 23 00:39:03 2022 +0300

    Merge branch 'feature/abstract-cloudevent' into feature/pydantic
    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 56489682c5
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Sat Jul 23 00:38:56 2022 +0300

    feat: simplify data attributes

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit fb5165eb6c980fa4091dae66871e719e0b2a5aec
Merge: af83fb0 01041e7
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Sat Jul 23 00:28:21 2022 +0300

    Merge branch 'feature/abstract-cloudevent' into feature/pydantic

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

    # Conflicts:
    #	CHANGELOG.md
    #	cloudevents/http/event.py
    #	cloudevents/tests/test_http_cloudevent.py

commit 01041e7cd5
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Sat Jul 23 00:23:39 2022 +0300

    docs: abstract cloudevent

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 6588577ffc
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Sat Jul 23 00:17:07 2022 +0300

    refactor: create abstract cloudevent package

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit c747f59a29
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Fri Jul 22 23:31:06 2022 +0300

    refactor: integrate abstract event

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit f1ff00908e
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Fri Jul 22 23:58:52 2022 +0300

    refactor: move to abstract

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 4488201812
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Fri Jul 22 23:40:05 2022 +0300

    feat: any cloud event

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 2b6483046a
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Fri Jul 22 23:38:49 2022 +0300

    feat: create function

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 5f8399fa09
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Fri Jul 22 23:31:55 2022 +0300

    feat: add missing return type

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 41a9af2874
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Fri Jul 22 23:30:57 2022 +0300

    feat: abstract event

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit af83fb084cdd882a607982ad6352446804f45252
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Fri Jul 22 23:08:55 2022 +0300

    fix: use python 3 type hints

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 771d2ab147e1755feb5cc0c2ee36edabb076e5e1
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Fri Jul 22 23:07:44 2022 +0300

    test: explicit value names

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 899e81b670719a45bfc3fa2ff673da4ce90a46a5
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Fri Jul 22 23:04:53 2022 +0300

    fix: make specversion comperable to strings

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 476d8226cf1b1ca6c6bd9e12cb9b380084f259ae
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Fri Jul 22 23:02:29 2022 +0300

    docs: make return value more precise

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 9896252a7b999d199c58d788fbc6e4bedb3aac53
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Fri Jul 22 23:00:00 2022 +0300

    refactor: merge attributes to signle module

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 4be431f47fb3a06febe1bf73807a4ff754d722f7
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Fri Jul 22 22:53:30 2022 +0300

    build: explicit pydantic version

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit e44e99687d03b717de0a9fe3abe43d4bdbf02c6f
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Fri Jul 22 22:46:51 2022 +0300

    feat: remove content type from strict event

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit eeb608cbfdbb23740cc90c701d9d4d3c20b8d5e4
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Fri Jul 22 22:46:22 2022 +0300

    build: move pydantic tox deps to test.txt

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 63975cd67e5bdbc6889327914c1b78d3cd430aa7
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Tue Jul 19 02:40:09 2022 +0300

    fix: strict event did not inherit descriptions

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 53ab87b817ce995894ce5b41cb6b775491e87105
Author: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Date:   Mon Jul 18 23:20:43 2022 +0000

    [pre-commit.ci] auto fixes from pre-commit.com hooks

    for more information, see https://pre-commit.ci

commit 50a7fb506eecaba04434519eac49cfd5927d0929
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Tue Jul 19 02:20:31 2022 +0300

    stlye: formatting

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit a7af0363228bab5309258ec720fda6bf21fe0ddf
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Tue Jul 19 02:19:39 2022 +0300

    test: strict cloudevent

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit bdfb997e7fa5a5e00ba442fc2d3251c8c05aebf5
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Tue Jul 19 02:14:47 2022 +0300

    test: pydantic json methods

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 1f580ecefbaf529a00da7a60820fab7e63de5da1
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Tue Jul 19 02:14:29 2022 +0300

    fix: use correct import

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 030e7c0daa74592dfe32689c85c2f9fa8171f6b9
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Tue Jul 19 02:11:09 2022 +0300

    test: pydantic events integration

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 92cb622bfe2f6230c9184fed05843cfda544bcc2
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Tue Jul 19 02:06:48 2022 +0300

    fix: encode attribute access

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 9d334563c2febdeda2776a7f02e8ed8278b1e96d
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Tue Jul 19 02:05:45 2022 +0300

    feat: make encode attribute value public

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 100c78905ecf96c9afc01702f524426f77d882ff
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Tue Jul 19 01:57:38 2022 +0300

    feat: strict event

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 703fe1a78f5bb024d2b0d9e6cdc099e42c493d00
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Tue Jul 19 01:57:34 2022 +0300

    feat: lax event requirments

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit f2c9bc4af56b243e62949a99bbe890f069833fcc
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Tue Jul 19 01:50:48 2022 +0300

    feat: add more proxy imports

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit e8163a9bc1e1a3cff3b03ff20cb41a868c8d283e
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Tue Jul 19 01:48:25 2022 +0300

    test: data not in dummy event

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit c3c9c7d1d3bfa56750da99f79a1c18d5d1efc105
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Tue Jul 19 01:46:55 2022 +0300

    test: fix broken dummy values

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit bac4f19e6289137da53618476005985c4276cefe
Author: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Date:   Mon Jul 18 22:42:35 2022 +0000

    [pre-commit.ci] auto fixes from pre-commit.com hooks

    for more information, see https://pre-commit.ci

commit 5f7c7b0b9620fbc841856fb43bfff4de7ca9ac95
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Tue Jul 19 01:37:28 2022 +0300

    test: repr

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 593fa84428c5f0238cbce22461b85ea4eb62a664
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Tue Jul 19 01:35:36 2022 +0300

    test: event length

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 0259e46aa4df676c015cf666bae7e5577c8be803
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Tue Jul 19 01:35:21 2022 +0300

    fix: incorrect iteration

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit bafcec8c2923e3f02a1138578dd04cb35673a36a
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Tue Jul 19 01:30:56 2022 +0300

    Revert "refactor: better iter type signature"

    This reverts commit 8bb3e76bf15d925ee5b5ac80e045d320f0bfbaa3.

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 8bec7b3dd014e0849a128c3ef5865f9b11bc94d5
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Tue Jul 19 01:28:53 2022 +0300

    test: item access

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 8bb3e76bf15d925ee5b5ac80e045d320f0bfbaa3
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Tue Jul 19 01:23:14 2022 +0300

    refactor: better iter type signature

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 37fdeec23bf136e771dc30195564a4bc77860a2f
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Tue Jul 19 01:15:27 2022 +0300

    docs: cloudevent methods

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit e0ad1ae47261e7276f086fb06aa1730b055d72d4
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Tue Jul 19 01:09:37 2022 +0300

    docs: fix typo

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 0095236d29e46adef34e1a80a1deb9deeb471557
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Tue Jul 19 01:09:23 2022 +0300

    docs: fix typo

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 3eb1fe165527fdbc77b33b01ed8090f701022a51
Author: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Date:   Mon Jul 18 22:04:19 2022 +0000

    [pre-commit.ci] auto fixes from pre-commit.com hooks

    for more information, see https://pre-commit.ci

commit 96d5b66b082b962b35895d48a073567d607d9ed2
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Tue Jul 19 01:03:13 2022 +0300

    test: add xfail on the json content type

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 643ed7692184dc0cebb04ba92350779ffd15c66c
Author: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Date:   Mon Jul 18 21:19:50 2022 +0000

    [pre-commit.ci] auto fixes from pre-commit.com hooks

    for more information, see https://pre-commit.ci

commit efb5950b30129b78726bc601fae81c15225fdf97
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Tue Jul 19 00:18:52 2022 +0300

    test: json or string

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 3307e2df6b6b21f6a37c29baa9829246ea4d7d3c
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Tue Jul 19 00:13:05 2022 +0300

    refactor: better type information

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 6f336804dc33e844d54aed1385e3f2db516401da
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Tue Jul 19 00:10:16 2022 +0300

    fix: add optional to signature

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit cb29c54effbf1e0dde28b08d426c67c67c58e705
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Jul 18 23:55:25 2022 +0300

    fix: add missing decode exception

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 412d1912c133b52851061646c9cf765b63c1c0e1
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Tue Jul 19 00:02:14 2022 +0300

    fix: return str not bytes

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 00cc4e3ed232354a518887eeb2e998a7b021acbf
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Jul 18 23:56:58 2022 +0300

    fix: use correct iteration

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit c7693a1066a7bed4939d7f9fd23f80054d1f630e
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Jul 18 23:47:26 2022 +0300

    fix: normalize datetime

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 0adbc5e08d752a8ec0a1c72e9d3f9b5e95f2092f
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Jul 18 23:45:54 2022 +0300

    refactor: simplify ce json

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 61025385ec677d61790716a4040094c83104d382
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Jul 18 23:44:08 2022 +0300

    refactor: simplify http adapter

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit f3f22f175821560b3fc5681120e61e1e1d0a30e4
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Jul 18 23:42:58 2022 +0300

    feat: dict methods

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 08ab2ce7a61023069c6cbdc2f66d20c033e693c4
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Jul 18 23:31:44 2022 +0300

    feat: add type information for init

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 914bbcc18c296fcdf924b11442c21d8208f579d4
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Jul 18 23:29:56 2022 +0300

    fix: normalize enums

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit aeddc2e120a82a83dbb9adbad72614a9bc00b9b8
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Jul 18 23:22:34 2022 +0300

    fix: remove *args

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 50f985d36f822295cb8c73e8a9eb0e5f5b93fe22
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Jul 18 23:21:55 2022 +0300

    refactor: move json format methods to event module

    to prevent confusion

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 73c0ada30fc7b037aca1fafd54bf4f7908e9ccd2
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Jul 18 23:12:11 2022 +0300

    feat: http methods

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 016a3d63a65f7e7f25121401bd2a875daf005fb6
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Jul 18 23:11:59 2022 +0300

    docs: license

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 388b27837adc3cba781a3accdd546ef5350d404b
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Jul 18 23:06:32 2022 +0300

    refactor: json methods to use http json methods

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 41a653937db75f6044e0e358c4228fea8561f6ee
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Jul 18 23:05:48 2022 +0300

    style: formatting

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 03fcc8df2661c8d9969b701b7affbc13e5e175f3
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Jul 18 22:57:49 2022 +0300

    feat: simplify json functions

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit cb88107c9c2bbd81e0ab5c372b5777faddf2eb4e
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Jul 18 22:57:36 2022 +0300

    feat: from http event

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit cabcf2a02fb1d7debb635818a8bf74207078a94f
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Mon Jul 18 22:50:24 2022 +0300

    feat: http adapter

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 09fd02f727cd639ca6d5c7f3b0c579fe627ea5c5
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Sun Jul 17 22:01:57 2022 +0300

    test: fix tests to adjust to specversion changes

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit c3c6f63a15d549aa24449b96248d957afa7a9c81
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Sun Jul 17 21:59:05 2022 +0300

    fix: imports

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit d0253111eda0425df2779ad61777f5093c9c3437
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Sun Jul 17 21:56:26 2022 +0300

    feat: spec version enum

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit dcd3871f502fe69293407ad97eb2ec5946334819
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Sun Jul 17 21:50:57 2022 +0300

    refactor: split defaults module to attribute modules
    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit fc0d718bcac9ec155a8d290fbfae21a4bd04bb82
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Sun Jul 17 21:45:44 2022 +0300

    fix: every absolute uri is a uri reference

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 82e3439b8efb8a478d10f7425062a02f1bef7d07
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Sun Jul 17 21:44:42 2022 +0300

    docs: explain why cannot use pydantic

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit fbdf8fd5c48449bb6fead21ad1dfd7ec5f335a8a
Merge: eb32f0a 3bcf126
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Sun Jul 17 21:38:16 2022 +0300

    Merge remote-tracking branch 'origin/feature/pydantic' into feature/pydantic
    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit eb32f0a910e8baded4549af6e07cf21538938470
Merge: 81935fc 0a95e63
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Sun Jul 17 21:38:03 2022 +0300

    Merge remote-tracking branch 'upstream/main' into feature/pydantic
    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 3bcf126a46857a27d46aefba2d456d853a18cde8
Author: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Date:   Sun Jul 17 18:36:12 2022 +0000

    [pre-commit.ci] auto fixes from pre-commit.com hooks

    for more information, see https://pre-commit.ci

commit 81935fcdf760222483f23728ce83be388974a623
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Sun Jul 17 21:30:14 2022 +0300

    test: remove unused import

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 5452151b330d463f4eaf6d91ffc77e6c9d031db7
Author: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Date:   Sun Jul 17 18:16:39 2022 +0000

    [pre-commit.ci] auto fixes from pre-commit.com hooks

    for more information, see https://pre-commit.ci

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit a849f536283836d2b66aa951b9fefce18999415a
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Sun Jul 17 21:12:28 2022 +0300

    build: add missing pydantic dep

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit ce2526522b2e8f84e82e326ab744858179bf93eb
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Sun Jul 17 21:09:10 2022 +0300

    style: formatting

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 9870c3c90a6f978d2137374aafb3b477ad9e2378
Author: Yurii Serhiichuk <xSAVIKx@users.noreply.github.com>
Date:   Fri Jul 15 11:22:29 2022 +0300

    ci: migrate to `main` branch (#180)

    * ci: migrate to `main` branch

    Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

    * docs: mentioned default branch change in the changelog

    Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>
    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit be213912bcb8f5d308a8748442f7990d479672db
Author: Yurii Serhiichuk <xSAVIKx@users.noreply.github.com>
Date:   Thu Jul 14 12:11:16 2022 +0300

    release: v1.4.0 (#179)

    Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>
    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 84b488ac8a50131dd82c618cee6869d7be231366
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Thu Jul 14 00:10:08 2022 +0300

    fix __eq__ operator raises attribute error on non-cloudevent values  (#172)

    * fix: non-cloudevents values must not equal to cloudevents values (#171)

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

    * test: refactor move fixtures to beginning

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

    * test: cloudevent equality bug regression (#171)

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

    * style: remove redundent else

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

    * test: remove redundent test

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

    * test: refactor non_cloudevent_value into a parameterization

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

    * docs: update changelog

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

    * [pre-commit.ci] auto fixes from pre-commit.com hooks

    for more information, see https://pre-commit.ci

    * docs: fix bad merge

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

    * [pre-commit.ci] auto fixes from pre-commit.com hooks

    for more information, see https://pre-commit.ci

    * [pre-commit.ci] auto fixes from pre-commit.com hooks

    for more information, see https://pre-commit.ci

    Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 396c011a24964398e7d885bd13b441bb75b3a8e2
Author: Yurii Serhiichuk <xSAVIKx@users.noreply.github.com>
Date:   Mon Jul 11 20:05:45 2022 +0300

    chore: drop `docs` and related files (#168)

    * chore: drop `docs` and related files

    Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

    * docs: update changelog

    Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>
    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit faff6dca07eec7f4e7bfbf5b5308c440e8424f65
Author: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Date:   Sat Jul 16 12:24:07 2022 +0000

    [pre-commit.ci] auto fixes from pre-commit.com hooks

    for more information, see https://pre-commit.ci

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 9d8b6df94fa4ccbf70d060d9531a3830a101a196
Author: Yurii Serhiichuk <xSAVIKx@users.noreply.github.com>
Date:   Fri Jul 15 11:22:29 2022 +0300

    ci: migrate to `main` branch (#180)

    * ci: migrate to `main` branch

    Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

    * docs: mentioned default branch change in the changelog

    Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>
    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit fa540c714781f641615282a57cca369d89f456d9
Author: Yurii Serhiichuk <xSAVIKx@users.noreply.github.com>
Date:   Thu Jul 14 12:11:16 2022 +0300

    release: v1.4.0 (#179)

    Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

commit 573098232524d9dbb627615cdd0cdd42834dbed0
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Sat Jul 16 15:16:38 2022 +0300

    style: sort imports

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 14fdbfcc760ea6a0c2e00c8760eecc4132942685
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Sat Jul 16 15:14:34 2022 +0300

    feat: add more examples

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 49bd752b1efac4ba25826beb1ff3e09642f40352
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Sat Jul 16 15:04:51 2022 +0300

    test: binary data deserialization

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit c5a8b8668029a68dbe3e6d27b2f876da2ee566c0
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Sat Jul 16 15:02:30 2022 +0300

    fix: raise correct exception type to prevent confusion

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 0e075ae22531c042d89874c56e8d5076f81d8894
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Sat Jul 16 14:57:42 2022 +0300

    test: binary data serialization

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit b325caeec49fcb1d2cd0e125881bec49e137e0a7
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Sat Jul 16 14:57:23 2022 +0300

    fix: forbid api mixing

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit f07169dff83dd9d830cf9f927d0c922a8c5aaefa
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Sat Jul 16 14:47:06 2022 +0300

    test: json content type serialization

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 525dee0ddeb2bf035e13383e29994e3ef785e761
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Sat Jul 16 14:39:16 2022 +0300

    fix: incorrect behaviour for mirroring

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

commit 29a48598877562a5f8ad392bea51ceb4c4815343
Author: Alexander Tkachev <sasha64sasha@gmail.com>
Date:   Sat Jul 16 14:33:37 2022 +0300

    test: pydantic cloudevent

    Signed-off-by: Alexander Tkachev <sasha64sasha@gmai…

* docs: include pydantic feature to changelog

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

* docs: add deprecations to changelog

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2022-08-13 21:35:53 +03:00
Yurii Serhiichuk 47818a980d
release: v1.5.0 (#187)
* chore: bump version.

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* docs: Update the Changelog.

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* deps: fix `sanic` vulnerability.

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>
2022-08-07 09:32:57 -07:00
Alexander Tkachev 785bfe731b
refactor: create abstract cloudevent (#186)
* fix: non-cloudevents values must not equal to cloudevents values (#171)

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

* test: refactor move fixtures to beginning

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

* test: cloudevent equality bug regression (#171)

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

* style: remove redundent else

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

* test: remove redundent test

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

* test: refactor non_cloudevent_value into a parameterization

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

* docs: update changelog

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* docs: fix bad merge

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* feat: abstract event

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

* feat: add missing return type

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

* feat: create function

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

* feat: any cloud event

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

* refactor: move to abstract

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

* refactor: integrate abstract event

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

* refactor: create abstract cloudevent package

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

* docs: abstract cloudevent

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

* feat: simplify data attributes

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

* fix: intengrate data read model

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

* feat: define abstract methods

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

* refactor: use anycloudevent for generics

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

* docs: getitem documentation

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

* docs: better cloudevent explenation

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

* docs: explain read model

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

* docs: not implemented errors

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

* docs: explain why impl has no public attributes property

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

* docs: add missing comment to from_http

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

* test: add abstract cloudevent coverage tests

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

* refactor: rename abstract to generic

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

* refactor: cloudevent is no longer absctract

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

* test: fix broken test

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

* Revert "refactor: rename abstract to generic"

This reverts commit 89d30eb23d.

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

* refactor: move all abstract conversion logic under conversion

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

* test: rename badly named test

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

* refactor: add default value for conversions

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

* docs: remove inconsistent types

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

* refactor: remove mutation variables from contract

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

* refactor: expose data and attributes in class

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

* test: remove broken tests

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

* refactor: use classmethods

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

* refactor: remove optional type

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

* refactor: convert get_data and get_attributes to private member functions

instead of classmethods

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

* build: ignore not-implemented functions in coverage

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* docs: mentioned default branch change in the changelog

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Co-authored-by: Yurii Serhiichuk <savik.ne@gmail.com>
2022-08-06 14:52:22 +03:00
Alexander Tkachev 61c8657025
fix: `_json_or_string` no longer fails on malformed unicode buffers (#184)
* fix: add missing decode exception

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

* fix: add optional to signature

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

* refactor: better type information

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

* test: json or string

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

* docs: update changelog

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

* refactor: use anystr

Co-authored-by: Yurii Serhiichuk <xSAVIKx@users.noreply.github.com>
Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

* refactor: use anystr instead of custom type var

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

* docs: _json_or_string

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Co-authored-by: Yurii Serhiichuk <xSAVIKx@users.noreply.github.com>
2022-07-24 21:49:19 +03:00
Yurii Serhiichuk 0a95e63776
ci: migrate to `main` branch (#180)
* ci: migrate to `main` branch

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* docs: mentioned default branch change in the changelog

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>
2022-07-15 11:22:29 +03:00
Yurii Serhiichuk 86e6002d25
release: v1.4.0 (#179)
Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>
2022-07-14 12:11:16 +03:00
Alexander Tkachev ad111ae89a
fix __eq__ operator raises attribute error on non-cloudevent values (#172)
* fix: non-cloudevents values must not equal to cloudevents values (#171)

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

* test: refactor move fixtures to beginning

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

* test: cloudevent equality bug regression (#171)

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

* style: remove redundent else

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

* test: remove redundent test

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

* test: refactor non_cloudevent_value into a parameterization

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

* docs: update changelog

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* docs: fix bad merge

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2022-07-14 00:10:08 +03:00
Alexander Tkachev f39b964209
feat: add type information for all cloudevent member functions (#173)
* feat: add type information for all cloudevent member functions

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

* docs: update changelog

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>
2022-07-12 22:44:43 +03:00
Yurii Serhiichuk 18951808b1
chore: unify copyright with other SDKs and update/add it where needed. (#170)
* chore: unify copyright with other SDKs and update/add it where needed.

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* docs: update changelog

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* style: Add missing empty line.

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>
2022-07-11 12:56:07 -07:00
Yurii Serhiichuk 1cdd2542ba
ci: cleanup CI config and update setup (#169)
* ci: Run tests on multiple OS. Use latest action versions.

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* ci: use fixed `pupi-publish` action version and update others.

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* docs: update changelog

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* Upgrade python setup action to the latest v4

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>
2022-07-11 22:38:34 +03:00
Yurii Serhiichuk a61b84b1be
chore: drop `docs` and related files (#168)
* chore: drop `docs` and related files

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* docs: update changelog

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>
2022-07-11 20:05:45 +03:00
Lucas Bickel 2896d04c79
fix: merge strings on same line into single string (#153)
* fix: merge strings on same line into single string

Signed-off-by: Lucas Bickel <hairmare@rabe.ch>

* chore: blacken example

Signed-off-by: Lucas <lucas.bickel@adfinis.com>
2022-07-10 16:04:23 +03:00
dependabot[bot] ae3099de60
chore: bump sanic from 20.12.3 to 20.12.6 in /requirements (#155)
Bumps [sanic](https://github.com/sanic-org/sanic) from 20.12.3 to 20.12.6.
- [Release notes](https://github.com/sanic-org/sanic/releases)
- [Changelog](https://github.com/sanic-org/sanic/blob/main/CHANGELOG.rst)
- [Commits](https://github.com/sanic-org/sanic/compare/v20.12.3...v20.12.6)

---
updated-dependencies:
- dependency-name: sanic
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Grant Timmerman <744973+grant@users.noreply.github.com>
Co-authored-by: Yurii Serhiichuk <xSAVIKx@users.noreply.github.com>
2022-07-10 13:46:52 +03:00
Yurii Serhiichuk 885d365dd2
Feat/dev env cleanup (#167)
* build: Update pre-commit config versions and setup.

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* build: Migrate isort config to `pyproject`

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* style: Use recommended black-compatible flake8 options

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* build: Add standard pre-commit hooks.

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* docs: Add a note about this PR to the changelog.

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* docs: Cleanup docs, fix links. Add lins to respective tooling.

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* build: add dev-only dependencies.

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* style: reformat using new style/format configs

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* build: add pre-commit to dev dependencies

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* style: run pre-commit hooks on all the files

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* docs: Add dev status to the classifier.

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* docs: add missing links and dates for releases and PRs.

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* docs: Add latest PR to the changelog

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* ci: Add new maintainers

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>
2022-07-10 12:53:37 +03:00
Alexander Tkachev 8483e8e310
feat: event attribute get operation support (#165)
* feat: get operation support

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

* docs: event get operation

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

* test: extract dummy attributes into a fixture

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

* test: extract common dummy data into consts

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

* test: event get operation

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

* docs: return value

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

* test: remove assertion

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

* test: move dummy data into fixtures

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

* style: black formatting

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

* style: black formatting

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

* docs: fix bad grammar

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

* test: style fix line too long

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>

* style: fix line too long

Signed-off-by: Alexander Tkachev <sasha64sasha@gmail.com>
2022-07-10 12:44:52 +03:00
Yurii Serhiichuk aee384bf43
Release v1.3.0 (#166)
* Bump version

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* Add v1.3.0 changelog.

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>

* Fix MD language highlight

Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>
2022-07-09 09:24:28 -07:00
Dustin Ingram 900e315d36
Improve redistribute-ability (#151)
* Move functions needed to build project into setup.py

Signed-off-by: Dustin Ingram <di@users.noreply.github.com>

* Only execute setup() in __main__

Signed-off-by: Dustin Ingram <di@users.noreply.github.com>

Co-authored-by: Yurii Serhiichuk <xSAVIKx@users.noreply.github.com>
2022-07-09 12:25:50 +03:00
Doug Davis 1a0d48eb0f
rename (#160)
Signed-off-by: Doug Davis <dug@microsoft.com>
2022-04-27 12:48:44 -04:00
Doug Davis 6c182e0b1c
add .clomonitor.yaml (#159)
Signed-off-by: Doug Davis <dug@microsoft.com>
2022-04-27 10:15:15 -04:00
Doug Davis d3b8892da7
Add some CLO stuff (#158)
Signed-off-by: Doug Davis <dug@microsoft.com>
2022-04-26 14:38:38 -04:00
Dustin Ingram 2e5b96be7e
Support Python 3.10 (#150)
* ci: test python3.10

Signed-off-by: Grant Timmerman <timmerman+devrel@google.com>

Signed-off-by: Dustin Ingram <di@users.noreply.github.com>

* Remove hard pins in requirements

Signed-off-by: Dustin Ingram <di@users.noreply.github.com>

* Add sanic_testing dependency

Signed-off-by: Dustin Ingram <di@users.noreply.github.com>

* Constrain sanic/sanic-testing for 3.6

Signed-off-by: Dustin Ingram <di@users.noreply.github.com>

Co-authored-by: Grant Timmerman <timmerman+devrel@google.com>
2022-04-08 19:22:12 -04:00
Grant Timmerman 6f27322146
ci: use valid sanic instance name (#157)
* ci: use valid sanic instance name

Signed-off-by: Grant Timmerman <744973+grant@users.noreply.github.com>

* ci: use simple sanic name

Signed-off-by: Grant Timmerman <744973+grant@users.noreply.github.com>
2022-04-08 19:19:50 -04:00
jiashuChen 43659228ae
fix: link to flask server sample file in README.md (#154)
Signed-off-by: Jiashu Chen <cjs20080808@hotmail.com>

Co-authored-by: Grant Timmerman <744973+grant@users.noreply.github.com>
2022-04-07 18:09:22 -07:00
Grant Timmerman da47910770
Add correct type annotations for tuple return types (#149)
* style: fix some tuple type style lint issues

Signed-off-by: Grant Timmerman <timmerman+devrel@google.com>

* ci: remove other files

Signed-off-by: Grant Timmerman <timmerman+devrel@google.com>
2022-04-07 17:22:49 -07:00
Graham Campbell 705e8b4100
Added support for Python 3.9 (#144)
Signed-off-by: Graham Campbell <hello@gjcampbell.co.uk>
2021-09-02 17:58:52 -05:00
Xin Yang a5fc827513
ignore datacontenttype when using to_binary() (#138)
* ignore datacontenttype when using to_binary()

Signed-off-by: XinYang <xinydev@gmail.com>

* fix tests

Signed-off-by: XinYang <xinydev@gmail.com>

* fix tests. sanic>20.12 does not support py3.6 any more

Signed-off-by: XinYang <xinydev@gmail.com>
2021-06-01 08:53:47 -05:00
Grant Timmerman b83bfc58eb
docs: add cloudevents module requirement in samples (#129)
Signed-off-by: Grant Timmerman <timmerman+devrel@google.com>

Co-authored-by: Dustin Ingram <di@users.noreply.github.com>
2020-10-23 02:05:28 -05:00
Grant Timmerman c61c3c2ce6
docs: add quick section on installing (#127)
Co-authored-by: Dustin Ingram <di@users.noreply.github.com>
2020-10-23 02:03:45 -05:00
Dustin Ingram 8773319279
Fix formatting (#131)
* Fix formatting for latest black

Signed-off-by: Dustin Ingram <di@users.noreply.github.com>

* Add flake8 for linting

Signed-off-by: Dustin Ingram <di@users.noreply.github.com>

* Fix flake8 lint errors

Signed-off-by: Dustin Ingram <di@users.noreply.github.com>
2020-10-20 11:31:02 -05:00
140 changed files with 5177 additions and 16715 deletions

3
.clomonitor.yml Normal file
View File

@ -0,0 +1,3 @@
exemptions:
- check: recent_release
reason: no new release needed

7
.coveragerc Normal file
View File

@ -0,0 +1,7 @@
[report]
exclude_lines =
# Have to re-enable the standard pragma
pragma: no cover
# Don't complain if tests don't hit defensive assertion code:
raise NotImplementedError

View File

@ -7,28 +7,33 @@ jobs:
lint:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: actions/checkout@v4
- name: Setup Python
uses: actions/setup-python@v1
uses: actions/setup-python@v5
with:
python-version: 3.8
- name: Install tox
run: python -m pip install tox
python-version: '3.12'
cache: 'pip'
cache-dependency-path: 'requirements/*.txt'
- name: Install dev dependencies
run: python -m pip install -r requirements/dev.txt
- name: Run linting
run: python -m tox -e lint
run: python -m tox -e lint,mypy,mypy-samples-image,mypy-samples-json
test:
runs-on: ubuntu-latest
strategy:
matrix:
python: [3.6, 3.7, 3.8]
python: ['3.9', '3.10', '3.11', '3.12', '3.13']
os: [ubuntu-latest, windows-latest, macos-latest]
runs-on: ${{ matrix.os }}
steps:
- uses: actions/checkout@v2
- uses: actions/checkout@v4
- name: Setup Python
uses: actions/setup-python@v1
uses: actions/setup-python@v5
with:
python-version: ${{ matrix.python }}
- name: Install tox
run: python -m pip install tox
cache: 'pip'
cache-dependency-path: 'requirements/*.txt'
- name: Install dev dependencies
run: python -m pip install -r requirements/dev.txt
- name: Run tests
run: python -m tox -e py # Run tox using the version of Python in `PATH`

View File

@ -1,27 +1,58 @@
name: PyPI-Release
on:
workflow_dispatch:
push:
branches:
- master
- main
- 'tag/v**'
jobs:
build-and-publish:
build_dist:
name: Build source distribution
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- name: Set up Python
uses: actions/setup-python@v1
- uses: actions/checkout@v4
with:
python-version: "3.x"
fetch-depth: 0
- name: Build SDist and wheel
run: pipx run build
- uses: actions/upload-artifact@v4
with:
name: artifact
path: dist/*
- name: Check metadata
run: pipx run twine check dist/*
publish:
runs-on: ubuntu-latest
if: github.event_name == 'push'
needs: [ build_dist ]
steps:
- uses: actions/checkout@v4
with:
fetch-depth: 0
- name: Set up Python
uses: actions/setup-python@v5
with:
python-version: "3.12"
cache: 'pip'
- name: Install build dependencies
run: pip install -U setuptools wheel build
- name: Build
run: python -m build .
- name: Publish
uses: pypa/gh-action-pypi-publish@master
- uses: actions/download-artifact@v4
with:
# unpacks default artifact into dist/
# if `name: artifact` is omitted, the action will create extra parent dir
name: artifact
path: dist
- name: Publish
uses: pypa/gh-action-pypi-publish@release/v1
with:
user: __token__
password: ${{ secrets.pypi_password }}
attestations: false
- name: Install GitPython and cloudevents for pypi_packaging
run: pip install -U -r requirements/publish.txt
- name: Create Tag

View File

@ -1,4 +0,0 @@
[settings]
line_length = 80
multi_line_output = 3
include_trailing_comma = True

View File

@ -1,10 +1,27 @@
repos:
- repo: https://github.com/timothycrosley/isort/
rev: 5.0.4
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v5.0.0
hooks:
- id: isort
- repo: https://github.com/psf/black
rev: 19.10b0
- id: trailing-whitespace
- id: end-of-file-fixer
- id: check-toml
- repo: https://github.com/pycqa/isort
rev: 6.0.1
hooks:
- id: black
language_version: python3.8
- id: isort
args: [ "--profile", "black", "--filter-files" ]
- repo: https://github.com/psf/black
rev: 25.1.0
hooks:
- id: black
language_version: python3.11
- repo: https://github.com/pre-commit/mirrors-mypy
rev: v1.16.0
hooks:
- id: mypy
files: ^(cloudevents/)
exclude: ^(cloudevents/tests/)
types: [ python ]
args: [ ]
additional_dependencies:
- "pydantic~=2.7"

View File

@ -4,11 +4,144 @@ All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
## [1.2.0]
## [Unreleased]
## [1.12.0]
### Changed
- Dropped Python3.8 support while it has reached EOL. ([])
## [1.11.1]
### Fixed
- Kafka `conversion` marshaller and unmarshaller typings ([#240])
- Improved public API type annotations and fixed unit test type errors ([#248])
## [1.11.0]
### Fixed
- Pydantic v2 `examples` keyword usage and improved typings handling ([#235])
- Kafka `to_binary` check for invalid `content-type` attribute ([#232])
### Changed
- Dropped Python3.7 from CI while its EOL. ([#236])
## [1.10.1]
### Fixed
- Fixed Pydantic v2 `to_json` (and `to_structured`) conversion ([#229])
## [1.10.0] — 2023-09-25
### Added
- Pydantic v2 support. ([#219])
- Pydantic v2 to v1 compatibility layer. ([#218])
- Governance docs per main CE discussions. ([#221])
## [1.9.0] — 2023-01-04
### Added
- Added typings to the codebase. ([#207])
- Added Python3.11 support. ([#209])
## [1.8.0] — 2022-12-08
### Changed
- Dropped support of Python 3.6 that has reached EOL almost a year ago.
[v1.7.1](https://pypi.org/project/cloudevents/1.7.1/) is the last
one to support Python 3.6 ([#208])
## [1.7.1] — 2022-11-21
### Fixed
- Fixed Pydantic extras dependency constraint (backport of v1.6.3, [#204])
### Changed
- Refined build and publishing process. Added SDist to the released package ([#202])
## [1.7.0] — 2022-11-17
### Added
- Added [Kafka](https://github.com/cloudevents/spec/blob/v1.0.2/cloudevents/bindings/kafka-protocol-binding.md)
support ([#197], thanks [David Martines](https://github.com/davidwmartines))
## [1.6.3] — 2022-11-21
### Fixed
- Fixed Pydantic extras dependency constraint ([#204])
## [1.6.2] — 2022-10-18
### Added
- Added `get_attributes` API to the `CloudEvent` API. The method returns a read-only
view on the event attributes. ([#195])
## [1.6.1] — 2022-08-18
### Fixed
- Missing `to_json` import. ([#191])
## [1.6.0] — 2022-08-17
### Added
- A new `CloudEvent` optional `pydantic` model class is available in the
`cloudevents.pydantic.event` module. The new model enables the integration of
CloudEvents in your existing pydantic models or integration with pydantic
dependent systems such as FastAPI. ([#182])
### Changed
- Deprecated `cloudevents.http.event_type` module,
moved under `cloudevents.sdk.converters`.
- Deprecated `cloudevents.http.json_methods` module,
moved under `cloudevents.http.conversion`.
- Deprecated `cloudevents.http.http_methods` module,
moved under `cloudevents.http.conversion`.
- Deprecated `cloudevents.http.util` module.
### Fixed
- Multiple PEP issues, license headers, module-level exports. ([#188])
## [1.5.0] — 2022-08-06
### Added
- A new `CloudEvent` abstract class is available in the `cloudevents.abstract.event`
module. The new abstraction simplifies creation of custom framework-specific
implementations of `CloudEvents` wrappers ([#186])
### Fixed
- Malformed unicode buffer encoded in `base_64` json field no-longer fail CloudEvent
class construction ([#184])
### Changed
- Default branch changed from `master` to `main` ([#180])
## [1.4.0] — 2022-07-14
### Added
- Added `.get` accessor for even properties ([#165])
- Added type information for all event member functions ([#173])
### Fixed
- Fixed event `__eq__` operator raising `AttributeError` on non-CloudEvent values ([#172])
### Changed
- Code quality and styling tooling is unified and configs compatibility is ensured ([#167])
- CI configurations updated and added macOS and Windows tests ([#169])
- Copyright is unified with the other SDKs and updated/added where needed. ([#170])
### Removed
- `docs` folder and related unused tooling ([#168])
## [1.3.0] — 2022-07-09
### Added
- Python 3.9 support ([#144])
- Python 3.10 support ([#150])
- Automatic CLO checks ([#158], [#159], [#160])
### Fixed
- `ce-datacontenttype` is not longer generated for binary representation ([#138])
- Fixed typings issues ([#149])
- The package redistributive ability by inlining required `pypi-packaging.py` functions ([#151])
## [1.2.0] — 2020-08-20
### Added
- Added GenericException, DataMarshallingError and DataUnmarshallingError ([#120])
## [1.1.0]
## [1.1.0] — 2020-08-18
### Changed
- Changed from_http to now expect headers argument before data ([#110])
- Renamed exception names ([#111])
@ -19,12 +152,12 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
### Deprecated
- Renamed to_binary_http and to_structured_http. ([#108])
## [1.0.1]
## [1.0.1] — 2020-08-14
### Added
- CloudEvent exceptions and event type checking in http module ([#96])
- CloudEvent equality override ([#98])
## [1.0.0]
## [1.0.0] — 2020-08-11
### Added
- Update types and handle data_base64 structured ([#34])
- Added a user friendly CloudEvent class with data validation ([#36])
@ -38,7 +171,7 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
### Removed
- Removed support for Cloudevents V0.2 and V0.1 ([#43])
## [0.3.0]
## [0.3.0] — 2020-07-11
### Added
- Added Cloudevents V0.3 and V1 implementations ([#22])
- Add helpful text to README ([#23])
@ -79,7 +212,25 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
### Added
- Initial release
[0.3.0]: https://github.com/cloudevents/sdk-python/compare/0.2.4...HEAD
[1.11.0]: https://github.com/cloudevents/sdk-python/compare/1.10.1...1.11.0
[1.10.1]: https://github.com/cloudevents/sdk-python/compare/1.10.0...1.10.1
[1.10.0]: https://github.com/cloudevents/sdk-python/compare/1.9.0...1.10.0
[1.9.0]: https://github.com/cloudevents/sdk-python/compare/1.8.0...1.9.0
[1.8.0]: https://github.com/cloudevents/sdk-python/compare/1.7.0...1.8.0
[1.7.1]: https://github.com/cloudevents/sdk-python/compare/1.7.0...1.7.1
[1.7.0]: https://github.com/cloudevents/sdk-python/compare/1.6.0...1.7.0
[1.6.3]: https://github.com/cloudevents/sdk-python/compare/1.6.2...1.6.3
[1.6.2]: https://github.com/cloudevents/sdk-python/compare/1.6.1...1.6.2
[1.6.1]: https://github.com/cloudevents/sdk-python/compare/1.6.0...1.6.1
[1.6.0]: https://github.com/cloudevents/sdk-python/compare/1.5.0...1.6.0
[1.5.0]: https://github.com/cloudevents/sdk-python/compare/1.4.0...1.5.0
[1.4.0]: https://github.com/cloudevents/sdk-python/compare/1.3.0...1.4.0
[1.3.0]: https://github.com/cloudevents/sdk-python/compare/1.2.0...1.3.0
[1.2.0]: https://github.com/cloudevents/sdk-python/compare/1.1.0...1.2.0
[1.1.0]: https://github.com/cloudevents/sdk-python/compare/1.0.1...1.1.0
[1.0.1]: https://github.com/cloudevents/sdk-python/compare/1.0.0...1.0.1
[1.0.0]: https://github.com/cloudevents/sdk-python/compare/0.3.0...1.0.0
[0.3.0]: https://github.com/cloudevents/sdk-python/compare/0.2.4...0.3.0
[0.2.4]: https://github.com/cloudevents/sdk-python/compare/0.2.3...0.2.4
[0.2.3]: https://github.com/cloudevents/sdk-python/compare/0.2.2...0.2.3
[0.2.2]: https://github.com/cloudevents/sdk-python/compare/0.2.1...0.2.2
@ -113,4 +264,40 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
[#110]: https://github.com/cloudevents/sdk-python/pull/110
[#111]: https://github.com/cloudevents/sdk-python/pull/111
[#119]: https://github.com/cloudevents/sdk-python/pull/119
[#120]: https://github.com/cloudevents/sdk-python/pull/120
[#120]: https://github.com/cloudevents/sdk-python/pull/120
[#144]: https://github.com/cloudevents/sdk-python/pull/144
[#149]: https://github.com/cloudevents/sdk-python/pull/149
[#150]: https://github.com/cloudevents/sdk-python/pull/150
[#151]: https://github.com/cloudevents/sdk-python/pull/151
[#158]: https://github.com/cloudevents/sdk-python/pull/158
[#159]: https://github.com/cloudevents/sdk-python/pull/159
[#160]: https://github.com/cloudevents/sdk-python/pull/160
[#165]: https://github.com/cloudevents/sdk-python/pull/165
[#167]: https://github.com/cloudevents/sdk-python/pull/167
[#168]: https://github.com/cloudevents/sdk-python/pull/168
[#169]: https://github.com/cloudevents/sdk-python/pull/169
[#170]: https://github.com/cloudevents/sdk-python/pull/170
[#172]: https://github.com/cloudevents/sdk-python/pull/172
[#173]: https://github.com/cloudevents/sdk-python/pull/173
[#180]: https://github.com/cloudevents/sdk-python/pull/180
[#182]: https://github.com/cloudevents/sdk-python/pull/182
[#184]: https://github.com/cloudevents/sdk-python/pull/184
[#186]: https://github.com/cloudevents/sdk-python/pull/186
[#188]: https://github.com/cloudevents/sdk-python/pull/188
[#191]: https://github.com/cloudevents/sdk-python/pull/191
[#195]: https://github.com/cloudevents/sdk-python/pull/195
[#197]: https://github.com/cloudevents/sdk-python/pull/197
[#202]: https://github.com/cloudevents/sdk-python/pull/202
[#204]: https://github.com/cloudevents/sdk-python/pull/204
[#207]: https://github.com/cloudevents/sdk-python/pull/207
[#208]: https://github.com/cloudevents/sdk-python/pull/208
[#209]: https://github.com/cloudevents/sdk-python/pull/209
[#218]: https://github.com/cloudevents/sdk-python/pull/218
[#219]: https://github.com/cloudevents/sdk-python/pull/219
[#221]: https://github.com/cloudevents/sdk-python/pull/221
[#229]: https://github.com/cloudevents/sdk-python/pull/229
[#232]: https://github.com/cloudevents/sdk-python/pull/232
[#235]: https://github.com/cloudevents/sdk-python/pull/235
[#236]: https://github.com/cloudevents/sdk-python/pull/236
[#240]: https://github.com/cloudevents/sdk-python/pull/240
[#248]: https://github.com/cloudevents/sdk-python/pull/248

23
CONTRIBUTING.md Normal file
View File

@ -0,0 +1,23 @@
# Contributing to CloudEvents sdk-python
:+1::tada: First off, thanks for taking the time to contribute! :tada::+1:
We welcome contributions from the community! Please take some time to become
acquainted with the process before submitting a pull request. There are just
a few things to keep in mind.
## Pull Requests
Typically a pull request should relate to an existing issue. If you have
found a bug, want to add an improvement, or suggest an API change, please
create an issue before proceeding with a pull request. For very minor changes
such as typos in the documentation this isn't really necessary.
### Sign your work
Each PR must be signed. Be sure your `git` `user.name` and `user.email` are configured
then use the `--signoff` flag for your commits.
```console
git commit --signoff
```

9
MAINTAINERS.md Normal file
View File

@ -0,0 +1,9 @@
# Maintainers
Current active maintainers of this SDK:
- [Grant Timmerman](https://github.com/grant)
- [Denys Makogon](https://github.com/denismakogon)
- [Curtis Mason](https://github.com/cumason123)
- [Claudio Canales](https://github.com/Klaudioz)
- [Yurii Serhiichuk](https://github.com/xSAVIKx)

4
MANIFEST.in Normal file
View File

@ -0,0 +1,4 @@
include README.md
include CHANGELOG.md
include LICENSE
include cloudevents/py.typed

View File

@ -1,19 +0,0 @@
# Minimal makefile for Sphinx documentation
#
# You can set these variables from the command line.
SPHINXOPTS =
SPHINXBUILD = sphinx-build
SOURCEDIR = etc/docs_conf
BUILDDIR = docs
# Put it first so that "make" without argument is like "make help".
help:
@$(SPHINXBUILD) -M help "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)
.PHONY: help Makefile
# Catch-all target: route all unknown targets to Sphinx using the new
# "make mode" option. $(O) is meant as a shortcut for $(SPHINXOPTS).
%: Makefile
@$(SPHINXBUILD) -M $@ "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)

View File

@ -14,7 +14,16 @@ This SDK current supports the following versions of CloudEvents:
## Python SDK
Package **cloudevents** provides primitives to work with CloudEvents specification: https://github.com/cloudevents/spec.
Package **cloudevents** provides primitives to work with CloudEvents specification:
https://github.com/cloudevents/spec.
### Installing
The CloudEvents SDK can be installed with pip:
```
pip install cloudevents
```
## Sending CloudEvents
@ -24,7 +33,8 @@ Below we will provide samples on how to send cloudevents using the popular
### Binary HTTP CloudEvent
```python
from cloudevents.http import CloudEvent, to_binary
from cloudevents.http import CloudEvent
from cloudevents.conversion import to_binary
import requests
# Create a CloudEvent
@ -46,7 +56,8 @@ requests.post("<some-url>", data=body, headers=headers)
### Structured HTTP CloudEvent
```python
from cloudevents.http import CloudEvent, to_structured
from cloudevents.conversion import to_structured
from cloudevents.http import CloudEvent
import requests
# Create a CloudEvent
@ -65,12 +76,13 @@ headers, body = to_structured(event)
requests.post("<some-url>", data=body, headers=headers)
```
You can find a complete example of turning a CloudEvent into a HTTP request [in the samples directory](samples/http-json-cloudevents/client.py).
You can find a complete example of turning a CloudEvent into a HTTP request
[in the samples' directory](samples/http-json-cloudevents/client.py).
## Receiving CloudEvents
The code below shows how to consume a cloudevent using the popular python web framework
[flask](https://flask.palletsprojects.com/en/1.1.x/quickstart/):
[flask](https://flask.palletsprojects.com/en/2.2.x/quickstart/):
```python
from flask import Flask, request
@ -99,15 +111,18 @@ if __name__ == "__main__":
app.run(port=3000)
```
You can find a complete example of turning a CloudEvent into a HTTP request [in the samples directory](samples/http-json-cloudevents/server.py).
You can find a complete example of turning a CloudEvent into a HTTP request
[in the samples' directory](samples/http-json-cloudevents/json_sample_server.py).
## SDK versioning
The goal of this package is to provide support for all released versions of CloudEvents, ideally while maintaining
the same API. It will use semantic versioning with following rules:
The goal of this package is to provide support for all released versions of CloudEvents,
ideally while maintaining the same API. It will use semantic versioning
with following rules:
- MAJOR version increments when backwards incompatible changes is introduced.
- MINOR version increments when backwards compatible feature is introduced INCLUDING support for new CloudEvents version.
- MINOR version increments when backwards compatible feature is introduced
INCLUDING support for new CloudEvents version.
- PATCH version increments when a backwards compatible bug fix is introduced.
## Community
@ -125,25 +140,40 @@ the same API. It will use semantic versioning with following rules:
Each SDK may have its own unique processes, tooling and guidelines, common
governance related material can be found in the
[CloudEvents `community`](https://github.com/cloudevents/spec/tree/master/community)
[CloudEvents `docs`](https://github.com/cloudevents/spec/tree/main/docs)
directory. In particular, in there you will find information concerning
how SDK projects are
[managed](https://github.com/cloudevents/spec/blob/master/community/SDK-GOVERNANCE.md),
[guidelines](https://github.com/cloudevents/spec/blob/master/community/SDK-maintainer-guidelines.md)
[managed](https://github.com/cloudevents/spec/blob/main/docs/GOVERNANCE.md),
[guidelines](https://github.com/cloudevents/spec/blob/main/docs/SDK-maintainer-guidelines.md)
for how PR reviews and approval, and our
[Code of Conduct](https://github.com/cloudevents/spec/blob/master/community/GOVERNANCE.md#additional-information)
[Code of Conduct](https://github.com/cloudevents/spec/blob/main/docs/GOVERNANCE.md#additional-information)
information.
If there is a security concern with one of the CloudEvents specifications, or
with one of the project's SDKs, please send an email to
[cncf-cloudevents-security@lists.cncf.io](mailto:cncf-cloudevents-security@lists.cncf.io).
## Additional SDK Resources
- [List of current active maintainers](MAINTAINERS.md)
- [How to contribute to the project](CONTRIBUTING.md)
- [SDK's License](LICENSE)
- [SDK's Release process](RELEASING.md)
## Maintenance
We use black and isort for autoformatting. We setup a tox environment to reformat
the codebase.
We use [black][black] and [isort][isort] for autoformatting. We set up a [tox][tox]
environment to reformat the codebase.
e.g.
```python
```bash
pip install tox
tox -e reformat
```
For information on releasing version bumps see [RELEASING.md](RELEASING.md)
[black]: https://black.readthedocs.io/
[isort]: https://pycqa.github.io/isort/
[tox]: https://tox.wiki/

View File

@ -7,11 +7,11 @@ To release a new CloudEvents SDK, contributors should bump `__version__` in
[cloudevents](cloudevents/__init__.py) to reflect the new release version. On merge, the action
will automatically build and release to PyPI using
[this PyPI GitHub Action](https://github.com/pypa/gh-action-pypi-publish). This
action gets called on all pushes to master (such as a version branch being merged
into master), but only releases a new version when the version number has changed. Note,
this action assumes pushes to master are version updates. Consequently,
action gets called on all pushes to main (such as a version branch being merged
into main), but only releases a new version when the version number has changed. Note,
this action assumes pushes to main are version updates. Consequently,
[pypi-release.yml](.github/workflows/pypi-release.yml) will fail if you attempt to
push to master without updating `__version__` in
push to main without updating `__version__` in
[cloudevents](cloudevents/__init__.py) so don't forget to do so.
After a version update is merged, the script [pypi_packaging.py](pypi_packaging.py)

View File

@ -1 +1,15 @@
__version__ = "1.2.0"
# Copyright 2018-Present The CloudEvents Authors
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
__version__ = "1.12.0"

View File

@ -0,0 +1,17 @@
# Copyright 2018-Present The CloudEvents Authors
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from cloudevents.abstract.event import AnyCloudEvent, CloudEvent
__all__ = ["AnyCloudEvent", "CloudEvent"]

View File

@ -0,0 +1,145 @@
# Copyright 2018-Present The CloudEvents Authors
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import typing
from abc import abstractmethod
from types import MappingProxyType
from typing import Mapping
AnyCloudEvent = typing.TypeVar("AnyCloudEvent", bound="CloudEvent")
class CloudEvent:
"""
The CloudEvent Python wrapper contract exposing generically-available
properties and APIs.
Implementations might handle fields and have other APIs exposed but are
obliged to follow this contract.
"""
@classmethod
def create(
cls: typing.Type[AnyCloudEvent],
attributes: typing.Mapping[str, typing.Any],
data: typing.Optional[typing.Any],
) -> AnyCloudEvent:
"""
Creates a new instance of the CloudEvent using supplied `attributes`
and `data`.
This method should be preferably used over the constructor to create events
while custom framework-specific implementations may require or assume
different arguments.
:param attributes: The attributes of the CloudEvent instance.
:param data: The payload of the CloudEvent instance.
:returns: A new instance of the CloudEvent created from the passed arguments.
"""
raise NotImplementedError()
def get_attributes(self) -> Mapping[str, typing.Any]:
"""
Returns a read-only view on the attributes of the event.
:returns: Read-only view on the attributes of the event.
"""
return MappingProxyType(self._get_attributes())
@abstractmethod
def _get_attributes(self) -> typing.Dict[str, typing.Any]:
"""
Returns the attributes of the event.
The implementation MUST assume that the returned value MAY be mutated.
Having a function over a property simplifies integration for custom
framework-specific implementations.
:returns: Attributes of the event.
"""
raise NotImplementedError()
@abstractmethod
def get_data(self) -> typing.Optional[typing.Any]:
"""
Returns the data of the event.
The implementation MUST assume that the returned value MAY be mutated.
Having a function over a property simplifies integration for custom
framework-specific implementations.
:returns: Data of the event.
"""
raise NotImplementedError()
def __eq__(self, other: typing.Any) -> bool:
if isinstance(other, CloudEvent):
same_data = self.get_data() == other.get_data()
same_attributes = self._get_attributes() == other._get_attributes()
return same_data and same_attributes
return False
def __getitem__(self, key: str) -> typing.Any:
"""
Returns a value of an attribute of the event denoted by the given `key`.
The `data` of the event should be accessed by the `.data` accessor rather
than this mapping.
:param key: The name of the event attribute to retrieve the value for.
:returns: The event attribute value.
"""
return self._get_attributes()[key]
def get(
self, key: str, default: typing.Optional[typing.Any] = None
) -> typing.Optional[typing.Any]:
"""
Retrieves an event attribute value for the given `key`.
Returns the `default` value if the attribute for the given key does not exist.
The implementation MUST NOT throw an error when the key does not exist, but
rather should return `None` or the configured `default`.
:param key: The name of the event attribute to retrieve the value for.
:param default: The default value to be returned when
no attribute with the given key exists.
:returns: The event attribute value if exists, default value or None otherwise.
"""
return self._get_attributes().get(key, default)
def __iter__(self) -> typing.Iterator[typing.Any]:
"""
Returns an iterator over the event attributes.
"""
return iter(self._get_attributes())
def __len__(self) -> int:
"""
Returns the number of the event attributes.
"""
return len(self._get_attributes())
def __contains__(self, key: str) -> bool:
"""
Determines if an attribute with a given `key` is present
in the event attributes.
"""
return key in self._get_attributes()
def __repr__(self) -> str:
return str({"attributes": self._get_attributes(), "data": self.get_data()})

309
cloudevents/conversion.py Normal file
View File

@ -0,0 +1,309 @@
# Copyright 2018-Present The CloudEvents Authors
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import datetime
import enum
import json
import typing
from cloudevents import exceptions as cloud_exceptions
from cloudevents.abstract import AnyCloudEvent
from cloudevents.sdk import converters, marshaller, types
from cloudevents.sdk.converters import is_binary
from cloudevents.sdk.event import v1, v03
def _best_effort_serialize_to_json( # type: ignore[no-untyped-def]
value: typing.Any, *args, **kwargs
) -> typing.Optional[typing.Union[bytes, str, typing.Any]]:
"""
Serializes the given value into a JSON-encoded string.
Given a None value returns None as is.
Given a non-JSON-serializable value returns the value as is.
:param value: The value to be serialized into a JSON string.
:returns: JSON string of the given value OR None OR given value.
"""
if value is None:
return None
try:
return json.dumps(value, *args, **kwargs)
except TypeError:
return value
_default_marshaller_by_format: typing.Dict[str, types.MarshallerType] = {
converters.TypeStructured: lambda x: x,
converters.TypeBinary: _best_effort_serialize_to_json,
}
_obj_by_version = {"1.0": v1.Event, "0.3": v03.Event}
def to_json(
event: AnyCloudEvent,
data_marshaller: typing.Optional[types.MarshallerType] = None,
) -> bytes:
"""
Converts given `event` to a JSON string.
:param event: A CloudEvent to be converted into a JSON string.
:param data_marshaller: Callable function which will cast `event.data`
into a JSON string.
:returns: A JSON string representing the given event.
"""
return to_structured(event, data_marshaller=data_marshaller)[1]
def from_json(
event_type: typing.Type[AnyCloudEvent],
data: typing.Union[str, bytes],
data_unmarshaller: typing.Optional[types.UnmarshallerType] = None,
) -> AnyCloudEvent:
"""
Parses JSON string `data` into a CloudEvent.
:param data: JSON string representation of a CloudEvent.
:param data_unmarshaller: Callable function that casts `data` to a
Python object.
:param event_type: A concrete type of the event into which the data is
deserialized.
:returns: A CloudEvent parsed from the given JSON representation.
"""
return from_http(
headers={},
data=data,
data_unmarshaller=data_unmarshaller,
event_type=event_type,
)
def from_http(
event_type: typing.Type[AnyCloudEvent],
headers: typing.Union[
typing.Mapping[str, str], types.SupportsDuplicateItems[str, str]
],
data: typing.Optional[typing.Union[str, bytes]],
data_unmarshaller: typing.Optional[types.UnmarshallerType] = None,
) -> AnyCloudEvent:
"""
Parses CloudEvent `data` and `headers` into an instance of a given `event_type`.
The method supports both binary and structured representations.
:param headers: The HTTP request headers.
:param data: The HTTP request body. If set to None, "" or b'', the returned
event's `data` field will be set to None.
:param data_unmarshaller: Callable function to map data to a python object
e.g. lambda x: x or lambda x: json.loads(x)
:param event_type: The actual type of CloudEvent to deserialize the event to.
:returns: A CloudEvent instance parsed from the passed HTTP parameters of
the specified type.
"""
if data is None or data == b"":
# Empty string will cause data to be marshalled into None
data = ""
if not isinstance(data, (str, bytes, bytearray)):
raise cloud_exceptions.InvalidStructuredJSON(
"Expected json of type (str, bytes, bytearray), "
f"but instead found type {type(data)}"
)
headers = {key.lower(): value for key, value in headers.items()}
if data_unmarshaller is None:
data_unmarshaller = _json_or_string
marshall = marshaller.NewDefaultHTTPMarshaller()
if is_binary(headers):
specversion = headers.get("ce-specversion", None)
else:
try:
raw_ce = json.loads(data)
except json.decoder.JSONDecodeError:
raise cloud_exceptions.MissingRequiredFields(
"Failed to read specversion from both headers and data. "
"The following can not be parsed as json: {!r}".format(data)
)
if hasattr(raw_ce, "get"):
specversion = raw_ce.get("specversion", None)
else:
raise cloud_exceptions.MissingRequiredFields(
"Failed to read specversion from both headers and data. "
"The following deserialized data has no 'get' method: {}".format(raw_ce)
)
if specversion is None:
raise cloud_exceptions.MissingRequiredFields(
"Failed to find specversion in HTTP request"
)
event_handler = _obj_by_version.get(specversion, None)
if event_handler is None:
raise cloud_exceptions.InvalidRequiredFields(
"Found invalid specversion {}".format(specversion)
)
event = marshall.FromRequest(
event_handler(), headers, data, data_unmarshaller=data_unmarshaller
)
attrs = event.Properties()
attrs.pop("data", None)
attrs.pop("extensions", None)
attrs.update(**event.extensions)
result_data: typing.Optional[typing.Any] = event.data
if event.data == "" or event.data == b"":
# TODO: Check binary unmarshallers to debug why setting data to ""
# returns an event with data set to None, but structured will return ""
result_data = None
return event_type.create(attrs, result_data)
def _to_http(
event: AnyCloudEvent,
format: str = converters.TypeStructured,
data_marshaller: typing.Optional[types.MarshallerType] = None,
) -> typing.Tuple[typing.Dict[str, str], bytes]:
"""
Returns a tuple of HTTP headers/body dicts representing this Cloud Event.
:param format: The encoding format of the event.
:param data_marshaller: Callable function that casts event.data into
either a string or bytes.
:returns: (http_headers: dict, http_body: bytes or str)
"""
if data_marshaller is None:
data_marshaller = _default_marshaller_by_format[format]
if event["specversion"] not in _obj_by_version:
raise cloud_exceptions.InvalidRequiredFields(
f"Unsupported specversion: {event['specversion']}"
)
event_handler = _obj_by_version[event["specversion"]]()
for attribute_name in event:
event_handler.Set(attribute_name, event[attribute_name])
event_handler.data = event.get_data()
return marshaller.NewDefaultHTTPMarshaller().ToRequest(
event_handler, format, data_marshaller=data_marshaller
)
def to_structured(
event: AnyCloudEvent,
data_marshaller: typing.Optional[types.MarshallerType] = None,
) -> typing.Tuple[typing.Dict[str, str], bytes]:
"""
Returns a tuple of HTTP headers/body dicts representing this Cloud Event.
If event.data is a byte object, body will have a `data_base64` field instead of
`data`.
:param event: The event to be converted.
:param data_marshaller: Callable function to cast event.data into
either a string or bytes
:returns: (http_headers: dict, http_body: bytes or str)
"""
return _to_http(event=event, data_marshaller=data_marshaller)
def to_binary(
event: AnyCloudEvent, data_marshaller: typing.Optional[types.MarshallerType] = None
) -> typing.Tuple[typing.Dict[str, str], bytes]:
"""
Returns a tuple of HTTP headers/body dicts representing this Cloud Event.
Uses Binary conversion format.
:param event: The event to be converted.
:param data_marshaller: Callable function to cast event.data into
either a string or bytes.
:returns: (http_headers: dict, http_body: bytes or str)
"""
return _to_http(
event=event,
format=converters.TypeBinary,
data_marshaller=data_marshaller,
)
def best_effort_encode_attribute_value(value: typing.Any) -> typing.Any:
"""
SHOULD convert any value into a JSON serialization friendly format.
This function acts in a best-effort manner and MAY not actually encode the value
if it does not know how to do that, or the value is already JSON-friendly.
:param value: Value which MAY or MAY NOT be JSON serializable.
:return: Possibly encoded value.
"""
if isinstance(value, enum.Enum):
return value.value
if isinstance(value, datetime.datetime):
return value.isoformat()
return value
def from_dict(
event_type: typing.Type[AnyCloudEvent],
event: typing.Mapping[str, typing.Any],
) -> AnyCloudEvent:
"""
Constructs an Event object of a given `event_type` from
a dict `event` representation.
:param event: The event represented as a dict.
:param event_type: The type of the event to be constructed from the dict.
:returns: The event of the specified type backed by the given dict.
"""
attributes = {
attr_name: best_effort_encode_attribute_value(attr_value)
for attr_name, attr_value in event.items()
if attr_name != "data"
}
return event_type.create(attributes=attributes, data=event.get("data"))
def to_dict(event: AnyCloudEvent) -> typing.Dict[str, typing.Any]:
"""
Converts given `event` to its canonical dictionary representation.
:param event: The event to be converted into a dict.
:returns: The canonical dict representation of the event.
"""
result = {attribute_name: event.get(attribute_name) for attribute_name in event}
result["data"] = event.get_data()
return result
def _json_or_string(
content: typing.Optional[typing.Union[str, bytes]],
) -> typing.Any:
"""
Returns a JSON-decoded dictionary or a list of dictionaries if
a valid JSON string is provided.
Returns the same `content` in case of an error or `None` when no content provided.
"""
if content is None:
return None
try:
return json.loads(content)
except (json.JSONDecodeError, TypeError, UnicodeDecodeError):
return content

View File

@ -1,4 +1,4 @@
# All Rights Reserved.
# Copyright 2018-Present The CloudEvents Authors
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
@ -11,6 +11,8 @@
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
class GenericException(Exception):
pass
@ -37,3 +39,16 @@ class DataMarshallerError(GenericException):
class DataUnmarshallerError(GenericException):
pass
class IncompatibleArgumentsError(GenericException):
"""
Raised when a user tries to call a function with arguments which are incompatible
with each other.
"""
class PydanticFeatureNotInstalled(GenericException):
"""
Raised when a user tries to use the pydantic feature but did not install it.
"""

View File

@ -1,4 +1,4 @@
# All Rights Reserved.
# Copyright 2018-Present The CloudEvents Authors
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
@ -11,16 +11,29 @@
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import json
import typing
from cloudevents.http.conversion import from_dict, from_http, from_json
from cloudevents.http.event import CloudEvent
from cloudevents.http.event_type import is_binary, is_structured
from cloudevents.http.http_methods import (
from_http,
from cloudevents.http.event_type import is_binary, is_structured # deprecated
from cloudevents.http.http_methods import ( # deprecated
to_binary,
to_binary_http,
to_structured,
to_structured_http,
)
from cloudevents.http.json_methods import from_json, to_json
from cloudevents.http.json_methods import to_json # deprecated
__all__ = [
"to_binary",
"to_structured",
"from_json",
"from_http",
"from_dict",
"CloudEvent",
"is_binary",
"is_structured",
"to_binary_http",
"to_structured_http",
"to_json",
]

View File

@ -0,0 +1,71 @@
# Copyright 2018-Present The CloudEvents Authors
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import typing
from cloudevents.conversion import from_dict as _abstract_from_dict
from cloudevents.conversion import from_http as _abstract_from_http
from cloudevents.conversion import from_json as _abstract_from_json
from cloudevents.http.event import CloudEvent
from cloudevents.sdk import types
def from_json(
data: typing.Union[str, bytes],
data_unmarshaller: typing.Optional[types.UnmarshallerType] = None,
) -> CloudEvent:
"""
Parses JSON string `data` into a CloudEvent.
:param data: JSON string representation of a CloudEvent.
:param data_unmarshaller: Callable function that casts `data` to a
Python object.
:returns: A CloudEvent parsed from the given JSON representation.
"""
return _abstract_from_json(CloudEvent, data, data_unmarshaller)
def from_http(
headers: typing.Union[
typing.Mapping[str, str], types.SupportsDuplicateItems[str, str]
],
data: typing.Optional[typing.Union[str, bytes]],
data_unmarshaller: typing.Optional[types.UnmarshallerType] = None,
) -> CloudEvent:
"""
Parses CloudEvent `data` and `headers` into a CloudEvent`.
The method supports both binary and structured representations.
:param headers: The HTTP request headers.
:param data: The HTTP request body. If set to None, "" or b'', the returned
event's `data` field will be set to None.
:param data_unmarshaller: Callable function to map data to a python object
e.g. lambda x: x or lambda x: json.loads(x)
:returns: A CloudEvent instance parsed from the passed HTTP parameters of
the specified type.
"""
return _abstract_from_http(CloudEvent, headers, data, data_unmarshaller)
def from_dict(
event: typing.Mapping[str, typing.Any],
) -> CloudEvent:
"""
Constructs a CloudEvent from a dict `event` representation.
:param event: The event represented as a dict.
:returns: The event of the specified type backed by the given dict.
"""
return _abstract_from_dict(CloudEvent, event)

View File

@ -1,4 +1,4 @@
# All Rights Reserved.
# Copyright 2018-Present The CloudEvents Authors
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
@ -17,18 +17,30 @@ import typing
import uuid
import cloudevents.exceptions as cloud_exceptions
from cloudevents.http.mappings import _required_by_version
from cloudevents import abstract
from cloudevents.sdk.event import v1, v03
_required_by_version = {
"1.0": v1.Event._ce_required_fields,
"0.3": v03.Event._ce_required_fields,
}
class CloudEvent:
class CloudEvent(abstract.CloudEvent):
"""
Python-friendly cloudevent class supporting v1 events
Supports both binary and structured mode CloudEvents
"""
def __init__(
self, attributes: typing.Dict[str, str], data: typing.Any = None
):
@classmethod
def create(
cls,
attributes: typing.Mapping[str, typing.Any],
data: typing.Optional[typing.Any],
) -> "CloudEvent":
return cls(attributes, data)
def __init__(self, attributes: typing.Mapping[str, str], data: typing.Any = None):
"""
Event Constructor
:param attributes: a dict with cloudevent attributes. Minimally
@ -36,11 +48,11 @@ class CloudEvent:
attributes 'specversion', 'id' or 'time', this will create
those attributes with default values.
e.g. {
"content-type": "application/cloudevents+json",
"id": "16fb5f0b-211e-1102-3dfe-ea6e2806f124",
"source": "<event-source>",
"type": "cloudevent.event.type",
"specversion": "0.2"
"specversion": "1.0",
"type": "com.github.pull_request.opened",
"source": "https://github.com/cloudevents/spec/pull",
"id": "A234-1234-1234",
"time": "2018-04-05T17:31:00Z",
}
:type attributes: typing.Dict[str, str]
:param data: The payload of the event, as a python object
@ -69,28 +81,14 @@ class CloudEvent:
f"Missing required keys: {required_set - self._attributes.keys()}"
)
def __eq__(self, other):
return self.data == other.data and self._attributes == other._attributes
def _get_attributes(self) -> typing.Dict[str, typing.Any]:
return self._attributes
# Data access is handled via `.data` member
# Attribute access is managed via Mapping type
def __getitem__(self, key):
return self._attributes[key]
def get_data(self) -> typing.Optional[typing.Any]:
return self.data
def __setitem__(self, key, value):
def __setitem__(self, key: str, value: typing.Any) -> None:
self._attributes[key] = value
def __delitem__(self, key):
def __delitem__(self, key: str) -> None:
del self._attributes[key]
def __iter__(self):
return iter(self._attributes)
def __len__(self):
return len(self._attributes)
def __contains__(self, key):
return key in self._attributes
def __repr__(self):
return str({"attributes": self._attributes, "data": self.data})

View File

@ -1,29 +1,37 @@
# Copyright 2018-Present The CloudEvents Authors
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import typing
from cloudevents.sdk.converters import binary, structured
from deprecation import deprecated
from cloudevents.sdk.converters import is_binary as _moved_is_binary
from cloudevents.sdk.converters import is_structured as _moved_is_structured
# THIS MODULE IS DEPRECATED, YOU SHOULD NOT ADD NEW FUNCTIONALLY HERE
@deprecated(
deprecated_in="1.6.0",
details="Use cloudevents.sdk.converters.is_binary function instead",
)
def is_binary(headers: typing.Dict[str, str]) -> bool:
"""Uses internal marshallers to determine whether this event is binary
:param headers: the HTTP headers
:type headers: typing.Dict[str, str]
:returns bool: returns a bool indicating whether the headers indicate a binary event type
"""
headers = {key.lower(): value for key, value in headers.items()}
content_type = headers.get("content-type", "")
binary_parser = binary.BinaryHTTPCloudEventConverter()
return binary_parser.can_read(content_type=content_type, headers=headers)
return _moved_is_binary(headers)
@deprecated(
deprecated_in="1.6.0",
details="Use cloudevents.sdk.converters.is_structured function instead",
)
def is_structured(headers: typing.Dict[str, str]) -> bool:
"""Uses internal marshallers to determine whether this event is structured
:param headers: the HTTP headers
:type headers: typing.Dict[str, str]
:returns bool: returns a bool indicating whether the headers indicate a structured event type
"""
headers = {key.lower(): value for key, value in headers.items()}
content_type = headers.get("content-type", "")
structured_parser = structured.JSONHTTPCloudEventConverter()
return structured_parser.can_read(
content_type=content_type, headers=headers
)
return _moved_is_structured(headers)

View File

@ -1,175 +1,73 @@
import json
# Copyright 2018-Present The CloudEvents Authors
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import typing
from deprecation import deprecated
import cloudevents.exceptions as cloud_exceptions
from cloudevents.abstract import AnyCloudEvent
from cloudevents.conversion import to_binary as _moved_to_binary
from cloudevents.conversion import to_structured as _moved_to_structured
from cloudevents.http.conversion import from_http as _moved_from_http
from cloudevents.http.event import CloudEvent
from cloudevents.http.event_type import is_binary, is_structured
from cloudevents.http.mappings import _marshaller_by_format, _obj_by_version
from cloudevents.http.util import _json_or_string
from cloudevents.sdk import converters, marshaller, types
from cloudevents.sdk import types
# THIS MODULE IS DEPRECATED, YOU SHOULD NOT ADD NEW FUNCTIONALLY HERE
@deprecated(
deprecated_in="1.6.0",
details="Use cloudevents.conversion.to_binary function instead",
)
def to_binary(
event: AnyCloudEvent, data_marshaller: typing.Optional[types.MarshallerType] = None
) -> typing.Tuple[typing.Dict[str, str], bytes]:
return _moved_to_binary(event, data_marshaller)
@deprecated(
deprecated_in="1.6.0",
details="Use cloudevents.conversion.to_structured function instead",
)
def to_structured(
event: AnyCloudEvent,
data_marshaller: typing.Optional[types.MarshallerType] = None,
) -> typing.Tuple[typing.Dict[str, str], bytes]:
return _moved_to_structured(event, data_marshaller)
@deprecated(
deprecated_in="1.6.0",
details="Use cloudevents.http.from_http function instead",
)
def from_http(
headers: typing.Dict[str, str],
data: typing.Union[str, bytes, None],
data_unmarshaller: types.UnmarshallerType = None,
):
"""
Unwrap a CloudEvent (binary or structured) from an HTTP request.
:param headers: the HTTP headers
:type headers: typing.Dict[str, str]
:param data: the HTTP request body. If set to None, "" or b'', the returned
event's data field will be set to None
:type data: typing.IO
:param data_unmarshaller: Callable function to map data to a python object
e.g. lambda x: x or lambda x: json.loads(x)
:type data_unmarshaller: types.UnmarshallerType
"""
if data is None or data == b"":
# Empty string will cause data to be marshalled into None
data = ""
if not isinstance(data, (str, bytes, bytearray)):
raise cloud_exceptions.InvalidStructuredJSON(
"Expected json of type (str, bytes, bytearray), "
f"but instead found type {type(data)}"
)
headers = {key.lower(): value for key, value in headers.items()}
if data_unmarshaller is None:
data_unmarshaller = _json_or_string
marshall = marshaller.NewDefaultHTTPMarshaller()
if is_binary(headers):
specversion = headers.get("ce-specversion", None)
else:
try:
raw_ce = json.loads(data)
except json.decoder.JSONDecodeError:
raise cloud_exceptions.MissingRequiredFields(
"Failed to read specversion from both headers and data. "
f"The following can not be parsed as json: {data}"
)
if hasattr(raw_ce, "get"):
specversion = raw_ce.get("specversion", None)
else:
raise cloud_exceptions.MissingRequiredFields(
"Failed to read specversion from both headers and data. "
f"The following deserialized data has no 'get' method: {raw_ce}"
)
if specversion is None:
raise cloud_exceptions.MissingRequiredFields(
"Failed to find specversion in HTTP request"
)
event_handler = _obj_by_version.get(specversion, None)
if event_handler is None:
raise cloud_exceptions.InvalidRequiredFields(
f"Found invalid specversion {specversion}"
)
event = marshall.FromRequest(
event_handler(), headers, data, data_unmarshaller=data_unmarshaller
)
attrs = event.Properties()
attrs.pop("data", None)
attrs.pop("extensions", None)
attrs.update(**event.extensions)
if event.data == "" or event.data == b"":
# TODO: Check binary unmarshallers to debug why setting data to ""
# returns an event with data set to None, but structured will return ""
data = None
else:
data = event.data
return CloudEvent(attrs, data)
def _to_http(
event: CloudEvent,
format: str = converters.TypeStructured,
data_marshaller: types.MarshallerType = None,
) -> (dict, typing.Union[bytes, str]):
"""
Returns a tuple of HTTP headers/body dicts representing this cloudevent
:param format: constant specifying an encoding format
:type format: str
:param data_marshaller: Callable function to cast event.data into
either a string or bytes
:type data_marshaller: types.MarshallerType
:returns: (http_headers: dict, http_body: bytes or str)
"""
if data_marshaller is None:
data_marshaller = _marshaller_by_format[format]
if event._attributes["specversion"] not in _obj_by_version:
raise cloud_exceptions.InvalidRequiredFields(
f"Unsupported specversion: {event._attributes['specversion']}"
)
event_handler = _obj_by_version[event._attributes["specversion"]]()
for k, v in event._attributes.items():
event_handler.Set(k, v)
event_handler.data = event.data
return marshaller.NewDefaultHTTPMarshaller().ToRequest(
event_handler, format, data_marshaller=data_marshaller
)
def to_structured(
event: CloudEvent, data_marshaller: types.MarshallerType = None,
) -> (dict, typing.Union[bytes, str]):
"""
Returns a tuple of HTTP headers/body dicts representing this cloudevent. If
event.data is a byte object, body will have a data_base64 field instead of
data.
:param event: CloudEvent to cast into http data
:type event: CloudEvent
:param data_marshaller: Callable function to cast event.data into
either a string or bytes
:type data_marshaller: types.MarshallerType
:returns: (http_headers: dict, http_body: bytes or str)
"""
return _to_http(event=event, data_marshaller=data_marshaller)
def to_binary(
event: CloudEvent, data_marshaller: types.MarshallerType = None,
) -> (dict, typing.Union[bytes, str]):
"""
Returns a tuple of HTTP headers/body dicts representing this cloudevent
:param event: CloudEvent to cast into http data
:type event: CloudEvent
:param data_marshaller: Callable function to cast event.data into
either a string or bytes
:type data_marshaller: types.UnmarshallerType
:returns: (http_headers: dict, http_body: bytes or str)
"""
return _to_http(
event=event,
format=converters.TypeBinary,
data_marshaller=data_marshaller,
)
data: typing.Optional[typing.AnyStr],
data_unmarshaller: typing.Optional[types.UnmarshallerType] = None,
) -> CloudEvent:
return _moved_from_http(headers, data, data_unmarshaller)
@deprecated(deprecated_in="1.0.2", details="Use to_binary function instead")
def to_binary_http(
event: CloudEvent, data_marshaller: types.MarshallerType = None,
) -> (dict, typing.Union[bytes, str]):
return to_binary(event, data_marshaller)
event: CloudEvent, data_marshaller: typing.Optional[types.MarshallerType] = None
) -> typing.Tuple[typing.Dict[str, str], bytes]:
return _moved_to_binary(event, data_marshaller)
@deprecated(deprecated_in="1.0.2", details="Use to_structured function instead")
def to_structured_http(
event: CloudEvent, data_marshaller: types.MarshallerType = None,
) -> (dict, typing.Union[bytes, str]):
return to_structured(event, data_marshaller)
event: CloudEvent, data_marshaller: typing.Optional[types.MarshallerType] = None
) -> typing.Tuple[typing.Dict[str, str], bytes]:
return _moved_to_structured(event, data_marshaller)

View File

@ -1,36 +1,47 @@
# Copyright 2018-Present The CloudEvents Authors
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import typing
from cloudevents.http.event import CloudEvent
from cloudevents.http.http_methods import from_http, to_structured
from deprecation import deprecated
from cloudevents.abstract import AnyCloudEvent
from cloudevents.conversion import to_json as _moved_to_json
from cloudevents.http import CloudEvent
from cloudevents.http.conversion import from_json as _moved_from_json
from cloudevents.sdk import types
# THIS MODULE IS DEPRECATED, YOU SHOULD NOT ADD NEW FUNCTIONALLY HERE
@deprecated(
deprecated_in="1.6.0",
details="Use cloudevents.conversion.to_json function instead",
)
def to_json(
event: CloudEvent, data_marshaller: types.MarshallerType = None
) -> typing.Union[str, bytes]:
"""
Cast an CloudEvent into a json object
:param event: CloudEvent which will be converted into a json object
:type event: CloudEvent
:param data_marshaller: Callable function which will cast event.data
into a json object
:type data_marshaller: typing.Callable
:returns: json object representing the given event
"""
return to_structured(event, data_marshaller=data_marshaller)[1]
event: AnyCloudEvent,
data_marshaller: typing.Optional[types.MarshallerType] = None,
) -> bytes:
return _moved_to_json(event, data_marshaller)
@deprecated(
deprecated_in="1.6.0",
details="Use cloudevents.http.from_json function instead",
)
def from_json(
data: typing.Union[str, bytes],
data_unmarshaller: types.UnmarshallerType = None,
data_unmarshaller: typing.Optional[types.UnmarshallerType] = None,
) -> CloudEvent:
"""
Cast json encoded data into an CloudEvent
:param data: json encoded cloudevent data
:type event: typing.Union[str, bytes]
:param data_unmarshaller: Callable function which will cast data to a
python object
:type data_unmarshaller: typing.Callable
:returns: CloudEvent representing given cloudevent json object
"""
return from_http(headers={}, data=data, data_unmarshaller=data_unmarshaller)
return _moved_from_json(data, data_unmarshaller)

View File

@ -1,15 +0,0 @@
from cloudevents.http.util import default_marshaller
from cloudevents.sdk import converters
from cloudevents.sdk.event import v1, v03
_marshaller_by_format = {
converters.TypeStructured: lambda x: x,
converters.TypeBinary: default_marshaller,
}
_obj_by_version = {"1.0": v1.Event, "0.3": v03.Event}
_required_by_version = {
"1.0": v1.Event._ce_required_fields,
"0.3": v03.Event._ce_required_fields,
}

View File

@ -1,20 +1,32 @@
import json
# Copyright 2018-Present The CloudEvents Authors
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import typing
from deprecation import deprecated
def default_marshaller(content: any):
if content is None:
return None
try:
return json.dumps(content)
except TypeError:
return content
from cloudevents.conversion import (
_best_effort_serialize_to_json as _moved_default_marshaller,
)
# THIS MODULE IS DEPRECATED, YOU SHOULD NOT ADD NEW FUNCTIONALLY HERE
def _json_or_string(content: typing.Union[str, bytes]):
if content is None:
return None
try:
return json.loads(content)
except (json.JSONDecodeError, TypeError) as e:
return content
@deprecated(
deprecated_in="1.6.0",
details="You SHOULD NOT use the default marshaller",
)
def default_marshaller(
content: typing.Any,
) -> typing.Optional[typing.Union[bytes, str, typing.Any]]:
return _moved_default_marshaller(content)

View File

@ -0,0 +1,31 @@
# Copyright 2018-Present The CloudEvents Authors
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from cloudevents.kafka.conversion import (
KafkaMessage,
KeyMapper,
from_binary,
from_structured,
to_binary,
to_structured,
)
__all__ = [
"KafkaMessage",
"KeyMapper",
"from_binary",
"from_structured",
"to_binary",
"to_structured",
]

View File

@ -0,0 +1,311 @@
# Copyright 2018-Present The CloudEvents Authors
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import base64
import json
import typing
from cloudevents import exceptions as cloud_exceptions
from cloudevents import http
from cloudevents.abstract import AnyCloudEvent
from cloudevents.kafka.exceptions import KeyMapperError
from cloudevents.sdk import types
JSON_MARSHALLER: types.MarshallerType = json.dumps
JSON_UNMARSHALLER: types.UnmarshallerType = json.loads
IDENTITY_MARSHALLER = IDENTITY_UNMARSHALLER = lambda x: x
DEFAULT_MARSHALLER: types.MarshallerType = JSON_MARSHALLER
DEFAULT_UNMARSHALLER: types.UnmarshallerType = JSON_UNMARSHALLER
DEFAULT_EMBEDDED_DATA_MARSHALLER: types.MarshallerType = IDENTITY_MARSHALLER
DEFAULT_EMBEDDED_DATA_UNMARSHALLER: types.UnmarshallerType = IDENTITY_UNMARSHALLER
class KafkaMessage(typing.NamedTuple):
"""
Represents the elements of a message sent or received through the Kafka protocol.
Callers can map their client-specific message representation to and from this
type in order to use the cloudevents.kafka conversion functions.
"""
headers: typing.Dict[str, bytes]
"""
The dictionary of message headers key/values.
"""
key: typing.Optional[typing.Union[str, bytes]]
"""
The message key.
"""
value: typing.Union[str, bytes]
"""
The message value.
"""
KeyMapper = typing.Callable[[AnyCloudEvent], typing.AnyStr]
"""
A callable function that creates a Kafka message key, given a CloudEvent instance.
"""
DEFAULT_KEY_MAPPER: KeyMapper = lambda event: event.get("partitionkey")
"""
The default KeyMapper which maps the user provided `partitionkey` attribute value
to the `key` of the Kafka message as-is, if present.
"""
def to_binary(
event: AnyCloudEvent,
data_marshaller: typing.Optional[types.MarshallerType] = None,
key_mapper: typing.Optional[KeyMapper] = None,
) -> KafkaMessage:
"""
Returns a KafkaMessage in binary format representing this Cloud Event.
:param event: The event to be converted. To specify the Kafka messaage Key, set
the `partitionkey` attribute of the event, or provide a KeyMapper.
:param data_marshaller: Callable function to cast event.data into
either a string or bytes.
:param key_mapper: Callable function to get the Kafka message key.
:returns: KafkaMessage
"""
data_marshaller = data_marshaller or DEFAULT_MARSHALLER
key_mapper = key_mapper or DEFAULT_KEY_MAPPER
try:
message_key = key_mapper(event)
except Exception as e:
raise KeyMapperError(
f"Failed to map message key with error: {type(e).__name__}('{e}')"
)
headers = {}
if event["datacontenttype"]:
headers["content-type"] = event["datacontenttype"].encode("utf-8")
for attr, value in event.get_attributes().items():
if attr not in ["data", "partitionkey", "datacontenttype"]:
if value is not None:
headers["ce_{0}".format(attr)] = value.encode("utf-8")
try:
data = data_marshaller(event.get_data())
except Exception as e:
raise cloud_exceptions.DataMarshallerError(
f"Failed to marshall data with error: {type(e).__name__}('{e}')"
)
if isinstance(data, str):
data = data.encode("utf-8")
return KafkaMessage(headers, message_key, data)
@typing.overload
def from_binary(
message: KafkaMessage,
event_type: None = None,
data_unmarshaller: typing.Optional[types.UnmarshallerType] = None,
) -> http.CloudEvent:
pass
@typing.overload
def from_binary(
message: KafkaMessage,
event_type: typing.Type[AnyCloudEvent],
data_unmarshaller: typing.Optional[types.UnmarshallerType] = None,
) -> AnyCloudEvent:
pass
def from_binary(
message: KafkaMessage,
event_type: typing.Optional[typing.Type[AnyCloudEvent]] = None,
data_unmarshaller: typing.Optional[types.UnmarshallerType] = None,
) -> typing.Union[http.CloudEvent, AnyCloudEvent]:
"""
Returns a CloudEvent from a KafkaMessage in binary format.
:param message: The KafkaMessage to be converted.
:param event_type: The type of CloudEvent to create. Defaults to http.CloudEvent.
:param data_unmarshaller: Callable function to map data to a python object
:returns: CloudEvent
"""
data_unmarshaller = data_unmarshaller or DEFAULT_UNMARSHALLER
attributes: typing.Dict[str, typing.Any] = {}
for header, value in message.headers.items():
header = header.lower()
if header == "content-type":
attributes["datacontenttype"] = value.decode()
elif header.startswith("ce_"):
attributes[header[3:]] = value.decode()
if message.key is not None:
attributes["partitionkey"] = message.key
try:
data = data_unmarshaller(message.value)
except Exception as e:
raise cloud_exceptions.DataUnmarshallerError(
f"Failed to unmarshall data with error: {type(e).__name__}('{e}')"
)
result: typing.Union[http.CloudEvent, AnyCloudEvent]
if event_type:
result = event_type.create(attributes, data)
else:
result = http.CloudEvent.create(attributes, data)
return result
def to_structured(
event: AnyCloudEvent,
data_marshaller: typing.Optional[types.MarshallerType] = None,
envelope_marshaller: typing.Optional[types.MarshallerType] = None,
key_mapper: typing.Optional[KeyMapper] = None,
) -> KafkaMessage:
"""
Returns a KafkaMessage in structured format representing this Cloud Event.
:param event: The event to be converted. To specify the Kafka message KEY, set
the `partitionkey` attribute of the event.
:param data_marshaller: Callable function to cast event.data into
either a string or bytes.
:param envelope_marshaller: Callable function to cast event envelope into
either a string or bytes.
:param key_mapper: Callable function to get the Kafka message key.
:returns: KafkaMessage
"""
data_marshaller = data_marshaller or DEFAULT_EMBEDDED_DATA_MARSHALLER
envelope_marshaller = envelope_marshaller or DEFAULT_MARSHALLER
key_mapper = key_mapper or DEFAULT_KEY_MAPPER
try:
message_key = key_mapper(event)
except Exception as e:
raise KeyMapperError(
f"Failed to map message key with error: {type(e).__name__}('{e}')"
)
attrs: typing.Dict[str, typing.Any] = dict(event.get_attributes())
try:
data = data_marshaller(event.get_data())
except Exception as e:
raise cloud_exceptions.DataMarshallerError(
f"Failed to marshall data with error: {type(e).__name__}('{e}')"
)
if isinstance(data, (bytes, bytes, memoryview)):
attrs["data_base64"] = base64.b64encode(data).decode("ascii")
else:
attrs["data"] = data
headers = {}
if "datacontenttype" in attrs:
headers["content-type"] = attrs.pop("datacontenttype").encode("utf-8")
try:
value = envelope_marshaller(attrs)
except Exception as e:
raise cloud_exceptions.DataMarshallerError(
f"Failed to marshall event with error: {type(e).__name__}('{e}')"
)
if isinstance(value, str):
value = value.encode("utf-8")
return KafkaMessage(headers, message_key, value)
@typing.overload
def from_structured(
message: KafkaMessage,
event_type: None = None,
data_unmarshaller: typing.Optional[types.UnmarshallerType] = None,
envelope_unmarshaller: typing.Optional[types.UnmarshallerType] = None,
) -> http.CloudEvent:
pass
@typing.overload
def from_structured(
message: KafkaMessage,
event_type: typing.Type[AnyCloudEvent],
data_unmarshaller: typing.Optional[types.UnmarshallerType] = None,
envelope_unmarshaller: typing.Optional[types.UnmarshallerType] = None,
) -> AnyCloudEvent:
pass
def from_structured(
message: KafkaMessage,
event_type: typing.Optional[typing.Type[AnyCloudEvent]] = None,
data_unmarshaller: typing.Optional[types.UnmarshallerType] = None,
envelope_unmarshaller: typing.Optional[types.UnmarshallerType] = None,
) -> typing.Union[http.CloudEvent, AnyCloudEvent]:
"""
Returns a CloudEvent from a KafkaMessage in structured format.
:param message: The KafkaMessage to be converted.
:param event_type: The type of CloudEvent to create. Defaults to http.CloudEvent.
:param data_unmarshaller: Callable function to map the data to a python object.
:param envelope_unmarshaller: Callable function to map the envelope to a python
object.
:returns: CloudEvent
"""
data_unmarshaller = data_unmarshaller or DEFAULT_EMBEDDED_DATA_UNMARSHALLER
envelope_unmarshaller = envelope_unmarshaller or DEFAULT_UNMARSHALLER
try:
structure = envelope_unmarshaller(message.value)
except Exception as e:
raise cloud_exceptions.DataUnmarshallerError(
"Failed to unmarshall message with error: " f"{type(e).__name__}('{e}')"
)
attributes: typing.Dict[str, typing.Any] = {}
if message.key is not None:
attributes["partitionkey"] = message.key
data: typing.Optional[typing.Any] = None
for name, value in structure.items():
try:
if name == "data":
decoded_value = data_unmarshaller(value)
elif name == "data_base64":
decoded_value = data_unmarshaller(base64.b64decode(value))
name = "data"
else:
decoded_value = value
except Exception as e:
raise cloud_exceptions.DataUnmarshallerError(
"Failed to unmarshall data with error: " f"{type(e).__name__}('{e}')"
)
if name == "data":
data = decoded_value
else:
attributes[name] = decoded_value
for header, val in message.headers.items():
if header.lower() == "content-type":
attributes["datacontenttype"] = val.decode()
else:
attributes[header.lower()] = val.decode()
result: typing.Union[AnyCloudEvent, http.CloudEvent]
if event_type:
result = event_type.create(attributes, data)
else:
result = http.CloudEvent.create(attributes, data)
return result

View File

@ -0,0 +1,20 @@
# Copyright 2018-Present The CloudEvents Authors
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from cloudevents import exceptions as cloud_exceptions
class KeyMapperError(cloud_exceptions.GenericException):
"""
Raised when a KeyMapper fails.
"""

0
cloudevents/py.typed Normal file
View File

View File

@ -0,0 +1,47 @@
# Copyright 2018-Present The CloudEvents Authors
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from typing import TYPE_CHECKING
from cloudevents.exceptions import PydanticFeatureNotInstalled
try:
if TYPE_CHECKING:
from cloudevents.pydantic.v2 import CloudEvent, from_dict, from_http, from_json
else:
from pydantic import VERSION as PYDANTIC_VERSION
pydantic_major_version = PYDANTIC_VERSION.split(".")[0]
if pydantic_major_version == "1":
from cloudevents.pydantic.v1 import (
CloudEvent,
from_dict,
from_http,
from_json,
)
else:
from cloudevents.pydantic.v2 import (
CloudEvent,
from_dict,
from_http,
from_json,
)
except ImportError: # pragma: no cover # hard to test
raise PydanticFeatureNotInstalled(
"CloudEvents pydantic feature is not installed. "
"Install it using pip install cloudevents[pydantic]"
)
__all__ = ["CloudEvent", "from_json", "from_dict", "from_http"]

View File

@ -0,0 +1,142 @@
# Copyright 2018-Present The CloudEvents Authors
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from cloudevents.sdk.event import attribute
FIELD_DESCRIPTIONS = {
"data": {
"title": "Event Data",
"description": (
"CloudEvents MAY include domain-specific information about the occurrence."
" When present, this information will be encapsulated within data.It is"
" encoded into a media format which is specified by the datacontenttype"
" attribute (e.g. application/json), and adheres to the dataschema format"
" when those respective attributes are present."
),
},
"source": {
"title": "Event Source",
"description": (
"Identifies the context in which an event happened. Often this will include"
" information such as the type of the event source, the organization"
" publishing the event or the process that produced the event. The exact"
" syntax and semantics behind the data encoded in the URI is defined by the"
" event producer.\n"
"\n"
"Producers MUST ensure that source + id is unique for"
" each distinct event.\n"
"\n"
"An application MAY assign a unique source to each"
" distinct producer, which makes it easy to produce unique IDs since no"
" other producer will have the same source. The application MAY use UUIDs,"
" URNs, DNS authorities or an application-specific scheme to create unique"
" source identifiers.\n"
"\n"
"A source MAY include more than one producer. In"
" that case the producers MUST collaborate to ensure that source + id is"
" unique for each distinct event."
),
"example": "https://github.com/cloudevents",
},
"id": {
"title": "Event ID",
"description": (
"Identifies the event. Producers MUST ensure that source + id is unique for"
" each distinct event. If a duplicate event is re-sent (e.g. due to a"
" network error) it MAY have the same id. Consumers MAY assume that Events"
" with identical source and id are duplicates. MUST be unique within the"
" scope of the producer"
),
"example": "A234-1234-1234",
},
"type": {
"title": "Event Type",
"description": (
"This attribute contains a value describing the type of event related to"
" the originating occurrence. Often this attribute is used for routing,"
" observability, policy enforcement, etc. The format of this is producer"
" defined and might include information such as the version of the type"
),
"example": "com.github.pull_request.opened",
},
"specversion": {
"title": "Specification Version",
"description": (
"The version of the CloudEvents specification which the event uses. This"
" enables the interpretation of the context.\n"
"\n"
"Currently, this attribute will only have the 'major'"
" and 'minor' version numbers included in it. This allows for 'patch'"
" changes to the specification to be made without changing this property's"
" value in the serialization."
),
"example": attribute.DEFAULT_SPECVERSION,
},
"time": {
"title": "Occurrence Time",
"description": (
" Timestamp of when the occurrence happened. If the time of the occurrence"
" cannot be determined then this attribute MAY be set to some other time"
" (such as the current time) by the CloudEvents producer, however all"
" producers for the same source MUST be consistent in this respect. In"
" other words, either they all use the actual time of the occurrence or"
" they all use the same algorithm to determine the value used."
),
"example": "2018-04-05T17:31:00Z",
},
"subject": {
"title": "Event Subject",
"description": (
"This describes the subject of the event in the context of the event"
" producer (identified by source). In publish-subscribe scenarios, a"
" subscriber will typically subscribe to events emitted by a source, but"
" the source identifier alone might not be sufficient as a qualifier for"
" any specific event if the source context has internal"
" sub-structure.\n"
"\n"
"Identifying the subject of the event in context"
" metadata (opposed to only in the data payload) is particularly helpful in"
" generic subscription filtering scenarios where middleware is unable to"
" interpret the data content. In the above example, the subscriber might"
" only be interested in blobs with names ending with '.jpg' or '.jpeg' and"
" the subject attribute allows for constructing a simple and efficient"
" string-suffix filter for that subset of events."
),
"example": "123",
},
"datacontenttype": {
"title": "Event Data Content Type",
"description": (
"Content type of data value. This attribute enables data to carry any type"
" of content, whereby format and encoding might differ from that of the"
" chosen event format."
),
"example": "text/xml",
},
"dataschema": {
"title": "Event Data Schema",
"description": (
"Identifies the schema that data adheres to. "
"Incompatible changes to the schema SHOULD be reflected by a different URI"
),
},
}
"""
The dictionary above contains title, description, example and other
NON-FUNCTIONAL data for pydantic fields. It could be potentially.
used across all the SDK.
Functional field configurations (e.g. defaults) are still defined
in the pydantic model classes.
"""

View File

@ -0,0 +1,18 @@
# Copyright 2018-Present The CloudEvents Authors
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from cloudevents.pydantic.v1.conversion import from_dict, from_http, from_json
from cloudevents.pydantic.v1.event import CloudEvent
__all__ = ["CloudEvent", "from_json", "from_dict", "from_http"]

View File

@ -0,0 +1,76 @@
# Copyright 2018-Present The CloudEvents Authors
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import typing
from cloudevents.conversion import from_dict as _abstract_from_dict
from cloudevents.conversion import from_http as _abstract_from_http
from cloudevents.conversion import from_json as _abstract_from_json
from cloudevents.pydantic.v1.event import CloudEvent
from cloudevents.sdk import types
def from_http(
headers: typing.Union[
typing.Mapping[str, str], types.SupportsDuplicateItems[str, str]
],
data: typing.Optional[typing.AnyStr],
data_unmarshaller: typing.Optional[types.UnmarshallerType] = None,
) -> CloudEvent:
"""
Parses CloudEvent `data` and `headers` into a CloudEvent.
The method supports both binary and structured representations.
:param headers: The HTTP request headers.
:param data: The HTTP request body. If set to None, "" or b'', the returned
event's `data` field will be set to None.
:param data_unmarshaller: Callable function to map data to a python object
e.g. lambda x: x or lambda x: json.loads(x)
:returns: A CloudEvent parsed from the passed HTTP parameters
"""
return _abstract_from_http(
headers=headers,
data=data,
data_unmarshaller=data_unmarshaller,
event_type=CloudEvent,
)
def from_json(
data: typing.AnyStr,
data_unmarshaller: typing.Optional[types.UnmarshallerType] = None,
) -> CloudEvent:
"""
Parses JSON string `data` into a CloudEvent.
:param data: JSON string representation of a CloudEvent.
:param data_unmarshaller: Callable function that casts `data` to a
Python object.
:returns: A CloudEvent parsed from the given JSON representation.
"""
return _abstract_from_json(
data=data, data_unmarshaller=data_unmarshaller, event_type=CloudEvent
)
def from_dict(
event: typing.Mapping[str, typing.Any],
) -> CloudEvent:
"""
Construct an CloudEvent from a dict `event` representation.
:param event: The event represented as a dict.
:returns: A CloudEvent parsed from the given dict representation.
"""
return _abstract_from_dict(CloudEvent, event)

View File

@ -0,0 +1,247 @@
# Copyright 2018-Present The CloudEvents Authors
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import datetime
import json
import typing
from cloudevents.exceptions import PydanticFeatureNotInstalled
from cloudevents.pydantic.fields_docs import FIELD_DESCRIPTIONS
try:
from pydantic import VERSION as PYDANTIC_VERSION
pydantic_major_version = PYDANTIC_VERSION.split(".")[0]
if pydantic_major_version == "2":
from pydantic.v1 import BaseModel, Field
else:
from pydantic import BaseModel, Field # type: ignore
except ImportError: # pragma: no cover # hard to test
raise PydanticFeatureNotInstalled(
"CloudEvents pydantic feature is not installed. "
"Install it using pip install cloudevents[pydantic]"
)
from cloudevents import abstract, conversion, http
from cloudevents.exceptions import IncompatibleArgumentsError
from cloudevents.sdk.event import attribute
def _ce_json_dumps( # type: ignore[no-untyped-def]
obj: typing.Dict[str, typing.Any],
*args,
**kwargs,
) -> str:
"""Performs Pydantic-specific serialization of the event.
Needed by the pydantic base-model to serialize the event correctly to json.
Without this function the data will be incorrectly serialized.
:param obj: CloudEvent represented as a dict.
:param args: User arguments which will be passed to json.dumps function.
:param kwargs: User arguments which will be passed to json.dumps function.
:return: Event serialized as a standard JSON CloudEvent with user specific
parameters.
"""
# Using HTTP from dict due to performance issues.
event = http.from_dict(obj)
event_json = conversion.to_json(event)
# Pydantic is known for initialization time lagging.
return json.dumps(
# We SHOULD de-serialize the value, to serialize it back with
# the correct json args and kwargs passed by the user.
# This MAY cause performance issues in the future.
# When that issue will cause real problem you MAY add a special keyword
# argument that disabled this conversion
json.loads(event_json),
*args,
**kwargs,
)
def _ce_json_loads( # type: ignore[no-untyped-def]
data: typing.AnyStr, *args, **kwargs # noqa
) -> typing.Dict[typing.Any, typing.Any]:
"""Performs Pydantic-specific deserialization of the event.
Needed by the pydantic base-model to de-serialize the event correctly from json.
Without this function the data will be incorrectly de-serialized.
:param obj: CloudEvent encoded as a json string.
:param args: These arguments SHOULD NOT be passed by pydantic.
Located here for fail-safe reasons, in-case it does.
:param kwargs: These arguments SHOULD NOT be passed by pydantic.
Located here for fail-safe reasons, in-case it does.
:return: CloudEvent in a dict representation.
"""
# Using HTTP from dict due to performance issues.
# Pydantic is known for initialization time lagging.
return conversion.to_dict(http.from_json(data))
class CloudEvent(abstract.CloudEvent, BaseModel): # type: ignore
"""
A Python-friendly CloudEvent representation backed by Pydantic-modeled fields.
Supports both binary and structured modes of the CloudEvents v1 specification.
"""
@classmethod
def create(
cls,
attributes: typing.Mapping[str, typing.Any],
data: typing.Optional[typing.Any],
) -> "CloudEvent":
return cls(attributes, data)
data: typing.Optional[typing.Any] = Field(
title=FIELD_DESCRIPTIONS["data"].get("title"),
description=FIELD_DESCRIPTIONS["data"].get("description"),
example=FIELD_DESCRIPTIONS["data"].get("example"),
)
source: str = Field(
title=FIELD_DESCRIPTIONS["source"].get("title"),
description=FIELD_DESCRIPTIONS["source"].get("description"),
example=FIELD_DESCRIPTIONS["source"].get("example"),
)
id: str = Field(
title=FIELD_DESCRIPTIONS["id"].get("title"),
description=FIELD_DESCRIPTIONS["id"].get("description"),
example=FIELD_DESCRIPTIONS["id"].get("example"),
default_factory=attribute.default_id_selection_algorithm,
)
type: str = Field(
title=FIELD_DESCRIPTIONS["type"].get("title"),
description=FIELD_DESCRIPTIONS["type"].get("description"),
example=FIELD_DESCRIPTIONS["type"].get("example"),
)
specversion: attribute.SpecVersion = Field(
title=FIELD_DESCRIPTIONS["specversion"].get("title"),
description=FIELD_DESCRIPTIONS["specversion"].get("description"),
example=FIELD_DESCRIPTIONS["specversion"].get("example"),
default=attribute.DEFAULT_SPECVERSION,
)
time: typing.Optional[datetime.datetime] = Field(
title=FIELD_DESCRIPTIONS["time"].get("title"),
description=FIELD_DESCRIPTIONS["time"].get("description"),
example=FIELD_DESCRIPTIONS["time"].get("example"),
default_factory=attribute.default_time_selection_algorithm,
)
subject: typing.Optional[str] = Field(
title=FIELD_DESCRIPTIONS["subject"].get("title"),
description=FIELD_DESCRIPTIONS["subject"].get("description"),
example=FIELD_DESCRIPTIONS["subject"].get("example"),
)
datacontenttype: typing.Optional[str] = Field(
title=FIELD_DESCRIPTIONS["datacontenttype"].get("title"),
description=FIELD_DESCRIPTIONS["datacontenttype"].get("description"),
example=FIELD_DESCRIPTIONS["datacontenttype"].get("example"),
)
dataschema: typing.Optional[str] = Field(
title=FIELD_DESCRIPTIONS["dataschema"].get("title"),
description=FIELD_DESCRIPTIONS["dataschema"].get("description"),
example=FIELD_DESCRIPTIONS["dataschema"].get("example"),
)
def __init__( # type: ignore[no-untyped-def]
self,
attributes: typing.Optional[typing.Mapping[str, typing.Any]] = None,
data: typing.Optional[typing.Any] = None,
**kwargs,
):
"""
:param attributes: A dict with CloudEvent attributes.
Minimally expects the attributes 'type' and 'source'. If not given the
attributes 'specversion', 'id' or 'time', this will create
those attributes with default values.
If no attribute is given the class MUST use the kwargs as the attributes.
Example Attributes:
{
"specversion": "1.0",
"type": "com.github.pull_request.opened",
"source": "https://github.com/cloudevents/spec/pull",
"id": "A234-1234-1234",
"time": "2018-04-05T17:31:00Z",
}
:param data: Domain-specific information about the occurrence.
"""
if attributes:
if len(kwargs) != 0:
# To prevent API complexity and confusion.
raise IncompatibleArgumentsError(
"Attributes dict and kwargs are incompatible."
)
attributes = {k.lower(): v for k, v in attributes.items()}
kwargs.update(attributes)
super().__init__(data=data, **kwargs)
class Config:
extra: str = "allow" # this is the way we implement extensions
schema_extra = {
"example": {
"specversion": "1.0",
"type": "com.github.pull_request.opened",
"source": "https://github.com/cloudevents/spec/pull",
"subject": "123",
"id": "A234-1234-1234",
"time": "2018-04-05T17:31:00Z",
"comexampleextension1": "value",
"comexampleothervalue": 5,
"datacontenttype": "text/xml",
"data": '<much wow="xml"/>',
}
}
json_dumps = _ce_json_dumps
json_loads = _ce_json_loads
def _get_attributes(self) -> typing.Dict[str, typing.Any]:
return {
key: conversion.best_effort_encode_attribute_value(value)
for key, value in self.__dict__.items()
if key != "data"
}
def get_data(self) -> typing.Optional[typing.Any]:
return self.data
def __setitem__(self, key: str, value: typing.Any) -> None:
"""
Set event attribute value
MUST NOT set event data with this method, use `.data` member instead
Method SHOULD mimic `cloudevents.http.event.CloudEvent` interface
:param key: Event attribute name
:param value: New event attribute value
"""
if key != "data": # to mirror the behaviour of the http event
setattr(self, key, value)
else:
pass # It is de-facto ignored by the http event
def __delitem__(self, key: str) -> None:
"""
SHOULD raise `KeyError` if no event attribute for the given key exists.
Method SHOULD mimic `cloudevents.http.event.CloudEvent` interface
:param key: The event attribute name.
"""
if key == "data":
raise KeyError(key) # to mirror the behaviour of the http event
delattr(self, key)

View File

@ -0,0 +1,18 @@
# Copyright 2018-Present The CloudEvents Authors
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from cloudevents.pydantic.v2.conversion import from_dict, from_http, from_json
from cloudevents.pydantic.v2.event import CloudEvent
__all__ = ["CloudEvent", "from_json", "from_dict", "from_http"]

View File

@ -0,0 +1,77 @@
# Copyright 2018-Present The CloudEvents Authors
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import typing
from cloudevents.conversion import from_dict as _abstract_from_dict
from cloudevents.conversion import from_http as _abstract_from_http
from cloudevents.conversion import from_json as _abstract_from_json
from cloudevents.pydantic.v2.event import CloudEvent
from cloudevents.sdk import types
def from_http(
headers: typing.Union[
typing.Mapping[str, str], types.SupportsDuplicateItems[str, str]
],
data: typing.Optional[typing.AnyStr],
data_unmarshaller: typing.Optional[types.UnmarshallerType] = None,
) -> CloudEvent:
"""
Parses CloudEvent `data` and `headers` into a CloudEvent.
The method supports both binary and structured representations.
:param headers: The HTTP request headers.
:param data: The HTTP request body. If set to None, "" or b'', the returned
event's `data` field will be set to None.
:param data_unmarshaller: Callable function to map data to a python object
e.g. lambda x: x or lambda x: json.loads(x)
:returns: A CloudEvent parsed from the passed HTTP parameters
"""
return _abstract_from_http(
headers=headers,
data=data,
data_unmarshaller=data_unmarshaller,
event_type=CloudEvent,
)
def from_json(
data: typing.AnyStr,
data_unmarshaller: typing.Optional[types.UnmarshallerType] = None,
) -> CloudEvent:
"""
Parses JSON string `data` into a CloudEvent.
:param data: JSON string representation of a CloudEvent.
:param data_unmarshaller: Callable function that casts `data` to a
Python object.
:returns: A CloudEvent parsed from the given JSON representation.
"""
return _abstract_from_json(
data=data, data_unmarshaller=data_unmarshaller, event_type=CloudEvent
)
def from_dict(
event: typing.Mapping[str, typing.Any],
) -> CloudEvent:
"""
Construct an CloudEvent from a dict `event` representation.
:param event: The event represented as a dict.
:returns: A CloudEvent parsed from the given dict representation.
"""
return _abstract_from_dict(CloudEvent, event)

View File

@ -0,0 +1,248 @@
# Copyright 2018-Present The CloudEvents Authors
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import datetime
import json
import typing
from typing import Any
from pydantic.deprecated import parse as _deprecated_parse
from cloudevents.exceptions import PydanticFeatureNotInstalled
from cloudevents.pydantic.fields_docs import FIELD_DESCRIPTIONS
try:
from pydantic import BaseModel, ConfigDict, Field, model_serializer
except ImportError: # pragma: no cover # hard to test
raise PydanticFeatureNotInstalled(
"CloudEvents pydantic feature is not installed. "
"Install it using pip install cloudevents[pydantic]"
)
from cloudevents import abstract, conversion
from cloudevents.exceptions import IncompatibleArgumentsError
from cloudevents.sdk.event import attribute
class CloudEvent(abstract.CloudEvent, BaseModel): # type: ignore
"""
A Python-friendly CloudEvent representation backed by Pydantic-modeled fields.
Supports both binary and structured modes of the CloudEvents v1 specification.
"""
@classmethod
def create(
cls,
attributes: typing.Mapping[str, typing.Any],
data: typing.Optional[typing.Any],
) -> "CloudEvent":
return cls(attributes, data)
data: typing.Optional[typing.Any] = Field(
title=FIELD_DESCRIPTIONS["data"].get("title"),
description=FIELD_DESCRIPTIONS["data"].get("description"),
examples=[FIELD_DESCRIPTIONS["data"].get("example")],
default=None,
)
source: str = Field(
title=FIELD_DESCRIPTIONS["source"].get("title"),
description=FIELD_DESCRIPTIONS["source"].get("description"),
examples=[FIELD_DESCRIPTIONS["source"].get("example")],
)
id: str = Field(
title=FIELD_DESCRIPTIONS["id"].get("title"),
description=FIELD_DESCRIPTIONS["id"].get("description"),
examples=[FIELD_DESCRIPTIONS["id"].get("example")],
default_factory=attribute.default_id_selection_algorithm,
)
type: str = Field(
title=FIELD_DESCRIPTIONS["type"].get("title"),
description=FIELD_DESCRIPTIONS["type"].get("description"),
examples=[FIELD_DESCRIPTIONS["type"].get("example")],
)
specversion: attribute.SpecVersion = Field(
title=FIELD_DESCRIPTIONS["specversion"].get("title"),
description=FIELD_DESCRIPTIONS["specversion"].get("description"),
examples=[FIELD_DESCRIPTIONS["specversion"].get("example")],
default=attribute.DEFAULT_SPECVERSION,
)
time: typing.Optional[datetime.datetime] = Field(
title=FIELD_DESCRIPTIONS["time"].get("title"),
description=FIELD_DESCRIPTIONS["time"].get("description"),
examples=[FIELD_DESCRIPTIONS["time"].get("example")],
default_factory=attribute.default_time_selection_algorithm,
)
subject: typing.Optional[str] = Field(
title=FIELD_DESCRIPTIONS["subject"].get("title"),
description=FIELD_DESCRIPTIONS["subject"].get("description"),
examples=[FIELD_DESCRIPTIONS["subject"].get("example")],
default=None,
)
datacontenttype: typing.Optional[str] = Field(
title=FIELD_DESCRIPTIONS["datacontenttype"].get("title"),
description=FIELD_DESCRIPTIONS["datacontenttype"].get("description"),
examples=[FIELD_DESCRIPTIONS["datacontenttype"].get("example")],
default=None,
)
dataschema: typing.Optional[str] = Field(
title=FIELD_DESCRIPTIONS["dataschema"].get("title"),
description=FIELD_DESCRIPTIONS["dataschema"].get("description"),
examples=[FIELD_DESCRIPTIONS["dataschema"].get("example")],
default=None,
)
def __init__( # type: ignore[no-untyped-def]
self,
attributes: typing.Optional[typing.Mapping[str, typing.Any]] = None,
data: typing.Optional[typing.Any] = None,
**kwargs,
):
"""
:param attributes: A dict with CloudEvent attributes.
Minimally expects the attributes 'type' and 'source'. If not given the
attributes 'specversion', 'id' or 'time', this will create
those attributes with default values.
If no attribute is given the class MUST use the kwargs as the attributes.
Example Attributes:
{
"specversion": "1.0",
"type": "com.github.pull_request.opened",
"source": "https://github.com/cloudevents/spec/pull",
"id": "A234-1234-1234",
"time": "2018-04-05T17:31:00Z",
}
:param data: Domain-specific information about the occurrence.
"""
if attributes:
if len(kwargs) != 0:
# To prevent API complexity and confusion.
raise IncompatibleArgumentsError(
"Attributes dict and kwargs are incompatible."
)
attributes = {k.lower(): v for k, v in attributes.items()}
kwargs.update(attributes)
super().__init__(data=data, **kwargs)
model_config = ConfigDict(
extra="allow", # this is the way we implement extensions
json_schema_extra={
"example": {
"specversion": "1.0",
"type": "com.github.pull_request.opened",
"source": "https://github.com/cloudevents/spec/pull",
"subject": "123",
"id": "A234-1234-1234",
"time": "2018-04-05T17:31:00Z",
"comexampleextension1": "value",
"comexampleothervalue": 5,
"datacontenttype": "text/xml",
"data": '<much wow="xml"/>',
}
},
)
"""
We should use a @model_validator decorator to handle JSON deserialisation,
however it's not possible to completely bypass the internal pydantic logic
and still use the CloudEvents shared conversion logic.
Same issue applies to the multiple from/to JSON conversion logic in the
@model_serializer implemented after
To remove the need for the multiple from/to JSON transformation we need
major refactor in the SDK conversion logic.
"""
@classmethod
def model_validate_json(
cls,
json_data: typing.Union[str, bytes, bytearray],
*,
strict: typing.Optional[bool] = None,
context: typing.Optional[typing.Dict[str, Any]] = None,
by_alias: typing.Optional[bool] = None,
by_name: typing.Optional[bool] = None,
) -> "CloudEvent":
return conversion.from_json(cls, json_data)
@classmethod
def parse_raw(
cls,
b: typing.Union[str, bytes],
*,
content_type: typing.Optional[str] = None,
encoding: str = "utf8",
proto: typing.Optional[_deprecated_parse.Protocol] = None,
allow_pickle: bool = False,
) -> "CloudEvent":
return conversion.from_json(cls, b)
@model_serializer(when_used="json")
def _ce_json_dumps(self) -> typing.Dict[str, typing.Any]:
"""Performs Pydantic-specific serialization of the event when
serializing the model using `.model_dump_json()` method.
Needed by the pydantic base-model to serialize the event correctly to json.
Without this function the data will be incorrectly serialized.
:param self: CloudEvent.
:return: Event serialized as a standard CloudEvent dict with user specific
parameters.
"""
# Here mypy complains about json.loads returning Any
# which is incompatible with this method return type
# but we know it's always a dictionary in this case
return json.loads(conversion.to_json(self)) # type: ignore
def _get_attributes(self) -> typing.Dict[str, typing.Any]:
return {
key: conversion.best_effort_encode_attribute_value(value)
for key, value in dict(BaseModel.__iter__(self)).items()
if key not in ["data"]
}
def get_data(self) -> typing.Optional[typing.Any]:
return self.data
def __setitem__(self, key: str, value: typing.Any) -> None:
"""
Set event attribute value
MUST NOT set event data with this method, use `.data` member instead
Method SHOULD mimic `cloudevents.http.event.CloudEvent` interface
:param key: Event attribute name
:param value: New event attribute value
"""
if key != "data": # to mirror the behaviour of the http event
setattr(self, key, value)
else:
pass # It is de-facto ignored by the http event
def __delitem__(self, key: str) -> None:
"""
SHOULD raise `KeyError` if no event attribute for the given key exists.
Method SHOULD mimic `cloudevents.http.event.CloudEvent` interface
:param key: The event attribute name.
"""
if key == "data":
raise KeyError(key) # to mirror the behaviour of the http event
delattr(self, key)

View File

@ -0,0 +1,13 @@
# Copyright 2018-Present The CloudEvents Authors
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.

View File

@ -1,4 +1,4 @@
# All Rights Reserved.
# Copyright 2018-Present The CloudEvents Authors
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
@ -11,7 +11,19 @@
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from cloudevents.sdk.converters import binary, structured
TypeBinary = binary.BinaryHTTPCloudEventConverter.TYPE
TypeStructured = structured.JSONHTTPCloudEventConverter.TYPE
from cloudevents.sdk.converters import binary, structured
from cloudevents.sdk.converters.binary import is_binary
from cloudevents.sdk.converters.structured import is_structured
TypeBinary: str = binary.BinaryHTTPCloudEventConverter.TYPE
TypeStructured: str = structured.JSONHTTPCloudEventConverter.TYPE
__all__ = [
"binary",
"structured",
"is_binary",
"is_structured",
"TypeBinary",
"TypeStructured",
]

View File

@ -1,4 +1,4 @@
# All Rights Reserved.
# Copyright 2018-Present The CloudEvents Authors
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
@ -18,14 +18,13 @@ from cloudevents.sdk.event import base
class Converter(object):
TYPE = None
TYPE: str = ""
def read(
self,
event,
headers: dict,
body: typing.IO,
event: typing.Any,
headers: typing.Mapping[str, str],
body: typing.Union[str, bytes],
data_unmarshaller: typing.Callable,
) -> base.BaseEvent:
raise Exception("not implemented")
@ -33,10 +32,14 @@ class Converter(object):
def event_supported(self, event: object) -> bool:
raise Exception("not implemented")
def can_read(self, content_type: str) -> bool:
def can_read(
self,
content_type: typing.Optional[str],
headers: typing.Optional[typing.Mapping[str, str]] = None,
) -> bool:
raise Exception("not implemented")
def write(
self, event: base.BaseEvent, data_marshaller: typing.Callable
) -> (dict, object):
self, event: base.BaseEvent, data_marshaller: typing.Optional[typing.Callable]
) -> typing.Tuple[typing.Dict[str, str], bytes]:
raise Exception("not implemented")

View File

@ -1,4 +1,4 @@
# All Rights Reserved.
# Copyright 2018-Present The CloudEvents Authors
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
@ -22,16 +22,16 @@ from cloudevents.sdk.event import v1, v03
class BinaryHTTPCloudEventConverter(base.Converter):
TYPE = "binary"
TYPE: str = "binary"
SUPPORTED_VERSIONS = [v03.Event, v1.Event]
def can_read(
self,
content_type: str = None,
headers: typing.Dict[str, str] = {"ce-specversion": None},
content_type: typing.Optional[str] = None,
headers: typing.Optional[typing.Mapping[str, str]] = None,
) -> bool:
if headers is None:
headers = {"ce-specversion": ""}
return has_binary_headers(headers)
def event_supported(self, event: object) -> bool:
@ -40,8 +40,8 @@ class BinaryHTTPCloudEventConverter(base.Converter):
def read(
self,
event: event_base.BaseEvent,
headers: dict,
body: typing.IO,
headers: typing.Mapping[str, str],
body: typing.Union[str, bytes],
data_unmarshaller: types.UnmarshallerType,
) -> event_base.BaseEvent:
if type(event) not in self.SUPPORTED_VERSIONS:
@ -50,10 +50,26 @@ class BinaryHTTPCloudEventConverter(base.Converter):
return event
def write(
self, event: event_base.BaseEvent, data_marshaller: types.MarshallerType
) -> (dict, bytes):
self,
event: event_base.BaseEvent,
data_marshaller: typing.Optional[types.MarshallerType],
) -> typing.Tuple[typing.Dict[str, str], bytes]:
return event.MarshalBinary(data_marshaller)
def NewBinaryHTTPCloudEventConverter() -> BinaryHTTPCloudEventConverter:
return BinaryHTTPCloudEventConverter()
def is_binary(headers: typing.Mapping[str, str]) -> bool:
"""
Determines whether an event with the supplied `headers` is in binary format.
:param headers: The HTTP headers of a potential event.
:returns: Returns a bool indicating whether the headers indicate
a binary event type.
"""
headers = {key.lower(): value for key, value in headers.items()}
content_type = headers.get("content-type", "")
binary_parser = BinaryHTTPCloudEventConverter()
return binary_parser.can_read(content_type=content_type, headers=headers)

View File

@ -1,4 +1,4 @@
# All Rights Reserved.
# Copyright 2018-Present The CloudEvents Authors
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
@ -22,13 +22,16 @@ from cloudevents.sdk.event import base as event_base
# TODO: Singleton?
class JSONHTTPCloudEventConverter(base.Converter):
TYPE = "structured"
MIME_TYPE = "application/cloudevents+json"
TYPE: str = "structured"
MIME_TYPE: str = "application/cloudevents+json"
def can_read(
self, content_type: str, headers: typing.Dict[str, str] = {},
self,
content_type: typing.Optional[str] = None,
headers: typing.Optional[typing.Mapping[str, str]] = None,
) -> bool:
if headers is None:
headers = {}
return (
isinstance(content_type, str)
and content_type.startswith(self.MIME_TYPE)
@ -42,19 +45,35 @@ class JSONHTTPCloudEventConverter(base.Converter):
def read(
self,
event: event_base.BaseEvent,
headers: dict,
body: typing.IO,
headers: typing.Mapping[str, str],
body: typing.Union[str, bytes],
data_unmarshaller: types.UnmarshallerType,
) -> event_base.BaseEvent:
event.UnmarshalJSON(body, data_unmarshaller)
return event
def write(
self, event: event_base.BaseEvent, data_marshaller: types.MarshallerType
) -> (dict, bytes):
self,
event: event_base.BaseEvent,
data_marshaller: typing.Optional[types.MarshallerType],
) -> typing.Tuple[typing.Dict[str, str], bytes]:
http_headers = {"content-type": self.MIME_TYPE}
return http_headers, event.MarshalJSON(data_marshaller).encode("utf-8")
def NewJSONHTTPCloudEventConverter() -> JSONHTTPCloudEventConverter:
return JSONHTTPCloudEventConverter()
def is_structured(headers: typing.Mapping[str, str]) -> bool:
"""
Determines whether an event with the supplied `headers` is in a structured format.
:param headers: The HTTP headers of a potential event.
:returns: Returns a bool indicating whether the headers indicate
a structured event type.
"""
headers = {key.lower(): value for key, value in headers.items()}
content_type = headers.get("content-type", "")
structured_parser = JSONHTTPCloudEventConverter()
return structured_parser.can_read(content_type=content_type, headers=headers)

View File

@ -1,7 +1,26 @@
# Copyright 2018-Present The CloudEvents Authors
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import typing
def has_binary_headers(headers: typing.Dict[str, str]) -> bool:
def has_binary_headers(headers: typing.Mapping[str, str]) -> bool:
"""Determines if all CloudEvents required headers are presents
in the `headers`.
:returns: True if all the headers are present, False otherwise.
"""
return (
"ce-specversion" in headers
and "ce-source" in headers

View File

@ -0,0 +1,13 @@
# Copyright 2018-Present The CloudEvents Authors
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.

View File

@ -0,0 +1,48 @@
# Copyright 2018-Present The CloudEvents Authors
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import datetime
import uuid
from enum import Enum
class SpecVersion(str, Enum):
"""
The version of the CloudEvents specification which an event uses.
This enables the interpretation of the context.
Currently, this attribute will only have the 'major' and 'minor' version numbers
included in it. This allows for 'patch' changes to the specification to be made
without changing this property's value in the serialization.
"""
v0_3 = "0.3"
v1_0 = "1.0"
DEFAULT_SPECVERSION = SpecVersion.v1_0
def default_time_selection_algorithm() -> datetime.datetime:
"""
:return: A time value which will be used as CloudEvent time attribute value.
"""
return datetime.datetime.now(datetime.timezone.utc)
def default_id_selection_algorithm() -> str:
"""
:return: Globally unique id to be used as a CloudEvent id attribute value.
"""
return str(uuid.uuid4())

View File

@ -1,4 +1,4 @@
# All Rights Reserved.
# Copyright 2018-Present The CloudEvents Authors
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
@ -15,6 +15,7 @@
import base64
import json
import typing
from typing import Set
import cloudevents.exceptions as cloud_exceptions
from cloudevents.sdk import types
@ -23,112 +24,111 @@ from cloudevents.sdk import types
class EventGetterSetter(object): # pragma: no cover
# ce-specversion
def CloudEventVersion(self) -> str:
raise Exception("not implemented")
@property
def specversion(self):
def specversion(self) -> str:
return self.CloudEventVersion()
@specversion.setter
def specversion(self, value: str) -> None:
self.SetCloudEventVersion(value)
def SetCloudEventVersion(self, specversion: str) -> object:
raise Exception("not implemented")
@specversion.setter
def specversion(self, value: str):
self.SetCloudEventVersion(value)
# ce-type
def EventType(self) -> str:
raise Exception("not implemented")
@property
def type(self):
def type(self) -> str:
return self.EventType()
@type.setter
def type(self, value: str) -> None:
self.SetEventType(value)
def SetEventType(self, eventType: str) -> object:
raise Exception("not implemented")
@type.setter
def type(self, value: str):
self.SetEventType(value)
# ce-source
def Source(self) -> str:
raise Exception("not implemented")
@property
def source(self):
def source(self) -> str:
return self.Source()
@source.setter
def source(self, value: str) -> None:
self.SetSource(value)
def SetSource(self, source: str) -> object:
raise Exception("not implemented")
@source.setter
def source(self, value: str):
self.SetSource(value)
# ce-id
def EventID(self) -> str:
raise Exception("not implemented")
@property
def id(self):
def id(self) -> str:
return self.EventID()
@id.setter
def id(self, value: str) -> None:
self.SetEventID(value)
def SetEventID(self, eventID: str) -> object:
raise Exception("not implemented")
@id.setter
def id(self, value: str):
self.SetEventID(value)
# ce-time
def EventTime(self) -> str:
def EventTime(self) -> typing.Optional[str]:
raise Exception("not implemented")
@property
def time(self):
def time(self) -> typing.Optional[str]:
return self.EventTime()
def SetEventTime(self, eventTime: str) -> object:
raise Exception("not implemented")
@time.setter
def time(self, value: str):
def time(self, value: typing.Optional[str]) -> None:
self.SetEventTime(value)
def SetEventTime(self, eventTime: typing.Optional[str]) -> object:
raise Exception("not implemented")
# ce-schema
def SchemaURL(self) -> str:
def SchemaURL(self) -> typing.Optional[str]:
raise Exception("not implemented")
@property
def schema(self) -> str:
def schema(self) -> typing.Optional[str]:
return self.SchemaURL()
def SetSchemaURL(self, schemaURL: str) -> object:
raise Exception("not implemented")
@schema.setter
def schema(self, value: str):
def schema(self, value: typing.Optional[str]) -> None:
self.SetSchemaURL(value)
def SetSchemaURL(self, schemaURL: typing.Optional[str]) -> object:
raise Exception("not implemented")
# data
def Data(self) -> object:
def Data(self) -> typing.Optional[object]:
raise Exception("not implemented")
@property
def data(self) -> object:
def data(self) -> typing.Optional[object]:
return self.Data()
def SetData(self, data: object) -> object:
raise Exception("not implemented")
@data.setter
def data(self, value: object):
def data(self, value: typing.Optional[object]) -> None:
self.SetData(value)
def SetData(self, data: typing.Optional[object]) -> object:
raise Exception("not implemented")
# ce-extensions
def Extensions(self) -> dict:
raise Exception("not implemented")
@ -137,34 +137,38 @@ class EventGetterSetter(object): # pragma: no cover
def extensions(self) -> dict:
return self.Extensions()
@extensions.setter
def extensions(self, value: dict) -> None:
self.SetExtensions(value)
def SetExtensions(self, extensions: dict) -> object:
raise Exception("not implemented")
@extensions.setter
def extensions(self, value: dict):
self.SetExtensions(value)
# Content-Type
def ContentType(self) -> str:
def ContentType(self) -> typing.Optional[str]:
raise Exception("not implemented")
@property
def content_type(self) -> str:
def content_type(self) -> typing.Optional[str]:
return self.ContentType()
def SetContentType(self, contentType: str) -> object:
raise Exception("not implemented")
@content_type.setter
def content_type(self, value: str):
def content_type(self, value: typing.Optional[str]) -> None:
self.SetContentType(value)
def SetContentType(self, contentType: typing.Optional[str]) -> object:
raise Exception("not implemented")
class BaseEvent(EventGetterSetter):
_ce_required_fields = set()
_ce_optional_fields = set()
"""Base implementation of the CloudEvent."""
def Properties(self, with_nullable=False) -> dict:
_ce_required_fields: Set[str] = set()
"""A set of required CloudEvent field names."""
_ce_optional_fields: Set[str] = set()
"""A set of optional CloudEvent field names."""
def Properties(self, with_nullable: bool = False) -> dict:
props = dict()
for name, value in self.__dict__.items():
if str(name).startswith("ce__"):
@ -174,19 +178,18 @@ class BaseEvent(EventGetterSetter):
return props
def Get(self, key: str) -> (object, bool):
formatted_key = "ce__{0}".format(key.lower())
ok = hasattr(self, formatted_key)
value = getattr(self, formatted_key, None)
if not ok:
def Get(self, key: str) -> typing.Tuple[typing.Optional[object], bool]:
formatted_key: str = "ce__{0}".format(key.lower())
key_exists: bool = hasattr(self, formatted_key)
if not key_exists:
exts = self.Extensions()
return exts.get(key), key in exts
value: typing.Any = getattr(self, formatted_key)
return value.get(), key_exists
return value.get(), ok
def Set(self, key: str, value: object):
formatted_key = "ce__{0}".format(key)
key_exists = hasattr(self, formatted_key)
def Set(self, key: str, value: typing.Optional[object]) -> None:
formatted_key: str = "ce__{0}".format(key)
key_exists: bool = hasattr(self, formatted_key)
if key_exists:
attr = getattr(self, formatted_key)
attr.set(value)
@ -196,20 +199,20 @@ class BaseEvent(EventGetterSetter):
exts.update({key: value})
self.Set("extensions", exts)
def MarshalJSON(self, data_marshaller: types.MarshallerType) -> str:
if data_marshaller is None:
data_marshaller = lambda x: x # noqa: E731
def MarshalJSON(
self, data_marshaller: typing.Optional[types.MarshallerType]
) -> str:
props = self.Properties()
if "data" in props:
data = props.pop("data")
try:
data = data_marshaller(data)
if data_marshaller:
data = data_marshaller(data)
except Exception as e:
raise cloud_exceptions.DataMarshallerError(
"Failed to marshall data with error: "
f"{type(e).__name__}('{e}')"
f"Failed to marshall data with error: {type(e).__name__}('{e}')"
)
if isinstance(data, (bytes, bytes, memoryview)):
if isinstance(data, (bytes, bytearray, memoryview)):
props["data_base64"] = base64.b64encode(data).decode("ascii")
else:
props["data"] = data
@ -222,7 +225,7 @@ class BaseEvent(EventGetterSetter):
self,
b: typing.Union[str, bytes],
data_unmarshaller: types.UnmarshallerType,
):
) -> None:
raw_ce = json.loads(b)
missing_fields = self._ce_required_fields - raw_ce.keys()
@ -232,33 +235,28 @@ class BaseEvent(EventGetterSetter):
)
for name, value in raw_ce.items():
decoder = lambda x: x
if name == "data":
# Use the user-provided serializer, which may have customized
# JSON decoding
decoder = lambda v: data_unmarshaller(json.dumps(v))
if name == "data_base64":
decoder = lambda v: data_unmarshaller(base64.b64decode(v))
name = "data"
try:
set_value = decoder(value)
if name == "data":
decoded_value = data_unmarshaller(json.dumps(value))
elif name == "data_base64":
decoded_value = data_unmarshaller(base64.b64decode(value))
name = "data"
else:
decoded_value = value
except Exception as e:
raise cloud_exceptions.DataUnmarshallerError(
"Failed to unmarshall data with error: "
f"{type(e).__name__}('{e}')"
)
self.Set(name, set_value)
self.Set(name, decoded_value)
def UnmarshalBinary(
self,
headers: dict,
body: typing.Union[bytes, str],
headers: typing.Mapping[str, str],
body: typing.Union[str, bytes],
data_unmarshaller: types.UnmarshallerType,
):
required_binary_fields = {
f"ce-{field}" for field in self._ce_required_fields
}
) -> None:
required_binary_fields = {f"ce-{field}" for field in self._ce_required_fields}
missing_fields = required_binary_fields - headers.keys()
if len(missing_fields) > 0:
@ -277,26 +275,30 @@ class BaseEvent(EventGetterSetter):
raw_ce = data_unmarshaller(body)
except Exception as e:
raise cloud_exceptions.DataUnmarshallerError(
"Failed to unmarshall data with error: "
f"{type(e).__name__}('{e}')"
f"Failed to unmarshall data with error: {type(e).__name__}('{e}')"
)
self.Set("data", raw_ce)
def MarshalBinary(
self, data_marshaller: types.MarshallerType
) -> (dict, bytes):
if data_marshaller is None:
self, data_marshaller: typing.Optional[types.MarshallerType]
) -> typing.Tuple[typing.Dict[str, str], bytes]:
if not data_marshaller:
data_marshaller = json.dumps
headers = {}
if self.ContentType():
headers["content-type"] = self.ContentType()
props = self.Properties()
headers: typing.Dict[str, str] = {}
content_type = self.ContentType()
if content_type:
headers["content-type"] = content_type
props: typing.Dict = self.Properties()
for key, value in props.items():
if key not in ["data", "extensions", "contenttype"]:
if key not in ["data", "extensions", "datacontenttype"]:
if value is not None:
headers["ce-{0}".format(key)] = value
for key, value in props.get("extensions").items():
extensions = props.get("extensions")
if extensions is None or not isinstance(extensions, typing.Mapping):
raise cloud_exceptions.DataMarshallerError(
"No extensions are available in the binary event."
)
for key, value in extensions.items():
headers["ce-{0}".format(key)] = value
data, _ = self.Get("data")
@ -304,8 +306,7 @@ class BaseEvent(EventGetterSetter):
data = data_marshaller(data)
except Exception as e:
raise cloud_exceptions.DataMarshallerError(
"Failed to marshall data with error: "
f"{type(e).__name__}('{e}')"
f"Failed to marshall data with error: {type(e).__name__}('{e}')"
)
if isinstance(data, str): # Convenience method for json.dumps
data = data.encode("utf-8")

View File

@ -1,4 +1,4 @@
# All Rights Reserved.
# Copyright 2018-Present The CloudEvents Authors
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
@ -11,29 +11,36 @@
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import typing
from typing import Any
class Option(object):
def __init__(self, name, value, is_required):
self.name = name
self.value = value
self.is_required = is_required
class Option:
"""A value holder of CloudEvents extensions."""
def set(self, new_value):
def __init__(self, name: str, value: typing.Optional[Any], is_required: bool):
self.name: str = name
"""The name of the option."""
self.value: Any = value
"""The value of the option."""
self.is_required: bool = is_required
"""Determines if the option value must be present."""
def set(self, new_value: typing.Optional[Any]) -> None:
"""Sets given new value as the value of this option."""
is_none = new_value is None
if self.is_required and is_none:
raise ValueError(
"Attribute value error: '{0}', "
""
"invalid new value.".format(self.name)
"Attribute value error: '{0}', invalid new value.".format(self.name)
)
self.value = new_value
def get(self):
def get(self) -> typing.Optional[Any]:
"""Returns the value of this option."""
return self.value
def required(self):
"""Determines if the option value must be present."""
return self.is_required
def __eq__(self, obj):

View File

@ -1,4 +1,4 @@
# All Rights Reserved.
# Copyright 2018-Present The CloudEvents Authors
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
@ -11,6 +11,7 @@
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import typing
from cloudevents.sdk.event import base, opt
@ -33,9 +34,7 @@ class Event(base.BaseEvent):
self.ce__type = opt.Option("type", None, True)
self.ce__datacontenttype = opt.Option("datacontenttype", None, False)
self.ce__datacontentencoding = opt.Option(
"datacontentencoding", None, False
)
self.ce__datacontentencoding = opt.Option("datacontentencoding", None, False)
self.ce__subject = opt.Option("subject", None, False)
self.ce__time = opt.Option("time", None, False)
self.ce__schemaurl = opt.Option("schemaurl", None, False)
@ -43,37 +42,55 @@ class Event(base.BaseEvent):
self.ce__extensions = opt.Option("extensions", dict(), False)
def CloudEventVersion(self) -> str:
return self.ce__specversion.get()
return str(self.ce__specversion.get())
def EventType(self) -> str:
return self.ce__type.get()
return str(self.ce__type.get())
def Source(self) -> str:
return self.ce__source.get()
return str(self.ce__source.get())
def EventID(self) -> str:
return self.ce__id.get()
return str(self.ce__id.get())
def EventTime(self) -> str:
return self.ce__time.get()
def EventTime(self) -> typing.Optional[str]:
result = self.ce__time.get()
if result is None:
return None
return str(result)
def Subject(self) -> str:
return self.ce__subject.get()
def Subject(self) -> typing.Optional[str]:
result = self.ce__subject.get()
if result is None:
return None
return str(result)
def SchemaURL(self) -> str:
return self.ce__schemaurl.get()
def SchemaURL(self) -> typing.Optional[str]:
result = self.ce__schemaurl.get()
if result is None:
return None
return str(result)
def Data(self) -> object:
def Data(self) -> typing.Optional[object]:
return self.ce__data.get()
def Extensions(self) -> dict:
return self.ce__extensions.get()
result = self.ce__extensions.get()
if result is None:
return {}
return dict(result)
def ContentType(self) -> str:
return self.ce__datacontenttype.get()
def ContentType(self) -> typing.Optional[str]:
result = self.ce__datacontenttype.get()
if result is None:
return None
return str(result)
def ContentEncoding(self) -> str:
return self.ce__datacontentencoding.get()
def ContentEncoding(self) -> typing.Optional[str]:
result = self.ce__datacontentencoding.get()
if result is None:
return None
return str(result)
def SetEventType(self, eventType: str) -> base.BaseEvent:
self.Set("type", eventType)
@ -87,54 +104,56 @@ class Event(base.BaseEvent):
self.Set("id", eventID)
return self
def SetEventTime(self, eventTime: str) -> base.BaseEvent:
def SetEventTime(self, eventTime: typing.Optional[str]) -> base.BaseEvent:
self.Set("time", eventTime)
return self
def SetSubject(self, subject: str) -> base.BaseEvent:
def SetSubject(self, subject: typing.Optional[str]) -> base.BaseEvent:
self.Set("subject", subject)
return self
def SetSchemaURL(self, schemaURL: str) -> base.BaseEvent:
def SetSchemaURL(self, schemaURL: typing.Optional[str]) -> base.BaseEvent:
self.Set("schemaurl", schemaURL)
return self
def SetData(self, data: object) -> base.BaseEvent:
def SetData(self, data: typing.Optional[object]) -> base.BaseEvent:
self.Set("data", data)
return self
def SetExtensions(self, extensions: dict) -> base.BaseEvent:
def SetExtensions(self, extensions: typing.Optional[dict]) -> base.BaseEvent:
self.Set("extensions", extensions)
return self
def SetContentType(self, contentType: str) -> base.BaseEvent:
def SetContentType(self, contentType: typing.Optional[str]) -> base.BaseEvent:
self.Set("datacontenttype", contentType)
return self
def SetContentEncoding(self, contentEncoding: str) -> base.BaseEvent:
def SetContentEncoding(
self, contentEncoding: typing.Optional[str]
) -> base.BaseEvent:
self.Set("datacontentencoding", contentEncoding)
return self
@property
def datacontentencoding(self):
def datacontentencoding(self) -> typing.Optional[str]:
return self.ContentEncoding()
@datacontentencoding.setter
def datacontentencoding(self, value: str):
def datacontentencoding(self, value: typing.Optional[str]) -> None:
self.SetContentEncoding(value)
@property
def subject(self) -> str:
def subject(self) -> typing.Optional[str]:
return self.Subject()
@subject.setter
def subject(self, value: str):
def subject(self, value: typing.Optional[str]) -> None:
self.SetSubject(value)
@property
def schema_url(self) -> str:
def schema_url(self) -> typing.Optional[str]:
return self.SchemaURL()
@schema_url.setter
def schema_url(self, value: str):
def schema_url(self, value: typing.Optional[str]) -> None:
self.SetSchemaURL(value)

View File

@ -1,4 +1,4 @@
# All Rights Reserved.
# Copyright 2018-Present The CloudEvents Authors
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
@ -11,9 +11,15 @@
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from __future__ import annotations
import typing
from cloudevents.sdk.event import base, opt
if typing.TYPE_CHECKING:
from typing_extensions import Self
class Event(base.BaseEvent):
_ce_required_fields = {"id", "source", "type", "specversion"}
@ -34,83 +40,98 @@ class Event(base.BaseEvent):
self.ce__extensions = opt.Option("extensions", dict(), False)
def CloudEventVersion(self) -> str:
return self.ce__specversion.get()
return str(self.ce__specversion.get())
def EventType(self) -> str:
return self.ce__type.get()
return str(self.ce__type.get())
def Source(self) -> str:
return self.ce__source.get()
return str(self.ce__source.get())
def EventID(self) -> str:
return self.ce__id.get()
return str(self.ce__id.get())
def EventTime(self) -> str:
return self.ce__time.get()
def EventTime(self) -> typing.Optional[str]:
result = self.ce__time.get()
if result is None:
return None
return str(result)
def Subject(self) -> str:
return self.ce__subject.get()
def Subject(self) -> typing.Optional[str]:
result = self.ce__subject.get()
if result is None:
return None
return str(result)
def Schema(self) -> str:
return self.ce__dataschema.get()
def Schema(self) -> typing.Optional[str]:
result = self.ce__dataschema.get()
if result is None:
return None
return str(result)
def ContentType(self) -> str:
return self.ce__datacontenttype.get()
def ContentType(self) -> typing.Optional[str]:
result = self.ce__datacontenttype.get()
if result is None:
return None
return str(result)
def Data(self) -> object:
def Data(self) -> typing.Optional[object]:
return self.ce__data.get()
def Extensions(self) -> dict:
return self.ce__extensions.get()
result = self.ce__extensions.get()
if result is None:
return {}
return dict(result)
def SetEventType(self, eventType: str) -> base.BaseEvent:
def SetEventType(self, eventType: str) -> Self:
self.Set("type", eventType)
return self
def SetSource(self, source: str) -> base.BaseEvent:
def SetSource(self, source: str) -> Self:
self.Set("source", source)
return self
def SetEventID(self, eventID: str) -> base.BaseEvent:
def SetEventID(self, eventID: str) -> Self:
self.Set("id", eventID)
return self
def SetEventTime(self, eventTime: str) -> base.BaseEvent:
def SetEventTime(self, eventTime: typing.Optional[str]) -> Self:
self.Set("time", eventTime)
return self
def SetSubject(self, subject: str) -> base.BaseEvent:
def SetSubject(self, subject: typing.Optional[str]) -> Self:
self.Set("subject", subject)
return self
def SetSchema(self, schema: str) -> base.BaseEvent:
def SetSchema(self, schema: typing.Optional[str]) -> Self:
self.Set("dataschema", schema)
return self
def SetContentType(self, contentType: str) -> base.BaseEvent:
def SetContentType(self, contentType: typing.Optional[str]) -> Self:
self.Set("datacontenttype", contentType)
return self
def SetData(self, data: object) -> base.BaseEvent:
def SetData(self, data: typing.Optional[object]) -> Self:
self.Set("data", data)
return self
def SetExtensions(self, extensions: dict) -> base.BaseEvent:
def SetExtensions(self, extensions: typing.Optional[dict]) -> Self:
self.Set("extensions", extensions)
return self
@property
def schema(self) -> str:
def schema(self) -> typing.Optional[str]:
return self.Schema()
@schema.setter
def schema(self, value: str):
def schema(self, value: typing.Optional[str]) -> None:
self.SetSchema(value)
@property
def subject(self) -> str:
def subject(self) -> typing.Optional[str]:
return self.Subject()
@subject.setter
def subject(self, value: str):
def subject(self, value: typing.Optional[str]) -> None:
self.SetSubject(value)

View File

@ -1,4 +1,4 @@
# All Rights Reserved.
# Copyright 2018-Present The CloudEvents Authors
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain

View File

@ -1,4 +1,4 @@
# All Rights Reserved.
# Copyright 2018-Present The CloudEvents Authors
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
@ -26,36 +26,34 @@ class HTTPMarshaller(object):
API of this class designed to work with CloudEvent (upstream and v0.1)
"""
def __init__(self, converters: typing.List[base.Converter]):
def __init__(self, converters: typing.Sequence[base.Converter]):
"""
CloudEvent HTTP marshaller constructor
:param converters: a list of HTTP-to-CloudEvent-to-HTTP constructors
:type converters: typing.List[base.Converter]
"""
self.http_converters = [c for c in converters]
self.http_converters_by_type = {c.TYPE: c for c in converters}
self.http_converters: typing.List[base.Converter] = [c for c in converters]
self.http_converters_by_type: typing.Dict[str, base.Converter] = {
c.TYPE: c for c in converters
}
def FromRequest(
self,
event: event_base.BaseEvent,
headers: dict,
headers: typing.Mapping[str, str],
body: typing.Union[str, bytes],
data_unmarshaller: types.UnmarshallerType = json.loads,
data_unmarshaller: typing.Optional[types.UnmarshallerType] = None,
) -> event_base.BaseEvent:
"""
Reads a CloudEvent from an HTTP headers and request body
:param event: CloudEvent placeholder
:type event: cloudevents.sdk.event.base.BaseEvent
:param headers: a dict-like HTTP headers
:type headers: dict
:param body: an HTTP request body as a string or bytes
:type body: typing.Union[str, bytes]
:param data_unmarshaller: a callable-like
unmarshaller the CloudEvent data
:param data_unmarshaller: a callable-like unmarshaller the CloudEvent data
:return: a CloudEvent
:rtype: event_base.BaseEvent
"""
if not isinstance(data_unmarshaller, typing.Callable):
if not data_unmarshaller:
data_unmarshaller = json.loads
if not callable(data_unmarshaller):
raise exceptions.InvalidDataUnmarshaller()
# Lower all header keys
@ -77,23 +75,17 @@ class HTTPMarshaller(object):
def ToRequest(
self,
event: event_base.BaseEvent,
converter_type: str = None,
data_marshaller: types.MarshallerType = None,
) -> (dict, bytes):
converter_type: typing.Optional[str] = None,
data_marshaller: typing.Optional[types.MarshallerType] = None,
) -> typing.Tuple[typing.Dict[str, str], bytes]:
"""
Writes a CloudEvent into a HTTP-ready form of headers and request body
:param event: CloudEvent
:type event: event_base.BaseEvent
:param converter_type: a type of CloudEvent-to-HTTP converter
:type converter_type: str
:param data_marshaller: a callable-like marshaller CloudEvent data
:type data_marshaller: typing.Callable
:return: dict of HTTP headers and stream of HTTP request body
:rtype: tuple
"""
if data_marshaller is not None and not isinstance(
data_marshaller, typing.Callable
):
if data_marshaller is not None and not callable(data_marshaller):
raise exceptions.InvalidDataMarshaller()
if converter_type is None:
@ -108,10 +100,9 @@ class HTTPMarshaller(object):
def NewDefaultHTTPMarshaller() -> HTTPMarshaller:
"""
Creates the default HTTP marshaller with both structured
and binary converters
Creates the default HTTP marshaller with both structured and binary converters.
:return: an instance of HTTP marshaller
:rtype: cloudevents.sdk.marshaller.HTTPMarshaller
"""
return HTTPMarshaller(
[
@ -122,14 +113,13 @@ def NewDefaultHTTPMarshaller() -> HTTPMarshaller:
def NewHTTPMarshaller(
converters: typing.List[base.Converter],
converters: typing.Sequence[base.Converter],
) -> HTTPMarshaller:
"""
Creates the default HTTP marshaller with both
structured and binary converters
Creates the default HTTP marshaller with both structured and binary converters.
:param converters: a list of CloudEvent-to-HTTP-to-CloudEvent converters
:type converters: typing.List[base.Converter]
:return: an instance of HTTP marshaller
:rtype: cloudevents.sdk.marshaller.HTTPMarshaller
"""
return HTTPMarshaller(converters)

View File

@ -1,4 +1,4 @@
# All Rights Reserved.
# Copyright 2018-Present The CloudEvents Authors
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
@ -14,12 +14,25 @@
import typing
_K_co = typing.TypeVar("_K_co", covariant=True)
_V_co = typing.TypeVar("_V_co", covariant=True)
# Use consistent types for marshal and unmarshal functions across
# both JSON and Binary format.
MarshallerType = typing.Optional[
typing.Callable[[typing.Any], typing.Union[bytes, str]]
]
UnmarshallerType = typing.Optional[
typing.Callable[[typing.Union[bytes, str]], typing.Any]
]
MarshallerType = typing.Callable[[typing.Any], typing.AnyStr]
UnmarshallerType = typing.Callable[[typing.AnyStr], typing.Any]
class SupportsDuplicateItems(typing.Protocol[_K_co, _V_co]):
"""
Dict-like objects with an items() method that may produce duplicate keys.
"""
# This is wider than _typeshed.SupportsItems, which expects items() to
# return type an AbstractSet. werkzeug's Headers class satisfies this type,
# but not _typeshed.SupportsItems.
def items(self) -> typing.Iterable[typing.Tuple[_K_co, _V_co]]:
pass

View File

@ -0,0 +1,13 @@
# Copyright 2018-Present The CloudEvents Authors
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.

View File

@ -1,4 +1,4 @@
# All Rights Reserved.
# Copyright 2018-Present The CloudEvents Authors
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
@ -14,11 +14,11 @@
from cloudevents.sdk.event import v1, v03
contentType = "application/json"
content_type = "application/json"
ce_type = "word.found.exclamation"
ce_id = "16fb5f0b-211e-1102-3dfe-ea6e2806f124"
source = "pytest"
eventTime = "2018-10-23T12:28:23.3464579Z"
event_time = "2018-10-23T12:28:23.3464579Z"
body = '{"name":"john"}'
headers = {
@ -26,17 +26,17 @@ headers = {
"ce-specversion": "1.0",
"ce-type": ce_type,
"ce-id": ce_id,
"ce-time": eventTime,
"ce-time": event_time,
"ce-source": source,
"Content-Type": contentType,
"Content-Type": content_type,
},
v1.Event: {
"ce-specversion": "1.0",
"ce-type": ce_type,
"ce-id": ce_id,
"ce-time": eventTime,
"ce-time": event_time,
"ce-source": source,
"Content-Type": contentType,
"Content-Type": content_type,
},
}
@ -45,16 +45,16 @@ json_ce = {
"specversion": "1.0",
"type": ce_type,
"id": ce_id,
"time": eventTime,
"time": event_time,
"source": source,
"datacontenttype": contentType,
"datacontenttype": content_type,
},
v1.Event: {
"specversion": "1.0",
"type": ce_type,
"id": ce_id,
"time": eventTime,
"time": event_time,
"source": source,
"datacontenttype": contentType,
"datacontenttype": content_type,
},
}

View File

@ -0,0 +1,73 @@
# Copyright 2018-Present The CloudEvents Authors
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import pytest
from cloudevents.conversion import _best_effort_serialize_to_json
from cloudevents.http import CloudEvent
@pytest.fixture()
def dummy_event():
return CloudEvent({"type": "dummy", "source": "dummy"})
def test_json_methods(dummy_event):
from cloudevents.conversion import to_json
from cloudevents.http.conversion import from_json
from cloudevents.http.json_methods import from_json as deprecated_from_json
from cloudevents.http.json_methods import to_json as deprecated_to_json
assert from_json(to_json(dummy_event)) == deprecated_from_json(
deprecated_to_json(dummy_event)
)
def test_http_methods(dummy_event):
from cloudevents.http import from_http, to_binary, to_structured
from cloudevents.http.http_methods import from_http as deprecated_from_http
from cloudevents.http.http_methods import to_binary as deprecated_to_binary
from cloudevents.http.http_methods import to_structured as deprecated_to_structured
assert from_http(*to_binary(dummy_event)) == deprecated_from_http(
*deprecated_to_binary(dummy_event)
)
assert from_http(*to_structured(dummy_event)) == deprecated_from_http(
*deprecated_to_structured(dummy_event)
)
def test_util():
from cloudevents.http.util import default_marshaller # noqa
assert _best_effort_serialize_to_json(None) == default_marshaller(None)
def test_event_type():
from cloudevents.http.event_type import is_binary, is_structured # noqa
def test_http_module_imports():
from cloudevents.http import ( # noqa
CloudEvent,
from_dict,
from_http,
from_json,
is_binary,
is_structured,
to_binary,
to_binary_http,
to_json,
to_structured,
to_structured_http,
)

View File

@ -1,4 +1,4 @@
# All Rights Reserved.
# Copyright 2018-Present The CloudEvents Authors
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
@ -11,6 +11,7 @@
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import pytest
import cloudevents.exceptions as cloud_exceptions
@ -30,4 +31,4 @@ def test_get_nonexistent_optional(event_class):
event = event_class()
event.SetExtensions({"ext1": "val"})
res = event.Get("ext1")
assert res[0] == "val" and res[1] == True
assert res[0] == "val" and res[1] is True

View File

@ -1,4 +1,4 @@
# All Rights Reserved.
# Copyright 2018-Present The CloudEvents Authors
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
@ -11,16 +11,17 @@
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import pytest
from cloudevents.sdk import exceptions
from cloudevents.sdk.converters import base, binary, structured
from cloudevents.sdk.converters import base, binary
def test_binary_converter_raise_unsupported():
with pytest.raises(exceptions.UnsupportedEvent):
cnvtr = binary.BinaryHTTPCloudEventConverter()
cnvtr.read(None, {}, None, None)
cnvtr.read(None, {}, None, None) # type: ignore[arg-type] # intentionally wrong type # noqa: E501
def test_base_converters_raise_exceptions():
@ -34,8 +35,8 @@ def test_base_converters_raise_exceptions():
with pytest.raises(Exception):
cnvtr = base.Converter()
cnvtr.write(None, None)
cnvtr.write(None, None) # type: ignore[arg-type] # intentionally wrong type
with pytest.raises(Exception):
cnvtr = base.Converter()
cnvtr.read(None, None, None, None)
cnvtr.read(None, None, None, None) # type: ignore[arg-type] # intentionally wrong type # noqa: E501

View File

@ -1,4 +1,4 @@
# All Rights Reserved.
# Copyright 2018-Present The CloudEvents Authors
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
@ -12,15 +12,12 @@
# License for the specific language governing permissions and limitations
# under the License.
import copy
import io
import json
from uuid import uuid4
import pytest
from cloudevents.sdk import converters, marshaller
from cloudevents.sdk.converters import structured
from cloudevents.sdk.event import v1, v03
from cloudevents.tests import data
@ -43,7 +40,7 @@ def test_general_binary_properties(event_class):
assert event is not None
assert event.type == data.ce_type
assert event.id == data.ce_id
assert event.content_type == data.contentType
assert event.content_type == data.content_type
assert event.source == data.source
# Test setters
@ -71,7 +68,6 @@ def test_general_binary_properties(event_class):
@pytest.mark.parametrize("event_class", [v03.Event, v1.Event])
def test_general_structured_properties(event_class):
copy_of_ce = copy.deepcopy(data.json_ce[event_class])
m = marshaller.NewDefaultHTTPMarshaller()
http_headers = {"content-type": "application/cloudevents+json"}
event = m.FromRequest(
@ -84,7 +80,7 @@ def test_general_structured_properties(event_class):
assert event is not None
assert event.type == data.ce_type
assert event.id == data.ce_id
assert event.content_type == data.contentType
assert event.content_type == data.content_type
assert event.source == data.source
new_headers, _ = m.ToRequest(event, converters.TypeStructured, lambda x: x)

View File

@ -1,4 +1,4 @@
# All Rights Reserved.
# Copyright 2018-Present The CloudEvents Authors
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
@ -11,6 +11,7 @@
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import pytest
from cloudevents.http import (

View File

@ -1,4 +1,4 @@
# All Rights Reserved.
# Copyright 2018-Present The CloudEvents Authors
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
@ -11,6 +11,7 @@
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import json
import pytest

View File

@ -1,4 +1,4 @@
# All Rights Reserved.
# Copyright 2018-Present The CloudEvents Authors
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
@ -12,12 +12,11 @@
# License for the specific language governing permissions and limitations
# under the License.
import io
import json
import pytest
from cloudevents.sdk import exceptions, marshaller
from cloudevents.sdk import marshaller
from cloudevents.sdk.converters import binary, structured
from cloudevents.sdk.event import v1, v03
from cloudevents.tests import data
@ -25,23 +24,17 @@ from cloudevents.tests import data
@pytest.mark.parametrize("event_class", [v03.Event, v1.Event])
def test_binary_converter_upstream(event_class):
m = marshaller.NewHTTPMarshaller(
[binary.NewBinaryHTTPCloudEventConverter()]
)
event = m.FromRequest(
event_class(), data.headers[event_class], None, lambda x: x
)
m = marshaller.NewHTTPMarshaller([binary.NewBinaryHTTPCloudEventConverter()])
event = m.FromRequest(event_class(), data.headers[event_class], b"", lambda x: x)
assert event is not None
assert event.EventType() == data.ce_type
assert event.EventID() == data.ce_id
assert event.ContentType() == data.contentType
assert event.ContentType() == data.content_type
@pytest.mark.parametrize("event_class", [v03.Event, v1.Event])
def test_structured_converter_upstream(event_class):
m = marshaller.NewHTTPMarshaller(
[structured.NewJSONHTTPCloudEventConverter()]
)
m = marshaller.NewHTTPMarshaller([structured.NewJSONHTTPCloudEventConverter()])
event = m.FromRequest(
event_class(),
{"Content-Type": "application/cloudevents+json"},
@ -52,7 +45,7 @@ def test_structured_converter_upstream(event_class):
assert event is not None
assert event.EventType() == data.ce_type
assert event.EventID() == data.ce_id
assert event.ContentType() == data.contentType
assert event.ContentType() == data.content_type
@pytest.mark.parametrize("event_class", [v03.Event, v1.Event])
@ -68,7 +61,7 @@ def test_default_http_marshaller_with_structured(event_class):
assert event is not None
assert event.EventType() == data.ce_type
assert event.EventID() == data.ce_id
assert event.ContentType() == data.contentType
assert event.ContentType() == data.content_type
@pytest.mark.parametrize("event_class", [v03.Event, v1.Event])
@ -84,5 +77,5 @@ def test_default_http_marshaller_with_binary(event_class):
assert event is not None
assert event.EventType() == data.ce_type
assert event.EventID() == data.ce_id
assert event.ContentType() == data.contentType
assert event.ContentType() == data.content_type
assert event.Data() == data.body

View File

@ -1,4 +1,4 @@
# All Rights Reserved.
# Copyright 2018-Present The CloudEvents Authors
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
@ -12,7 +12,6 @@
# License for the specific language governing permissions and limitations
# under the License.
import io
import json
import pytest
@ -27,11 +26,11 @@ from cloudevents.tests import data
def test_event_pipeline_upstream(event_class):
event = (
event_class()
.SetContentType(data.contentType)
.SetContentType(data.content_type)
.SetData(data.body)
.SetEventID(data.ce_id)
.SetSource(data.source)
.SetEventTime(data.eventTime)
.SetEventTime(data.event_time)
.SetEventType(data.ce_type)
)
m = marshaller.NewDefaultHTTPMarshaller()
@ -60,14 +59,8 @@ def test_extensions_are_set_upstream():
def test_binary_event_v1():
event = (
v1.Event()
.SetContentType("application/octet-stream")
.SetData(b"\x00\x01")
)
m = marshaller.NewHTTPMarshaller(
[structured.NewJSONHTTPCloudEventConverter()]
)
event = v1.Event().SetContentType("application/octet-stream").SetData(b"\x00\x01")
m = marshaller.NewHTTPMarshaller([structured.NewJSONHTTPCloudEventConverter()])
_, body = m.ToRequest(event, converters.TypeStructured, lambda x: x)
assert isinstance(body, bytes)
@ -77,23 +70,21 @@ def test_binary_event_v1():
def test_object_event_v1():
event = (
v1.Event().SetContentType("application/json").SetData({"name": "john"})
)
event = v1.Event().SetContentType("application/json").SetData({"name": "john"})
m = marshaller.NewDefaultHTTPMarshaller()
_, structuredBody = m.ToRequest(event)
assert isinstance(structuredBody, bytes)
structuredObj = json.loads(structuredBody)
errorMsg = f"Body was {structuredBody}, obj is {structuredObj}"
assert isinstance(structuredObj, dict), errorMsg
assert isinstance(structuredObj["data"], dict), errorMsg
assert len(structuredObj["data"]) == 1, errorMsg
assert structuredObj["data"]["name"] == "john", errorMsg
_, structured_body = m.ToRequest(event)
assert isinstance(structured_body, bytes)
structured_obj = json.loads(structured_body)
error_msg = f"Body was {structured_body!r}, obj is {structured_obj}"
assert isinstance(structured_obj, dict), error_msg
assert isinstance(structured_obj["data"], dict), error_msg
assert len(structured_obj["data"]) == 1, error_msg
assert structured_obj["data"]["name"] == "john", error_msg
headers, binaryBody = m.ToRequest(event, converters.TypeBinary)
headers, binary_body = m.ToRequest(event, converters.TypeBinary)
assert isinstance(headers, dict)
assert isinstance(binaryBody, bytes)
assert isinstance(binary_body, bytes)
assert headers["content-type"] == "application/json"
assert binaryBody == b'{"name": "john"}', f"Binary is {binaryBody!r}"
assert binary_body == b'{"name": "john"}', f"Binary is {binary_body!r}"

View File

@ -1,4 +1,4 @@
# All Rights Reserved.
# Copyright 2018-Present The CloudEvents Authors
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
@ -12,14 +12,11 @@
# License for the specific language governing permissions and limitations
# under the License.
import copy
import io
import json
import pytest
from cloudevents.sdk import converters, marshaller
from cloudevents.sdk.converters import structured
from cloudevents.sdk.event import v1, v03
from cloudevents.tests import data
@ -36,7 +33,7 @@ def test_binary_event_to_request_upstream(event_class):
assert event is not None
assert event.EventType() == data.ce_type
assert event.EventID() == data.ce_id
assert event.ContentType() == data.contentType
assert event.ContentType() == data.content_type
new_headers, _ = m.ToRequest(event, converters.TypeBinary, lambda x: x)
assert new_headers is not None
@ -45,7 +42,6 @@ def test_binary_event_to_request_upstream(event_class):
@pytest.mark.parametrize("event_class", [v03.Event, v1.Event])
def test_structured_event_to_request_upstream(event_class):
copy_of_ce = copy.deepcopy(data.json_ce[event_class])
m = marshaller.NewDefaultHTTPMarshaller()
http_headers = {"content-type": "application/cloudevents+json"}
event = m.FromRequest(
@ -54,7 +50,7 @@ def test_structured_event_to_request_upstream(event_class):
assert event is not None
assert event.EventType() == data.ce_type
assert event.EventID() == data.ce_id
assert event.ContentType() == data.contentType
assert event.ContentType() == data.content_type
new_headers, _ = m.ToRequest(event, converters.TypeStructured, lambda x: x)
for key in new_headers:

View File

@ -1,13 +1,32 @@
# Copyright 2018-Present The CloudEvents Authors
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import pytest
import cloudevents.exceptions as cloud_exceptions
from cloudevents.conversion import _json_or_string
from cloudevents.http import CloudEvent
from cloudevents.http.util import _json_or_string
@pytest.mark.parametrize("specversion", ["0.3", "1.0"])
def test_http_cloudevent_equality(specversion):
attributes = {
@pytest.fixture(params=["0.3", "1.0"])
def specversion(request):
return request.param
@pytest.fixture()
def dummy_attributes(specversion):
return {
"source": "<source>",
"specversion": specversion,
"id": "my-id",
@ -16,48 +35,80 @@ def test_http_cloudevent_equality(specversion):
"datacontenttype": "application/json",
"subject": "my-subject",
}
data = '{"name":"john"}'
event1 = CloudEvent(attributes, data)
event2 = CloudEvent(attributes, data)
@pytest.fixture()
def my_dummy_data():
return '{"name":"john"}'
@pytest.fixture()
def your_dummy_data():
return '{"name":"paul"}'
@pytest.fixture()
def dummy_event(dummy_attributes, my_dummy_data):
return CloudEvent(attributes=dummy_attributes, data=my_dummy_data)
@pytest.fixture()
def non_exiting_attribute_name(dummy_event):
result = "nonexisting"
assert result not in dummy_event
return result
def test_http_cloudevent_equality(dummy_attributes, my_dummy_data, your_dummy_data):
data = my_dummy_data
event1 = CloudEvent(dummy_attributes, data)
event2 = CloudEvent(dummy_attributes, data)
assert event1 == event2
# Test different attributes
for key in attributes:
for key in dummy_attributes:
if key == "specversion":
continue
else:
attributes[key] = f"noise-{key}"
event3 = CloudEvent(attributes, data)
event2 = CloudEvent(attributes, data)
dummy_attributes[key] = f"noise-{key}"
event3 = CloudEvent(dummy_attributes, data)
event2 = CloudEvent(dummy_attributes, data)
assert event2 == event3
assert event1 != event2 and event3 != event1
# Test different data
data = '{"name":"paul"}'
event3 = CloudEvent(attributes, data)
event2 = CloudEvent(attributes, data)
data = your_dummy_data
event3 = CloudEvent(dummy_attributes, data)
event2 = CloudEvent(dummy_attributes, data)
assert event2 == event3
assert event1 != event2 and event3 != event1
@pytest.mark.parametrize("specversion", ["0.3", "1.0"])
def test_http_cloudevent_mutates_equality(specversion):
attributes = {
"source": "<source>",
"specversion": specversion,
"id": "my-id",
"time": "tomorrow",
"type": "tests.cloudevents.override",
"datacontenttype": "application/json",
"subject": "my-subject",
}
data = '{"name":"john"}'
event1 = CloudEvent(attributes, data)
event2 = CloudEvent(attributes, data)
event3 = CloudEvent(attributes, data)
@pytest.mark.parametrize(
"non_cloudevent_value",
(
1,
None,
object(),
"Hello World",
),
)
def test_http_cloudevent_must_not_equal_to_non_cloudevent_value(
dummy_event, non_cloudevent_value
):
assert not dummy_event == non_cloudevent_value
def test_http_cloudevent_mutates_equality(
dummy_attributes, my_dummy_data, your_dummy_data
):
data = my_dummy_data
event1 = CloudEvent(dummy_attributes, data)
event2 = CloudEvent(dummy_attributes, data)
event3 = CloudEvent(dummy_attributes, data)
assert event1 == event2
# Test different attributes
for key in attributes:
for key in dummy_attributes:
if key == "specversion":
continue
else:
@ -67,8 +118,8 @@ def test_http_cloudevent_mutates_equality(specversion):
assert event1 != event2 and event3 != event1
# Test different data
event2.data = '{"name":"paul"}'
event3.data = '{"name":"paul"}'
event2.data = your_dummy_data
event3.data = your_dummy_data
assert event2 == event3
assert event1 != event2 and event3 != event1
@ -117,5 +168,46 @@ def test_cloudevent_general_overrides():
assert len(event) == 0
def test_none_json_or_string():
assert _json_or_string(None) is None
@pytest.mark.parametrize(
"given, expected",
[
(None, None),
('{"hello": "world"}', {"hello": "world"}),
(b'{"hello": "world"}', {"hello": "world"}),
(b"Hello World", b"Hello World"),
("Hello World", "Hello World"),
(b"\x00\x00\x11Hello World", b"\x00\x00\x11Hello World"),
],
)
def test_json_or_string_match_golden_sample(given, expected):
assert _json_or_string(given) == expected
def test_get_operation_on_non_existing_attribute_must_not_raise_exception(
dummy_event, non_exiting_attribute_name
):
dummy_event.get(non_exiting_attribute_name)
def test_get_must_return_attribute_value_if_exists(dummy_event):
assert dummy_event.get("source") == dummy_event["source"]
def test_get_operation_on_non_existing_attribute_must_return_none_by_default(
dummy_event, non_exiting_attribute_name
):
assert dummy_event.get(non_exiting_attribute_name) is None
def test_get_operation_on_non_existing_attribute_must_return_default_value_if_given(
dummy_event, non_exiting_attribute_name
):
dummy_value = "Hello World"
assert dummy_event.get(non_exiting_attribute_name, dummy_value) == dummy_value
def test_get_operation_on_non_existing_attribute_should_not_copy_default_value(
dummy_event, non_exiting_attribute_name
):
dummy_value = object()
assert dummy_event.get(non_exiting_attribute_name, dummy_value) is dummy_value

View File

@ -1,4 +1,4 @@
# All Rights Reserved.
# Copyright 2018-Present The CloudEvents Authors
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
@ -11,12 +11,16 @@
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import base64
import datetime
import json
import pytest
from cloudevents.http import CloudEvent, from_json, to_json
from cloudevents.conversion import to_dict, to_json
from cloudevents.http import CloudEvent, from_dict, from_json
from cloudevents.sdk.event.attribute import SpecVersion
test_data = json.dumps({"data-key": "val"})
test_attributes = {
@ -126,3 +130,30 @@ def test_json_can_talk_to_itself_base64(specversion):
for key, val in test_attributes.items():
assert event[key] == val
assert event.data == data
def test_from_dict():
given = {
"data": b"\x00\x00\x11Hello World",
"datacontenttype": "application/octet-stream",
"dataschema": None,
"id": "11775cb2-fd00-4487-a18b-30c3600eaa5f",
"source": "dummy:source",
"specversion": SpecVersion.v1_0,
"subject": None,
"time": datetime.datetime(
2022, 7, 16, 12, 3, 20, 519216, tzinfo=datetime.timezone.utc
),
"type": "dummy.type",
}
assert to_dict(from_dict(given)) == {
"data": b"\x00\x00\x11Hello World",
"datacontenttype": "application/octet-stream",
"dataschema": None,
"id": "11775cb2-fd00-4487-a18b-30c3600eaa5f",
"source": "dummy:source",
"specversion": "1.0",
"subject": None,
"time": "2022-07-16T12:03:20.519216+00:00",
"type": "dummy.type",
}

View File

@ -1,4 +1,4 @@
# All Rights Reserved.
# Copyright 2018-Present The CloudEvents Authors
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
@ -11,27 +11,23 @@
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from __future__ import annotations
import bz2
import copy
import io
import json
import typing
import pytest
from sanic import Sanic, response
import cloudevents.exceptions as cloud_exceptions
from cloudevents.http import (
CloudEvent,
from_http,
is_binary,
is_structured,
to_binary,
to_binary_http,
to_structured,
to_structured_http,
)
from cloudevents.http import CloudEvent, from_http, to_binary, to_structured
from cloudevents.http.event_type import is_binary as deprecated_is_binary
from cloudevents.http.event_type import is_structured as deprecated_is_structured
from cloudevents.sdk import converters
from cloudevents.sdk.converters.binary import is_binary
from cloudevents.sdk.converters.structured import is_structured
invalid_test_headers = [
{
@ -69,7 +65,7 @@ invalid_cloudevent_request_body = [
test_data = {"payload-content": "Hello World!"}
app = Sanic(__name__)
app = Sanic("test_http_events")
@app.route("/event", ["POST"])
@ -77,9 +73,7 @@ async def echo(request):
decoder = None
if "binary-payload" in request.headers:
decoder = lambda x: x
event = from_http(
dict(request.headers), request.body, data_unmarshaller=decoder
)
event = from_http(dict(request.headers), request.body, data_unmarshaller=decoder)
data = (
event.data
if isinstance(event.data, (bytes, bytearray, memoryview))
@ -90,10 +84,9 @@ async def echo(request):
@pytest.mark.parametrize("body", invalid_cloudevent_request_body)
def test_missing_required_fields_structured(body):
with pytest.raises(cloud_exceptions.MissingRequiredFields) as e:
with pytest.raises(cloud_exceptions.MissingRequiredFields):
_ = from_http(
{"Content-Type": "application/cloudevents+json"}, json.dumps(body),
{"Content-Type": "application/cloudevents+json"}, json.dumps(body)
)
@ -146,9 +139,7 @@ def test_emit_structured_event(specversion):
"specversion": specversion,
"data": test_data,
}
_, r = app.test_client.post(
"/event", headers=headers, data=json.dumps(body)
)
_, r = app.test_client.post("/event", headers=headers, data=json.dumps(body))
# Convert byte array to dict
# e.g. r.body = b'{"payload-content": "Hello World!"}'
@ -198,7 +189,6 @@ def test_missing_ce_prefix_binary_event(specversion):
"ce-specversion": specversion,
}
for key in headers:
# breaking prefix e.g. e-id instead of ce-id
prefixed_headers[key[1:]] = headers[key]
@ -220,7 +210,7 @@ def test_valid_binary_events(specversion):
headers = {
"ce-id": f"id{i}",
"ce-source": f"source{i}.com.test",
"ce-type": f"cloudevent.test.type",
"ce-type": "cloudevent.test.type",
"ce-specversion": specversion,
}
data = {"payload": f"payload-{i}"}
@ -252,7 +242,26 @@ def test_structured_to_request(specversion):
assert headers["content-type"] == "application/cloudevents+json"
for key in attributes:
assert body[key] == attributes[key]
assert body["data"] == data, f"|{body_bytes}|| {body}"
assert body["data"] == data, f"|{body_bytes!r}|| {body}"
@pytest.mark.parametrize("specversion", ["1.0", "0.3"])
def test_attributes_view_accessor(specversion: str) -> None:
attributes: dict[str, typing.Any] = {
"specversion": specversion,
"type": "word.found.name",
"id": "96fb5f0b-001e-0108-6dfe-da6e2806f124",
"source": "pytest",
}
data = {"message": "Hello World!"}
event: CloudEvent = CloudEvent(attributes, data)
event_attributes: typing.Mapping[str, typing.Any] = event.get_attributes()
assert event_attributes["specversion"] == attributes["specversion"]
assert event_attributes["type"] == attributes["type"]
assert event_attributes["id"] == attributes["id"]
assert event_attributes["source"] == attributes["source"]
assert event_attributes["time"]
@pytest.mark.parametrize("specversion", ["1.0", "0.3"])
@ -289,14 +298,14 @@ def test_empty_data_structured_event(specversion):
event = from_http(
{"content-type": "application/cloudevents+json"}, json.dumps(attributes)
)
assert event.data == None
assert event.data is None
attributes["data"] = ""
# Data of empty string will be marshalled into None
event = from_http(
{"content-type": "application/cloudevents+json"}, json.dumps(attributes)
)
assert event.data == None
assert event.data is None
@pytest.mark.parametrize("specversion", ["1.0", "0.3"])
@ -311,12 +320,12 @@ def test_empty_data_binary_event(specversion):
"ce-source": "<source-url>",
}
event = from_http(headers, None)
assert event.data == None
assert event.data is None
data = ""
# Data of empty string will be marshalled into None
event = from_http(headers, data)
assert event.data == None
assert event.data is None
@pytest.mark.parametrize("specversion", ["1.0", "0.3"])
@ -325,17 +334,17 @@ def test_valid_structured_events(specversion):
events_queue = []
num_cloudevents = 30
for i in range(num_cloudevents):
event = {
raw_event = {
"id": f"id{i}",
"source": f"source{i}.com.test",
"type": f"cloudevent.test.type",
"type": "cloudevent.test.type",
"specversion": specversion,
"data": {"payload": f"payload-{i}"},
}
events_queue.append(
from_http(
{"content-type": "application/cloudevents+json"},
json.dumps(event),
json.dumps(raw_event),
)
)
@ -365,23 +374,36 @@ def test_structured_no_content_type(specversion):
assert event.data[key] == val
def test_is_binary():
headers = {
"ce-id": "my-id",
"ce-source": "<event-source>",
"ce-type": "cloudevent.event.type",
"ce-specversion": "1.0",
"Content-Type": "text/plain",
}
assert is_binary(headers)
parameterize_binary_func = pytest.mark.parametrize(
"is_binary_func", [is_binary, deprecated_is_binary]
)
headers = {
"Content-Type": "application/cloudevents+json",
}
assert not is_binary(headers)
headers = {}
assert not is_binary(headers)
@parameterize_binary_func
def test_empty_headers_must_not_be_recognized_as_binary(is_binary_func):
assert not is_binary_func({})
@parameterize_binary_func
def test_non_binary_headers_must_not_be_recognized_as_binary(is_binary_func):
assert not is_binary_func(
{
"Content-Type": "application/cloudevents+json",
}
)
@parameterize_binary_func
def test_binary_ce_headers_must_be_recognize_as_binary(is_binary_func):
assert is_binary_func(
{
"ce-id": "my-id",
"ce-source": "<event-source>",
"ce-type": "cloudevent.event.type",
"ce-specversion": "1.0",
"Content-Type": "text/plain",
}
)
@pytest.mark.parametrize("specversion", ["1.0", "0.3"])
@ -395,10 +417,10 @@ def test_cloudevent_repr(specversion):
"ce-source": "<source-url>",
}
event = from_http(headers, "")
# Testing to make sure event is printable. I could runevent. __repr__() but
# Testing to make sure event is printable. I could run event. __repr__() but
# we had issues in the past where event.__repr__() could run but
# print(event) would fail.
print(event)
print(event) # noqa T201
@pytest.mark.parametrize("specversion", ["1.0", "0.3"])
@ -433,7 +455,7 @@ def test_invalid_data_format_structured_from_http():
headers = {"Content-Type": "application/cloudevents+json"}
data = 20
with pytest.raises(cloud_exceptions.InvalidStructuredJSON) as e:
from_http(headers, data)
from_http(headers, data) # type: ignore[arg-type] # intentionally wrong type
assert "Expected json of type (str, bytes, bytearray)" in str(e.value)
@ -445,11 +467,14 @@ def test_wrong_specversion_to_request():
assert "Unsupported specversion: 0.2" in str(e.value)
def test_is_structured():
@pytest.mark.parametrize(
"is_structured_func", [is_structured, deprecated_is_structured]
)
def test_is_structured(is_structured_func):
headers = {
"Content-Type": "application/cloudevents+json",
}
assert is_structured(headers)
assert is_structured_func(headers)
headers = {
"ce-id": "my-id",
@ -458,19 +483,15 @@ def test_is_structured():
"ce-specversion": "1.0",
"Content-Type": "text/plain",
}
assert not is_structured(headers)
assert not is_structured_func(headers)
def test_empty_json_structured():
headers = {"Content-Type": "application/cloudevents+json"}
data = ""
with pytest.raises(cloud_exceptions.MissingRequiredFields) as e:
from_http(
headers, data,
)
assert "Failed to read specversion from both headers and data" in str(
e.value
)
from_http(headers, data)
assert "Failed to read specversion from both headers and data" in str(e.value)
def test_uppercase_headers_with_none_data_binary():
@ -484,10 +505,10 @@ def test_uppercase_headers_with_none_data_binary():
for key in headers:
assert event[key.lower()[3:]] == headers[key]
assert event.data == None
assert event.data is None
_, new_data = to_binary(event)
assert new_data == None
assert new_data is None
def test_generic_exception():
@ -506,7 +527,7 @@ def test_generic_exception():
e.errisinstance(cloud_exceptions.MissingRequiredFields)
with pytest.raises(cloud_exceptions.GenericException) as e:
from_http({}, 123)
from_http({}, 123) # type: ignore[arg-type] # intentionally wrong type
e.errisinstance(cloud_exceptions.InvalidStructuredJSON)
with pytest.raises(cloud_exceptions.GenericException) as e:
@ -524,10 +545,6 @@ def test_non_dict_data_no_headers_bug():
headers = {"Content-Type": "application/cloudevents+json"}
data = "123"
with pytest.raises(cloud_exceptions.MissingRequiredFields) as e:
from_http(
headers, data,
)
assert "Failed to read specversion from both headers and data" in str(
e.value
)
from_http(headers, data)
assert "Failed to read specversion from both headers and data" in str(e.value)
assert "The following deserialized data has no 'get' method" in str(e.value)

View File

@ -0,0 +1,515 @@
# Copyright 2018-Present The CloudEvents Authors
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import base64
import datetime
import json
import pytest
from cloudevents import exceptions as cloud_exceptions
from cloudevents.abstract.event import AnyCloudEvent
from cloudevents.http import CloudEvent
from cloudevents.kafka.conversion import (
KafkaMessage,
from_binary,
from_structured,
to_binary,
to_structured,
)
from cloudevents.kafka.exceptions import KeyMapperError
from cloudevents.sdk import types
def simple_serialize(data: dict) -> bytes:
return bytes(json.dumps(data).encode("utf-8"))
def simple_deserialize(data: bytes) -> dict:
value = json.loads(data.decode())
assert isinstance(value, dict)
return value
def failing_func(*args):
raise Exception("fail")
class KafkaConversionTestBase:
expected_data = {"name": "test", "amount": 1}
expected_custom_mapped_key = "custom-key"
def custom_key_mapper(self, _: AnyCloudEvent) -> str:
return self.expected_custom_mapped_key
@pytest.fixture
def source_event(self) -> CloudEvent:
return CloudEvent.create(
attributes={
"specversion": "1.0",
"id": "1234-1234-1234",
"source": "pytest",
"type": "com.pytest.test",
"time": datetime.datetime(2000, 1, 1, 6, 42, 33).isoformat(),
"datacontenttype": "foo",
"partitionkey": "test_key_123",
},
data=self.expected_data,
)
@pytest.fixture
def custom_marshaller(self) -> types.MarshallerType:
return simple_serialize
@pytest.fixture
def custom_unmarshaller(self) -> types.UnmarshallerType:
return simple_deserialize
def test_custom_marshaller_can_talk_to_itself(
self, custom_marshaller, custom_unmarshaller
):
data = self.expected_data
marshalled = custom_marshaller(data)
unmarshalled = custom_unmarshaller(marshalled)
for k, v in data.items():
assert unmarshalled[k] == v
class TestToBinary(KafkaConversionTestBase):
def test_sets_value_default_marshaller(self, source_event):
result = to_binary(source_event)
assert result.value == json.dumps(source_event.data).encode("utf-8")
def test_sets_value_custom_marshaller(self, source_event, custom_marshaller):
result = to_binary(source_event, data_marshaller=custom_marshaller)
assert result.value == custom_marshaller(source_event.data)
def test_sets_key(self, source_event):
result = to_binary(source_event)
assert result.key == source_event["partitionkey"]
def test_key_mapper(self, source_event):
result = to_binary(source_event, key_mapper=self.custom_key_mapper)
assert result.key == self.expected_custom_mapped_key
def test_key_mapper_error(self, source_event):
with pytest.raises(KeyMapperError):
to_binary(source_event, key_mapper=failing_func)
def test_none_key(self, source_event):
source_event["partitionkey"] = None
result = to_binary(source_event)
assert result.key is None
def test_no_key(self, source_event):
del source_event["partitionkey"]
result = to_binary(source_event)
assert result.key is None
def test_sets_headers(self, source_event):
result = to_binary(source_event)
assert result.headers["ce_id"] == source_event["id"].encode("utf-8")
assert result.headers["ce_specversion"] == source_event["specversion"].encode(
"utf-8"
)
assert result.headers["ce_source"] == source_event["source"].encode("utf-8")
assert result.headers["ce_type"] == source_event["type"].encode("utf-8")
assert result.headers["ce_time"] == source_event["time"].encode("utf-8")
assert result.headers["content-type"] == source_event["datacontenttype"].encode(
"utf-8"
)
assert "data" not in result.headers
assert "partitionkey" not in result.headers
def test_raise_marshaller_exception(self, source_event):
with pytest.raises(cloud_exceptions.DataMarshallerError):
to_binary(source_event, data_marshaller=failing_func)
class TestFromBinary(KafkaConversionTestBase):
@pytest.fixture
def source_binary_json_message(self) -> KafkaMessage:
return KafkaMessage(
headers={
"ce_specversion": "1.0".encode("utf-8"),
"ce_id": "1234-1234-1234".encode("utf-8"),
"ce_source": "pytest".encode("utf-8"),
"ce_type": "com.pytest.test".encode("utf-8"),
"ce_time": datetime.datetime(2000, 1, 1, 6, 42, 33)
.isoformat()
.encode("utf-8"),
"content-type": "foo".encode("utf-8"),
},
value=json.dumps(self.expected_data).encode("utf-8"),
key="test_key_123",
)
@pytest.fixture
def source_binary_bytes_message(self) -> KafkaMessage:
return KafkaMessage(
headers={
"ce_specversion": "1.0".encode("utf-8"),
"ce_id": "1234-1234-1234".encode("utf-8"),
"ce_source": "pytest".encode("utf-8"),
"ce_type": "com.pytest.test".encode("utf-8"),
"ce_time": datetime.datetime(2000, 1, 1, 6, 42, 33)
.isoformat()
.encode("utf-8"),
"datacontenttype": "foo".encode("utf-8"),
},
value=simple_serialize(self.expected_data),
key="test_key_123",
)
def test_default_marshaller(self, source_binary_json_message):
result = from_binary(source_binary_json_message)
assert result.data == json.loads(source_binary_json_message.value.decode())
def test_custom_marshaller(self, source_binary_bytes_message, custom_unmarshaller):
result = from_binary(
source_binary_bytes_message, data_unmarshaller=custom_unmarshaller
)
assert result.data == custom_unmarshaller(source_binary_bytes_message.value)
def test_sets_key(self, source_binary_json_message):
result = from_binary(source_binary_json_message)
assert result["partitionkey"] == source_binary_json_message.key
def test_no_key(self, source_binary_json_message):
keyless_message = KafkaMessage(
headers=source_binary_json_message.headers,
key=None,
value=source_binary_json_message.value,
)
result = from_binary(keyless_message)
assert "partitionkey" not in result.get_attributes()
def test_sets_attrs_from_headers(self, source_binary_json_message):
result = from_binary(source_binary_json_message)
assert result["id"] == source_binary_json_message.headers["ce_id"].decode()
assert (
result["specversion"]
== source_binary_json_message.headers["ce_specversion"].decode()
)
assert (
result["source"] == source_binary_json_message.headers["ce_source"].decode()
)
assert result["type"] == source_binary_json_message.headers["ce_type"].decode()
assert result["time"] == source_binary_json_message.headers["ce_time"].decode()
assert (
result["datacontenttype"]
== source_binary_json_message.headers["content-type"].decode()
)
def test_unmarshaller_exception(self, source_binary_json_message):
with pytest.raises(cloud_exceptions.DataUnmarshallerError):
from_binary(source_binary_json_message, data_unmarshaller=failing_func)
class TestToFromBinary(KafkaConversionTestBase):
def test_can_talk_to_itself(self, source_event):
message = to_binary(source_event)
event = from_binary(message)
for key, val in source_event.get_attributes().items():
assert event[key] == val
for key, val in source_event.data.items():
assert event.data[key] == val
def test_can_talk_to_itself_custom_marshaller(
self, source_event, custom_marshaller, custom_unmarshaller
):
message = to_binary(source_event, data_marshaller=custom_marshaller)
event = from_binary(message, data_unmarshaller=custom_unmarshaller)
for key, val in source_event.get_attributes().items():
assert event[key] == val
for key, val in source_event.data.items():
assert event.data[key] == val
class TestToStructured(KafkaConversionTestBase):
def test_sets_value_default_marshallers(self, source_event):
result = to_structured(source_event)
assert result.value == json.dumps(
{
"specversion": source_event["specversion"],
"id": source_event["id"],
"source": source_event["source"],
"type": source_event["type"],
"time": source_event["time"],
"partitionkey": source_event["partitionkey"],
"data": self.expected_data,
}
).encode("utf-8")
def test_sets_value_custom_data_marshaller_default_envelope(
self, source_event, custom_marshaller
):
result = to_structured(source_event, data_marshaller=custom_marshaller)
assert result.value == json.dumps(
{
"specversion": source_event["specversion"],
"id": source_event["id"],
"source": source_event["source"],
"type": source_event["type"],
"time": source_event["time"],
"partitionkey": source_event["partitionkey"],
"data_base64": base64.b64encode(
custom_marshaller(self.expected_data)
).decode("ascii"),
}
).encode("utf-8")
def test_sets_value_custom_envelope_marshaller(
self, source_event, custom_marshaller
):
result = to_structured(source_event, envelope_marshaller=custom_marshaller)
assert result.value == custom_marshaller(
{
"specversion": source_event["specversion"],
"id": source_event["id"],
"source": source_event["source"],
"type": source_event["type"],
"time": source_event["time"],
"partitionkey": source_event["partitionkey"],
"data": self.expected_data,
}
)
def test_sets_value_custom_marshallers(self, source_event, custom_marshaller):
result = to_structured(
source_event,
data_marshaller=custom_marshaller,
envelope_marshaller=custom_marshaller,
)
assert result.value == custom_marshaller(
{
"specversion": source_event["specversion"],
"id": source_event["id"],
"source": source_event["source"],
"type": source_event["type"],
"time": source_event["time"],
"partitionkey": source_event["partitionkey"],
"data_base64": base64.b64encode(
custom_marshaller(self.expected_data)
).decode("ascii"),
}
)
def test_sets_key(self, source_event):
result = to_structured(source_event)
assert result.key == source_event["partitionkey"]
def test_key_mapper(self, source_event):
result = to_structured(source_event, key_mapper=self.custom_key_mapper)
assert result.key == self.expected_custom_mapped_key
def test_key_mapper_error(self, source_event):
with pytest.raises(KeyMapperError):
to_structured(source_event, key_mapper=failing_func)
def test_none_key(self, source_event):
source_event["partitionkey"] = None
result = to_structured(source_event)
assert result.key is None
def test_no_key(self, source_event):
del source_event["partitionkey"]
result = to_structured(source_event)
assert result.key is None
def test_sets_headers(self, source_event):
result = to_structured(source_event)
assert len(result.headers) == 1
assert result.headers["content-type"] == source_event["datacontenttype"].encode(
"utf-8"
)
def test_datamarshaller_exception(self, source_event):
with pytest.raises(cloud_exceptions.DataMarshallerError):
to_structured(source_event, data_marshaller=failing_func)
def test_envelope_datamarshaller_exception(self, source_event):
with pytest.raises(cloud_exceptions.DataMarshallerError):
to_structured(source_event, envelope_marshaller=failing_func)
class TestToFromStructured(KafkaConversionTestBase):
def test_can_talk_to_itself(self, source_event):
message = to_structured(source_event)
event = from_structured(message)
for key, val in source_event.get_attributes().items():
assert event[key] == val
for key, val in source_event.data.items():
assert event.data[key] == val
class TestFromStructured(KafkaConversionTestBase):
@pytest.fixture
def source_structured_json_message(self) -> KafkaMessage:
return KafkaMessage(
headers={
"content-type": "foo".encode("utf-8"),
},
value=json.dumps(
{
"specversion": "1.0",
"id": "1234-1234-1234",
"source": "pytest",
"type": "com.pytest.test",
"time": datetime.datetime(2000, 1, 1, 6, 42, 33).isoformat(),
"partitionkey": "test_key_123",
"data": self.expected_data,
}
).encode("utf-8"),
key="test_key_123",
)
@pytest.fixture
def source_structured_json_bytes_message(self) -> KafkaMessage:
return KafkaMessage(
headers={
"content-type": "foo".encode("utf-8"),
},
value=json.dumps(
{
"specversion": "1.0",
"id": "1234-1234-1234",
"source": "pytest",
"type": "com.pytest.test",
"time": datetime.datetime(2000, 1, 1, 6, 42, 33).isoformat(),
"partitionkey": "test_key_123",
"data_base64": base64.b64encode(
simple_serialize(self.expected_data)
).decode("ascii"),
}
).encode("utf-8"),
key="test_key_123",
)
@pytest.fixture
def source_structured_bytes_bytes_message(self) -> KafkaMessage:
return KafkaMessage(
headers={
"content-type": "foo".encode("utf-8"),
},
value=simple_serialize(
{
"specversion": "1.0",
"id": "1234-1234-1234",
"source": "pytest",
"type": "com.pytest.test",
"time": datetime.datetime(2000, 1, 1, 6, 42, 33).isoformat(),
"partitionkey": "test_key_123",
"data_base64": base64.b64encode(
simple_serialize(self.expected_data)
).decode("ascii"),
}
),
key="test_key_123",
)
def test_sets_data_default_data_unmarshaller(
self,
source_structured_json_message,
):
result = from_structured(source_structured_json_message)
assert result.data == self.expected_data
def test_sets_data_custom_data_unmarshaller(
self, source_structured_json_bytes_message, custom_unmarshaller
):
result = from_structured(
source_structured_json_bytes_message, data_unmarshaller=custom_unmarshaller
)
assert result.data == self.expected_data
def test_sets_data_custom_unmarshallers(
self, source_structured_bytes_bytes_message, custom_unmarshaller
):
result = from_structured(
source_structured_bytes_bytes_message,
data_unmarshaller=custom_unmarshaller,
envelope_unmarshaller=custom_unmarshaller,
)
assert result.data == self.expected_data
def test_sets_attrs_default_enveloper_unmarshaller(
self,
source_structured_json_message,
):
result = from_structured(source_structured_json_message)
for key, value in json.loads(
source_structured_json_message.value.decode()
).items():
if key != "data":
assert result[key] == value
def test_sets_attrs_custom_enveloper_unmarshaller(
self,
source_structured_bytes_bytes_message,
custom_unmarshaller,
):
result = from_structured(
source_structured_bytes_bytes_message,
data_unmarshaller=custom_unmarshaller,
envelope_unmarshaller=custom_unmarshaller,
)
for key, value in custom_unmarshaller(
source_structured_bytes_bytes_message.value
).items():
if key not in ["data_base64"]:
assert result[key] == value
def test_sets_content_type_default_envelope_unmarshaller(
self,
source_structured_json_message,
):
result = from_structured(source_structured_json_message)
assert (
result["datacontenttype"]
== source_structured_json_message.headers["content-type"].decode()
)
def test_sets_content_type_custom_envelope_unmarshaller(
self, source_structured_bytes_bytes_message, custom_unmarshaller
):
result = from_structured(
source_structured_bytes_bytes_message,
data_unmarshaller=custom_unmarshaller,
envelope_unmarshaller=custom_unmarshaller,
)
assert (
result["datacontenttype"]
== source_structured_bytes_bytes_message.headers["content-type"].decode()
)
def test_data_unmarshaller_exception(
self, source_structured_bytes_bytes_message, custom_unmarshaller
):
with pytest.raises(cloud_exceptions.DataUnmarshallerError):
from_structured(
source_structured_bytes_bytes_message,
data_unmarshaller=failing_func,
envelope_unmarshaller=custom_unmarshaller,
)
def test_envelope_unmarshaller_exception(
self,
source_structured_bytes_bytes_message,
):
with pytest.raises(cloud_exceptions.DataUnmarshallerError):
from_structured(
source_structured_bytes_bytes_message,
envelope_unmarshaller=failing_func,
)

View File

@ -1,4 +1,4 @@
# All Rights Reserved.
# Copyright 2018-Present The CloudEvents Authors
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
@ -18,7 +18,7 @@ import pytest
import cloudevents.exceptions as cloud_exceptions
from cloudevents.http import CloudEvent, from_http, to_binary, to_structured
from cloudevents.sdk import converters, exceptions, marshaller
from cloudevents.sdk import exceptions, marshaller
from cloudevents.sdk.converters import binary, structured
from cloudevents.sdk.event import v1
@ -49,34 +49,30 @@ def structured_data():
def test_from_request_wrong_unmarshaller():
with pytest.raises(exceptions.InvalidDataUnmarshaller):
m = marshaller.NewDefaultHTTPMarshaller()
_ = m.FromRequest(v1.Event(), {}, "", None)
_ = m.FromRequest(
event=v1.Event(), headers={}, body="", data_unmarshaller=object() # type: ignore[arg-type] # intentionally wrong type # noqa: E501
)
def test_to_request_wrong_marshaller():
with pytest.raises(exceptions.InvalidDataMarshaller):
m = marshaller.NewDefaultHTTPMarshaller()
_ = m.ToRequest(v1.Event(), data_marshaller="")
_ = m.ToRequest(v1.Event(), data_marshaller="") # type: ignore[arg-type] # intentionally wrong type # noqa: E501
def test_from_request_cannot_read(binary_headers):
with pytest.raises(exceptions.UnsupportedEventConverter):
m = marshaller.HTTPMarshaller(
[binary.NewBinaryHTTPCloudEventConverter(),]
)
m = marshaller.HTTPMarshaller([binary.NewBinaryHTTPCloudEventConverter()])
m.FromRequest(v1.Event(), {}, "")
with pytest.raises(exceptions.UnsupportedEventConverter):
m = marshaller.HTTPMarshaller(
[structured.NewJSONHTTPCloudEventConverter()]
)
m = marshaller.HTTPMarshaller([structured.NewJSONHTTPCloudEventConverter()])
m.FromRequest(v1.Event(), binary_headers, "")
def test_to_request_invalid_converter():
with pytest.raises(exceptions.NoSuchConverter):
m = marshaller.HTTPMarshaller(
[structured.NewJSONHTTPCloudEventConverter()]
)
m = marshaller.HTTPMarshaller([structured.NewJSONHTTPCloudEventConverter()])
m.ToRequest(v1.Event(), "")

View File

@ -1,4 +1,4 @@
# All Rights Reserved.
# Copyright 2018-Present The CloudEvents Authors
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain

View File

@ -0,0 +1,395 @@
# Copyright 2018-Present The CloudEvents Authors
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import datetime
from json import loads
import pytest
from pydantic import ValidationError as PydanticV2ValidationError
from pydantic.v1 import ValidationError as PydanticV1ValidationError
from cloudevents.conversion import _json_or_string
from cloudevents.exceptions import IncompatibleArgumentsError
from cloudevents.pydantic.v1.event import CloudEvent as PydanticV1CloudEvent
from cloudevents.pydantic.v2.event import CloudEvent as PydanticV2CloudEvent
from cloudevents.sdk.event.attribute import SpecVersion
_DUMMY_SOURCE = "dummy:source"
_DUMMY_TYPE = "tests.cloudevents.override"
_DUMMY_TIME = "2022-07-16T11:20:34.284130+00:00"
_DUMMY_ID = "my-id"
@pytest.fixture(params=["0.3", "1.0"])
def specversion(request):
return request.param
_pydantic_implementation = {
"v1": {
"event": PydanticV1CloudEvent,
"validation_error": PydanticV1ValidationError,
"pydantic_version": "v1",
},
"v2": {
"event": PydanticV2CloudEvent,
"validation_error": PydanticV2ValidationError,
"pydantic_version": "v2",
},
}
@pytest.fixture(params=["v1", "v2"])
def cloudevents_implementation(request):
return _pydantic_implementation[request.param]
@pytest.fixture()
def dummy_attributes(specversion):
return {
"source": _DUMMY_SOURCE,
"specversion": specversion,
"id": _DUMMY_ID,
"time": _DUMMY_TIME,
"type": _DUMMY_TYPE,
"datacontenttype": "application/json",
"subject": "my-subject",
"dataschema": "myschema:dummy",
}
@pytest.fixture()
def my_dummy_data():
return '{"name":"john"}'
@pytest.fixture()
def your_dummy_data():
return '{"name":"paul"}'
@pytest.fixture()
def dummy_event(dummy_attributes, my_dummy_data, cloudevents_implementation):
return cloudevents_implementation["event"](
attributes=dummy_attributes, data=my_dummy_data
)
@pytest.fixture()
def non_exiting_attribute_name(dummy_event):
result = "nonexisting"
assert result not in dummy_event
return result
def test_pydantic_cloudevent_equality(
dummy_attributes, my_dummy_data, your_dummy_data, cloudevents_implementation
):
data = my_dummy_data
event1 = cloudevents_implementation["event"](dummy_attributes, data)
event2 = cloudevents_implementation["event"](dummy_attributes, data)
assert event1 == event2
# Test different attributes
for key in dummy_attributes:
if key in ("specversion", "time", "datacontenttype", "dataschema"):
continue
else:
dummy_attributes[key] = f"noise-{key}"
event3 = cloudevents_implementation["event"](dummy_attributes, data)
event2 = cloudevents_implementation["event"](dummy_attributes, data)
assert event2 == event3
assert event1 != event2 and event3 != event1
# Test different data
data = your_dummy_data
event3 = cloudevents_implementation["event"](dummy_attributes, data)
event2 = cloudevents_implementation["event"](dummy_attributes, data)
assert event2 == event3
assert event1 != event2 and event3 != event1
@pytest.mark.parametrize(
"non_cloudevent_value",
(
1,
None,
object(),
"Hello World",
),
)
def test_http_cloudevent_must_not_equal_to_non_cloudevent_value(
dummy_event, non_cloudevent_value
):
assert not dummy_event == non_cloudevent_value
def test_http_cloudevent_mutates_equality(
dummy_attributes, my_dummy_data, your_dummy_data, cloudevents_implementation
):
data = my_dummy_data
event1 = cloudevents_implementation["event"](dummy_attributes, data)
event2 = cloudevents_implementation["event"](dummy_attributes, data)
event3 = cloudevents_implementation["event"](dummy_attributes, data)
assert event1 == event2
# Test different attributes
for key in dummy_attributes:
if key in ("specversion", "time", "datacontenttype"):
continue
else:
event2[key] = f"noise-{key}"
event3[key] = f"noise-{key}"
assert event2 == event3
assert event1 != event2 and event3 != event1
# Test different data
event2.data = your_dummy_data
event3.data = your_dummy_data
assert event2 == event3
assert event1 != event2 and event3 != event1
def test_cloudevent_missing_specversion(cloudevents_implementation):
errors = {
"v1": "value is not a valid enumeration member; permitted: '0.3', '1.0'",
"v2": "Input should be '0.3' or '1.0'",
}
attributes = {"specversion": "0.2", "source": "s", "type": "t"}
with pytest.raises(cloudevents_implementation["validation_error"]) as e:
_ = cloudevents_implementation["event"](attributes, None)
assert errors[cloudevents_implementation["pydantic_version"]] in str(e.value)
def test_cloudevent_missing_minimal_required_fields(cloudevents_implementation):
attributes = {"type": "t"}
errors = {
"v1": "\nsource\n field required ",
"v2": "\nsource\n Field required ",
}
with pytest.raises(cloudevents_implementation["validation_error"]) as e:
_ = cloudevents_implementation["event"](attributes, None)
assert errors[cloudevents_implementation["pydantic_version"]] in str(e.value)
attributes = {"source": "s"}
errors = {
"v1": "\ntype\n field required ",
"v2": "\ntype\n Field required ",
}
with pytest.raises(cloudevents_implementation["validation_error"]) as e:
_ = cloudevents_implementation["event"](attributes, None)
assert errors[cloudevents_implementation["pydantic_version"]] in str(e.value)
def test_cloudevent_general_overrides(cloudevents_implementation):
event = cloudevents_implementation["event"](
{
"source": "my-source",
"type": "com.test.overrides",
"subject": "my-subject",
},
None,
)
expected_attributes = [
"time",
"source",
"id",
"specversion",
"type",
"subject",
"datacontenttype",
"dataschema",
]
assert len(event) == len(expected_attributes)
for attribute in expected_attributes:
assert attribute in event
del event[attribute]
assert len(event) == 0
def test_none_json_or_string():
assert _json_or_string(None) is None
def test_get_operation_on_non_existing_attribute_must_not_raise_exception(
dummy_event, non_exiting_attribute_name
):
dummy_event.get(non_exiting_attribute_name)
def test_get_must_return_attribute_value_if_exists(dummy_event):
assert dummy_event.get("source") == dummy_event["source"]
def test_get_operation_on_non_existing_attribute_must_return_none_by_default(
dummy_event, non_exiting_attribute_name
):
assert dummy_event.get(non_exiting_attribute_name) is None
def test_get_operation_on_non_existing_attribute_must_return_default_value_if_given(
dummy_event, non_exiting_attribute_name
):
dummy_value = "Hello World"
assert dummy_event.get(non_exiting_attribute_name, dummy_value) == dummy_value
def test_get_operation_on_non_existing_attribute_should_not_copy_default_value(
dummy_event, non_exiting_attribute_name
):
dummy_value = object()
assert dummy_event.get(non_exiting_attribute_name, dummy_value) is dummy_value
@pytest.mark.xfail() # https://github.com/cloudevents/sdk-python/issues/185
def test_json_data_serialization_without_explicit_type(cloudevents_implementation):
assert loads(
cloudevents_implementation["event"](
source=_DUMMY_SOURCE, type=_DUMMY_TYPE, data='{"hello": "world"}'
).json()
)["data"] == {"hello": "world"}
@pytest.mark.xfail() # https://github.com/cloudevents/sdk-python/issues/185
@pytest.mark.parametrize(
"json_content_type",
[
"application/json",
"application/ld+json",
"application/x-my-custom-type+json",
"text/html+json",
],
)
def test_json_data_serialization_with_explicit_json_content_type(
dummy_attributes, json_content_type, cloudevents_implementation
):
dummy_attributes["datacontenttype"] = json_content_type
assert loads(
cloudevents_implementation["event"](
dummy_attributes,
data='{"hello": "world"}',
).json()
)["data"] == {"hello": "world"}
_NON_JSON_CONTENT_TYPES = [
pytest.param("video/mp2t", id="MPEG transport stream"),
pytest.param("text/plain", id="Text, (generally ASCII or ISO 8859-n)"),
pytest.param("application/vnd.visio", id="Microsoft Visio"),
pytest.param("audio/wav", id="Waveform Audio Format"),
pytest.param("audio/webm", id="WEBM audio"),
pytest.param("video/webm", id="WEBM video"),
pytest.param("image/webp", id="WEBP image"),
pytest.param("application/gzip", id="GZip Compressed Archive"),
pytest.param("image/gif", id="Graphics Interchange Format (GIF)"),
pytest.param("text/html", id="HyperText Markup Language (HTML)"),
pytest.param("image/vnd.microsoft.icon", id="Icon format"),
pytest.param("text/calendar", id="iCalendar format"),
pytest.param("application/java-archive", id="Java Archive (JAR)"),
pytest.param("image/jpeg", id="JPEG images"),
]
@pytest.mark.parametrize("datacontenttype", _NON_JSON_CONTENT_TYPES)
def test_json_data_serialization_with_explicit_non_json_content_type(
dummy_attributes, datacontenttype, cloudevents_implementation
):
dummy_attributes["datacontenttype"] = datacontenttype
event = cloudevents_implementation["event"](
dummy_attributes,
data='{"hello": "world"}',
).json()
assert loads(event)["data"] == '{"hello": "world"}'
@pytest.mark.parametrize("datacontenttype", _NON_JSON_CONTENT_TYPES)
def test_binary_data_serialization(
dummy_attributes, datacontenttype, cloudevents_implementation
):
dummy_attributes["datacontenttype"] = datacontenttype
event = cloudevents_implementation["event"](
dummy_attributes,
data=b"\x00\x00\x11Hello World",
).json()
result_json = loads(event)
assert result_json["data_base64"] == "AAARSGVsbG8gV29ybGQ="
assert "data" not in result_json
def test_binary_data_deserialization(cloudevents_implementation):
given = (
b'{"source": "dummy:source", "id": "11775cb2-fd00-4487-a18b-30c3600eaa5f",'
b' "type": "dummy.type", "specversion": "1.0", "time":'
b' "2022-07-16T12:03:20.519216+00:00", "subject": null, "datacontenttype":'
b' "application/octet-stream", "dataschema": null, "data_base64":'
b' "AAARSGVsbG8gV29ybGQ="}'
)
expected = {
"data": b"\x00\x00\x11Hello World",
"datacontenttype": "application/octet-stream",
"dataschema": None,
"id": "11775cb2-fd00-4487-a18b-30c3600eaa5f",
"source": "dummy:source",
"specversion": SpecVersion.v1_0,
"subject": None,
"time": datetime.datetime(
2022, 7, 16, 12, 3, 20, 519216, tzinfo=datetime.timezone.utc
),
"type": "dummy.type",
}
assert cloudevents_implementation["event"].parse_raw(given).dict() == expected
if cloudevents_implementation["pydantic_version"] == "v2":
assert (
cloudevents_implementation["event"].model_validate_json(given).dict()
== expected
)
def test_access_data_event_attribute_should_raise_key_error(dummy_event):
with pytest.raises(KeyError):
dummy_event["data"]
def test_delete_data_event_attribute_should_raise_key_error(dummy_event):
with pytest.raises(KeyError):
del dummy_event["data"]
def test_setting_data_attribute_should_not_affect_actual_data(dummy_event):
my_data = object()
dummy_event["data"] = my_data
assert dummy_event.data != my_data
def test_event_length(dummy_event, dummy_attributes):
assert len(dummy_event) == len(dummy_attributes)
def test_access_data_attribute_with_get_should_return_default(dummy_event):
default = object()
assert dummy_event.get("data", default) is default
def test_pydantic_repr_should_contain_attributes_and_data(dummy_event):
assert "attributes" in repr(dummy_event)
assert "data" in repr(dummy_event)
def test_data_must_never_exist_as_an_attribute_name(dummy_event):
assert "data" not in dummy_event
def test_attributes_and_kwards_are_incompatible(cloudevents_implementation):
with pytest.raises(IncompatibleArgumentsError):
cloudevents_implementation["event"]({"a": "b"}, other="hello world")

View File

@ -0,0 +1,179 @@
# Copyright 2018-Present The CloudEvents Authors
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import base64
import datetime
import json
import pytest
from pydantic import ValidationError as PydanticV2ValidationError
from pydantic.v1 import ValidationError as PydanticV1ValidationError
from cloudevents.conversion import to_json
from cloudevents.pydantic.v1.conversion import from_dict as pydantic_v1_from_dict
from cloudevents.pydantic.v1.conversion import from_json as pydantic_v1_from_json
from cloudevents.pydantic.v1.event import CloudEvent as PydanticV1CloudEvent
from cloudevents.pydantic.v2.conversion import from_dict as pydantic_v2_from_dict
from cloudevents.pydantic.v2.conversion import from_json as pydantic_v2_from_json
from cloudevents.pydantic.v2.event import CloudEvent as PydanticV2CloudEvent
from cloudevents.sdk.event.attribute import SpecVersion
test_data = json.dumps({"data-key": "val"})
test_attributes = {
"type": "com.example.string",
"source": "https://example.com/event-producer",
"extension-attribute": "extension-attribute-test-value",
}
_pydantic_implementation = {
"v1": {
"event": PydanticV1CloudEvent,
"validation_error": PydanticV1ValidationError,
"from_dict": pydantic_v1_from_dict,
"from_json": pydantic_v1_from_json,
"pydantic_version": "v1",
},
"v2": {
"event": PydanticV2CloudEvent,
"validation_error": PydanticV2ValidationError,
"from_dict": pydantic_v2_from_dict,
"from_json": pydantic_v2_from_json,
"pydantic_version": "v2",
},
}
@pytest.fixture(params=["v1", "v2"])
def cloudevents_implementation(request):
return _pydantic_implementation[request.param]
@pytest.mark.parametrize("specversion", ["0.3", "1.0"])
def test_to_json(specversion, cloudevents_implementation):
event = cloudevents_implementation["event"](test_attributes, test_data)
event_json = to_json(event)
event_dict = json.loads(event_json)
for key, val in test_attributes.items():
assert event_dict[key] == val
assert event_dict["data"] == test_data
@pytest.mark.parametrize("specversion", ["0.3", "1.0"])
def test_to_json_base64(specversion, cloudevents_implementation):
data = b"test123"
event = cloudevents_implementation["event"](test_attributes, data)
event_json = to_json(event)
event_dict = json.loads(event_json)
for key, val in test_attributes.items():
assert event_dict[key] == val
# test data was properly marshalled into data_base64
data_base64 = event_dict["data_base64"].encode()
test_data_base64 = base64.b64encode(data)
assert data_base64 == test_data_base64
@pytest.mark.parametrize("specversion", ["0.3", "1.0"])
def test_from_json(specversion, cloudevents_implementation):
payload = {
"type": "com.example.string",
"source": "https://example.com/event-producer",
"id": "1234",
"specversion": specversion,
"data": {"data-key": "val"},
}
event = cloudevents_implementation["from_json"](json.dumps(payload))
for key, val in payload.items():
if key == "data":
assert event.data == payload["data"]
else:
assert event[key] == val
@pytest.mark.parametrize("specversion", ["0.3", "1.0"])
def test_from_json_base64(specversion, cloudevents_implementation):
# Create base64 encoded data
raw_data = {"data-key": "val"}
data = json.dumps(raw_data).encode()
data_base64_str = base64.b64encode(data).decode()
# Create json payload
payload = {
"type": "com.example.string",
"source": "https://example.com/event-producer",
"id": "1234",
"specversion": specversion,
"data_base64": data_base64_str,
}
payload_json = json.dumps(payload)
# Create event
event = cloudevents_implementation["from_json"](payload_json)
# Test fields were marshalled properly
for key, val in payload.items():
if key == "data_base64":
# Check data_base64 was unmarshalled properly
assert event.data == raw_data
else:
assert event[key] == val
@pytest.mark.parametrize("specversion", ["0.3", "1.0"])
def test_json_can_talk_to_itself(specversion, cloudevents_implementation):
event = cloudevents_implementation["event"](test_attributes, test_data)
event_json = to_json(event)
event = cloudevents_implementation["from_json"](event_json)
for key, val in test_attributes.items():
assert event[key] == val
assert event.data == test_data
@pytest.mark.parametrize("specversion", ["0.3", "1.0"])
def test_json_can_talk_to_itself_base64(specversion, cloudevents_implementation):
data = b"test123"
event = cloudevents_implementation["event"](test_attributes, data)
event_json = to_json(event)
event = cloudevents_implementation["from_json"](event_json)
for key, val in test_attributes.items():
assert event[key] == val
assert event.data == data
def test_from_dict(cloudevents_implementation):
given = {
"data": b"\x00\x00\x11Hello World",
"datacontenttype": "application/octet-stream",
"dataschema": None,
"id": "11775cb2-fd00-4487-a18b-30c3600eaa5f",
"source": "dummy:source",
"specversion": SpecVersion.v1_0,
"subject": None,
"time": datetime.datetime(
2022, 7, 16, 12, 3, 20, 519216, tzinfo=datetime.timezone.utc
),
"type": "dummy.type",
}
assert cloudevents_implementation["from_dict"](given).dict() == given

View File

@ -0,0 +1,654 @@
# Copyright 2018-Present The CloudEvents Authors
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from __future__ import annotations
import bz2
import io
import json
import typing
import pytest
from pydantic import ValidationError as PydanticV2ValidationError
from pydantic.v1 import ValidationError as PydanticV1ValidationError
from sanic import Sanic, response
import cloudevents.exceptions as cloud_exceptions
from cloudevents.conversion import to_binary, to_structured
from cloudevents.pydantic.v1.conversion import from_http as pydantic_v1_from_http
from cloudevents.pydantic.v1.event import CloudEvent as PydanticV1CloudEvent
from cloudevents.pydantic.v2.conversion import from_http as pydantic_v2_from_http
from cloudevents.pydantic.v2.event import CloudEvent as PydanticV2CloudEvent
from cloudevents.sdk import converters, types
from cloudevents.sdk.converters.binary import is_binary
from cloudevents.sdk.converters.structured import is_structured
if typing.TYPE_CHECKING:
from typing_extensions import TypeAlias
invalid_test_headers = [
{
"ce-source": "<event-source>",
"ce-type": "cloudevent.event.type",
"ce-specversion": "1.0",
},
{
"ce-id": "my-id",
"ce-type": "cloudevent.event.type",
"ce-specversion": "1.0",
},
{"ce-id": "my-id", "ce-source": "<event-source>", "ce-specversion": "1.0"},
{
"ce-id": "my-id",
"ce-source": "<event-source>",
"ce-type": "cloudevent.event.type",
},
]
invalid_cloudevent_request_body = [
{
"source": "<event-source>",
"type": "cloudevent.event.type",
"specversion": "1.0",
},
{"id": "my-id", "type": "cloudevent.event.type", "specversion": "1.0"},
{"id": "my-id", "source": "<event-source>", "specversion": "1.0"},
{
"id": "my-id",
"source": "<event-source>",
"type": "cloudevent.event.type",
},
]
test_data = {"payload-content": "Hello World!"}
app = Sanic("test_pydantic_http_events")
AnyPydanticCloudEvent: TypeAlias = typing.Union[
PydanticV1CloudEvent, PydanticV2CloudEvent
]
class FromHttpFn(typing.Protocol):
def __call__(
self,
headers: typing.Dict[str, str],
data: typing.Optional[typing.AnyStr],
data_unmarshaller: typing.Optional[types.UnmarshallerType] = None,
) -> AnyPydanticCloudEvent:
pass
class PydanticImplementation(typing.TypedDict):
event: typing.Type[AnyPydanticCloudEvent]
validation_error: typing.Type[Exception]
from_http: FromHttpFn
pydantic_version: typing.Literal["v1", "v2"]
_pydantic_implementation: typing.Mapping[str, PydanticImplementation] = {
"v1": {
"event": PydanticV1CloudEvent,
"validation_error": PydanticV1ValidationError,
"from_http": pydantic_v1_from_http,
"pydantic_version": "v1",
},
"v2": {
"event": PydanticV2CloudEvent,
"validation_error": PydanticV2ValidationError,
"from_http": pydantic_v2_from_http,
"pydantic_version": "v2",
},
}
@pytest.fixture(params=["v1", "v2"])
def cloudevents_implementation(
request: pytest.FixtureRequest,
) -> PydanticImplementation:
return _pydantic_implementation[request.param]
@app.route("/event/<pydantic_version>", ["POST"])
async def echo(request, pydantic_version):
decoder = None
if "binary-payload" in request.headers:
decoder = lambda x: x
event = _pydantic_implementation[pydantic_version]["from_http"](
dict(request.headers), request.body, data_unmarshaller=decoder
)
data = (
event.data
if isinstance(event.data, (bytes, bytearray, memoryview))
else json.dumps(event.data).encode()
)
return response.raw(data, headers={k: event[k] for k in event})
@pytest.mark.parametrize("body", invalid_cloudevent_request_body)
def test_missing_required_fields_structured(
body: dict, cloudevents_implementation: PydanticImplementation
) -> None:
with pytest.raises(cloud_exceptions.MissingRequiredFields):
_ = cloudevents_implementation["from_http"](
{"Content-Type": "application/cloudevents+json"}, json.dumps(body)
)
@pytest.mark.parametrize("headers", invalid_test_headers)
def test_missing_required_fields_binary(
headers: dict, cloudevents_implementation: PydanticImplementation
) -> None:
with pytest.raises(cloud_exceptions.MissingRequiredFields):
_ = cloudevents_implementation["from_http"](headers, json.dumps(test_data))
@pytest.mark.parametrize("headers", invalid_test_headers)
def test_missing_required_fields_empty_data_binary(
headers: dict, cloudevents_implementation: PydanticImplementation
) -> None:
# Test for issue #115
with pytest.raises(cloud_exceptions.MissingRequiredFields):
_ = cloudevents_implementation["from_http"](headers, None)
@pytest.mark.parametrize("specversion", ["1.0", "0.3"])
def test_emit_binary_event(
specversion: str, cloudevents_implementation: PydanticImplementation
) -> None:
headers = {
"ce-id": "my-id",
"ce-source": "<event-source>",
"ce-type": "cloudevent.event.type",
"ce-specversion": specversion,
"Content-Type": "text/plain",
}
data = json.dumps(test_data)
_, r = app.test_client.post(
f"/event/{cloudevents_implementation['pydantic_version']}",
headers=headers,
data=data,
)
# Convert byte array to dict
# e.g. r.body = b'{"payload-content": "Hello World!"}'
body = json.loads(r.body.decode("utf-8"))
# Check response fields
for key in test_data:
assert body[key] == test_data[key], body
for key in headers:
if key != "Content-Type":
attribute_key = key[3:]
assert r.headers[attribute_key] == headers[key]
assert r.status_code == 200
@pytest.mark.parametrize("specversion", ["1.0", "0.3"])
def test_emit_structured_event(
specversion: str, cloudevents_implementation: PydanticImplementation
) -> None:
headers = {"Content-Type": "application/cloudevents+json"}
body = {
"id": "my-id",
"source": "<event-source>",
"type": "cloudevent.event.type",
"specversion": specversion,
"data": test_data,
}
_, r = app.test_client.post(
f"/event/{cloudevents_implementation['pydantic_version']}",
headers=headers,
data=json.dumps(body),
)
# Convert byte array to dict
# e.g. r.body = b'{"payload-content": "Hello World!"}'
body = json.loads(r.body.decode("utf-8"))
# Check response fields
for key in test_data:
assert body[key] == test_data[key]
assert r.status_code == 200
@pytest.mark.parametrize(
"converter", [converters.TypeBinary, converters.TypeStructured]
)
@pytest.mark.parametrize("specversion", ["1.0", "0.3"])
def test_roundtrip_non_json_event(
converter: str,
specversion: str,
cloudevents_implementation: PydanticImplementation,
) -> None:
input_data = io.BytesIO()
for _ in range(100):
for j in range(20):
assert 1 == input_data.write(j.to_bytes(1, byteorder="big"))
compressed_data = bz2.compress(input_data.getvalue())
attrs = {"source": "test", "type": "t"}
event = cloudevents_implementation["event"](attrs, compressed_data)
if converter == converters.TypeStructured:
headers, data = to_structured(event, data_marshaller=lambda x: x)
elif converter == converters.TypeBinary:
headers, data = to_binary(event, data_marshaller=lambda x: x)
headers["binary-payload"] = "true" # Decoding hint for server
_, r = app.test_client.post(
f"/event/{cloudevents_implementation['pydantic_version']}",
headers=headers,
data=data,
)
assert r.status_code == 200
for key in attrs:
assert r.headers[key] == attrs[key]
assert compressed_data == r.body, r.body
@pytest.mark.parametrize("specversion", ["1.0", "0.3"])
def test_missing_ce_prefix_binary_event(
specversion: str, cloudevents_implementation: PydanticImplementation
) -> None:
prefixed_headers = {}
headers = {
"ce-id": "my-id",
"ce-source": "<event-source>",
"ce-type": "cloudevent.event.type",
"ce-specversion": specversion,
}
for key in headers:
# breaking prefix e.g. e-id instead of ce-id
prefixed_headers[key[1:]] = headers[key]
with pytest.raises(cloud_exceptions.MissingRequiredFields):
# CloudEvent constructor throws TypeError if missing required field
# and NotImplementedError because structured calls aren't
# implemented. In this instance one of the required keys should have
# prefix e-id instead of ce-id therefore it should throw
_ = cloudevents_implementation["from_http"](
prefixed_headers, json.dumps(test_data)
)
@pytest.mark.parametrize("specversion", ["1.0", "0.3"])
def test_valid_binary_events(
specversion: str, cloudevents_implementation: PydanticImplementation
) -> None:
# Test creating multiple cloud events
events_queue: list[AnyPydanticCloudEvent] = []
headers = {}
num_cloudevents = 30
for i in range(num_cloudevents):
headers = {
"ce-id": f"id{i}",
"ce-source": f"source{i}.com.test",
"ce-type": "cloudevent.test.type",
"ce-specversion": specversion,
}
data = {"payload": f"payload-{i}"}
events_queue.append(
cloudevents_implementation["from_http"](headers, json.dumps(data))
)
for i, event in enumerate(events_queue):
assert isinstance(event.data, dict)
assert event["id"] == f"id{i}"
assert event["source"] == f"source{i}.com.test"
assert event["specversion"] == specversion
assert event.data["payload"] == f"payload-{i}"
@pytest.mark.parametrize("specversion", ["1.0", "0.3"])
def test_structured_to_request(
specversion: str, cloudevents_implementation: PydanticImplementation
) -> None:
attributes = {
"specversion": specversion,
"type": "word.found.name",
"id": "96fb5f0b-001e-0108-6dfe-da6e2806f124",
"source": "pytest",
}
data = {"message": "Hello World!"}
event = cloudevents_implementation["event"](attributes, data)
headers, body_bytes = to_structured(event)
assert isinstance(body_bytes, bytes)
body = json.loads(body_bytes)
assert headers["content-type"] == "application/cloudevents+json"
for key in attributes:
assert body[key] == attributes[key]
assert body["data"] == data, f"|{body_bytes!r}|| {body}"
@pytest.mark.parametrize("specversion", ["1.0", "0.3"])
def test_attributes_view_accessor(
specversion: str, cloudevents_implementation: PydanticImplementation
) -> None:
attributes: dict[str, typing.Any] = {
"specversion": specversion,
"type": "word.found.name",
"id": "96fb5f0b-001e-0108-6dfe-da6e2806f124",
"source": "pytest",
}
data = {"message": "Hello World!"}
event = cloudevents_implementation["event"](attributes, data)
event_attributes: typing.Mapping[str, typing.Any] = event.get_attributes()
assert event_attributes["specversion"] == attributes["specversion"]
assert event_attributes["type"] == attributes["type"]
assert event_attributes["id"] == attributes["id"]
assert event_attributes["source"] == attributes["source"]
assert event_attributes["time"]
@pytest.mark.parametrize("specversion", ["1.0", "0.3"])
def test_binary_to_request(
specversion: str, cloudevents_implementation: PydanticImplementation
) -> None:
attributes = {
"specversion": specversion,
"type": "word.found.name",
"id": "96fb5f0b-001e-0108-6dfe-da6e2806f124",
"source": "pytest",
}
data = {"message": "Hello World!"}
event = cloudevents_implementation["event"](attributes, data)
headers, body_bytes = to_binary(event)
body = json.loads(body_bytes)
for key in data:
assert body[key] == data[key]
for key in attributes:
assert attributes[key] == headers["ce-" + key]
@pytest.mark.parametrize("specversion", ["1.0", "0.3"])
def test_empty_data_structured_event(
specversion: str, cloudevents_implementation: PydanticImplementation
) -> None:
# Testing if cloudevent breaks when no structured data field present
attributes = {
"specversion": specversion,
"datacontenttype": "application/cloudevents+json",
"type": "word.found.name",
"id": "96fb5f0b-001e-0108-6dfe-da6e2806f124",
"time": "2018-10-23T12:28:22.4579346Z",
"source": "<source-url>",
}
event = cloudevents_implementation["from_http"](
{"content-type": "application/cloudevents+json"}, json.dumps(attributes)
)
assert event.data is None
attributes["data"] = ""
# Data of empty string will be marshalled into None
event = cloudevents_implementation["from_http"](
{"content-type": "application/cloudevents+json"}, json.dumps(attributes)
)
assert event.data is None
@pytest.mark.parametrize("specversion", ["1.0", "0.3"])
def test_empty_data_binary_event(
specversion: str, cloudevents_implementation: PydanticImplementation
) -> None:
# Testing if cloudevent breaks when no structured data field present
headers = {
"Content-Type": "application/octet-stream",
"ce-specversion": specversion,
"ce-type": "word.found.name",
"ce-id": "96fb5f0b-001e-0108-6dfe-da6e2806f124",
"ce-time": "2018-10-23T12:28:22.4579346Z",
"ce-source": "<source-url>",
}
event = cloudevents_implementation["from_http"](headers, None)
assert event.data is None
data = ""
# Data of empty string will be marshalled into None
event = cloudevents_implementation["from_http"](headers, data)
assert event.data is None
@pytest.mark.parametrize("specversion", ["1.0", "0.3"])
def test_valid_structured_events(
specversion: str, cloudevents_implementation: PydanticImplementation
) -> None:
# Test creating multiple cloud events
events_queue: list[AnyPydanticCloudEvent] = []
num_cloudevents = 30
for i in range(num_cloudevents):
raw_event = {
"id": f"id{i}",
"source": f"source{i}.com.test",
"type": "cloudevent.test.type",
"specversion": specversion,
"data": {"payload": f"payload-{i}"},
}
events_queue.append(
cloudevents_implementation["from_http"](
{"content-type": "application/cloudevents+json"},
json.dumps(raw_event),
)
)
for i, event in enumerate(events_queue):
assert isinstance(event.data, dict)
assert event["id"] == f"id{i}"
assert event["source"] == f"source{i}.com.test"
assert event["specversion"] == specversion
assert event.data["payload"] == f"payload-{i}"
@pytest.mark.parametrize("specversion", ["1.0", "0.3"])
def test_structured_no_content_type(
specversion: str, cloudevents_implementation: PydanticImplementation
) -> None:
# Test creating multiple cloud events
data = {
"id": "id",
"source": "source.com.test",
"type": "cloudevent.test.type",
"specversion": specversion,
"data": test_data,
}
event = cloudevents_implementation["from_http"]({}, json.dumps(data))
assert isinstance(event.data, dict)
assert event["id"] == "id"
assert event["source"] == "source.com.test"
assert event["specversion"] == specversion
for key, val in test_data.items():
assert event.data[key] == val
def test_is_binary():
headers = {
"ce-id": "my-id",
"ce-source": "<event-source>",
"ce-type": "cloudevent.event.type",
"ce-specversion": "1.0",
"Content-Type": "text/plain",
}
assert is_binary(headers)
headers = {
"Content-Type": "application/cloudevents+json",
}
assert not is_binary(headers)
headers = {}
assert not is_binary(headers)
@pytest.mark.parametrize("specversion", ["1.0", "0.3"])
def test_cloudevent_repr(
specversion: str, cloudevents_implementation: PydanticImplementation
) -> None:
headers = {
"Content-Type": "application/octet-stream",
"ce-specversion": specversion,
"ce-type": "word.found.name",
"ce-id": "96fb5f0b-001e-0108-6dfe-da6e2806f124",
"ce-time": "2018-10-23T12:28:22.4579346Z",
"ce-source": "<source-url>",
}
event = cloudevents_implementation["from_http"](headers, "")
# Testing to make sure event is printable. I could run event. __repr__() but
# we had issues in the past where event.__repr__() could run but
# print(event) would fail.
print(event) # noqa T201
@pytest.mark.parametrize("specversion", ["1.0", "0.3"])
def test_none_data_cloudevent(
specversion: str, cloudevents_implementation: PydanticImplementation
) -> None:
event = cloudevents_implementation["event"](
{
"source": "<my-url>",
"type": "issue.example",
"specversion": specversion,
}
)
to_binary(event)
to_structured(event)
def test_wrong_specversion(cloudevents_implementation: PydanticImplementation) -> None:
headers = {"Content-Type": "application/cloudevents+json"}
data = json.dumps(
{
"specversion": "0.2",
"type": "word.found.name",
"id": "96fb5f0b-001e-0108-6dfe-da6e2806f124",
"source": "<my-source>",
}
)
with pytest.raises(cloud_exceptions.InvalidRequiredFields) as e:
cloudevents_implementation["from_http"](headers, data)
assert "Found invalid specversion 0.2" in str(e.value)
def test_invalid_data_format_structured_from_http(
cloudevents_implementation: PydanticImplementation,
) -> None:
headers = {"Content-Type": "application/cloudevents+json"}
data = 20
with pytest.raises(cloud_exceptions.InvalidStructuredJSON) as e:
cloudevents_implementation["from_http"](headers, data) # type: ignore[type-var] # intentionally wrong type # noqa: E501
assert "Expected json of type (str, bytes, bytearray)" in str(e.value)
def test_wrong_specversion_to_request(
cloudevents_implementation: PydanticImplementation,
) -> None:
event = cloudevents_implementation["event"]({"source": "s", "type": "t"}, None)
with pytest.raises(cloud_exceptions.InvalidRequiredFields) as e:
event["specversion"] = "0.2"
to_binary(event)
assert "Unsupported specversion: 0.2" in str(e.value)
def test_is_structured():
headers = {
"Content-Type": "application/cloudevents+json",
}
assert is_structured(headers)
headers = {
"ce-id": "my-id",
"ce-source": "<event-source>",
"ce-type": "cloudevent.event.type",
"ce-specversion": "1.0",
"Content-Type": "text/plain",
}
assert not is_structured(headers)
def test_empty_json_structured(
cloudevents_implementation: PydanticImplementation,
) -> None:
headers = {"Content-Type": "application/cloudevents+json"}
data = ""
with pytest.raises(cloud_exceptions.MissingRequiredFields) as e:
cloudevents_implementation["from_http"](headers, data)
assert "Failed to read specversion from both headers and data" in str(e.value)
def test_uppercase_headers_with_none_data_binary(
cloudevents_implementation: PydanticImplementation,
) -> None:
headers = {
"Ce-Id": "my-id",
"Ce-Source": "<event-source>",
"Ce-Type": "cloudevent.event.type",
"Ce-Specversion": "1.0",
}
event = cloudevents_implementation["from_http"](headers, None)
for key in headers:
assert event[key.lower()[3:]] == headers[key]
assert event.data is None
_, new_data = to_binary(event)
assert new_data is None
def test_generic_exception(cloudevents_implementation: PydanticImplementation) -> None:
headers = {"Content-Type": "application/cloudevents+json"}
data = json.dumps(
{
"specversion": "1.0",
"source": "s",
"type": "t",
"id": "1234-1234-1234",
"data": "",
}
)
with pytest.raises(cloud_exceptions.GenericException) as e:
cloudevents_implementation["from_http"]({}, None)
e.errisinstance(cloud_exceptions.MissingRequiredFields)
with pytest.raises(cloud_exceptions.GenericException) as e:
cloudevents_implementation["from_http"]({}, 123) # type: ignore[type-var] # intentionally wrong type # noqa: E501
e.errisinstance(cloud_exceptions.InvalidStructuredJSON)
with pytest.raises(cloud_exceptions.GenericException) as e:
cloudevents_implementation["from_http"](
headers, data, data_unmarshaller=lambda x: 1 / 0
)
e.errisinstance(cloud_exceptions.DataUnmarshallerError)
with pytest.raises(cloud_exceptions.GenericException) as e:
event = cloudevents_implementation["from_http"](headers, data)
to_binary(event, data_marshaller=lambda x: 1 / 0)
e.errisinstance(cloud_exceptions.DataMarshallerError)
def test_non_dict_data_no_headers_bug(
cloudevents_implementation: PydanticImplementation,
) -> None:
# Test for issue #116
headers = {"Content-Type": "application/cloudevents+json"}
data = "123"
with pytest.raises(cloud_exceptions.MissingRequiredFields) as e:
cloudevents_implementation["from_http"](headers, data)
assert "Failed to read specversion from both headers and data" in str(e.value)
assert "The following deserialized data has no 'get' method" in str(e.value)

View File

@ -1,4 +1,4 @@
# All Rights Reserved.
# Copyright 2018-Present The CloudEvents Authors
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
@ -11,7 +11,6 @@
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import pytest
from cloudevents.sdk.event import v03

View File

@ -1,4 +1,4 @@
# All Rights Reserved.
# Copyright 2018-Present The CloudEvents Authors
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
@ -12,8 +12,6 @@
# License for the specific language governing permissions and limitations
# under the License.
import pytest
from cloudevents.sdk.event import v1

View File

@ -1,4 +1,4 @@
# All Rights Reserved.
# Copyright 2018-Present The CloudEvents Authors
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
@ -19,7 +19,7 @@ from cloudevents.sdk.event import v1
from cloudevents.tests import data as test_data
m = marshaller.NewDefaultHTTPMarshaller()
app = Sanic(__name__)
app = Sanic("test_with_sanic")
@app.route("/is-ok", ["POST"])
@ -30,11 +30,9 @@ async def is_ok(request):
@app.route("/echo", ["POST"])
async def echo(request):
event = m.FromRequest(
v1.Event(), dict(request.headers), request.body, lambda x: x
)
event = m.FromRequest(v1.Event(), dict(request.headers), request.body, lambda x: x)
hs, body = m.ToRequest(event, converters.TypeBinary, lambda x: x)
return response.text(body, headers=hs)
return response.text(body.decode("utf-8"), headers=hs)
def test_reusable_marshaller():

Binary file not shown.

Binary file not shown.

View File

@ -1,4 +0,0 @@
# Sphinx build info version 1
# This file hashes the configuration used when building these files. When it is not found, a full rebuild will be done.
config: 7a2eda13b1d0d4202963ea48c547f2cb
tags: 645f666f9bcd5a90fca523b33c5a78b7

View File

@ -1,20 +0,0 @@
.. CloudEvents Python SDK documentation master file, created by
sphinx-quickstart on Mon Nov 19 11:59:03 2018.
You can adapt this file completely to your liking, but it should at least
contain the root `toctree` directive.
Welcome to CloudEvents Python SDK's documentation!
==================================================
.. toctree::
:maxdepth: 2
:caption: Contents:
Indices and tables
==================
* :ref:`genindex`
* :ref:`modindex`
* :ref:`search`

Binary file not shown.

Before

Width:  |  Height:  |  Size: 673 B

View File

@ -1,701 +0,0 @@
@import url("basic.css");
/* -- page layout ----------------------------------------------------------- */
body {
font-family: Georgia, serif;
font-size: 17px;
background-color: #fff;
color: #000;
margin: 0;
padding: 0;
}
div.document {
width: 940px;
margin: 30px auto 0 auto;
}
div.documentwrapper {
float: left;
width: 100%;
}
div.bodywrapper {
margin: 0 0 0 220px;
}
div.sphinxsidebar {
width: 220px;
font-size: 14px;
line-height: 1.5;
}
hr {
border: 1px solid #B1B4B6;
}
div.body {
background-color: #fff;
color: #3E4349;
padding: 0 30px 0 30px;
}
div.body > .section {
text-align: left;
}
div.footer {
width: 940px;
margin: 20px auto 30px auto;
font-size: 14px;
color: #888;
text-align: right;
}
div.footer a {
color: #888;
}
p.caption {
font-family: inherit;
font-size: inherit;
}
div.relations {
display: none;
}
div.sphinxsidebar a {
color: #444;
text-decoration: none;
border-bottom: 1px dotted #999;
}
div.sphinxsidebar a:hover {
border-bottom: 1px solid #999;
}
div.sphinxsidebarwrapper {
padding: 18px 10px;
}
div.sphinxsidebarwrapper p.logo {
padding: 0;
margin: -10px 0 0 0px;
text-align: center;
}
div.sphinxsidebarwrapper h1.logo {
margin-top: -10px;
text-align: center;
margin-bottom: 5px;
text-align: left;
}
div.sphinxsidebarwrapper h1.logo-name {
margin-top: 0px;
}
div.sphinxsidebarwrapper p.blurb {
margin-top: 0;
font-style: normal;
}
div.sphinxsidebar h3,
div.sphinxsidebar h4 {
font-family: Georgia, serif;
color: #444;
font-size: 24px;
font-weight: normal;
margin: 0 0 5px 0;
padding: 0;
}
div.sphinxsidebar h4 {
font-size: 20px;
}
div.sphinxsidebar h3 a {
color: #444;
}
div.sphinxsidebar p.logo a,
div.sphinxsidebar h3 a,
div.sphinxsidebar p.logo a:hover,
div.sphinxsidebar h3 a:hover {
border: none;
}
div.sphinxsidebar p {
color: #555;
margin: 10px 0;
}
div.sphinxsidebar ul {
margin: 10px 0;
padding: 0;
color: #000;
}
div.sphinxsidebar ul li.toctree-l1 > a {
font-size: 120%;
}
div.sphinxsidebar ul li.toctree-l2 > a {
font-size: 110%;
}
div.sphinxsidebar input {
border: 1px solid #CCC;
font-family: Georgia, serif;
font-size: 1em;
}
div.sphinxsidebar hr {
border: none;
height: 1px;
color: #AAA;
background: #AAA;
text-align: left;
margin-left: 0;
width: 50%;
}
div.sphinxsidebar .badge {
border-bottom: none;
}
div.sphinxsidebar .badge:hover {
border-bottom: none;
}
/* To address an issue with donation coming after search */
div.sphinxsidebar h3.donation {
margin-top: 10px;
}
/* -- body styles ----------------------------------------------------------- */
a {
color: #004B6B;
text-decoration: underline;
}
a:hover {
color: #6D4100;
text-decoration: underline;
}
div.body h1,
div.body h2,
div.body h3,
div.body h4,
div.body h5,
div.body h6 {
font-family: Georgia, serif;
font-weight: normal;
margin: 30px 0px 10px 0px;
padding: 0;
}
div.body h1 { margin-top: 0; padding-top: 0; font-size: 240%; }
div.body h2 { font-size: 180%; }
div.body h3 { font-size: 150%; }
div.body h4 { font-size: 130%; }
div.body h5 { font-size: 100%; }
div.body h6 { font-size: 100%; }
a.headerlink {
color: #DDD;
padding: 0 4px;
text-decoration: none;
}
a.headerlink:hover {
color: #444;
background: #EAEAEA;
}
div.body p, div.body dd, div.body li {
line-height: 1.4em;
}
div.admonition {
margin: 20px 0px;
padding: 10px 30px;
background-color: #EEE;
border: 1px solid #CCC;
}
div.admonition tt.xref, div.admonition code.xref, div.admonition a tt {
background-color: #FBFBFB;
border-bottom: 1px solid #fafafa;
}
div.admonition p.admonition-title {
font-family: Georgia, serif;
font-weight: normal;
font-size: 24px;
margin: 0 0 10px 0;
padding: 0;
line-height: 1;
}
div.admonition p.last {
margin-bottom: 0;
}
div.highlight {
background-color: #fff;
}
dt:target, .highlight {
background: #FAF3E8;
}
div.warning {
background-color: #FCC;
border: 1px solid #FAA;
}
div.danger {
background-color: #FCC;
border: 1px solid #FAA;
-moz-box-shadow: 2px 2px 4px #D52C2C;
-webkit-box-shadow: 2px 2px 4px #D52C2C;
box-shadow: 2px 2px 4px #D52C2C;
}
div.error {
background-color: #FCC;
border: 1px solid #FAA;
-moz-box-shadow: 2px 2px 4px #D52C2C;
-webkit-box-shadow: 2px 2px 4px #D52C2C;
box-shadow: 2px 2px 4px #D52C2C;
}
div.caution {
background-color: #FCC;
border: 1px solid #FAA;
}
div.attention {
background-color: #FCC;
border: 1px solid #FAA;
}
div.important {
background-color: #EEE;
border: 1px solid #CCC;
}
div.note {
background-color: #EEE;
border: 1px solid #CCC;
}
div.tip {
background-color: #EEE;
border: 1px solid #CCC;
}
div.hint {
background-color: #EEE;
border: 1px solid #CCC;
}
div.seealso {
background-color: #EEE;
border: 1px solid #CCC;
}
div.topic {
background-color: #EEE;
}
p.admonition-title {
display: inline;
}
p.admonition-title:after {
content: ":";
}
pre, tt, code {
font-family: 'Consolas', 'Menlo', 'DejaVu Sans Mono', 'Bitstream Vera Sans Mono', monospace;
font-size: 0.9em;
}
.hll {
background-color: #FFC;
margin: 0 -12px;
padding: 0 12px;
display: block;
}
img.screenshot {
}
tt.descname, tt.descclassname, code.descname, code.descclassname {
font-size: 0.95em;
}
tt.descname, code.descname {
padding-right: 0.08em;
}
img.screenshot {
-moz-box-shadow: 2px 2px 4px #EEE;
-webkit-box-shadow: 2px 2px 4px #EEE;
box-shadow: 2px 2px 4px #EEE;
}
table.docutils {
border: 1px solid #888;
-moz-box-shadow: 2px 2px 4px #EEE;
-webkit-box-shadow: 2px 2px 4px #EEE;
box-shadow: 2px 2px 4px #EEE;
}
table.docutils td, table.docutils th {
border: 1px solid #888;
padding: 0.25em 0.7em;
}
table.field-list, table.footnote {
border: none;
-moz-box-shadow: none;
-webkit-box-shadow: none;
box-shadow: none;
}
table.footnote {
margin: 15px 0;
width: 100%;
border: 1px solid #EEE;
background: #FDFDFD;
font-size: 0.9em;
}
table.footnote + table.footnote {
margin-top: -15px;
border-top: none;
}
table.field-list th {
padding: 0 0.8em 0 0;
}
table.field-list td {
padding: 0;
}
table.field-list p {
margin-bottom: 0.8em;
}
/* Cloned from
* https://github.com/sphinx-doc/sphinx/commit/ef60dbfce09286b20b7385333d63a60321784e68
*/
.field-name {
-moz-hyphens: manual;
-ms-hyphens: manual;
-webkit-hyphens: manual;
hyphens: manual;
}
table.footnote td.label {
width: .1px;
padding: 0.3em 0 0.3em 0.5em;
}
table.footnote td {
padding: 0.3em 0.5em;
}
dl {
margin: 0;
padding: 0;
}
dl dd {
margin-left: 30px;
}
blockquote {
margin: 0 0 0 30px;
padding: 0;
}
ul, ol {
/* Matches the 30px from the narrow-screen "li > ul" selector below */
margin: 10px 0 10px 30px;
padding: 0;
}
pre {
background: #EEE;
padding: 7px 30px;
margin: 15px 0px;
line-height: 1.3em;
}
div.viewcode-block:target {
background: #ffd;
}
dl pre, blockquote pre, li pre {
margin-left: 0;
padding-left: 30px;
}
tt, code {
background-color: #ecf0f3;
color: #222;
/* padding: 1px 2px; */
}
tt.xref, code.xref, a tt {
background-color: #FBFBFB;
border-bottom: 1px solid #fff;
}
a.reference {
text-decoration: none;
border-bottom: 1px dotted #004B6B;
}
/* Don't put an underline on images */
a.image-reference, a.image-reference:hover {
border-bottom: none;
}
a.reference:hover {
border-bottom: 1px solid #6D4100;
}
a.footnote-reference {
text-decoration: none;
font-size: 0.7em;
vertical-align: top;
border-bottom: 1px dotted #004B6B;
}
a.footnote-reference:hover {
border-bottom: 1px solid #6D4100;
}
a:hover tt, a:hover code {
background: #EEE;
}
@media screen and (max-width: 870px) {
div.sphinxsidebar {
display: none;
}
div.document {
width: 100%;
}
div.documentwrapper {
margin-left: 0;
margin-top: 0;
margin-right: 0;
margin-bottom: 0;
}
div.bodywrapper {
margin-top: 0;
margin-right: 0;
margin-bottom: 0;
margin-left: 0;
}
ul {
margin-left: 0;
}
li > ul {
/* Matches the 30px from the "ul, ol" selector above */
margin-left: 30px;
}
.document {
width: auto;
}
.footer {
width: auto;
}
.bodywrapper {
margin: 0;
}
.footer {
width: auto;
}
.github {
display: none;
}
}
@media screen and (max-width: 875px) {
body {
margin: 0;
padding: 20px 30px;
}
div.documentwrapper {
float: none;
background: #fff;
}
div.sphinxsidebar {
display: block;
float: none;
width: 102.5%;
margin: 50px -30px -20px -30px;
padding: 10px 20px;
background: #333;
color: #FFF;
}
div.sphinxsidebar h3, div.sphinxsidebar h4, div.sphinxsidebar p,
div.sphinxsidebar h3 a {
color: #fff;
}
div.sphinxsidebar a {
color: #AAA;
}
div.sphinxsidebar p.logo {
display: none;
}
div.document {
width: 100%;
margin: 0;
}
div.footer {
display: none;
}
div.bodywrapper {
margin: 0;
}
div.body {
min-height: 0;
padding: 0;
}
.rtd_doc_footer {
display: none;
}
.document {
width: auto;
}
.footer {
width: auto;
}
.footer {
width: auto;
}
.github {
display: none;
}
}
/* misc. */
.revsys-inline {
display: none!important;
}
/* Make nested-list/multi-paragraph items look better in Releases changelog
* pages. Without this, docutils' magical list fuckery causes inconsistent
* formatting between different release sub-lists.
*/
div#changelog > div.section > ul > li > p:only-child {
margin-bottom: 0;
}
/* Hide fugly table cell borders in ..bibliography:: directive output */
table.docutils.citation, table.docutils.citation td, table.docutils.citation th {
border: none;
/* Below needed in some edge cases; if not applied, bottom shadows appear */
-moz-box-shadow: none;
-webkit-box-shadow: none;
box-shadow: none;
}
/* relbar */
.related {
line-height: 30px;
width: 100%;
font-size: 0.9rem;
}
.related.top {
border-bottom: 1px solid #EEE;
margin-bottom: 20px;
}
.related.bottom {
border-top: 1px solid #EEE;
}
.related ul {
padding: 0;
margin: 0;
list-style: none;
}
.related li {
display: inline;
}
nav#rellinks {
float: right;
}
nav#rellinks li+li:before {
content: "|";
}
nav#breadcrumbs li+li:before {
content: "\00BB";
}
/* Hide certain items when printing */
@media print {
div.related {
display: none;
}
}

View File

@ -1,676 +0,0 @@
/*
* basic.css
* ~~~~~~~~~
*
* Sphinx stylesheet -- basic theme.
*
* :copyright: Copyright 2007-2018 by the Sphinx team, see AUTHORS.
* :license: BSD, see LICENSE for details.
*
*/
/* -- main layout ----------------------------------------------------------- */
div.clearer {
clear: both;
}
/* -- relbar ---------------------------------------------------------------- */
div.related {
width: 100%;
font-size: 90%;
}
div.related h3 {
display: none;
}
div.related ul {
margin: 0;
padding: 0 0 0 10px;
list-style: none;
}
div.related li {
display: inline;
}
div.related li.right {
float: right;
margin-right: 5px;
}
/* -- sidebar --------------------------------------------------------------- */
div.sphinxsidebarwrapper {
padding: 10px 5px 0 10px;
}
div.sphinxsidebar {
float: left;
width: 230px;
margin-left: -100%;
font-size: 90%;
word-wrap: break-word;
overflow-wrap : break-word;
}
div.sphinxsidebar ul {
list-style: none;
}
div.sphinxsidebar ul ul,
div.sphinxsidebar ul.want-points {
margin-left: 20px;
list-style: square;
}
div.sphinxsidebar ul ul {
margin-top: 0;
margin-bottom: 0;
}
div.sphinxsidebar form {
margin-top: 10px;
}
div.sphinxsidebar input {
border: 1px solid #98dbcc;
font-family: sans-serif;
font-size: 1em;
}
div.sphinxsidebar #searchbox form.search {
overflow: hidden;
}
div.sphinxsidebar #searchbox input[type="text"] {
float: left;
width: 80%;
padding: 0.25em;
box-sizing: border-box;
}
div.sphinxsidebar #searchbox input[type="submit"] {
float: left;
width: 20%;
border-left: none;
padding: 0.25em;
box-sizing: border-box;
}
img {
border: 0;
max-width: 100%;
}
/* -- search page ----------------------------------------------------------- */
ul.search {
margin: 10px 0 0 20px;
padding: 0;
}
ul.search li {
padding: 5px 0 5px 20px;
background-image: url(file.png);
background-repeat: no-repeat;
background-position: 0 7px;
}
ul.search li a {
font-weight: bold;
}
ul.search li div.context {
color: #888;
margin: 2px 0 0 30px;
text-align: left;
}
ul.keywordmatches li.goodmatch a {
font-weight: bold;
}
/* -- index page ------------------------------------------------------------ */
table.contentstable {
width: 90%;
margin-left: auto;
margin-right: auto;
}
table.contentstable p.biglink {
line-height: 150%;
}
a.biglink {
font-size: 1.3em;
}
span.linkdescr {
font-style: italic;
padding-top: 5px;
font-size: 90%;
}
/* -- general index --------------------------------------------------------- */
table.indextable {
width: 100%;
}
table.indextable td {
text-align: left;
vertical-align: top;
}
table.indextable ul {
margin-top: 0;
margin-bottom: 0;
list-style-type: none;
}
table.indextable > tbody > tr > td > ul {
padding-left: 0em;
}
table.indextable tr.pcap {
height: 10px;
}
table.indextable tr.cap {
margin-top: 10px;
background-color: #f2f2f2;
}
img.toggler {
margin-right: 3px;
margin-top: 3px;
cursor: pointer;
}
div.modindex-jumpbox {
border-top: 1px solid #ddd;
border-bottom: 1px solid #ddd;
margin: 1em 0 1em 0;
padding: 0.4em;
}
div.genindex-jumpbox {
border-top: 1px solid #ddd;
border-bottom: 1px solid #ddd;
margin: 1em 0 1em 0;
padding: 0.4em;
}
/* -- domain module index --------------------------------------------------- */
table.modindextable td {
padding: 2px;
border-collapse: collapse;
}
/* -- general body styles --------------------------------------------------- */
div.body {
min-width: 450px;
max-width: 800px;
}
div.body p, div.body dd, div.body li, div.body blockquote {
-moz-hyphens: auto;
-ms-hyphens: auto;
-webkit-hyphens: auto;
hyphens: auto;
}
a.headerlink {
visibility: hidden;
}
h1:hover > a.headerlink,
h2:hover > a.headerlink,
h3:hover > a.headerlink,
h4:hover > a.headerlink,
h5:hover > a.headerlink,
h6:hover > a.headerlink,
dt:hover > a.headerlink,
caption:hover > a.headerlink,
p.caption:hover > a.headerlink,
div.code-block-caption:hover > a.headerlink {
visibility: visible;
}
div.body p.caption {
text-align: inherit;
}
div.body td {
text-align: left;
}
.first {
margin-top: 0 !important;
}
p.rubric {
margin-top: 30px;
font-weight: bold;
}
img.align-left, .figure.align-left, object.align-left {
clear: left;
float: left;
margin-right: 1em;
}
img.align-right, .figure.align-right, object.align-right {
clear: right;
float: right;
margin-left: 1em;
}
img.align-center, .figure.align-center, object.align-center {
display: block;
margin-left: auto;
margin-right: auto;
}
.align-left {
text-align: left;
}
.align-center {
text-align: center;
}
.align-right {
text-align: right;
}
/* -- sidebars -------------------------------------------------------------- */
div.sidebar {
margin: 0 0 0.5em 1em;
border: 1px solid #ddb;
padding: 7px 7px 0 7px;
background-color: #ffe;
width: 40%;
float: right;
}
p.sidebar-title {
font-weight: bold;
}
/* -- topics ---------------------------------------------------------------- */
div.topic {
border: 1px solid #ccc;
padding: 7px 7px 0 7px;
margin: 10px 0 10px 0;
}
p.topic-title {
font-size: 1.1em;
font-weight: bold;
margin-top: 10px;
}
/* -- admonitions ----------------------------------------------------------- */
div.admonition {
margin-top: 10px;
margin-bottom: 10px;
padding: 7px;
}
div.admonition dt {
font-weight: bold;
}
div.admonition dl {
margin-bottom: 0;
}
p.admonition-title {
margin: 0px 10px 5px 0px;
font-weight: bold;
}
div.body p.centered {
text-align: center;
margin-top: 25px;
}
/* -- tables ---------------------------------------------------------------- */
table.docutils {
border: 0;
border-collapse: collapse;
}
table.align-center {
margin-left: auto;
margin-right: auto;
}
table caption span.caption-number {
font-style: italic;
}
table caption span.caption-text {
}
table.docutils td, table.docutils th {
padding: 1px 8px 1px 5px;
border-top: 0;
border-left: 0;
border-right: 0;
border-bottom: 1px solid #aaa;
}
table.footnote td, table.footnote th {
border: 0 !important;
}
th {
text-align: left;
padding-right: 5px;
}
table.citation {
border-left: solid 1px gray;
margin-left: 1px;
}
table.citation td {
border-bottom: none;
}
/* -- figures --------------------------------------------------------------- */
div.figure {
margin: 0.5em;
padding: 0.5em;
}
div.figure p.caption {
padding: 0.3em;
}
div.figure p.caption span.caption-number {
font-style: italic;
}
div.figure p.caption span.caption-text {
}
/* -- field list styles ----------------------------------------------------- */
table.field-list td, table.field-list th {
border: 0 !important;
}
.field-list ul {
margin: 0;
padding-left: 1em;
}
.field-list p {
margin: 0;
}
.field-name {
-moz-hyphens: manual;
-ms-hyphens: manual;
-webkit-hyphens: manual;
hyphens: manual;
}
/* -- hlist styles ---------------------------------------------------------- */
table.hlist td {
vertical-align: top;
}
/* -- other body styles ----------------------------------------------------- */
ol.arabic {
list-style: decimal;
}
ol.loweralpha {
list-style: lower-alpha;
}
ol.upperalpha {
list-style: upper-alpha;
}
ol.lowerroman {
list-style: lower-roman;
}
ol.upperroman {
list-style: upper-roman;
}
dl {
margin-bottom: 15px;
}
dd p {
margin-top: 0px;
}
dd ul, dd table {
margin-bottom: 10px;
}
dd {
margin-top: 3px;
margin-bottom: 10px;
margin-left: 30px;
}
dt:target, span.highlighted {
background-color: #fbe54e;
}
rect.highlighted {
fill: #fbe54e;
}
dl.glossary dt {
font-weight: bold;
font-size: 1.1em;
}
.optional {
font-size: 1.3em;
}
.sig-paren {
font-size: larger;
}
.versionmodified {
font-style: italic;
}
.system-message {
background-color: #fda;
padding: 5px;
border: 3px solid red;
}
.footnote:target {
background-color: #ffa;
}
.line-block {
display: block;
margin-top: 1em;
margin-bottom: 1em;
}
.line-block .line-block {
margin-top: 0;
margin-bottom: 0;
margin-left: 1.5em;
}
.guilabel, .menuselection {
font-family: sans-serif;
}
.accelerator {
text-decoration: underline;
}
.classifier {
font-style: oblique;
}
abbr, acronym {
border-bottom: dotted 1px;
cursor: help;
}
/* -- code displays --------------------------------------------------------- */
pre {
overflow: auto;
overflow-y: hidden; /* fixes display issues on Chrome browsers */
}
span.pre {
-moz-hyphens: none;
-ms-hyphens: none;
-webkit-hyphens: none;
hyphens: none;
}
td.linenos pre {
padding: 5px 0px;
border: 0;
background-color: transparent;
color: #aaa;
}
table.highlighttable {
margin-left: 0.5em;
}
table.highlighttable td {
padding: 0 0.5em 0 0.5em;
}
div.code-block-caption {
padding: 2px 5px;
font-size: small;
}
div.code-block-caption code {
background-color: transparent;
}
div.code-block-caption + div > div.highlight > pre {
margin-top: 0;
}
div.code-block-caption span.caption-number {
padding: 0.1em 0.3em;
font-style: italic;
}
div.code-block-caption span.caption-text {
}
div.literal-block-wrapper {
padding: 1em 1em 0;
}
div.literal-block-wrapper div.highlight {
margin: 0;
}
code.descname {
background-color: transparent;
font-weight: bold;
font-size: 1.2em;
}
code.descclassname {
background-color: transparent;
}
code.xref, a code {
background-color: transparent;
font-weight: bold;
}
h1 code, h2 code, h3 code, h4 code, h5 code, h6 code {
background-color: transparent;
}
.viewcode-link {
float: right;
}
.viewcode-back {
float: right;
font-family: sans-serif;
}
div.viewcode-block:target {
margin: -1px -10px;
padding: 0 10px;
}
/* -- math display ---------------------------------------------------------- */
img.math {
vertical-align: middle;
}
div.body div.math p {
text-align: center;
}
span.eqno {
float: right;
}
span.eqno a.headerlink {
position: relative;
left: 0px;
z-index: 1;
}
div.math:hover a.headerlink {
visibility: visible;
}
/* -- printout stylesheet --------------------------------------------------- */
@media print {
div.document,
div.documentwrapper,
div.bodywrapper {
margin: 0 !important;
width: 100%;
}
div.sphinxsidebar,
div.related,
div.footer,
#top-link {
display: none;
}
}

Binary file not shown.

Before

Width:  |  Height:  |  Size: 756 B

Binary file not shown.

Before

Width:  |  Height:  |  Size: 829 B

Binary file not shown.

Before

Width:  |  Height:  |  Size: 641 B

View File

@ -1 +0,0 @@
/* This file intentionally left blank. */

Binary file not shown.

Before

Width:  |  Height:  |  Size: 1.4 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 1.3 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 1.2 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 1.8 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 1.3 KiB

View File

@ -1,315 +0,0 @@
/*
* doctools.js
* ~~~~~~~~~~~
*
* Sphinx JavaScript utilities for all documentation.
*
* :copyright: Copyright 2007-2018 by the Sphinx team, see AUTHORS.
* :license: BSD, see LICENSE for details.
*
*/
/**
* select a different prefix for underscore
*/
$u = _.noConflict();
/**
* make the code below compatible with browsers without
* an installed firebug like debugger
if (!window.console || !console.firebug) {
var names = ["log", "debug", "info", "warn", "error", "assert", "dir",
"dirxml", "group", "groupEnd", "time", "timeEnd", "count", "trace",
"profile", "profileEnd"];
window.console = {};
for (var i = 0; i < names.length; ++i)
window.console[names[i]] = function() {};
}
*/
/**
* small helper function to urldecode strings
*/
jQuery.urldecode = function(x) {
return decodeURIComponent(x).replace(/\+/g, ' ');
};
/**
* small helper function to urlencode strings
*/
jQuery.urlencode = encodeURIComponent;
/**
* This function returns the parsed url parameters of the
* current request. Multiple values per key are supported,
* it will always return arrays of strings for the value parts.
*/
jQuery.getQueryParameters = function(s) {
if (typeof s === 'undefined')
s = document.location.search;
var parts = s.substr(s.indexOf('?') + 1).split('&');
var result = {};
for (var i = 0; i < parts.length; i++) {
var tmp = parts[i].split('=', 2);
var key = jQuery.urldecode(tmp[0]);
var value = jQuery.urldecode(tmp[1]);
if (key in result)
result[key].push(value);
else
result[key] = [value];
}
return result;
};
/**
* highlight a given string on a jquery object by wrapping it in
* span elements with the given class name.
*/
jQuery.fn.highlightText = function(text, className) {
function highlight(node, addItems) {
if (node.nodeType === 3) {
var val = node.nodeValue;
var pos = val.toLowerCase().indexOf(text);
if (pos >= 0 &&
!jQuery(node.parentNode).hasClass(className) &&
!jQuery(node.parentNode).hasClass("nohighlight")) {
var span;
var isInSVG = jQuery(node).closest("body, svg, foreignObject").is("svg");
if (isInSVG) {
span = document.createElementNS("http://www.w3.org/2000/svg", "tspan");
} else {
span = document.createElement("span");
span.className = className;
}
span.appendChild(document.createTextNode(val.substr(pos, text.length)));
node.parentNode.insertBefore(span, node.parentNode.insertBefore(
document.createTextNode(val.substr(pos + text.length)),
node.nextSibling));
node.nodeValue = val.substr(0, pos);
if (isInSVG) {
var bbox = span.getBBox();
var rect = document.createElementNS("http://www.w3.org/2000/svg", "rect");
rect.x.baseVal.value = bbox.x;
rect.y.baseVal.value = bbox.y;
rect.width.baseVal.value = bbox.width;
rect.height.baseVal.value = bbox.height;
rect.setAttribute('class', className);
var parentOfText = node.parentNode.parentNode;
addItems.push({
"parent": node.parentNode,
"target": rect});
}
}
}
else if (!jQuery(node).is("button, select, textarea")) {
jQuery.each(node.childNodes, function() {
highlight(this, addItems);
});
}
}
var addItems = [];
var result = this.each(function() {
highlight(this, addItems);
});
for (var i = 0; i < addItems.length; ++i) {
jQuery(addItems[i].parent).before(addItems[i].target);
}
return result;
};
/*
* backward compatibility for jQuery.browser
* This will be supported until firefox bug is fixed.
*/
if (!jQuery.browser) {
jQuery.uaMatch = function(ua) {
ua = ua.toLowerCase();
var match = /(chrome)[ \/]([\w.]+)/.exec(ua) ||
/(webkit)[ \/]([\w.]+)/.exec(ua) ||
/(opera)(?:.*version|)[ \/]([\w.]+)/.exec(ua) ||
/(msie) ([\w.]+)/.exec(ua) ||
ua.indexOf("compatible") < 0 && /(mozilla)(?:.*? rv:([\w.]+)|)/.exec(ua) ||
[];
return {
browser: match[ 1 ] || "",
version: match[ 2 ] || "0"
};
};
jQuery.browser = {};
jQuery.browser[jQuery.uaMatch(navigator.userAgent).browser] = true;
}
/**
* Small JavaScript module for the documentation.
*/
var Documentation = {
init : function() {
this.fixFirefoxAnchorBug();
this.highlightSearchWords();
this.initIndexTable();
if (DOCUMENTATION_OPTIONS.NAVIGATION_WITH_KEYS) {
this.initOnKeyListeners();
}
},
/**
* i18n support
*/
TRANSLATIONS : {},
PLURAL_EXPR : function(n) { return n === 1 ? 0 : 1; },
LOCALE : 'unknown',
// gettext and ngettext don't access this so that the functions
// can safely bound to a different name (_ = Documentation.gettext)
gettext : function(string) {
var translated = Documentation.TRANSLATIONS[string];
if (typeof translated === 'undefined')
return string;
return (typeof translated === 'string') ? translated : translated[0];
},
ngettext : function(singular, plural, n) {
var translated = Documentation.TRANSLATIONS[singular];
if (typeof translated === 'undefined')
return (n == 1) ? singular : plural;
return translated[Documentation.PLURALEXPR(n)];
},
addTranslations : function(catalog) {
for (var key in catalog.messages)
this.TRANSLATIONS[key] = catalog.messages[key];
this.PLURAL_EXPR = new Function('n', 'return +(' + catalog.plural_expr + ')');
this.LOCALE = catalog.locale;
},
/**
* add context elements like header anchor links
*/
addContextElements : function() {
$('div[id] > :header:first').each(function() {
$('<a class="headerlink">\u00B6</a>').
attr('href', '#' + this.id).
attr('title', _('Permalink to this headline')).
appendTo(this);
});
$('dt[id]').each(function() {
$('<a class="headerlink">\u00B6</a>').
attr('href', '#' + this.id).
attr('title', _('Permalink to this definition')).
appendTo(this);
});
},
/**
* workaround a firefox stupidity
* see: https://bugzilla.mozilla.org/show_bug.cgi?id=645075
*/
fixFirefoxAnchorBug : function() {
if (document.location.hash && $.browser.mozilla)
window.setTimeout(function() {
document.location.href += '';
}, 10);
},
/**
* highlight the search words provided in the url in the text
*/
highlightSearchWords : function() {
var params = $.getQueryParameters();
var terms = (params.highlight) ? params.highlight[0].split(/\s+/) : [];
if (terms.length) {
var body = $('div.body');
if (!body.length) {
body = $('body');
}
window.setTimeout(function() {
$.each(terms, function() {
body.highlightText(this.toLowerCase(), 'highlighted');
});
}, 10);
$('<p class="highlight-link"><a href="javascript:Documentation.' +
'hideSearchWords()">' + _('Hide Search Matches') + '</a></p>')
.appendTo($('#searchbox'));
}
},
/**
* init the domain index toggle buttons
*/
initIndexTable : function() {
var togglers = $('img.toggler').click(function() {
var src = $(this).attr('src');
var idnum = $(this).attr('id').substr(7);
$('tr.cg-' + idnum).toggle();
if (src.substr(-9) === 'minus.png')
$(this).attr('src', src.substr(0, src.length-9) + 'plus.png');
else
$(this).attr('src', src.substr(0, src.length-8) + 'minus.png');
}).css('display', '');
if (DOCUMENTATION_OPTIONS.COLLAPSE_INDEX) {
togglers.click();
}
},
/**
* helper function to hide the search marks again
*/
hideSearchWords : function() {
$('#searchbox .highlight-link').fadeOut(300);
$('span.highlighted').removeClass('highlighted');
},
/**
* make the url absolute
*/
makeURL : function(relativeURL) {
return DOCUMENTATION_OPTIONS.URL_ROOT + '/' + relativeURL;
},
/**
* get the current relative url
*/
getCurrentURL : function() {
var path = document.location.pathname;
var parts = path.split(/\//);
$.each(DOCUMENTATION_OPTIONS.URL_ROOT.split(/\//), function() {
if (this === '..')
parts.pop();
});
var url = parts.join('/');
return path.substring(url.lastIndexOf('/') + 1, path.length - 1);
},
initOnKeyListeners: function() {
$(document).keyup(function(event) {
var activeElementType = document.activeElement.tagName;
// don't navigate when in search box or textarea
if (activeElementType !== 'TEXTAREA' && activeElementType !== 'INPUT' && activeElementType !== 'SELECT') {
switch (event.keyCode) {
case 37: // left
var prevHref = $('link[rel="prev"]').prop('href');
if (prevHref) {
window.location.href = prevHref;
return false;
}
case 39: // right
var nextHref = $('link[rel="next"]').prop('href');
if (nextHref) {
window.location.href = nextHref;
return false;
}
}
}
});
}
};
// quick alias for translations
_ = Documentation.gettext;
$(document).ready(function() {
Documentation.init();
});

View File

@ -1,296 +0,0 @@
var DOCUMENTATION_OPTIONS = {
URL_ROOT: document.getElementById("documentation_options").getAttribute('data-url_root'),
VERSION: '',
LANGUAGE: 'None',
COLLAPSE_INDEX: false,
FILE_SUFFIX: '.html',
HAS_SOURCE: true,
SOURCELINK_SUFFIX: '.txt',
NAVIGATION_WITH_KEYS: false,
SEARCH_LANGUAGE_STOP_WORDS: ["a","and","are","as","at","be","but","by","for","if","in","into","is","it","near","no","not","of","on","or","such","that","the","their","then","there","these","they","this","to","was","will","with"]
};
/* Non-minified version JS is _stemmer.js if file is provided */
/**
* Porter Stemmer
*/
var Stemmer = function() {
var step2list = {
ational: 'ate',
tional: 'tion',
enci: 'ence',
anci: 'ance',
izer: 'ize',
bli: 'ble',
alli: 'al',
entli: 'ent',
eli: 'e',
ousli: 'ous',
ization: 'ize',
ation: 'ate',
ator: 'ate',
alism: 'al',
iveness: 'ive',
fulness: 'ful',
ousness: 'ous',
aliti: 'al',
iviti: 'ive',
biliti: 'ble',
logi: 'log'
};
var step3list = {
icate: 'ic',
ative: '',
alize: 'al',
iciti: 'ic',
ical: 'ic',
ful: '',
ness: ''
};
var c = "[^aeiou]"; // consonant
var v = "[aeiouy]"; // vowel
var C = c + "[^aeiouy]*"; // consonant sequence
var V = v + "[aeiou]*"; // vowel sequence
var mgr0 = "^(" + C + ")?" + V + C; // [C]VC... is m>0
var meq1 = "^(" + C + ")?" + V + C + "(" + V + ")?$"; // [C]VC[V] is m=1
var mgr1 = "^(" + C + ")?" + V + C + V + C; // [C]VCVC... is m>1
var s_v = "^(" + C + ")?" + v; // vowel in stem
this.stemWord = function (w) {
var stem;
var suffix;
var firstch;
var origword = w;
if (w.length < 3)
return w;
var re;
var re2;
var re3;
var re4;
firstch = w.substr(0,1);
if (firstch == "y")
w = firstch.toUpperCase() + w.substr(1);
// Step 1a
re = /^(.+?)(ss|i)es$/;
re2 = /^(.+?)([^s])s$/;
if (re.test(w))
w = w.replace(re,"$1$2");
else if (re2.test(w))
w = w.replace(re2,"$1$2");
// Step 1b
re = /^(.+?)eed$/;
re2 = /^(.+?)(ed|ing)$/;
if (re.test(w)) {
var fp = re.exec(w);
re = new RegExp(mgr0);
if (re.test(fp[1])) {
re = /.$/;
w = w.replace(re,"");
}
}
else if (re2.test(w)) {
var fp = re2.exec(w);
stem = fp[1];
re2 = new RegExp(s_v);
if (re2.test(stem)) {
w = stem;
re2 = /(at|bl|iz)$/;
re3 = new RegExp("([^aeiouylsz])\\1$");
re4 = new RegExp("^" + C + v + "[^aeiouwxy]$");
if (re2.test(w))
w = w + "e";
else if (re3.test(w)) {
re = /.$/;
w = w.replace(re,"");
}
else if (re4.test(w))
w = w + "e";
}
}
// Step 1c
re = /^(.+?)y$/;
if (re.test(w)) {
var fp = re.exec(w);
stem = fp[1];
re = new RegExp(s_v);
if (re.test(stem))
w = stem + "i";
}
// Step 2
re = /^(.+?)(ational|tional|enci|anci|izer|bli|alli|entli|eli|ousli|ization|ation|ator|alism|iveness|fulness|ousness|aliti|iviti|biliti|logi)$/;
if (re.test(w)) {
var fp = re.exec(w);
stem = fp[1];
suffix = fp[2];
re = new RegExp(mgr0);
if (re.test(stem))
w = stem + step2list[suffix];
}
// Step 3
re = /^(.+?)(icate|ative|alize|iciti|ical|ful|ness)$/;
if (re.test(w)) {
var fp = re.exec(w);
stem = fp[1];
suffix = fp[2];
re = new RegExp(mgr0);
if (re.test(stem))
w = stem + step3list[suffix];
}
// Step 4
re = /^(.+?)(al|ance|ence|er|ic|able|ible|ant|ement|ment|ent|ou|ism|ate|iti|ous|ive|ize)$/;
re2 = /^(.+?)(s|t)(ion)$/;
if (re.test(w)) {
var fp = re.exec(w);
stem = fp[1];
re = new RegExp(mgr1);
if (re.test(stem))
w = stem;
}
else if (re2.test(w)) {
var fp = re2.exec(w);
stem = fp[1] + fp[2];
re2 = new RegExp(mgr1);
if (re2.test(stem))
w = stem;
}
// Step 5
re = /^(.+?)e$/;
if (re.test(w)) {
var fp = re.exec(w);
stem = fp[1];
re = new RegExp(mgr1);
re2 = new RegExp(meq1);
re3 = new RegExp("^" + C + v + "[^aeiouwxy]$");
if (re.test(stem) || (re2.test(stem) && !(re3.test(stem))))
w = stem;
}
re = /ll$/;
re2 = new RegExp(mgr1);
if (re.test(w) && re2.test(w)) {
re = /.$/;
w = w.replace(re,"");
}
// and turn initial Y back to y
if (firstch == "y")
w = firstch.toLowerCase() + w.substr(1);
return w;
}
}
var splitChars = (function() {
var result = {};
var singles = [96, 180, 187, 191, 215, 247, 749, 885, 903, 907, 909, 930, 1014, 1648,
1748, 1809, 2416, 2473, 2481, 2526, 2601, 2609, 2612, 2615, 2653, 2702,
2706, 2729, 2737, 2740, 2857, 2865, 2868, 2910, 2928, 2948, 2961, 2971,
2973, 3085, 3089, 3113, 3124, 3213, 3217, 3241, 3252, 3295, 3341, 3345,
3369, 3506, 3516, 3633, 3715, 3721, 3736, 3744, 3748, 3750, 3756, 3761,
3781, 3912, 4239, 4347, 4681, 4695, 4697, 4745, 4785, 4799, 4801, 4823,
4881, 5760, 5901, 5997, 6313, 7405, 8024, 8026, 8028, 8030, 8117, 8125,
8133, 8181, 8468, 8485, 8487, 8489, 8494, 8527, 11311, 11359, 11687, 11695,
11703, 11711, 11719, 11727, 11735, 12448, 12539, 43010, 43014, 43019, 43587,
43696, 43713, 64286, 64297, 64311, 64317, 64319, 64322, 64325, 65141];
var i, j, start, end;
for (i = 0; i < singles.length; i++) {
result[singles[i]] = true;
}
var ranges = [[0, 47], [58, 64], [91, 94], [123, 169], [171, 177], [182, 184], [706, 709],
[722, 735], [741, 747], [751, 879], [888, 889], [894, 901], [1154, 1161],
[1318, 1328], [1367, 1368], [1370, 1376], [1416, 1487], [1515, 1519], [1523, 1568],
[1611, 1631], [1642, 1645], [1750, 1764], [1767, 1773], [1789, 1790], [1792, 1807],
[1840, 1868], [1958, 1968], [1970, 1983], [2027, 2035], [2038, 2041], [2043, 2047],
[2070, 2073], [2075, 2083], [2085, 2087], [2089, 2307], [2362, 2364], [2366, 2383],
[2385, 2391], [2402, 2405], [2419, 2424], [2432, 2436], [2445, 2446], [2449, 2450],
[2483, 2485], [2490, 2492], [2494, 2509], [2511, 2523], [2530, 2533], [2546, 2547],
[2554, 2564], [2571, 2574], [2577, 2578], [2618, 2648], [2655, 2661], [2672, 2673],
[2677, 2692], [2746, 2748], [2750, 2767], [2769, 2783], [2786, 2789], [2800, 2820],
[2829, 2830], [2833, 2834], [2874, 2876], [2878, 2907], [2914, 2917], [2930, 2946],
[2955, 2957], [2966, 2968], [2976, 2978], [2981, 2983], [2987, 2989], [3002, 3023],
[3025, 3045], [3059, 3076], [3130, 3132], [3134, 3159], [3162, 3167], [3170, 3173],
[3184, 3191], [3199, 3204], [3258, 3260], [3262, 3293], [3298, 3301], [3312, 3332],
[3386, 3388], [3390, 3423], [3426, 3429], [3446, 3449], [3456, 3460], [3479, 3481],
[3518, 3519], [3527, 3584], [3636, 3647], [3655, 3663], [3674, 3712], [3717, 3718],
[3723, 3724], [3726, 3731], [3752, 3753], [3764, 3772], [3774, 3775], [3783, 3791],
[3802, 3803], [3806, 3839], [3841, 3871], [3892, 3903], [3949, 3975], [3980, 4095],
[4139, 4158], [4170, 4175], [4182, 4185], [4190, 4192], [4194, 4196], [4199, 4205],
[4209, 4212], [4226, 4237], [4250, 4255], [4294, 4303], [4349, 4351], [4686, 4687],
[4702, 4703], [4750, 4751], [4790, 4791], [4806, 4807], [4886, 4887], [4955, 4968],
[4989, 4991], [5008, 5023], [5109, 5120], [5741, 5742], [5787, 5791], [5867, 5869],
[5873, 5887], [5906, 5919], [5938, 5951], [5970, 5983], [6001, 6015], [6068, 6102],
[6104, 6107], [6109, 6111], [6122, 6127], [6138, 6159], [6170, 6175], [6264, 6271],
[6315, 6319], [6390, 6399], [6429, 6469], [6510, 6511], [6517, 6527], [6572, 6592],
[6600, 6607], [6619, 6655], [6679, 6687], [6741, 6783], [6794, 6799], [6810, 6822],
[6824, 6916], [6964, 6980], [6988, 6991], [7002, 7042], [7073, 7085], [7098, 7167],
[7204, 7231], [7242, 7244], [7294, 7400], [7410, 7423], [7616, 7679], [7958, 7959],
[7966, 7967], [8006, 8007], [8014, 8015], [8062, 8063], [8127, 8129], [8141, 8143],
[8148, 8149], [8156, 8159], [8173, 8177], [8189, 8303], [8306, 8307], [8314, 8318],
[8330, 8335], [8341, 8449], [8451, 8454], [8456, 8457], [8470, 8472], [8478, 8483],
[8506, 8507], [8512, 8516], [8522, 8525], [8586, 9311], [9372, 9449], [9472, 10101],
[10132, 11263], [11493, 11498], [11503, 11516], [11518, 11519], [11558, 11567],
[11622, 11630], [11632, 11647], [11671, 11679], [11743, 11822], [11824, 12292],
[12296, 12320], [12330, 12336], [12342, 12343], [12349, 12352], [12439, 12444],
[12544, 12548], [12590, 12592], [12687, 12689], [12694, 12703], [12728, 12783],
[12800, 12831], [12842, 12880], [12896, 12927], [12938, 12976], [12992, 13311],
[19894, 19967], [40908, 40959], [42125, 42191], [42238, 42239], [42509, 42511],
[42540, 42559], [42592, 42593], [42607, 42622], [42648, 42655], [42736, 42774],
[42784, 42785], [42889, 42890], [42893, 43002], [43043, 43055], [43062, 43071],
[43124, 43137], [43188, 43215], [43226, 43249], [43256, 43258], [43260, 43263],
[43302, 43311], [43335, 43359], [43389, 43395], [43443, 43470], [43482, 43519],
[43561, 43583], [43596, 43599], [43610, 43615], [43639, 43641], [43643, 43647],
[43698, 43700], [43703, 43704], [43710, 43711], [43715, 43738], [43742, 43967],
[44003, 44015], [44026, 44031], [55204, 55215], [55239, 55242], [55292, 55295],
[57344, 63743], [64046, 64047], [64110, 64111], [64218, 64255], [64263, 64274],
[64280, 64284], [64434, 64466], [64830, 64847], [64912, 64913], [64968, 65007],
[65020, 65135], [65277, 65295], [65306, 65312], [65339, 65344], [65371, 65381],
[65471, 65473], [65480, 65481], [65488, 65489], [65496, 65497]];
for (i = 0; i < ranges.length; i++) {
start = ranges[i][0];
end = ranges[i][1];
for (j = start; j <= end; j++) {
result[j] = true;
}
}
return result;
})();
function splitQuery(query) {
var result = [];
var start = -1;
for (var i = 0; i < query.length; i++) {
if (splitChars[query.charCodeAt(i)]) {
if (start !== -1) {
result.push(query.slice(start, i));
start = -1;
}
} else if (start === -1) {
start = i;
}
}
if (start !== -1) {
result.push(query.slice(start));
}
return result;
}

Binary file not shown.

Before

Width:  |  Height:  |  Size: 222 B

Binary file not shown.

Before

Width:  |  Height:  |  Size: 202 B

View File

@ -1,310 +0,0 @@
/*
* default.css_t
* ~~~~~~~~~~~~~
*
* Sphinx stylesheet -- default theme.
*
* :copyright: Copyright 2007-2018 by the Sphinx team, see AUTHORS.
* :license: BSD, see LICENSE for details.
*
*/
@import url("basic.css");
/* -- page layout ----------------------------------------------------------- */
body {
font-family: {{ theme_bodyfont }};
font-size: 100%;
background-color: {{ theme_footerbgcolor }};
color: #000;
margin: 0;
padding: 0;
}
div.document {
background-color: {{ theme_sidebarbgcolor }};
}
div.documentwrapper {
float: left;
width: 100%;
}
div.bodywrapper {
margin: 0 0 0 230px;
}
div.body {
background-color: {{ theme_bgcolor }};
color: {{ theme_textcolor }};
padding: 0 20px 30px 20px;
}
{%- if theme_rightsidebar|tobool %}
div.bodywrapper {
margin: 0 230px 0 0;
}
{%- endif %}
div.footer {
color: {{ theme_footertextcolor }};
width: 100%;
padding: 9px 0 9px 0;
text-align: center;
font-size: 75%;
}
div.footer a {
color: {{ theme_footertextcolor }};
text-decoration: underline;
}
div.related {
background-color: {{ theme_relbarbgcolor }};
line-height: 30px;
color: {{ theme_relbartextcolor }};
}
div.related a {
color: {{ theme_relbarlinkcolor }};
}
div.sphinxsidebar {
{%- if theme_stickysidebar|tobool %}
top: 30px;
bottom: 0;
margin: 0;
position: fixed;
overflow: auto;
height: auto;
{%- endif %}
{%- if theme_rightsidebar|tobool %}
float: right;
{%- if theme_stickysidebar|tobool %}
right: 0;
{%- endif %}
{%- endif %}
}
{%- if theme_stickysidebar|tobool %}
/* this is nice, but it it leads to hidden headings when jumping
to an anchor */
/*
div.related {
position: fixed;
}
div.documentwrapper {
margin-top: 30px;
}
*/
{%- endif %}
div.sphinxsidebar h3 {
font-family: {{ theme_headfont }};
color: {{ theme_sidebartextcolor }};
font-size: 1.4em;
font-weight: normal;
margin: 0;
padding: 0;
}
div.sphinxsidebar h3 a {
color: {{ theme_sidebartextcolor }};
}
div.sphinxsidebar h4 {
font-family: {{ theme_headfont }};
color: {{ theme_sidebartextcolor }};
font-size: 1.3em;
font-weight: normal;
margin: 5px 0 0 0;
padding: 0;
}
div.sphinxsidebar p {
color: {{ theme_sidebartextcolor }};
}
div.sphinxsidebar p.topless {
margin: 5px 10px 10px 10px;
}
div.sphinxsidebar ul {
margin: 10px;
padding: 0;
color: {{ theme_sidebartextcolor }};
}
div.sphinxsidebar a {
color: {{ theme_sidebarlinkcolor }};
}
div.sphinxsidebar input {
border: 1px solid {{ theme_sidebarlinkcolor }};
font-family: sans-serif;
font-size: 1em;
}
{% if theme_collapsiblesidebar|tobool %}
/* for collapsible sidebar */
div#sidebarbutton {
background-color: {{ theme_sidebarbtncolor }};
}
{% endif %}
/* -- hyperlink styles ------------------------------------------------------ */
a {
color: {{ theme_linkcolor }};
text-decoration: none;
}
a:visited {
color: {{ theme_visitedlinkcolor }};
text-decoration: none;
}
a:hover {
text-decoration: underline;
}
{% if theme_externalrefs|tobool %}
a.external {
text-decoration: none;
border-bottom: 1px dashed {{ theme_linkcolor }};
}
a.external:hover {
text-decoration: none;
border-bottom: none;
}
a.external:visited {
text-decoration: none;
border-bottom: 1px dashed {{ theme_visitedlinkcolor }};
}
{% endif %}
/* -- body styles ----------------------------------------------------------- */
div.body h1,
div.body h2,
div.body h3,
div.body h4,
div.body h5,
div.body h6 {
font-family: {{ theme_headfont }};
background-color: {{ theme_headbgcolor }};
font-weight: normal;
color: {{ theme_headtextcolor }};
border-bottom: 1px solid #ccc;
margin: 20px -20px 10px -20px;
padding: 3px 0 3px 10px;
}
div.body h1 { margin-top: 0; font-size: 200%; }
div.body h2 { font-size: 160%; }
div.body h3 { font-size: 140%; }
div.body h4 { font-size: 120%; }
div.body h5 { font-size: 110%; }
div.body h6 { font-size: 100%; }
a.headerlink {
color: {{ theme_headlinkcolor }};
font-size: 0.8em;
padding: 0 4px 0 4px;
text-decoration: none;
}
a.headerlink:hover {
background-color: {{ theme_headlinkcolor }};
color: white;
}
div.body p, div.body dd, div.body li {
text-align: justify;
line-height: 130%;
}
div.admonition p.admonition-title + p {
display: inline;
}
div.admonition p {
margin-bottom: 5px;
}
div.admonition pre {
margin-bottom: 5px;
}
div.admonition ul, div.admonition ol {
margin-bottom: 5px;
}
div.note {
background-color: #eee;
border: 1px solid #ccc;
}
div.seealso {
background-color: #ffc;
border: 1px solid #ff6;
}
div.topic {
background-color: #eee;
}
div.warning {
background-color: #ffe4e4;
border: 1px solid #f66;
}
p.admonition-title {
display: inline;
}
p.admonition-title:after {
content: ":";
}
pre {
padding: 5px;
background-color: {{ theme_codebgcolor }};
color: {{ theme_codetextcolor }};
line-height: 120%;
border: 1px solid #ac9;
border-left: none;
border-right: none;
}
code {
background-color: #ecf0f3;
padding: 0 1px 0 1px;
font-size: 0.95em;
}
th {
background-color: #ede;
}
.warning code {
background: #efc2c2;
}
.note code {
background: #d6d6d6;
}
.viewcode-back {
font-family: {{ theme_bodyfont }};
}
div.viewcode-block:target {
background-color: #f4debf;
border-top: 1px solid #ac9;
border-bottom: 1px solid #ac9;
}

Binary file not shown.

Before

Width:  |  Height:  |  Size: 286 B

Binary file not shown.

Before

Width:  |  Height:  |  Size: 333 B

Binary file not shown.

Before

Width:  |  Height:  |  Size: 190 B

Some files were not shown because too many files have changed in this diff Show More