Feat/dev env cleanup (#167)
* build: Update pre-commit config versions and setup. Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com> * build: Migrate isort config to `pyproject` Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com> * style: Use recommended black-compatible flake8 options Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com> * build: Add standard pre-commit hooks. Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com> * docs: Add a note about this PR to the changelog. Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com> * docs: Cleanup docs, fix links. Add lins to respective tooling. Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com> * build: add dev-only dependencies. Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com> * style: reformat using new style/format configs Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com> * build: add pre-commit to dev dependencies Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com> * style: run pre-commit hooks on all the files Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com> * docs: Add dev status to the classifier. Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com> * docs: add missing links and dates for releases and PRs. Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com> * docs: Add latest PR to the changelog Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com> * ci: Add new maintainers Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>
This commit is contained in:
parent
8483e8e310
commit
885d365dd2
|
@ -1,4 +0,0 @@
|
|||
[settings]
|
||||
line_length = 80
|
||||
multi_line_output = 3
|
||||
include_trailing_comma = True
|
|
@ -1,10 +1,17 @@
|
|||
repos:
|
||||
- repo: https://github.com/timothycrosley/isort/
|
||||
rev: 5.0.4
|
||||
- repo: https://github.com/pre-commit/pre-commit-hooks
|
||||
rev: v4.3.0
|
||||
hooks:
|
||||
- id: isort
|
||||
- repo: https://github.com/psf/black
|
||||
rev: 19.10b0
|
||||
- id: trailing-whitespace
|
||||
- id: end-of-file-fixer
|
||||
- id: check-toml
|
||||
- repo: https://github.com/pycqa/isort
|
||||
rev: 5.10.1
|
||||
hooks:
|
||||
- id: black
|
||||
language_version: python3.9
|
||||
- id: isort
|
||||
args: [ "--profile", "black", "--filter-files" ]
|
||||
- repo: https://github.com/psf/black
|
||||
rev: 22.6.0
|
||||
hooks:
|
||||
- id: black
|
||||
language_version: python3.10
|
||||
|
|
34
CHANGELOG.md
34
CHANGELOG.md
|
@ -6,6 +6,12 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
|
|||
|
||||
## [Unreleased]
|
||||
|
||||
### Added
|
||||
- Added `.get` accessor for even properties ([#165])
|
||||
|
||||
### Changed
|
||||
- Code quality and styling tooling is unified and configs compatibility is ensured ([#167])
|
||||
|
||||
## [1.3.0] — 2022-09-07
|
||||
### Added
|
||||
- Python 3.9 support ([#144])
|
||||
|
@ -17,11 +23,11 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
|
|||
- Fixed typings issues ([#149])
|
||||
- The package redistributive ability by inlining required `pypi-packaging.py` functions ([#151])
|
||||
|
||||
## [1.2.0]
|
||||
## [1.2.0] — 2020-08-20
|
||||
### Added
|
||||
- Added GenericException, DataMarshallingError and DataUnmarshallingError ([#120])
|
||||
|
||||
## [1.1.0]
|
||||
## [1.1.0] — 2020-08-18
|
||||
### Changed
|
||||
- Changed from_http to now expect headers argument before data ([#110])
|
||||
- Renamed exception names ([#111])
|
||||
|
@ -32,12 +38,12 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
|
|||
### Deprecated
|
||||
- Renamed to_binary_http and to_structured_http. ([#108])
|
||||
|
||||
## [1.0.1]
|
||||
## [1.0.1] — 2020-08-14
|
||||
### Added
|
||||
- CloudEvent exceptions and event type checking in http module ([#96])
|
||||
- CloudEvent equality override ([#98])
|
||||
|
||||
## [1.0.0]
|
||||
## [1.0.0] — 2020-08-11
|
||||
### Added
|
||||
- Update types and handle data_base64 structured ([#34])
|
||||
- Added a user friendly CloudEvent class with data validation ([#36])
|
||||
|
@ -51,7 +57,7 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
|
|||
### Removed
|
||||
- Removed support for Cloudevents V0.2 and V0.1 ([#43])
|
||||
|
||||
## [0.3.0]
|
||||
## [0.3.0] — 2020-07-11
|
||||
### Added
|
||||
- Added Cloudevents V0.3 and V1 implementations ([#22])
|
||||
- Add helpful text to README ([#23])
|
||||
|
@ -92,7 +98,12 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
|
|||
### Added
|
||||
- Initial release
|
||||
|
||||
[0.3.0]: https://github.com/cloudevents/sdk-python/compare/0.2.4...HEAD
|
||||
[1.3.0]: https://github.com/cloudevents/sdk-python/compare/1.2.0...1.3.0
|
||||
[1.2.0]: https://github.com/cloudevents/sdk-python/compare/1.1.0...1.2.0
|
||||
[1.1.0]: https://github.com/cloudevents/sdk-python/compare/1.0.1...1.1.0
|
||||
[1.0.1]: https://github.com/cloudevents/sdk-python/compare/1.0.0...1.0.1
|
||||
[1.0.0]: https://github.com/cloudevents/sdk-python/compare/0.3.0...1.0.0
|
||||
[0.3.0]: https://github.com/cloudevents/sdk-python/compare/0.2.4...0.3.0
|
||||
[0.2.4]: https://github.com/cloudevents/sdk-python/compare/0.2.3...0.2.4
|
||||
[0.2.3]: https://github.com/cloudevents/sdk-python/compare/0.2.2...0.2.3
|
||||
[0.2.2]: https://github.com/cloudevents/sdk-python/compare/0.2.1...0.2.2
|
||||
|
@ -126,4 +137,13 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
|
|||
[#110]: https://github.com/cloudevents/sdk-python/pull/110
|
||||
[#111]: https://github.com/cloudevents/sdk-python/pull/111
|
||||
[#119]: https://github.com/cloudevents/sdk-python/pull/119
|
||||
[#120]: https://github.com/cloudevents/sdk-python/pull/120
|
||||
[#120]: https://github.com/cloudevents/sdk-python/pull/120
|
||||
[#144]: https://github.com/cloudevents/sdk-python/pull/144
|
||||
[#149]: https://github.com/cloudevents/sdk-python/pull/149
|
||||
[#150]: https://github.com/cloudevents/sdk-python/pull/150
|
||||
[#151]: https://github.com/cloudevents/sdk-python/pull/151
|
||||
[#158]: https://github.com/cloudevents/sdk-python/pull/158
|
||||
[#159]: https://github.com/cloudevents/sdk-python/pull/159
|
||||
[#160]: https://github.com/cloudevents/sdk-python/pull/160
|
||||
[#165]: https://github.com/cloudevents/sdk-python/pull/165
|
||||
[#167]: https://github.com/cloudevents/sdk-python/pull/167
|
||||
|
|
2
Makefile
2
Makefile
|
@ -16,4 +16,4 @@ help:
|
|||
# Catch-all target: route all unknown targets to Sphinx using the new
|
||||
# "make mode" option. $(O) is meant as a shortcut for $(SPHINXOPTS).
|
||||
%: Makefile
|
||||
@$(SPHINXBUILD) -M $@ "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)
|
||||
@$(SPHINXBUILD) -M $@ "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)
|
||||
|
|
16
README.md
16
README.md
|
@ -133,19 +133,19 @@ the same API. It will use semantic versioning with following rules:
|
|||
|
||||
Each SDK may have its own unique processes, tooling and guidelines, common
|
||||
governance related material can be found in the
|
||||
[CloudEvents `community`](https://github.com/cloudevents/spec/tree/master/community)
|
||||
[CloudEvents `docs`](https://github.com/cloudevents/spec/tree/main/docs)
|
||||
directory. In particular, in there you will find information concerning
|
||||
how SDK projects are
|
||||
[managed](https://github.com/cloudevents/spec/blob/master/community/SDK-GOVERNANCE.md),
|
||||
[guidelines](https://github.com/cloudevents/spec/blob/master/community/SDK-maintainer-guidelines.md)
|
||||
[managed](https://github.com/cloudevents/spec/blob/main/docs/GOVERNANCE.md),
|
||||
[guidelines](https://github.com/cloudevents/spec/blob/main/docs/SDK-maintainer-guidelines.md)
|
||||
for how PR reviews and approval, and our
|
||||
[Code of Conduct](https://github.com/cloudevents/spec/blob/master/community/GOVERNANCE.md#additional-information)
|
||||
[Code of Conduct](https://github.com/cloudevents/spec/blob/main/docs/GOVERNANCE.md#additional-information)
|
||||
information.
|
||||
|
||||
## Maintenance
|
||||
|
||||
We use black and isort for autoformatting. We setup a tox environment to reformat
|
||||
the codebase.
|
||||
We use [black][black] and [isort][isort] for autoformatting. We set up a [tox][tox] environment
|
||||
to reformat the codebase.
|
||||
|
||||
e.g.
|
||||
|
||||
|
@ -155,3 +155,7 @@ tox -e reformat
|
|||
```
|
||||
|
||||
For information on releasing version bumps see [RELEASING.md](RELEASING.md)
|
||||
|
||||
[black]: https://black.readthedocs.io/
|
||||
[isort]: https://pycqa.github.io/isort/
|
||||
[tox]: https://tox.wiki/
|
||||
|
|
|
@ -26,9 +26,7 @@ class CloudEvent:
|
|||
Supports both binary and structured mode CloudEvents
|
||||
"""
|
||||
|
||||
def __init__(
|
||||
self, attributes: typing.Dict[str, str], data: typing.Any = None
|
||||
):
|
||||
def __init__(self, attributes: typing.Dict[str, str], data: typing.Any = None):
|
||||
"""
|
||||
Event Constructor
|
||||
:param attributes: a dict with cloudevent attributes. Minimally
|
||||
|
|
|
@ -26,6 +26,4 @@ def is_structured(headers: typing.Dict[str, str]) -> bool:
|
|||
headers = {key.lower(): value for key, value in headers.items()}
|
||||
content_type = headers.get("content-type", "")
|
||||
structured_parser = structured.JSONHTTPCloudEventConverter()
|
||||
return structured_parser.can_read(
|
||||
content_type=content_type, headers=headers
|
||||
)
|
||||
return structured_parser.can_read(content_type=content_type, headers=headers)
|
||||
|
|
|
@ -26,9 +26,7 @@ class JSONHTTPCloudEventConverter(base.Converter):
|
|||
TYPE = "structured"
|
||||
MIME_TYPE = "application/cloudevents+json"
|
||||
|
||||
def can_read(
|
||||
self, content_type: str, headers: typing.Dict[str, str] = {}
|
||||
) -> bool:
|
||||
def can_read(self, content_type: str, headers: typing.Dict[str, str] = {}) -> bool:
|
||||
return (
|
||||
isinstance(content_type, str)
|
||||
and content_type.startswith(self.MIME_TYPE)
|
||||
|
|
|
@ -206,8 +206,7 @@ class BaseEvent(EventGetterSetter):
|
|||
data = data_marshaller(data)
|
||||
except Exception as e:
|
||||
raise cloud_exceptions.DataMarshallerError(
|
||||
"Failed to marshall data with error: "
|
||||
f"{type(e).__name__}('{e}')"
|
||||
f"Failed to marshall data with error: {type(e).__name__}('{e}')"
|
||||
)
|
||||
if isinstance(data, (bytes, bytes, memoryview)):
|
||||
props["data_base64"] = base64.b64encode(data).decode("ascii")
|
||||
|
@ -256,9 +255,7 @@ class BaseEvent(EventGetterSetter):
|
|||
body: typing.Union[bytes, str],
|
||||
data_unmarshaller: types.UnmarshallerType,
|
||||
):
|
||||
required_binary_fields = {
|
||||
f"ce-{field}" for field in self._ce_required_fields
|
||||
}
|
||||
required_binary_fields = {f"ce-{field}" for field in self._ce_required_fields}
|
||||
missing_fields = required_binary_fields - headers.keys()
|
||||
|
||||
if len(missing_fields) > 0:
|
||||
|
@ -277,8 +274,7 @@ class BaseEvent(EventGetterSetter):
|
|||
raw_ce = data_unmarshaller(body)
|
||||
except Exception as e:
|
||||
raise cloud_exceptions.DataUnmarshallerError(
|
||||
"Failed to unmarshall data with error: "
|
||||
f"{type(e).__name__}('{e}')"
|
||||
f"Failed to unmarshall data with error: {type(e).__name__}('{e}')"
|
||||
)
|
||||
self.Set("data", raw_ce)
|
||||
|
||||
|
@ -304,8 +300,7 @@ class BaseEvent(EventGetterSetter):
|
|||
data = data_marshaller(data)
|
||||
except Exception as e:
|
||||
raise cloud_exceptions.DataMarshallerError(
|
||||
"Failed to marshall data with error: "
|
||||
f"{type(e).__name__}('{e}')"
|
||||
f"Failed to marshall data with error: {type(e).__name__}('{e}')"
|
||||
)
|
||||
if isinstance(data, str): # Convenience method for json.dumps
|
||||
data = data.encode("utf-8")
|
||||
|
|
|
@ -33,9 +33,7 @@ class Event(base.BaseEvent):
|
|||
self.ce__type = opt.Option("type", None, True)
|
||||
|
||||
self.ce__datacontenttype = opt.Option("datacontenttype", None, False)
|
||||
self.ce__datacontentencoding = opt.Option(
|
||||
"datacontentencoding", None, False
|
||||
)
|
||||
self.ce__datacontentencoding = opt.Option("datacontentencoding", None, False)
|
||||
self.ce__subject = opt.Option("subject", None, False)
|
||||
self.ce__time = opt.Option("time", None, False)
|
||||
self.ce__schemaurl = opt.Option("schemaurl", None, False)
|
||||
|
|
|
@ -24,12 +24,8 @@ from cloudevents.tests import data
|
|||
|
||||
@pytest.mark.parametrize("event_class", [v03.Event, v1.Event])
|
||||
def test_binary_converter_upstream(event_class):
|
||||
m = marshaller.NewHTTPMarshaller(
|
||||
[binary.NewBinaryHTTPCloudEventConverter()]
|
||||
)
|
||||
event = m.FromRequest(
|
||||
event_class(), data.headers[event_class], None, lambda x: x
|
||||
)
|
||||
m = marshaller.NewHTTPMarshaller([binary.NewBinaryHTTPCloudEventConverter()])
|
||||
event = m.FromRequest(event_class(), data.headers[event_class], None, lambda x: x)
|
||||
assert event is not None
|
||||
assert event.EventType() == data.ce_type
|
||||
assert event.EventID() == data.ce_id
|
||||
|
@ -38,9 +34,7 @@ def test_binary_converter_upstream(event_class):
|
|||
|
||||
@pytest.mark.parametrize("event_class", [v03.Event, v1.Event])
|
||||
def test_structured_converter_upstream(event_class):
|
||||
m = marshaller.NewHTTPMarshaller(
|
||||
[structured.NewJSONHTTPCloudEventConverter()]
|
||||
)
|
||||
m = marshaller.NewHTTPMarshaller([structured.NewJSONHTTPCloudEventConverter()])
|
||||
event = m.FromRequest(
|
||||
event_class(),
|
||||
{"Content-Type": "application/cloudevents+json"},
|
||||
|
|
|
@ -59,14 +59,8 @@ def test_extensions_are_set_upstream():
|
|||
|
||||
|
||||
def test_binary_event_v1():
|
||||
event = (
|
||||
v1.Event()
|
||||
.SetContentType("application/octet-stream")
|
||||
.SetData(b"\x00\x01")
|
||||
)
|
||||
m = marshaller.NewHTTPMarshaller(
|
||||
[structured.NewJSONHTTPCloudEventConverter()]
|
||||
)
|
||||
event = v1.Event().SetContentType("application/octet-stream").SetData(b"\x00\x01")
|
||||
m = marshaller.NewHTTPMarshaller([structured.NewJSONHTTPCloudEventConverter()])
|
||||
|
||||
_, body = m.ToRequest(event, converters.TypeStructured, lambda x: x)
|
||||
assert isinstance(body, bytes)
|
||||
|
@ -76,9 +70,7 @@ def test_binary_event_v1():
|
|||
|
||||
|
||||
def test_object_event_v1():
|
||||
event = (
|
||||
v1.Event().SetContentType("application/json").SetData({"name": "john"})
|
||||
)
|
||||
event = v1.Event().SetContentType("application/json").SetData({"name": "john"})
|
||||
|
||||
m = marshaller.NewDefaultHTTPMarshaller()
|
||||
|
||||
|
|
|
@ -33,9 +33,7 @@ def your_dummy_data():
|
|||
return '{"name":"paul"}'
|
||||
|
||||
|
||||
def test_http_cloudevent_equality(
|
||||
dummy_attributes, my_dummy_data, your_dummy_data
|
||||
):
|
||||
def test_http_cloudevent_equality(dummy_attributes, my_dummy_data, your_dummy_data):
|
||||
data = my_dummy_data
|
||||
event1 = CloudEvent(dummy_attributes, data)
|
||||
event2 = CloudEvent(dummy_attributes, data)
|
||||
|
@ -165,15 +163,11 @@ def test_get_operation_on_non_existing_attribute_must_return_default_value_if_gi
|
|||
dummy_event, non_exiting_attribute_name
|
||||
):
|
||||
dummy_value = "Hello World"
|
||||
assert (
|
||||
dummy_event.get(non_exiting_attribute_name, dummy_value) == dummy_value
|
||||
)
|
||||
assert dummy_event.get(non_exiting_attribute_name, dummy_value) == dummy_value
|
||||
|
||||
|
||||
def test_get_operation_on_non_existing_attribute_should_not_copy_default_value(
|
||||
dummy_event, non_exiting_attribute_name
|
||||
):
|
||||
dummy_value = object()
|
||||
assert (
|
||||
dummy_event.get(non_exiting_attribute_name, dummy_value) is dummy_value
|
||||
)
|
||||
assert dummy_event.get(non_exiting_attribute_name, dummy_value) is dummy_value
|
||||
|
|
|
@ -74,9 +74,7 @@ async def echo(request):
|
|||
decoder = None
|
||||
if "binary-payload" in request.headers:
|
||||
decoder = lambda x: x
|
||||
event = from_http(
|
||||
dict(request.headers), request.body, data_unmarshaller=decoder
|
||||
)
|
||||
event = from_http(dict(request.headers), request.body, data_unmarshaller=decoder)
|
||||
data = (
|
||||
event.data
|
||||
if isinstance(event.data, (bytes, bytearray, memoryview))
|
||||
|
@ -143,9 +141,7 @@ def test_emit_structured_event(specversion):
|
|||
"specversion": specversion,
|
||||
"data": test_data,
|
||||
}
|
||||
_, r = app.test_client.post(
|
||||
"/event", headers=headers, data=json.dumps(body)
|
||||
)
|
||||
_, r = app.test_client.post("/event", headers=headers, data=json.dumps(body))
|
||||
|
||||
# Convert byte array to dict
|
||||
# e.g. r.body = b'{"payload-content": "Hello World!"}'
|
||||
|
@ -463,9 +459,7 @@ def test_empty_json_structured():
|
|||
data = ""
|
||||
with pytest.raises(cloud_exceptions.MissingRequiredFields) as e:
|
||||
from_http(headers, data)
|
||||
assert "Failed to read specversion from both headers and data" in str(
|
||||
e.value
|
||||
)
|
||||
assert "Failed to read specversion from both headers and data" in str(e.value)
|
||||
|
||||
|
||||
def test_uppercase_headers_with_none_data_binary():
|
||||
|
@ -520,7 +514,5 @@ def test_non_dict_data_no_headers_bug():
|
|||
data = "123"
|
||||
with pytest.raises(cloud_exceptions.MissingRequiredFields) as e:
|
||||
from_http(headers, data)
|
||||
assert "Failed to read specversion from both headers and data" in str(
|
||||
e.value
|
||||
)
|
||||
assert "Failed to read specversion from both headers and data" in str(e.value)
|
||||
assert "The following deserialized data has no 'get' method" in str(e.value)
|
||||
|
|
|
@ -60,23 +60,17 @@ def test_to_request_wrong_marshaller():
|
|||
|
||||
def test_from_request_cannot_read(binary_headers):
|
||||
with pytest.raises(exceptions.UnsupportedEventConverter):
|
||||
m = marshaller.HTTPMarshaller(
|
||||
[binary.NewBinaryHTTPCloudEventConverter()]
|
||||
)
|
||||
m = marshaller.HTTPMarshaller([binary.NewBinaryHTTPCloudEventConverter()])
|
||||
m.FromRequest(v1.Event(), {}, "")
|
||||
|
||||
with pytest.raises(exceptions.UnsupportedEventConverter):
|
||||
m = marshaller.HTTPMarshaller(
|
||||
[structured.NewJSONHTTPCloudEventConverter()]
|
||||
)
|
||||
m = marshaller.HTTPMarshaller([structured.NewJSONHTTPCloudEventConverter()])
|
||||
m.FromRequest(v1.Event(), binary_headers, "")
|
||||
|
||||
|
||||
def test_to_request_invalid_converter():
|
||||
with pytest.raises(exceptions.NoSuchConverter):
|
||||
m = marshaller.HTTPMarshaller(
|
||||
[structured.NewJSONHTTPCloudEventConverter()]
|
||||
)
|
||||
m = marshaller.HTTPMarshaller([structured.NewJSONHTTPCloudEventConverter()])
|
||||
m.ToRequest(v1.Event(), "")
|
||||
|
||||
|
||||
|
|
|
@ -30,9 +30,7 @@ async def is_ok(request):
|
|||
|
||||
@app.route("/echo", ["POST"])
|
||||
async def echo(request):
|
||||
event = m.FromRequest(
|
||||
v1.Event(), dict(request.headers), request.body, lambda x: x
|
||||
)
|
||||
event = m.FromRequest(v1.Event(), dict(request.headers), request.body, lambda x: x)
|
||||
hs, body = m.ToRequest(event, converters.TypeBinary, lambda x: x)
|
||||
return response.text(body.decode("utf-8"), headers=hs)
|
||||
|
||||
|
|
|
@ -1,4 +1,5 @@
|
|||
import os
|
||||
|
||||
import pkg_resources
|
||||
|
||||
from setup import pypi_config
|
||||
|
|
|
@ -1,5 +1,5 @@
|
|||
[tool.black]
|
||||
line-length = 80
|
||||
line-length = 88
|
||||
include = '\.pyi?$'
|
||||
exclude = '''
|
||||
/(
|
||||
|
@ -14,3 +14,6 @@ exclude = '''
|
|||
| dist
|
||||
)/
|
||||
'''
|
||||
|
||||
[tool.isort]
|
||||
profile = "black"
|
||||
|
|
|
@ -0,0 +1,9 @@
|
|||
black
|
||||
isort
|
||||
flake8
|
||||
pep8-naming
|
||||
flake8-import-order
|
||||
flake8-print
|
||||
flake8-strict
|
||||
tox
|
||||
pre-commit
|
|
@ -63,9 +63,7 @@ def send_structured_cloud_event(url: str):
|
|||
if __name__ == "__main__":
|
||||
# Run client.py via: 'python3 client.py http://localhost:3000/'
|
||||
if len(sys.argv) < 2:
|
||||
sys.exit(
|
||||
"Usage: python with_requests.py " "<CloudEvents controller URL>"
|
||||
)
|
||||
sys.exit("Usage: python with_requests.py <CloudEvents controller URL>")
|
||||
|
||||
url = sys.argv[1]
|
||||
send_binary_cloud_event(url)
|
||||
|
|
|
@ -54,9 +54,7 @@ if __name__ == "__main__":
|
|||
# expects a url from command line.
|
||||
# e.g. python3 client.py http://localhost:3000/
|
||||
if len(sys.argv) < 2:
|
||||
sys.exit(
|
||||
"Usage: python with_requests.py " "<CloudEvents controller URL>"
|
||||
)
|
||||
sys.exit("Usage: python with_requests.py <CloudEvents controller URL>")
|
||||
|
||||
url = sys.argv[1]
|
||||
send_binary_cloud_event(url)
|
||||
|
|
8
setup.py
8
setup.py
|
@ -11,12 +11,12 @@
|
|||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
from setuptools import setup, find_packages
|
||||
|
||||
import os
|
||||
import codecs
|
||||
import os
|
||||
import pathlib
|
||||
|
||||
from setuptools import find_packages, setup
|
||||
|
||||
|
||||
def read(rel_path):
|
||||
here = os.path.abspath(os.path.dirname(__file__))
|
||||
|
@ -54,7 +54,9 @@ if __name__ == "__main__":
|
|||
classifiers=[
|
||||
"Intended Audience :: Information Technology",
|
||||
"Intended Audience :: System Administrators",
|
||||
"Intended Audience :: Developers",
|
||||
"License :: OSI Approved :: Apache Software License",
|
||||
"Development Status :: 5 - Production/Stable",
|
||||
"Operating System :: POSIX :: Linux",
|
||||
"Programming Language :: Python :: 3",
|
||||
"Programming Language :: Python :: 3.6",
|
||||
|
|
4
tox.ini
4
tox.ini
|
@ -13,7 +13,7 @@ setenv =
|
|||
commands = pytest {env:PYTESTARGS} {posargs}
|
||||
|
||||
[testenv:reformat]
|
||||
basepython=python3.10
|
||||
basepython = python3.10
|
||||
deps =
|
||||
black
|
||||
isort
|
||||
|
@ -30,4 +30,4 @@ deps =
|
|||
commands =
|
||||
black --check .
|
||||
isort -c cloudevents samples
|
||||
flake8 cloudevents samples --ignore W503,E731 --max-line-length 88
|
||||
flake8 cloudevents samples --ignore W503,E731 --extend-ignore E203 --max-line-length 88
|
||||
|
|
Loading…
Reference in New Issue