* remove unused imports
* use google as isort profile
* sort imports
* format with yapf
* clean end of file new line, trailing whitespace, double quoted strings
* Reformat sdk only using the new yapf config.
* Reformat docstrings using docformatter.
* update golden files to resolve diff caused by whitespaces
* fix some tests
* format .py files under sdk/python/tests using yapf
* additional docformatter
* fix some tests
* fix uri placeholder in v2 compatible mode
* fix tests
* fix path generation
* fix tests
* fix test
* cleanup
* clean up
* fix test
* fix test
* fix test
* handle type annotation with Optional
* remove typing._GenericAlias, which is available only after Py 3.7
* fix
* fix py3.6 test
* address review comments
* Update syntax for lightweight components v2.
Use Input[T] and Output[T] for representing input/output artifacts
of type T. This enables users to see these annotations as merely the
classes T, enabling autocompletion, reducing verbosity and simplifying
I/O types code. Users also no longer
Other changes include:
- no longer need to use .get() to obtain the artifact class
- no auto-creation of the `data` file for output artifacts. This enables
artifacts to be either directories or files.
* fix for Py 3.6
* Remove stray file.
* Add error message if Input/Output are used with non Artifact types.
* Fix e2e test.
* SDK - Refactored command-line resolving
Moved the execution engine specific code to the component bridge.
* Added placeholder_resolver
This simplifies adding custom placeholder resolving logic.
* init
* Update type annotation.
Arguments can be String, Integer, Float, Bool and PipelineParam passed from pipeline.
* Fix type annotation consistently.
* Add unit test.
* Fix unit test case.
* Fix after review, simplify by converting arguments and command to lists of str.
Adds Metrics, ClassificationMetrics SlicedClassificationMetrics to ontology types.
Common methods used by helper metric classes and Artifact class are moved to
_artifact_utils.py
Change updates dsl Artifact class to support protobuff based metadata
removing existing implementation that uses properties/custom_properties
to handle metadata.
This change modifies the ontology artifacts with enhanced schemas for
model and datasets.
When calling the delete() method of a ResourceOp we need to ensure we do
not wait for its deletion.
The reason for this is described in [1]: If a pipeline creates a
resource which is being consumed by its steps (e.g., a PVC), the step
deleting the resource will hang waiting for the Kubernetes resource
deletion which, in turn, is waiting for the other steps to get deleted.
As a result, the pipeline never finishes.
This commit allows specifying flags for the ResourceOp kubectl commands
and defaults to the '--wait=false' flag for the deletion.
Specifying flags for a ResourceTemplate is not supported in Argo v2.7
that we currently deploy. But they will be once we upgrade to v2.11+
[2]. This does not affect the delete() method because we don't rely on
Argo's ResourceTemplate for it.
[1] https://github.com/kubeflow/pipelines/issues/4506
[2] https://github.com/kubeflow/pipelines/issues/4553
Signed-off-by: Ilias Katsakioris <elikatsis@arrikto.com>
* add tests for pythonic and non-pythonic component outputs
* fix: graph for non-pythonic container output's names
Loading container component from component.yaml creates both
pythonic and original output names. Graph component iterated over
all outputs, using pythonic-to-output conversion on all. If some
of the names are not identical to their pythonic versions, they
rised KeyError on the lookup table.
This commit fixes this problem by using default value for the lookup.
* remove depythonification of outputs - not needed anymore
* SDK - Added warning when not using components
We have long advised our users to create reusable components.
Creating reusable components is as easy as creating ContainerOp instances, but the components are shareable, portable and are easier to support going forward.
* Disable warning for TFX
* Fixed the warning disabling logic
* Added tests
* SDK - Made outputs with original names available in ContainerOp.outputs
Previously, ContainerOp had strict requirements for the output names, so we had to convert all the names before passing them to the ContainerOp constructor. Outputs with non-pythonic names could not be accessed using their original names.
Now ContainerOp supports any output names, so we're now using the original output names.
However to support legacy pipelines, we're also adding output references with pythonic names.
* Fixed the compiler test data
* Fixed the duplicate parameter outputs in the compiled workflow
* Fixed long line
* Stabilized the output naming conflict resolution
* Fix case of missing special outputs
Added test_fail_on_handling_list_arguments_containing_python_objects
Added test_handling_list_arguments_containing_serializable_python_objects
Moved test_handling_list_arguments_containing_pipelineparam to component_bridge_tests
* SDK - Tests - Testing command-line resolving explicitly
After the recent small refactoring of the task resolving flow in the component library, some tests we left unupdated with compatibility shims added to make the tests pass.
This PR updates the remaining tests and removes the shims.
This mostly involves using explicitly using `_resolve_command_line_and_paths`.
Some tests that validate the behavior of the dsl bridge were moved to `component_bridge_tests.py`
* Indented the component texts
* SDK/DSL: Enable the deletion of a resource via ResourceOp method
* Add the method delete() to ResourceOps
* Extend ResourceOp & VolumeOp tests
Signed-off-by: Ilias Katsakioris <elikatsis@arrikto.com>
* Fix ValueError not being raised
* Replaced `_instance_to_dict(obj)` with `obj.to_dict()`
* Fixed the capitalization in _python_function_name_to_component_name
It now only changes the case of the first letter.
* Replaced the _extract_component_metadata function with _extract_component_interface
* Stopped adding newline to the component description.
* Handling None inputs and outputs
* Not including emply inputs and outputs in component spec
* Renamed the private attributes that the @pipeline decorator sets
* Changged _extract_pipeline_metadata to use _extract_component_interface
* Fixed issues based on feedback
* SDK/DSL: Fix PipelineVolume name length
Volume name must be no more than 63 characters
Signed-off-by: Ilias Katsakioris <elikatsis@arrikto.com>
* Change which part of the hash value we make use of
Signed-off-by: Ilias Katsakioris <elikatsis@arrikto.com>
* added new secret support
* updated the documentation and env settings
* updated after feedback
* added tests
* nameing issue fixed
* renamed test to follow unittest standard
* updated after feedback
* the new test after renaming
* added the test to main
* updates after feedback
* added licensce agreement
* removed space
* updated the volume named to be generated
* secret_name as volume name and updated test
* updated the file structure
* fixed build
* SDK - Refactoring - Split the K8sHelper class
One part was only used by container builder and provided higher-level API over K8s Client.
Another was used by the compiler and did not use the kubernetes library.
* Updated the license year.
Two PRs have been merged that turned out to be slightly incompatible. This PR fixes the failing tests.
Root causes:
* The pipeline parameter default values were not properly serialized when constructing the metadata object.
* The `ParameterMeta` class did not validate the default value type, so the lack of serialization has not been caught. The `ParameterMeta` was replaced by `InputSpec` which has strict type validation.
* Previously we did not have samples with complex pipeline parameter default values (e.g. lists) that could trigger the failures. Then two samples were added that had complex default values.
* Travis does not re-run tests before merging
* Prow does not re-run Travis tests before merging