* SDK - Moved the _container_builder from kfp.compiler to kfp.containers
This only moves the files. The imports remain the same for now.
* Simplified the imports.
* SDK - Compiler - Fix large data passing
Stop outputting parameters unless they're consumed as parameters downstream.
This prevents the situaltion when component outputs a big file, but DSL compiler instructs Argo to pick it up as parameter (parameters only hold few kilobytes of data).
As byproduct, this change fixes some minor compiler data passing bugs where some parameters were being passed around, but never consumed (happened with `ResourceOp`, `dsl.Condition` and recursion).
* Replaced ... with `raise AssertionError`
* Fixed small bug
* Removed unused variables
* Fixed names of the mark_upstream_ios_of_* functions
* Fixed detection of parameter output references
* Fixed handling of volumes
Two PRs have been merged that turned out to be slightly incompatible. This PR fixes the failing tests.
Root causes:
* The pipeline parameter default values were not properly serialized when constructing the metadata object.
* The `ParameterMeta` class did not validate the default value type, so the lack of serialization has not been caught. The `ParameterMeta` was replaced by `InputSpec` which has strict type validation.
* Previously we did not have samples with complex pipeline parameter default values (e.g. lists) that could trigger the failures. Then two samples were added that had complex default values.
* Travis does not re-run tests before merging
* Prow does not re-run Travis tests before merging
Currently, the parameter output values are not saved to storage and their values are lost as soon as garbage collector removes the workflow object.
This change makes is so the parameter output values are persisted.
* first working commit
* incrememtal commit
* in the middle of converting loop args constructor to accept pipeline param
* both cases working
* output works, passed doesn't
* about to redo compiler section
* rewrite draft done
* added withparam tests
* removed sdk/python/comp.yaml
* minor
* subvars work
* more tests
* removed unneeded artifact outputs from test yaml
* sort keys
* removed dead artifact code
* Refactor. Expose a public API to append pipeline param without interacting with dsl.Pipeline obj.
* Add unit test and fix.
* Fix docstring.
* Fix test
* Fix test
* Fix two nit problems
* Refactor
* Explicitly added mlpipeline outputs to the components that actually produce them
* Updated samples
* SDK - DSL - Stopped adding mlpipeline artifacts to every compiled template
Fixes https://github.com/kubeflow/pipelines/issues/1421
Fixes https://github.com/kubeflow/pipelines/issues/1422
* Updated the Lighweight sample
* Updated the compiler tests
* Fixed the lightweight sample
* Reverted the change to one contrib/samples/openvino
The sample will still work fine as it is now.
I'll add the change to that file as a separate PR.
* SDK - Switching python container components to Lightweight components code generator
* Fixed the tests
Had to remove the python2 test since python2 code generation is going away (python2 is near its End of Life and Kubeflow Pipelines only support python 3.5+).
* Added description for the internal add_files parameter
* Fixed typo
* Removed the `test_func_to_entrypoint` test
This was proposed by @gaoning777: `_func_to_entrypoint` is now just a reference to `_func_to_component_spec` which is extensively covered by other tests.
* SDK - Added support for raw artifact values to ContainerOp
* `ContainerOp` now gets artifact artguments from command line instead of the constructor.
* Added back input_artifact_arguments to the ContainerOp constructor.
In some scenarios it's hard to provide the artifact arguments through the `command` list when it already has resolved artifact paths.
* Exporting InputArtifactArgument from kfp.dsl
* Updated the sample
* Properly passing artifact arguments as task arguments
as opposed to default input values.
* Renamed input_artifact_arguments to artifact_arguments to reduce confusion
* Renamed InputArtifactArgument to InputArgumentPath
Also renamed input_artifact_arguments to artifact_argument_paths in the ContainerOp's constructor
* Replaced getattr with isinstance checks.
getattr is too fragile and can be broken by renames.
* Fixed the type annotations
* Unlocked the input artifact support in components
Added the test_input_path_placeholder_with_constant_argument test
* Fix bug where delete resource op should not have success_condition, failure_condition, and output parameters
* remove unnecessary whitespace
* compiler test for delete resource ops should retrieve templates from spec instead of root
* SDK - Refactoring - Serialized PipelineParam does not need type
Only the types in non-serialized PipelineParams are ever used.
* SDK - Refactoring - Serialized PipelineParam does not need value
Default values are only relevant when PipelineParam is used in the pipeline function signature and even in this case compiler captures them explicitly from the pipelineParam objects in the signature.
There is no other uses for them.
* avoid istio injector in the container builder
* find the correct namespace
* configure default ns to kubeflow if out of cluster; fix unit tests
* container build default gcs bucket
* resolve comments
* code refactor; add create_bucket_if_not_exist in containerbuilder
* support load kube config and output error, good for ai platform notebooks/local notebooks
* remove create_bucket_if_not_exist param
* SDK - Containers - Returning image name with digest
Image building functions now return image name with digest: image_repo@sha256:digest
Fixes https://github.com/kubeflow/pipelines/issues/1715
* Added comments
* Remove redundant import.
* Simplify sample_test.yaml by using withItem syntax.
* Simplify sample_test.yaml by using withItem syntax.
* Change dict to str in withItems.
* Add image pull secret sample.
* Move imagepullsecret sample from test dir to sample dir. Waiting on corresponding unit test infra refactoring.
* Update the location of imagepullsecrets so that it can serve as an example.
* Add minimal comments documenting usage.
* Remove redundant import.
* Simplify sample_test.yaml by using withItem syntax.
* Simplify sample_test.yaml by using withItem syntax.
* Change dict to str in withItems.
* Add preemptible gpu tpu sample and unittest
* Update a test utility function.
* Seperate the location of sample and gold .yaml for testing purpose.
* Added support for mulitple outputs
* Added test for multiple output
* Adding sample for multiple outputs
* func_signature now shorter form
* Added parameters tag
* Fixed func_signature mistake
* refactor component build code
* remove unnecessary import
* minor changes
* fix unit tests
* separate the container build from the component build; add support for directories in the containerbuilder
* minor fixes
* fix unit test
* fix tarball error
* revert changes
* unit test fix
* minor fix
* addressing comments
* removing the check_gcs_path function
* move namespace to the contructor of containerbuilder
* fix bugs
* refactor component build code
* remove unnecessary import
* minor changes
* fix unit tests
* fix sample test bug
* revert the change of the dependency orders
* add -u to disable python stdout buffering
* address the comments
* separate lines to look clean
* fix unit tests
* fix
* Add PipelineConf method to set ttlSecondsAfterFinished in argo workflow spec
* remove unnecessary compile test for ttl. add unit test for ttl instead.
* Configure gcp connectors in dsl
* Make configure_gcp_connector more extensible
* Add add_pod_env op handler.
* Only apply add_pod_env on ContainerOp
* Update license header
* Frontend - Show customized task display names
* Added customized name test
* Added ContainerOp.set_display_name(name) method
* Stopped writing human_name to display_name annotation for now
Reason: It's a change to existing pipelines.
* Added test for op.set_display_name
* Fix for tests that have workflows with status nodes, but without any spec or templates
* Fixed the test workflow
* Fix linter error
Error: "The key 'metadata' is not sorted alphabetically"
* add default value type checking
* add jsonschema dependency
* fix unit test error
* workaround for travis python package installation
* add back jsonschema version
* fix sample test error in type checking sample
* add jsonschema in requirements such that sphinx works fine