* Added warning message to NewRun.tsx when a pipelines has empty parameters
* Updated existing test snapshots
* Created unit tests for new missing parameters message
* Refactor. Expose a public API to append pipeline param without interacting with dsl.Pipeline obj.
* Add unit test and fix.
* Fix docstring.
* Fix test
* Fix test
* Fix two nit problems
* Refactor
* SDK - Testing - Run some unit-tests in a more correct way
Replaced `@unittest.expectedFailure` with `with self.assertRaises(...):`.
Replaced `assert` with `self.assertEqual(...)`.
Stopped producing the stray "comp.yaml" file.
Enabled the test_load_component_from_url test.
* Removed a stray comment
* Addded two tests for output_component_file
* Fix bug where source and variables are not accessible to visualization
* Updated snapshot
* Removed test_generate_test_visualization_html_from_notebook
* Added test cases to ensure roc_curve, table, and tfdv visualizations can be generated
* Made test requirements identical to normal requirements
* Fixed source links
* Updated test_server.py to use table visualization
* Update .travis.yml
* Add logging to debug travis tests
* Add tensorflow back to requirements.txt
* Updated .travis.yml and requirements.txt, also added comment that specifies required libraries to run tests
* Testing TFDV visualization with different source
* Changed remote paths to be local due to timeout issues
* Removed visualization tests due to continued failure
* Reverted .gitignore and removed tensorflow from text_exporter pip install command
* Moved where dependencies are installed in .travis.yaml
* Revert "Made test requirements identical to normal requirements"
This reverts commit 7f11c43c44.
* Added pip install requirements to .travis file
* Removed new unit test and requirements.txt install
* Cleaned up tests and re-added test.py predefined visualization
* Cleanup
* pass in secret
* fix
* use application name by default for database prefix
* bug fixes and bump kfp version
* Update application.yaml
* fix objectstore name
* fix objectstore name
* store db pwd as secret
* fix
* fix
* fix
* fix
* Add logic to detect extension name.
* Rename notebook samples
* Change to use config yaml for papermill preprocess.
* Remove ad hoc logic
* Remove duplicated logic
* Refactor
* Add run_pipeline flag in config yaml
* Add run pipeline flag for .py sample as well.
* Fix extension name
* Fix
* Fix problems in docstring.
* refactor run_sample_test.py into two functions
* Refactor the procedure into 3 steps
* Fix bug in exit code format
* Remove two redundant functions.
* pass in secret
* fix
* use application name by default for database prefix
* bug fixes and bump kfp version
* Update application.yaml
* fix objectstore name
* fix objectstore name
* SDK - Containers - Build container image from current environment
* Removed the ability to capture the active python environment (as requested by @hongye-sun)
* Added the type hint and docstring to for the return type.
* Renamed `build_image_from_env` function to `build_image_from_working_dir`
as requested by @hongye-sun
* Explained the function behavior in the documentation.
* Removed extra empty line
* Improved caching by copying python files only after installing python packages
* Made test more portable
* Added support for specifying the base_image
`kfp.containers.default_base_image = ...`
The image can also be a callable returning the image name.
* Renamed `get_python_image` to `get_python_image_for_current_version`
* Switched the default base image to Google Deep Learning container image as requested by @hongye-sun
The size of this image is 4.35GB which really concerns me. The GPU image size is 6.45GB.
* Stopped importing kfp.containers.* into kfp.*
* Fixed test
* Fixed the regex string
* Fixed the type annotation style
* Addressed @hongye-sun feedback
* Removed the container image size warning
* Fixed import failure
* Working, though the request seems malformed
* Working with grpc-web. trying to push to cluster
* WIP
* With great hax come great success
* Begin moving some metadata UI pages to KFP
* Artifact list and details pages work! A lot of clean up is needed. Look for console.log and TODO
* Clean up
* Fixes filtering of artifact list
* More cleanup
* Revert ui deployment
* Updates tests
* Update envoy deployment
* add more core samples in the sample test
* remove retry test for now
* update pipeline transformer sample
* add default value to volume_ops and fix error
* remove volume snapshot and resource op sample tests, which require platform support
* remove volume_ops since it is already included in the e2e
* add timeout
* SDK - Components - Hiding signature attribute from CloudPickle
Cloudpickle has some issues with pickling type annotations in python versions < 3.7, so they disabled it. https://github.com/cloudpipe/cloudpickle/issues/196
`create component_from_airflow_op` spoofs the function signature by setting the `func.__signature__` attribute. cloudpickle then tries to pickle that attribute which leads to failures during unpickling.
To prevent this we remove the `.__signature__` attribute before pickling.
* Added comments
# Hack to prevent cloudpickle from trying to pickle generic types that might be present in the signature. See https://github.com/cloudpipe/cloudpickle/issues/196
# Currently the __signature__ is only set by Airflow components as a means to spoof/pass the function signature to _func_to_component_spec
* Explicitly added mlpipeline outputs to the components that actually produce them
* Updated samples
* SDK - DSL - Stopped adding mlpipeline artifacts to every compiled template
Fixes https://github.com/kubeflow/pipelines/issues/1421
Fixes https://github.com/kubeflow/pipelines/issues/1422
* Updated the Lighweight sample
* Updated the compiler tests
* Fixed the lightweight sample
* Reverted the change to one contrib/samples/openvino
The sample will still work fine as it is now.
I'll add the change to that file as a separate PR.
If no `name` is provided to PipelineVolume constructor, a custom name is
generated. It relies on `json.dumps()` of the struct after getting
converted to dict.
When `pvc` is provided and `name` is not, the following error is raised:
TypeError: Object of type PipelineParam is not JSON serializable
This commit fixes it and extends tests to catch it.
* SDK - Switching python container components to Lightweight components code generator
* Fixed the tests
Had to remove the python2 test since python2 code generation is going away (python2 is near its End of Life and Kubeflow Pipelines only support python 3.5+).
* Added description for the internal add_files parameter
* Fixed typo
* Removed the `test_func_to_entrypoint` test
This was proposed by @gaoning777: `_func_to_entrypoint` is now just a reference to `_func_to_component_spec` which is extensively covered by other tests.