* kfp pipeline upload
* kfp pipeline get
* kfp pipeline list
* Use logging [Set logging level to INFO]
* Catch exceptions raised in command calls and log them as errors
* Add comment for TODO kfp pipeline delete
Signed-off-by: Ilias Katsakioris <elikatsis@arrikto.com>
* SDK - Client - Added a way to set experiment name using environment variables
This is useful for launching notebooks or pipeline files that submit
themselves for execution.
* Switched to subprocess.run which supports env
* Setting the environment variable differently
Looks like `subprocess.run` uses `PATH` to search for the program.
* Convert return code to string
* Changed the way the experiment name is being set
* Changed how the notebook installs the SDK
Notebook is overriding the SDK that's being tested.
* Not installing the KFP SDK package
* Removed the experiment_name from samples and configs.
* Changed the SDK installation lines in samples
Otherwise the sample tests do not correctly test the new SDK code.
* SDK - Compiler - Allow creating portable pipelines
This change allows directly passing the PipelineConf instance to compiler or launcher which makes it easier to create portable pipelines by allowing the environment-specific configuration to be directly passed to the environment-specific launcher.
Background:
PipelineConf holds all pipeline-level configuration including `op_transformers`, `image_pull_secrets` etc. Some of these are specific to particular execution environment (e.g. GCP secret or Argo artifact location or Kubernetes-specific options).
Previously, the only way to modify `PipelineConf` was to do it inside the piepline function. That tied the pipeline function to specific execution environment (e.g. GCP, Argo or Kubernetes)
Solution: This change allows directly passing the PipelineConf instance to compiler or launcher. This allows writing portable enlauncher and environment agnostic pipeline functions. All environment-specific configurations can be moved to launching stage.
Before:
```python
# Defining pipeline
def my_pipeline():
# portable pipeline code
dsl.get_pipeline_conf().add_op_transformer(gcp.use_gcp_secret('user-gcp-sa'))
# Launching pipeline
kfp.Clinet().create_run_from_pipeline_func(my_pipeline, arguments={})
```
After:
```python
# Defining pipeline
def my_pipeline():
# portable pipeline code
# Launching pipeline
pipeline_conf = dsl.PipelineConf()
pipeline_conf.add_op_transformer(gcp.use_gcp_secret('user-gcp-sa'))
kfp.Clinet().create_run_from_pipeline_func(my_pipeline, arguments={}, pipeline_conf=pipeline_conf)
```
After 2 *(launching same portable pipeline using different launchers):
```python
# Loading portable pipeline
from portable_pipeline import my_pipeline
# Launching pipeline on Kubeflow
pipeline_conf = dsl.PipelineConf()
pipeline_conf.add_op_transformer(gcp.use_gcp_secret('user-gcp-sa'))
kfp.Clinet().create_run_from_pipeline_func(my_pipeline, arguments={}, pipeline_conf=pipeline_conf)
# Launching pipeline on locally (not implemented yet)
kfp.run_pipeline_func_locally(my_pipeline, arguments={})
```
* Added parameter docstring
* add support for flexible config (via env var) for the pipline service and UI, fix broken links (pointed to API vs UI service)
* support https prefix
* change to LF
* init _uihost
* set _uihost from host param if specified no env var
* change config.host to host
* SDK - Make it easier to compile and submit a pipeline run
Adds the `Client.run_pipeline_func_on_kfp(func)` function that compiles the pipeline and submits it for execution on KFP-enabled cluster.
* Renamed run_pipeline_func_on_kfp to create_run_from_pipeline_func
* SDK - Separated the generated api client package
* Splitting the package build scripts
* Pinning the API client package version
* Moved import kfp_server_api to the top of the file
* Added the Mac OS X prerequisite install instructions
* Moved the build_kfp_server_api_python_package.sh script to the backend dir
* Updated the dependency version span
* dsl generate zip file
* minor fix
* fix zip read in the unit test
* update sample tests
* dsl compiler generates pipeline based on the input name suffix
* add unit tests for different output format
* update the sdk client to support tar zip and yaml
* fix typo
* fix file write
* add comments
* relocate functions in compiler to aggregate similar functions; move _build_conventional_artifact as a nested function
* reduce sanitize functions into one in the dsl.
* more comments
* move all sanitization(op name, param name) from dsl to compiler
* sanitize pipelineparam name and op_name; remove format check in pipelineparam
* remove unit test for pipelineparam op_name format checking
* fix bug: correctly replace input in the argument list
* fix bug: replace arguments with found ones
* Sanitize the file_output keys, Matches the param in the args/cmds with the whole serialized param str, Verify both param name and container name
* loosen the containerop and param name restrictions
* Update sample notebook to clean up deployed models.
Update SDK client to return correct links in local Jupyter with user's own proxy connection.
* Fix sample tests.
* add get_experiment_id and list_runs_by_experiment
* offer only one get_experiment function
* return experiment body instead of id
* simply codes
* simply code 2
* remove experiment_id check in the while loop
* minor bug
* add notebook sample tests for tfx
* parameterize component image tag
* parameterize base and target image tags
* install tensorflow package for the notebook tfx sample test
* bug fixes
* start debug mode
* fix bugs
* add namespace arg to check_notebook_results, copy test results to gcs, fix minor bugs
add CMLE model deletion
* install the correct KFP version in the notebook; parameterize deployer model name and version
* fix CMLE model name bug
* add notebook sample test in v2
* add gcp sa in notebook tfx sample and shutdown debug mode
* import kfp.gcp