chore(release): bumped version to 1.5.0-rc.2

This commit is contained in:
Yuan Gong 2021-04-02 10:06:48 +00:00
parent 29676cef53
commit f870570a79
72 changed files with 120 additions and 105 deletions

View File

@ -1,5 +1,20 @@
# Changelog
## [1.5.0-rc.2](https://github.com/kubeflow/pipelines/compare/1.5.0-rc.1...1.5.0-rc.2) (2021-04-02)
### Bug Fixes
* **frontend:** Avoid crashing, and show error message for invalid trigger date/time format. ([\#5418](https://github.com/kubeflow/pipelines/issues/5418)) ([19667e1](https://github.com/kubeflow/pipelines/commit/19667e1086593075bb0db3168b4500b5a89e73fe))
* **manifests:** Multi-User manifests fixes for 1.3 ([\#5416](https://github.com/kubeflow/pipelines/issues/5416)) ([6033718](https://github.com/kubeflow/pipelines/commit/6033718786574cd64b1ebd59e0246e7c5ba1fc88))
* **sdk:** Support returning plain tuples in lightweight function components v2. ([\#5412](https://github.com/kubeflow/pipelines/issues/5412)) ([a6a7255](https://github.com/kubeflow/pipelines/commit/a6a725560c30a769da3c49de739c5e008e761a4f))
* **sdk.v2:** fix bug for component input parameter/artifact referencing ([\#5419](https://github.com/kubeflow/pipelines/issues/5419)) ([eb55842](https://github.com/kubeflow/pipelines/commit/eb558423ecdca969f2dea55e0d1b0197fafbcb13))
### Other Pull Requests
* metadata-writer: Fix multi-user mode ([\#5417](https://github.com/kubeflow/pipelines/issues/5417)) ([684d639](https://github.com/kubeflow/pipelines/commit/684d6392d35f9d847caec373ff13b947243da424))
## [1.5.0-rc.1](https://github.com/kubeflow/pipelines/compare/1.5.0-rc.0...1.5.0-rc.1) (2021-04-01)

View File

@ -1 +1 @@
1.5.0-rc.1
1.5.0-rc.2

View File

@ -3,8 +3,8 @@ This file contains REST API specification for Kubeflow Pipelines. The file is au
This Python package is automatically generated by the [OpenAPI Generator](https://openapi-generator.tech) project:
- API version: 1.5.0-rc.1
- Package version: 1.5.0-rc.1
- API version: 1.5.0-rc.2
- Package version: 1.5.0-rc.2
- Build package: org.openapitools.codegen.languages.PythonClientCodegen
For more information, please visit [https://www.google.com](https://www.google.com)

View File

@ -14,7 +14,7 @@
from __future__ import absolute_import
__version__ = "1.5.0-rc.1"
__version__ = "1.5.0-rc.2"
# import apis into sdk package
from kfp_server_api.api.experiment_service_api import ExperimentServiceApi

View File

@ -78,7 +78,7 @@ class ApiClient(object):
self.default_headers[header_name] = header_value
self.cookie = cookie
# Set default User-Agent.
self.user_agent = 'OpenAPI-Generator/1.5.0-rc.1/python'
self.user_agent = 'OpenAPI-Generator/1.5.0-rc.2/python'
self.client_side_validation = configuration.client_side_validation
def __enter__(self):

View File

@ -351,8 +351,8 @@ conf = kfp_server_api.Configuration(
return "Python SDK Debug Report:\n"\
"OS: {env}\n"\
"Python Version: {pyversion}\n"\
"Version of the API: 1.5.0-rc.1\n"\
"SDK Package Version: 1.5.0-rc.1".\
"Version of the API: 1.5.0-rc.2\n"\
"SDK Package Version: 1.5.0-rc.2".\
format(env=sys.platform, pyversion=sys.version)
def get_host_settings(self):

View File

@ -13,7 +13,7 @@
from setuptools import setup, find_packages # noqa: H301
NAME = "kfp-server-api"
VERSION = "1.5.0-rc.1"
VERSION = "1.5.0-rc.2"
# To install the library, run the following
#
# python setup.py install

View File

@ -2,7 +2,7 @@
"swagger": "2.0",
"info": {
"title": "Kubeflow Pipelines API",
"version": "1.5.0-rc.1",
"version": "1.5.0-rc.2",
"description": "This file contains REST API specification for Kubeflow Pipelines. The file is autogenerated from the swagger definition.",
"contact": {
"name": "google",

View File

@ -179,37 +179,37 @@ Also see the tutorials for [data passing for components based on python function
/ [gcp](https://github.com/kubeflow/pipelines/tree/0795597562e076437a21745e524b5c960b1edb68/components/gcp) / [automl](https://github.com/kubeflow/pipelines/tree/0795597562e076437a21745e524b5c960b1edb68/components/gcp/automl) / [split_dataset_table_column_names](https://github.com/kubeflow/pipelines/tree/0795597562e076437a21745e524b5c960b1edb68/components/gcp/automl/split_dataset_table_column_names) / [Automl split dataset table column names](https://raw.githubusercontent.com/kubeflow/pipelines/0795597562e076437a21745e524b5c960b1edb68/components/gcp/automl/split_dataset_table_column_names/component.yaml)
/ [gcp](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp) / [bigquery](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp/bigquery) / [query](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp/bigquery/query) / [to_CSV](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp/bigquery/query/to_CSV) / [Bigquery - Query](https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.1/components/gcp/bigquery/query/to_CSV/component.yaml)
/ [gcp](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp) / [bigquery](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp/bigquery) / [query](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp/bigquery/query) / [to_CSV](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp/bigquery/query/to_CSV) / [Bigquery - Query](https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.2/components/gcp/bigquery/query/to_CSV/component.yaml)
/ [gcp](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp) / [bigquery](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp/bigquery) / [query](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp/bigquery/query) / [to_gcs](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp/bigquery/query/to_gcs) / [Bigquery - Query](https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.1/components/gcp/bigquery/query/to_gcs/component.yaml)
/ [gcp](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp) / [bigquery](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp/bigquery) / [query](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp/bigquery/query) / [to_gcs](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp/bigquery/query/to_gcs) / [Bigquery - Query](https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.2/components/gcp/bigquery/query/to_gcs/component.yaml)
/ [gcp](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp) / [bigquery](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp/bigquery) / [query](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp/bigquery/query) / [to_table](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp/bigquery/query/to_table) / [Bigquery - Query](https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.1/components/gcp/bigquery/query/to_table/component.yaml)
/ [gcp](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp) / [bigquery](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp/bigquery) / [query](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp/bigquery/query) / [to_table](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp/bigquery/query/to_table) / [Bigquery - Query](https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.2/components/gcp/bigquery/query/to_table/component.yaml)
/ [gcp](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp) / [dataflow](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp/dataflow) / [launch_python](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp/dataflow/launch_python) / [Launch Python](https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.1/components/gcp/dataflow/launch_python/component.yaml)
/ [gcp](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp) / [dataflow](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp/dataflow) / [launch_python](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp/dataflow/launch_python) / [Launch Python](https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.2/components/gcp/dataflow/launch_python/component.yaml)
/ [gcp](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp) / [dataflow](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp/dataflow) / [launch_template](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp/dataflow/launch_template) / [Launch Dataflow Template](https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.1/components/gcp/dataflow/launch_template/component.yaml)
/ [gcp](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp) / [dataflow](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp/dataflow) / [launch_template](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp/dataflow/launch_template) / [Launch Dataflow Template](https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.2/components/gcp/dataflow/launch_template/component.yaml)
/ [gcp](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp) / [dataproc](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp/dataproc) / [create_cluster](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp/dataproc/create_cluster) / [dataproc_create_cluster](https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.1/components/gcp/dataproc/create_cluster/component.yaml)
/ [gcp](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp) / [dataproc](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp/dataproc) / [create_cluster](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp/dataproc/create_cluster) / [dataproc_create_cluster](https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.2/components/gcp/dataproc/create_cluster/component.yaml)
/ [gcp](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp) / [dataproc](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp/dataproc) / [delete_cluster](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp/dataproc/delete_cluster) / [dataproc_delete_cluster](https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.1/components/gcp/dataproc/delete_cluster/component.yaml)
/ [gcp](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp) / [dataproc](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp/dataproc) / [delete_cluster](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp/dataproc/delete_cluster) / [dataproc_delete_cluster](https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.2/components/gcp/dataproc/delete_cluster/component.yaml)
/ [gcp](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp) / [dataproc](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp/dataproc) / [submit_hadoop_job](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp/dataproc/submit_hadoop_job) / [dataproc_submit_hadoop_job](https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.1/components/gcp/dataproc/submit_hadoop_job/component.yaml)
/ [gcp](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp) / [dataproc](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp/dataproc) / [submit_hadoop_job](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp/dataproc/submit_hadoop_job) / [dataproc_submit_hadoop_job](https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.2/components/gcp/dataproc/submit_hadoop_job/component.yaml)
/ [gcp](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp) / [dataproc](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp/dataproc) / [submit_hive_job](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp/dataproc/submit_hive_job) / [dataproc_submit_hive_job](https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.1/components/gcp/dataproc/submit_hive_job/component.yaml)
/ [gcp](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp) / [dataproc](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp/dataproc) / [submit_hive_job](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp/dataproc/submit_hive_job) / [dataproc_submit_hive_job](https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.2/components/gcp/dataproc/submit_hive_job/component.yaml)
/ [gcp](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp) / [dataproc](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp/dataproc) / [submit_pig_job](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp/dataproc/submit_pig_job) / [dataproc_submit_pig_job](https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.1/components/gcp/dataproc/submit_pig_job/component.yaml)
/ [gcp](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp) / [dataproc](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp/dataproc) / [submit_pig_job](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp/dataproc/submit_pig_job) / [dataproc_submit_pig_job](https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.2/components/gcp/dataproc/submit_pig_job/component.yaml)
/ [gcp](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp) / [dataproc](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp/dataproc) / [submit_pyspark_job](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp/dataproc/submit_pyspark_job) / [dataproc_submit_pyspark_job](https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.1/components/gcp/dataproc/submit_pyspark_job/component.yaml)
/ [gcp](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp) / [dataproc](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp/dataproc) / [submit_pyspark_job](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp/dataproc/submit_pyspark_job) / [dataproc_submit_pyspark_job](https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.2/components/gcp/dataproc/submit_pyspark_job/component.yaml)
/ [gcp](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp) / [dataproc](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp/dataproc) / [submit_spark_job](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp/dataproc/submit_spark_job) / [dataproc_submit_spark_job](https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.1/components/gcp/dataproc/submit_spark_job/component.yaml)
/ [gcp](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp) / [dataproc](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp/dataproc) / [submit_spark_job](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp/dataproc/submit_spark_job) / [dataproc_submit_spark_job](https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.2/components/gcp/dataproc/submit_spark_job/component.yaml)
/ [gcp](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp) / [dataproc](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp/dataproc) / [submit_sparksql_job](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp/dataproc/submit_sparksql_job) / [dataproc_submit_sparksql_job](https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.1/components/gcp/dataproc/submit_sparksql_job/component.yaml)
/ [gcp](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp) / [dataproc](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp/dataproc) / [submit_sparksql_job](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp/dataproc/submit_sparksql_job) / [dataproc_submit_sparksql_job](https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.2/components/gcp/dataproc/submit_sparksql_job/component.yaml)
/ [gcp](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp) / [ml_engine](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp/ml_engine) / [batch_predict](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp/ml_engine/batch_predict) / [Batch predict against a model with Cloud ML Engine](https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.1/components/gcp/ml_engine/batch_predict/component.yaml)
/ [gcp](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp) / [ml_engine](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp/ml_engine) / [batch_predict](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp/ml_engine/batch_predict) / [Batch predict against a model with Cloud ML Engine](https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.2/components/gcp/ml_engine/batch_predict/component.yaml)
/ [gcp](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp) / [ml_engine](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp/ml_engine) / [deploy](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp/ml_engine/deploy) / [Deploying a trained model to Cloud Machine Learning Engine](https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.1/components/gcp/ml_engine/deploy/component.yaml)
/ [gcp](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp) / [ml_engine](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp/ml_engine) / [deploy](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp/ml_engine/deploy) / [Deploying a trained model to Cloud Machine Learning Engine](https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.2/components/gcp/ml_engine/deploy/component.yaml)
/ [gcp](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp) / [ml_engine](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp/ml_engine) / [train](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp/ml_engine/train) / [Submitting a Cloud ML training job as a pipeline step](https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.1/components/gcp/ml_engine/train/component.yaml)
/ [gcp](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp) / [ml_engine](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp/ml_engine) / [train](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/gcp/ml_engine/train) / [Submitting a Cloud ML training job as a pipeline step](https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.2/components/gcp/ml_engine/train/component.yaml)
/ [git](https://github.com/kubeflow/pipelines/tree/0795597562e076437a21745e524b5c960b1edb68/components/git) / [clone](https://github.com/kubeflow/pipelines/tree/0795597562e076437a21745e524b5c960b1edb68/components/git/clone) / [Git clone](https://raw.githubusercontent.com/kubeflow/pipelines/0795597562e076437a21745e524b5c960b1edb68/components/git/clone/component.yaml)
@ -273,9 +273,9 @@ Also see the tutorials for [data passing for components based on python function
/ [keras](https://github.com/kubeflow/pipelines/tree/0795597562e076437a21745e524b5c960b1edb68/components/keras) / [Train_classifier](https://github.com/kubeflow/pipelines/tree/0795597562e076437a21745e524b5c960b1edb68/components/keras/Train_classifier) / [from_CSV](https://github.com/kubeflow/pipelines/tree/0795597562e076437a21745e524b5c960b1edb68/components/keras/Train_classifier/from_CSV) / [Keras train classifier from csv](https://raw.githubusercontent.com/kubeflow/pipelines/0795597562e076437a21745e524b5c960b1edb68/components/keras/Train_classifier/from_CSV/component.yaml)
/ [kubeflow](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/kubeflow) / [deployer](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/kubeflow/deployer) / [Kubeflow - Serve TF model](https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.1/components/kubeflow/deployer/component.yaml)
/ [kubeflow](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/kubeflow) / [deployer](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/kubeflow/deployer) / [Kubeflow - Serve TF model](https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.2/components/kubeflow/deployer/component.yaml)
/ [kubeflow](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/kubeflow) / [dnntrainer](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/kubeflow/dnntrainer) / [Train FC DNN using TF](https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.1/components/kubeflow/dnntrainer/component.yaml)
/ [kubeflow](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/kubeflow) / [dnntrainer](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/kubeflow/dnntrainer) / [Train FC DNN using TF](https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.2/components/kubeflow/dnntrainer/component.yaml)
/ [kubeflow](https://github.com/kubeflow/pipelines/tree/34d23aa924720ead13fe67ebca5c1ab9926585ee/components/kubeflow) / [katib-launcher](https://github.com/kubeflow/pipelines/tree/34d23aa924720ead13fe67ebca5c1ab9926585ee/components/kubeflow/katib-launcher) / [Katib - Launch Experiment](https://raw.githubusercontent.com/kubeflow/pipelines/34d23aa924720ead13fe67ebca5c1ab9926585ee/components/kubeflow/katib-launcher/component.yaml)
@ -283,9 +283,9 @@ Also see the tutorials for [data passing for components based on python function
/ [kubeflow](https://github.com/kubeflow/pipelines/tree/dd31142a57053e0b6f1416a3ecb4c8a94faa27f9/components/kubeflow) / [launcher](https://github.com/kubeflow/pipelines/tree/dd31142a57053e0b6f1416a3ecb4c8a94faa27f9/components/kubeflow/launcher) / [Kubeflow - Launch TFJob](https://raw.githubusercontent.com/kubeflow/pipelines/dd31142a57053e0b6f1416a3ecb4c8a94faa27f9/components/kubeflow/launcher/component.yaml)
/ [local](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/local) / [confusion_matrix](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/local/confusion_matrix) / [Confusion matrix](https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.1/components/local/confusion_matrix/component.yaml)
/ [local](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/local) / [confusion_matrix](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/local/confusion_matrix) / [Confusion matrix](https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.2/components/local/confusion_matrix/component.yaml)
/ [local](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/local) / [roc](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/local/roc) / [ROC curve](https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.1/components/local/roc/component.yaml)
/ [local](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/local) / [roc](https://github.com/kubeflow/pipelines/tree/8b3d741c6ef9f80190c962d4640690ea723b71e9/components/local/roc) / [ROC curve](https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.2/components/local/roc/component.yaml)
/ [ml_metrics](https://github.com/kubeflow/pipelines/tree/0795597562e076437a21745e524b5c960b1edb68/components/ml_metrics) / [Aggregate_regression_metrics](https://github.com/kubeflow/pipelines/tree/0795597562e076437a21745e524b5c960b1edb68/components/ml_metrics/Aggregate_regression_metrics) / [Aggregate regression metrics](https://raw.githubusercontent.com/kubeflow/pipelines/0795597562e076437a21745e524b5c960b1edb68/components/ml_metrics/Aggregate_regression_metrics/component.yaml)

View File

@ -47,7 +47,7 @@ outputs:
type: CSV
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-gcp:1.5.0-rc.1
image: gcr.io/ml-pipeline/ml-pipeline-gcp:1.5.0-rc.2
args: [
--ui_metadata_path, {outputPath: MLPipeline UI metadata},
kfp_component.google.bigquery, query,

View File

@ -69,7 +69,7 @@ outputs:
type: UI metadata
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-gcp:1.5.0-rc.1
image: gcr.io/ml-pipeline/ml-pipeline-gcp:1.5.0-rc.2
args: [
--ui_metadata_path, {outputPath: MLPipeline UI metadata},
kfp_component.google.bigquery, query,

View File

@ -77,7 +77,7 @@ Note: The following sample code works in an IPython notebook or directly in Pyth
import kfp.components as comp
bigquery_query_op = comp.load_component_from_url(
'https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.1/components/gcp/bigquery/query/to_table/component.yaml')
'https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.2/components/gcp/bigquery/query/to_table/component.yaml')
help(bigquery_query_op)
```

View File

@ -56,7 +56,7 @@ outputs:
type: UI metadata
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-gcp:1.5.0-rc.1
image: gcr.io/ml-pipeline/ml-pipeline-gcp:1.5.0-rc.2
args: [
--ui_metadata_path, {outputPath: MLPipeline UI metadata},
kfp_component.google.bigquery, query,

View File

@ -15,7 +15,7 @@
from setuptools import setup
PACKAGE_NAME = 'kfp-component'
VERSION = '1.5.0-rc.1'
VERSION = '1.5.0-rc.2'
setup(
name=PACKAGE_NAME,

View File

@ -90,7 +90,7 @@ The steps to use the component in a pipeline are:
```python
import kfp.components as comp
dataflow_python_op = comp.load_component_from_url('https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.1/components/gcp/dataflow/launch_python/component.yaml')
dataflow_python_op = comp.load_component_from_url('https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.2/components/gcp/dataflow/launch_python/component.yaml')
help(dataflow_python_op)
```

View File

@ -56,7 +56,7 @@ outputs:
type: UI metadata
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-gcp:1.5.0-rc.1
image: gcr.io/ml-pipeline/ml-pipeline-gcp:1.5.0-rc.2
args: [
--ui_metadata_path, {outputPath: MLPipeline UI metadata},
kfp_component.google.dataflow, launch_python,

View File

@ -92,7 +92,7 @@
"import kfp.components as comp\n",
"\n",
"dataflow_python_op = comp.load_component_from_url(\n",
" 'https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.1/components/gcp/dataflow/launch_python/component.yaml')\n",
" 'https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.2/components/gcp/dataflow/launch_python/component.yaml')\n",
"help(dataflow_python_op)"
]
},

View File

@ -63,7 +63,7 @@ Follow these steps to use the component in a pipeline:
import kfp.components as comp
dataflow_template_op = comp.load_component_from_url(
'https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.1/components/gcp/dataflow/launch_template/component.yaml')
'https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.2/components/gcp/dataflow/launch_template/component.yaml')
help(dataflow_template_op)
```

View File

@ -63,7 +63,7 @@ outputs:
type: UI metadata
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-gcp:1.5.0-rc.1
image: gcr.io/ml-pipeline/ml-pipeline-gcp:1.5.0-rc.2
args: [
--ui_metadata_path, {outputPath: MLPipeline UI metadata},
kfp_component.google.dataflow, launch_template,

View File

@ -81,7 +81,7 @@
"import kfp.components as comp\n",
"\n",
"dataflow_template_op = comp.load_component_from_url(\n",
" 'https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.1/components/gcp/dataflow/launch_template/component.yaml')\n",
" 'https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.2/components/gcp/dataflow/launch_template/component.yaml')\n",
"help(dataflow_template_op)"
]
},

View File

@ -87,7 +87,7 @@ Follow these steps to use the component in a pipeline:
```python
import kfp.components as comp
dataproc_create_cluster_op = comp.load_component_from_url('https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.1/components/gcp/dataproc/create_cluster/component.yaml')
dataproc_create_cluster_op = comp.load_component_from_url('https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.2/components/gcp/dataproc/create_cluster/component.yaml')
help(dataproc_create_cluster_op)
```

View File

@ -70,7 +70,7 @@ outputs:
type: UI metadata
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-gcp:1.5.0-rc.1
image: gcr.io/ml-pipeline/ml-pipeline-gcp:1.5.0-rc.2
args: [
--ui_metadata_path, {outputPath: MLPipeline UI metadata},
kfp_component.google.dataproc, create_cluster,

View File

@ -87,7 +87,7 @@
"import kfp.components as comp\n",
"\n",
"dataproc_create_cluster_op = comp.load_component_from_url(\n",
" 'https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.1/components/gcp/dataproc/create_cluster/component.yaml')\n",
" 'https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.2/components/gcp/dataproc/create_cluster/component.yaml')\n",
"help(dataproc_create_cluster_op)"
]
},

View File

@ -65,7 +65,7 @@ Follow these steps to use the component in a pipeline:
```python
import kfp.components as comp
dataproc_delete_cluster_op = comp.load_component_from_url('https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.1/components/gcp/dataproc/delete_cluster/component.yaml')
dataproc_delete_cluster_op = comp.load_component_from_url('https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.2/components/gcp/dataproc/delete_cluster/component.yaml')
help(dataproc_delete_cluster_op)
```

View File

@ -36,7 +36,7 @@ inputs:
type: Integer
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-gcp:1.5.0-rc.1
image: gcr.io/ml-pipeline/ml-pipeline-gcp:1.5.0-rc.2
args: [
kfp_component.google.dataproc, delete_cluster,
--project_id, {inputValue: project_id},

View File

@ -70,7 +70,7 @@
"import kfp.components as comp\n",
"\n",
"dataproc_delete_cluster_op = comp.load_component_from_url(\n",
" 'https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.1/components/gcp/dataproc/delete_cluster/component.yaml')\n",
" 'https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.2/components/gcp/dataproc/delete_cluster/component.yaml')\n",
"help(dataproc_delete_cluster_op)"
]
},

View File

@ -82,7 +82,7 @@ Follow these steps to use the component in a pipeline:
```python
import kfp.components as comp
dataproc_submit_hadoop_job_op = comp.load_component_from_url('https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.1/components/gcp/dataproc/submit_hadoop_job/component.yaml')
dataproc_submit_hadoop_job_op = comp.load_component_from_url('https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.2/components/gcp/dataproc/submit_hadoop_job/component.yaml')
help(dataproc_submit_hadoop_job_op)
```

View File

@ -80,7 +80,7 @@ outputs:
type: UI metadata
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-gcp:1.5.0-rc.1
image: gcr.io/ml-pipeline/ml-pipeline-gcp:1.5.0-rc.2
args: [
--ui_metadata_path, {outputPath: MLPipeline UI metadata},
kfp_component.google.dataproc, submit_hadoop_job,

View File

@ -85,7 +85,7 @@
"import kfp.components as comp\n",
"\n",
"dataproc_submit_hadoop_job_op = comp.load_component_from_url(\n",
" 'https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.1/components/gcp/dataproc/submit_hadoop_job/component.yaml')\n",
" 'https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.2/components/gcp/dataproc/submit_hadoop_job/component.yaml')\n",
"help(dataproc_submit_hadoop_job_op)"
]
},

View File

@ -72,7 +72,7 @@ Follow these steps to use the component in a pipeline:
```python
import kfp.components as comp
dataproc_submit_hive_job_op = comp.load_component_from_url('https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.1/components/gcp/dataproc/submit_hive_job/component.yaml')
dataproc_submit_hive_job_op = comp.load_component_from_url('https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.2/components/gcp/dataproc/submit_hive_job/component.yaml')
help(dataproc_submit_hive_job_op)
```

View File

@ -75,7 +75,7 @@ outputs:
type: UI metadata
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-gcp:1.5.0-rc.1
image: gcr.io/ml-pipeline/ml-pipeline-gcp:1.5.0-rc.2
args: [
--ui_metadata_path, {outputPath: MLPipeline UI metadata},
kfp_component.google.dataproc, submit_hive_job,

View File

@ -76,7 +76,7 @@
"import kfp.components as comp\n",
"\n",
"dataproc_submit_hive_job_op = comp.load_component_from_url(\n",
" 'https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.1/components/gcp/dataproc/submit_hive_job/component.yaml')\n",
" 'https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.2/components/gcp/dataproc/submit_hive_job/component.yaml')\n",
"help(dataproc_submit_hive_job_op)"
]
},

View File

@ -81,7 +81,7 @@ Follow these steps to use the component in a pipeline:
```python
import kfp.components as comp
dataproc_submit_pig_job_op = comp.load_component_from_url('https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.1/components/gcp/dataproc/submit_pig_job/component.yaml')
dataproc_submit_pig_job_op = comp.load_component_from_url('https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.2/components/gcp/dataproc/submit_pig_job/component.yaml')
help(dataproc_submit_pig_job_op)
```

View File

@ -75,7 +75,7 @@ outputs:
type: UI metadata
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-gcp:1.5.0-rc.1
image: gcr.io/ml-pipeline/ml-pipeline-gcp:1.5.0-rc.2
args: [
--ui_metadata_path, {outputPath: MLPipeline UI metadata},
kfp_component.google.dataproc, submit_pig_job,

View File

@ -79,7 +79,7 @@
"import kfp.components as comp\n",
"\n",
"dataproc_submit_pig_job_op = comp.load_component_from_url(\n",
" 'https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.1/components/gcp/dataproc/submit_pig_job/component.yaml')\n",
" 'https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.2/components/gcp/dataproc/submit_pig_job/component.yaml')\n",
"help(dataproc_submit_pig_job_op)"
]
},

View File

@ -78,7 +78,7 @@ Follow these steps to use the component in a pipeline:
```python
import kfp.components as comp
dataproc_submit_pyspark_job_op = comp.load_component_from_url('https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.1/components/gcp/dataproc/submit_pyspark_job/component.yaml')
dataproc_submit_pyspark_job_op = comp.load_component_from_url('https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.2/components/gcp/dataproc/submit_pyspark_job/component.yaml')
help(dataproc_submit_pyspark_job_op)
```

View File

@ -69,7 +69,7 @@ outputs:
type: UI metadata
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-gcp:1.5.0-rc.1
image: gcr.io/ml-pipeline/ml-pipeline-gcp:1.5.0-rc.2
args: [
--ui_metadata_path, {outputPath: MLPipeline UI metadata},
kfp_component.google.dataproc, submit_pyspark_job,

View File

@ -81,7 +81,7 @@
"import kfp.components as comp\n",
"\n",
"dataproc_submit_pyspark_job_op = comp.load_component_from_url(\n",
" 'https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.1/components/gcp/dataproc/submit_pyspark_job/component.yaml')\n",
" 'https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.2/components/gcp/dataproc/submit_pyspark_job/component.yaml')\n",
"help(dataproc_submit_pyspark_job_op)"
]
},

View File

@ -94,7 +94,7 @@ Follow these steps to use the component in a pipeline:
import kfp.components as comp
dataproc_submit_spark_job_op = comp.load_component_from_url(
'https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.1/components/gcp/dataproc/submit_spark_job/component.yaml')
'https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.2/components/gcp/dataproc/submit_spark_job/component.yaml')
help(dataproc_submit_spark_job_op)
```

View File

@ -76,7 +76,7 @@ outputs:
type: UI metadata
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-gcp:1.5.0-rc.1
image: gcr.io/ml-pipeline/ml-pipeline-gcp:1.5.0-rc.2
args: [
--ui_metadata_path, {outputPath: MLPipeline UI metadata},
kfp_component.google.dataproc, submit_spark_job,

View File

@ -92,7 +92,7 @@
"import kfp.components as comp\n",
"\n",
"dataproc_submit_spark_job_op = comp.load_component_from_url(\n",
" 'https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.1/components/gcp/dataproc/submit_spark_job/component.yaml')\n",
" 'https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.2/components/gcp/dataproc/submit_spark_job/component.yaml')\n",
"help(dataproc_submit_spark_job_op)"
]
},

View File

@ -73,7 +73,7 @@ Follow these steps to use the component in a pipeline:
```python
import kfp.components as comp
dataproc_submit_sparksql_job_op = comp.load_component_from_url('https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.1/components/gcp/dataproc/submit_sparksql_job/component.yaml')
dataproc_submit_sparksql_job_op = comp.load_component_from_url('https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.2/components/gcp/dataproc/submit_sparksql_job/component.yaml')
help(dataproc_submit_sparksql_job_op)
```

View File

@ -75,7 +75,7 @@ outputs:
type: UI metadata
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-gcp:1.5.0-rc.1
image: gcr.io/ml-pipeline/ml-pipeline-gcp:1.5.0-rc.2
args: [
--ui_metadata_path, {outputPath: MLPipeline UI metadata},
kfp_component.google.dataproc, submit_sparksql_job,

View File

@ -77,7 +77,7 @@
"import kfp.components as comp\n",
"\n",
"dataproc_submit_sparksql_job_op = comp.load_component_from_url(\n",
" 'https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.1/components/gcp/dataproc/submit_sparksql_job/component.yaml')\n",
" 'https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.2/components/gcp/dataproc/submit_sparksql_job/component.yaml')\n",
"help(dataproc_submit_sparksql_job_op)"
]
},

View File

@ -86,7 +86,7 @@ Follow these steps to use the component in a pipeline:
import kfp.components as comp
mlengine_batch_predict_op = comp.load_component_from_url(
'https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.1/components/gcp/ml_engine/batch_predict/component.yaml')
'https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.2/components/gcp/ml_engine/batch_predict/component.yaml')
help(mlengine_batch_predict_op)
```

View File

@ -69,7 +69,7 @@ outputs:
type: UI metadata
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-gcp:1.5.0-rc.1
image: gcr.io/ml-pipeline/ml-pipeline-gcp:1.5.0-rc.2
args: [
--ui_metadata_path, {outputPath: MLPipeline UI metadata},
kfp_component.google.ml_engine, batch_predict,

View File

@ -105,7 +105,7 @@
"import kfp.components as comp\n",
"\n",
"mlengine_batch_predict_op = comp.load_component_from_url(\n",
" 'https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.1/components/gcp/ml_engine/batch_predict/component.yaml')\n",
" 'https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.2/components/gcp/ml_engine/batch_predict/component.yaml')\n",
"help(mlengine_batch_predict_op)"
]
},

View File

@ -103,7 +103,7 @@ Follow these steps to use the component in a pipeline:
import kfp.components as comp
mlengine_deploy_op = comp.load_component_from_url(
'https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.1/components/gcp/ml_engine/deploy/component.yaml')
'https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.2/components/gcp/ml_engine/deploy/component.yaml')
help(mlengine_deploy_op)
```

View File

@ -95,7 +95,7 @@ outputs:
type: UI metadata
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-gcp:1.5.0-rc.1
image: gcr.io/ml-pipeline/ml-pipeline-gcp:1.5.0-rc.2
args: [
--ui_metadata_path, {outputPath: MLPipeline UI metadata},
kfp_component.google.ml_engine, deploy,

View File

@ -120,7 +120,7 @@
"import kfp.components as comp\n",
"\n",
"mlengine_deploy_op = comp.load_component_from_url(\n",
" 'https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.1/components/gcp/ml_engine/deploy/component.yaml')\n",
" 'https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.2/components/gcp/ml_engine/deploy/component.yaml')\n",
"help(mlengine_deploy_op)"
]
},

View File

@ -99,7 +99,7 @@ The steps to use the component in a pipeline are:
```python
import kfp.components as comp
mlengine_train_op = comp.load_component_from_url('https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.1/components/gcp/ml_engine/train/component.yaml')
mlengine_train_op = comp.load_component_from_url('https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.2/components/gcp/ml_engine/train/component.yaml')
help(mlengine_train_op)
```
### Sample

View File

@ -109,7 +109,7 @@ outputs:
type: UI metadata
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-gcp:1.5.0-rc.1
image: gcr.io/ml-pipeline/ml-pipeline-gcp:1.5.0-rc.2
args: [
--ui_metadata_path, {outputPath: MLPipeline UI metadata},
kfp_component.google.ml_engine, train,

View File

@ -99,7 +99,7 @@
"import kfp.components as comp\n",
"\n",
"mlengine_train_op = comp.load_component_from_url(\n",
" 'https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.1/components/gcp/ml_engine/train/component.yaml')\n",
" 'https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.2/components/gcp/ml_engine/train/component.yaml')\n",
"help(mlengine_train_op)"
]
},

View File

@ -11,7 +11,7 @@ inputs:
# - {name: Endppoint URI, type: Serving URI, description: 'URI of the deployed prediction service..'}
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-kubeflow-deployer:1.5.0-rc.1
image: gcr.io/ml-pipeline/ml-pipeline-kubeflow-deployer:1.5.0-rc.2
command: [/bin/deploy.sh]
args: [
--model-export-path, {inputValue: Model dir},

View File

@ -16,7 +16,7 @@ outputs:
- {name: MLPipeline UI metadata, type: UI metadata}
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-kubeflow-tf-trainer:1.5.0-rc.1
image: gcr.io/ml-pipeline/ml-pipeline-kubeflow-tf-trainer:1.5.0-rc.2
command: [python2, -m, trainer.task]
args: [
--transformed-data-dir, {inputValue: Transformed data dir},

View File

@ -9,7 +9,7 @@ outputs:
- {name: MLPipeline Metrics, type: Metrics}
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-local-confusion-matrix:1.5.0-rc.1
image: gcr.io/ml-pipeline/ml-pipeline-local-confusion-matrix:1.5.0-rc.2
command: [python2, /ml/confusion_matrix.py]
args: [
--predictions, {inputValue: Predictions},

View File

@ -11,7 +11,7 @@ outputs:
- {name: MLPipeline Metrics, type: Metrics}
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-local-confusion-matrix:1.5.0-rc.1
image: gcr.io/ml-pipeline/ml-pipeline-local-confusion-matrix:1.5.0-rc.2
command: [python2, /ml/roc.py]
args: [
--predictions, {inputValue: Predictions dir},

View File

@ -12,7 +12,7 @@ metadata:
spec:
descriptor:
type: Kubeflow Pipelines
version: 1.5.0-rc.1
version: 1.5.0-rc.2
description: |-
Reusable end-to-end ML workflow
maintainers:

View File

@ -1,9 +1,9 @@
x-google-marketplace:
schemaVersion: v2
applicationApiVersion: v1beta1
publishedVersion: 1.5.0-rc.1
publishedVersion: 1.5.0-rc.2
publishedVersionMetadata:
releaseNote: Based on 1.5.0-rc.1 version.
releaseNote: Based on 1.5.0-rc.2 version.
releaseTypes:
- Feature
recommended: false

View File

@ -8,4 +8,4 @@ commonLabels:
app: cache-deployer
images:
- name: gcr.io/ml-pipeline/cache-deployer
newTag: 1.5.0-rc.1
newTag: 1.5.0-rc.2

View File

@ -10,4 +10,4 @@ commonLabels:
app: cache-server
images:
- name: gcr.io/ml-pipeline/cache-server
newTag: 1.5.0-rc.1
newTag: 1.5.0-rc.2

View File

@ -4,7 +4,7 @@ metadata:
name: pipeline-install-config
data:
appName: pipeline
appVersion: 1.5.0-rc.1
appVersion: 1.5.0-rc.2
dbHost: mysql
dbPort: "3306"
mlmdDb: metadb

View File

@ -9,4 +9,4 @@ resources:
- metadata-grpc-sa.yaml
images:
- name: gcr.io/ml-pipeline/metadata-envoy
newTag: 1.5.0-rc.1
newTag: 1.5.0-rc.2

View File

@ -36,14 +36,14 @@ resources:
- viewer-sa.yaml
images:
- name: gcr.io/ml-pipeline/api-server
newTag: 1.5.0-rc.1
newTag: 1.5.0-rc.2
- name: gcr.io/ml-pipeline/persistenceagent
newTag: 1.5.0-rc.1
newTag: 1.5.0-rc.2
- name: gcr.io/ml-pipeline/scheduledworkflow
newTag: 1.5.0-rc.1
newTag: 1.5.0-rc.2
- name: gcr.io/ml-pipeline/frontend
newTag: 1.5.0-rc.1
newTag: 1.5.0-rc.2
- name: gcr.io/ml-pipeline/viewer-crd-controller
newTag: 1.5.0-rc.1
newTag: 1.5.0-rc.2
- name: gcr.io/ml-pipeline/visualization-server
newTag: 1.5.0-rc.1
newTag: 1.5.0-rc.2

View File

@ -7,4 +7,4 @@ resources:
- metadata-writer-sa.yaml
images:
- name: gcr.io/ml-pipeline/metadata-writer
newTag: 1.5.0-rc.1
newTag: 1.5.0-rc.2

View File

@ -2,7 +2,7 @@ apiVersion: kustomize.config.k8s.io/v1beta1
kind: Kustomization
images:
- name: gcr.io/ml-pipeline/inverse-proxy-agent
newTag: 1.5.0-rc.1
newTag: 1.5.0-rc.2
resources:
- proxy-configmap.yaml
- proxy-deployment.yaml

View File

@ -516,7 +516,7 @@
"outputs": [],
"source": [
"mlengine_deploy_op = comp.load_component_from_url(\n",
" 'https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.1/components/gcp/ml_engine/deploy/component.yaml')\n",
" 'https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.2/components/gcp/ml_engine/deploy/component.yaml')\n",
"\n",
"def deploy(\n",
" project_id,\n",
@ -542,7 +542,7 @@
"Kubeflow serving deployment component as an option. **Note that, the deployed Endppoint URI is not availabe as output of this component.**\n",
"```python\n",
"kubeflow_deploy_op = comp.load_component_from_url(\n",
" 'https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.1/components/gcp/ml_engine/deploy/component.yaml')\n",
" 'https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.2/components/gcp/ml_engine/deploy/component.yaml')\n",
"\n",
"def deploy_kubeflow(\n",
" model_dir,\n",

View File

@ -154,7 +154,7 @@
"outputs": [],
"source": [
"mlengine_train_op = comp.load_component_from_url(\n",
" 'https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.1/components/gcp/ml_engine/train/component.yaml')\n",
" 'https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.2/components/gcp/ml_engine/train/component.yaml')\n",
"\n",
"def train(project_id,\n",
" trainer_args,\n",
@ -192,7 +192,7 @@
"outputs": [],
"source": [
"mlengine_deploy_op = comp.load_component_from_url(\n",
" 'https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.1/components/gcp/ml_engine/deploy/component.yaml')\n",
" 'https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.2/components/gcp/ml_engine/deploy/component.yaml')\n",
"\n",
"def deploy(\n",
" project_id,\n",

View File

@ -137,7 +137,7 @@
"import kfp.components as comp\n",
"\n",
"dataflow_python_op = comp.load_component_from_url(\n",
" 'https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.1/components/gcp/dataflow/launch_python/component.yaml')\n",
" 'https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.2/components/gcp/dataflow/launch_python/component.yaml')\n",
"help(dataflow_python_op)"
]
},

View File

@ -176,7 +176,7 @@
"ts = int(time.time())\n",
"model_version = str(ts) # Here we use timestamp as version to avoid conflict \n",
"output = 'Your-Gcs-Path' # A GCS bucket for asset outputs\n",
"KUBEFLOW_DEPLOYER_IMAGE = 'gcr.io/ml-pipeline/ml-pipeline-kubeflow-deployer:1.5.0-rc.1'"
"KUBEFLOW_DEPLOYER_IMAGE = 'gcr.io/ml-pipeline/ml-pipeline-kubeflow-deployer:1.5.0-rc.2'"
]
},
{

View File

@ -24,22 +24,22 @@ import subprocess
diagnose_me_op = components.load_component_from_url(
'https://raw.githubusercontent.com/kubeflow/pipelines/566dddfdfc0a6a725b6e50ea85e73d8d5578bbb9/components/diagnostics/diagnose_me/component.yaml')
confusion_matrix_op = components.load_component_from_url('https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.1/components/local/confusion_matrix/component.yaml')
confusion_matrix_op = components.load_component_from_url('https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.2/components/local/confusion_matrix/component.yaml')
roc_op = components.load_component_from_url('https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.1/components/local/roc/component.yaml')
roc_op = components.load_component_from_url('https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.2/components/local/roc/component.yaml')
dataproc_create_cluster_op = components.load_component_from_url(
'https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.1/components/gcp/dataproc/create_cluster/component.yaml')
'https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.2/components/gcp/dataproc/create_cluster/component.yaml')
dataproc_delete_cluster_op = components.load_component_from_url(
'https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.1/components/gcp/dataproc/delete_cluster/component.yaml')
'https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.2/components/gcp/dataproc/delete_cluster/component.yaml')
dataproc_submit_pyspark_op = components.load_component_from_url(
'https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.1/components/gcp/dataproc/submit_pyspark_job/component.yaml'
'https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.2/components/gcp/dataproc/submit_pyspark_job/component.yaml'
)
dataproc_submit_spark_op = components.load_component_from_url(
'https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.1/components/gcp/dataproc/submit_spark_job/component.yaml'
'https://raw.githubusercontent.com/kubeflow/pipelines/1.5.0-rc.2/components/gcp/dataproc/submit_spark_job/component.yaml'
)
_PYSRC_PREFIX = 'gs://ml-pipeline/sample-pipeline/xgboost' # Common path to python src.

View File

@ -16,7 +16,7 @@
# https://packaging.python.org/guides/packaging-namespace-packages/#pkgutil-style-namespace-packages
__path__ = __import__("pkgutil").extend_path(__path__, __name__)
__version__ = '1.5.0-rc.1'
__version__ = '1.5.0-rc.2'
from . import components
from . import containers