mirror of https://github.com/kubeflow/website.git
Restructured Kubeflow Pipelines docs (#3737)
* Restructured Kubeflow Pipelines docs Signed-off-by: hbelmiro <helber.belmiro@gmail.com> * Fixed broken links Signed-off-by: hbelmiro <helber.belmiro@gmail.com> * Moved new redirects to the end of the file Signed-off-by: hbelmiro <helber.belmiro@gmail.com> * Renamed How-to/User guides to User guides Signed-off-by: hbelmiro <helber.belmiro@gmail.com> * Fixed redirects Signed-off-by: hbelmiro <helber.belmiro@gmail.com> * Removed double slash typo Signed-off-by: hbelmiro <helber.belmiro@gmail.com> * Fixed "Legacy v1" title Signed-off-by: hbelmiro <helber.belmiro@gmail.com> * Added "Operator Guides" section Signed-off-by: hbelmiro <helber.belmiro@gmail.com> * Added brackets to Legacy page Signed-off-by: hbelmiro <helber.belmiro@gmail.com> * Moved Introduction and getting-started.md to the root Signed-off-by: hbelmiro <helber.belmiro@gmail.com> * Reverted changes to other components Signed-off-by: hbelmiro <helber.belmiro@gmail.com> * Removed catch-all redirects and moved new redirects to the bottom of the file Signed-off-by: hbelmiro <helber.belmiro@gmail.com> * Renamed the how-to directory to user-guides Signed-off-by: hbelmiro <helber.belmiro@gmail.com> * Regrouped user guides Signed-off-by: hbelmiro <helber.belmiro@gmail.com> * Moved "Concepts" to Pipelines Signed-off-by: hbelmiro <helber.belmiro@gmail.com> * Moved component-spec.md Signed-off-by: hbelmiro <helber.belmiro@gmail.com> * Moved interfaces.md Signed-off-by: hbelmiro <helber.belmiro@gmail.com> * Moved pipeline-root.md Signed-off-by: hbelmiro <helber.belmiro@gmail.com> * Reverted changes outside of pipelines Signed-off-by: hbelmiro <helber.belmiro@gmail.com> * Update content/en/docs/components/pipelines/reference/sdk.md Signed-off-by: hbelmiro <helber.belmiro@gmail.com> Co-authored-by: Mathew Wicks <5735406+thesuperzapper@users.noreply.github.com> * Removed warning Signed-off-by: hbelmiro <helber.belmiro@gmail.com> * Fixed /docs/components/pipelines/installation/ broken links Signed-off-by: hbelmiro <helber.belmiro@gmail.com> * Removed beta status flag from pipeline-root.md Signed-off-by: hbelmiro <helber.belmiro@gmail.com> * Fixed broken links Signed-off-by: hbelmiro <helber.belmiro@gmail.com> * Fixed broken links Signed-off-by: hbelmiro <helber.belmiro@gmail.com> * Update content/en/_redirects Signed-off-by: hbelmiro <helber.belmiro@gmail.com> Co-authored-by: Mathew Wicks <5735406+thesuperzapper@users.noreply.github.com> * Update content/en/_redirects Signed-off-by: hbelmiro <helber.belmiro@gmail.com> Co-authored-by: Mathew Wicks <5735406+thesuperzapper@users.noreply.github.com> * Update content/en/docs/components/pipelines/concepts/component.md Signed-off-by: hbelmiro <helber.belmiro@gmail.com> Co-authored-by: Mathew Wicks <5735406+thesuperzapper@users.noreply.github.com> * Update content/en/docs/components/pipelines/legacy-v1/sdk/manipulate-resources.md Signed-off-by: hbelmiro <helber.belmiro@gmail.com> Co-authored-by: Mathew Wicks <5735406+thesuperzapper@users.noreply.github.com> * Update content/en/docs/components/pipelines/legacy-v1/sdk/python-function-components.ipynb Signed-off-by: hbelmiro <helber.belmiro@gmail.com> Co-authored-by: Mathew Wicks <5735406+thesuperzapper@users.noreply.github.com> * Update content/en/docs/components/pipelines/legacy-v1/sdk/python-function-components.md Signed-off-by: hbelmiro <helber.belmiro@gmail.com> Co-authored-by: Mathew Wicks <5735406+thesuperzapper@users.noreply.github.com> * Update content/en/docs/components/pipelines/legacy-v1/tutorials/api-pipelines.md Signed-off-by: hbelmiro <helber.belmiro@gmail.com> Co-authored-by: Mathew Wicks <5735406+thesuperzapper@users.noreply.github.com> * Update content/en/docs/components/pipelines/legacy-v1/tutorials/build-pipeline.md Signed-off-by: hbelmiro <helber.belmiro@gmail.com> Co-authored-by: Mathew Wicks <5735406+thesuperzapper@users.noreply.github.com> * Fixed redirects Signed-off-by: hbelmiro <helber.belmiro@gmail.com> * Fixed redirects Signed-off-by: hbelmiro <helber.belmiro@gmail.com> --------- Signed-off-by: hbelmiro <helber.belmiro@gmail.com> Co-authored-by: Mathew Wicks <5735406+thesuperzapper@users.noreply.github.com>
This commit is contained in:
parent
62b4fae8cc
commit
8e56df75dc
|
|
@ -54,7 +54,7 @@
|
|||
inkscape:pagecheckerboard="false" />
|
||||
<a
|
||||
id="a3946"
|
||||
xlink:href="https://www.kubeflow.org/docs/components/pipelines/introduction/"
|
||||
xlink:href="https://www.kubeflow.org/docs/components/pipelines/overview/"
|
||||
target="_blank"
|
||||
transform="matrix(0.49363962,0,0,0.49963206,1.4381833,0.08830732)">
|
||||
<image
|
||||
|
|
|
|||
|
Before Width: | Height: | Size: 23 KiB After Width: | Height: | Size: 23 KiB |
|
|
@ -41,7 +41,7 @@ title = "Kubeflow"
|
|||
<div class="container">
|
||||
<div class="card-deck">
|
||||
<div class="card border-primary-dark">
|
||||
<a href="/docs/components/pipelines/v2/introduction/" target="_blank" rel="noopener" >
|
||||
<a href="/docs/components/pipelines/overview/" target="_blank" rel="noopener" >
|
||||
<img
|
||||
src="/docs/images/logos/kubeflow.png"
|
||||
class="card-img-top"
|
||||
|
|
@ -53,7 +53,7 @@ title = "Kubeflow"
|
|||
<div class="card-body bg-primary-dark">
|
||||
<h5 class="card-title text-white section-head">Pipelines</h5>
|
||||
<p class="card-text text-white">
|
||||
<a target="_blank" rel="noopener" href="/docs/components/pipelines/v2/introduction/">Kubeflow Pipelines</a> (KFP) is a platform for building then deploying portable and scalable machine learning workflows using Kubernetes.
|
||||
<a target="_blank" rel="noopener" href="/docs/components/pipelines/overview/">Kubeflow Pipelines</a> (KFP) is a platform for building then deploying portable and scalable machine learning workflows using Kubernetes.
|
||||
</p>
|
||||
</div>
|
||||
</div>
|
||||
|
|
|
|||
|
|
@ -16,40 +16,40 @@
|
|||
/docs/guides/pipelines/deploy-pipelines-service/ /docs/components/pipelines/pipelines-quickstart/
|
||||
|
||||
# Merged duplicate page pipelines-samples.md into build-pipeline.md
|
||||
/docs/pipelines/pipelines-samples/ /docs/components/pipelines/build-pipeline/
|
||||
/docs/pipelines/pipelines-samples/ /docs/components/pipelines/legacy-v1/sdk/build-pipeline/
|
||||
|
||||
# Removed redundant UI guide. Quickstart is a better destination.
|
||||
/docs/pipelines/pipelines-ui/ /docs/components/pipelines/pipelines-quickstart/
|
||||
|
||||
# Restructured the pipelines docs.
|
||||
/docs/pipelines/ /docs/components/pipelines
|
||||
/docs/pipelines/output-viewer/ /docs/components/pipelines/sdk/output-viewer/
|
||||
/docs/pipelines/pipelines-metrics/ /docs/components/pipelines/sdk/pipelines-metrics/
|
||||
/docs/pipelines/build-component/ /docs/components/pipelines/sdk/build-component/
|
||||
/docs/pipelines/install-sdk/ /docs/components/pipelines/sdk/install-sdk/
|
||||
/docs/pipelines/lightweight-python-components/ /docs/components/pipelines/sdk/python-function-components/
|
||||
/docs/pipelines/sdk/lightweight-python-components/ /docs/components/pipelines/sdk/python-function-components/
|
||||
/docs/pipelines/build-pipeline/ /docs/components/pipelines/tutorials/build-pipeline/
|
||||
/docs/pipelines/pipelines-tutorial/ /docs/components/pipelines/tutorials/cloud-tutorials/
|
||||
/docs/pipelines/tutorials/pipelines-tutorial/ /docs/components/pipelines/tutorials/cloud-tutorials/
|
||||
/docs/gke/pipelines-tutorial/ /docs/components/pipelines/tutorials/cloud-tutorials/
|
||||
/docs/gke/pipelines/pipelines-tutorial/ /docs/components/pipelines/tutorials/cloud-tutorials/
|
||||
/docs/pipelines/output-viewer/ /docs/components/pipelines/legacy-v1/sdk/output-viewer/
|
||||
/docs/pipelines/pipelines-metrics/ /docs/components/pipelines/legacy-v1/sdk/pipelines-metrics/
|
||||
/docs/pipelines/build-component/ /docs/components/pipelines/legacy-v1/sdk/component-development/
|
||||
/docs/pipelines/install-sdk/ /docs/components/pipelines/legacy-v1/sdk/install-sdk/
|
||||
/docs/pipelines/lightweight-python-components/ /docs/components/pipelines/legacy-v1/sdk/python-function-components/
|
||||
/docs/pipelines/sdk/lightweight-python-components/ /docs/components/pipelines/legacy-v1/sdk/python-function-components/
|
||||
/docs/pipelines/build-pipeline/ /docs/components/pipelines/legacy-v1/tutorials/build-pipeline/
|
||||
/docs/pipelines/pipelines-tutorial/ /docs/components/pipelines/legacy-v1/tutorials/cloud-tutorials/
|
||||
/docs/pipelines/tutorials/pipelines-tutorial/ /docs/components/pipelines/legacy-v1/tutorials/cloud-tutorials/
|
||||
/docs/gke/pipelines-tutorial/ /docs/components/pipelines/legacy-v1/tutorials/cloud-tutorials/
|
||||
/docs/gke/pipelines/pipelines-tutorial/ /docs/components/pipelines/legacy-v1/tutorials/cloud-tutorials/
|
||||
/docs/gke/authentication-pipelines/ /docs/distributions/gke/pipelines/authentication-pipelines/
|
||||
|
||||
/docs/pipelines/metrics/ /docs/components/pipelines/sdk/pipelines-metrics/
|
||||
/docs/pipelines/metrics/pipelines-metrics/ /docs/components/pipelines/sdk/pipelines-metrics/
|
||||
/docs/pipelines/metrics/output-viewer/ /docs/components/pipelines/sdk/output-viewer/
|
||||
/docs/pipelines/pipelines-overview/ /docs/components/pipelines/overview/pipelines-overview/
|
||||
/docs/pipelines/enable-gpu-and-tpu/ /docs/distributions/gke/pipelines/enable-gpu-and-tpu/
|
||||
/docs/pipelines/sdk/enable-gpu-and-tpu/ /docs/distributions/gke/pipelines/enable-gpu-and-tpu/
|
||||
/docs/pipelines/sdk/gcp/enable-gpu-and-tpu/ /docs/distributions/gke/pipelines/enable-gpu-and-tpu/
|
||||
/docs/pipelines/preemptible/ /docs/distributions/gke/pipelines/preemptible/
|
||||
/docs/pipelines/sdk/gcp/preemptible/ /docs/distributions/gke/pipelines/preemptible/
|
||||
/docs/pipelines/reusable-components/ /docs/examples/shared-resources/
|
||||
/docs/pipelines/sdk/reusable-components/ /docs/examples/shared-resources/
|
||||
/docs/pipelines/sdk/build-component/ /docs/components/pipelines/sdk/build-pipeline/
|
||||
/docs/components/pipelines/sdk/build-component/ /docs/components/pipelines/sdk/build-pipeline/
|
||||
/docs/components/pipelines/upgrade/ /docs/components/pipelines/installation/upgrade/
|
||||
/docs/pipelines/metrics/ /docs/components/pipelines/legacy-v1/sdk/pipelines-metrics/
|
||||
/docs/pipelines/metrics/pipelines-metrics/ /docs/components/pipelines/legacy-v1/sdk/pipelines-metrics/
|
||||
/docs/pipelines/metrics/output-viewer/ /docs/components/pipelines/legacy-v1/sdk/output-viewer/
|
||||
/docs/pipelines/pipelines-overview/ /docs/components/pipelines/overview/pipelines-overview/
|
||||
/docs/pipelines/enable-gpu-and-tpu/ /docs/distributions/gke/pipelines/enable-gpu-and-tpu/
|
||||
/docs/pipelines/sdk/enable-gpu-and-tpu/ /docs/distributions/gke/pipelines/enable-gpu-and-tpu/
|
||||
/docs/pipelines/sdk/gcp/enable-gpu-and-tpu/ /docs/distributions/gke/pipelines/enable-gpu-and-tpu/
|
||||
/docs/pipelines/preemptible/ /docs/distributions/gke/pipelines/preemptible/
|
||||
/docs/pipelines/sdk/gcp/preemptible/ /docs/distributions/gke/pipelines/preemptible/
|
||||
/docs/pipelines/reusable-components/ /docs/examples/shared-resources/
|
||||
/docs/pipelines/sdk/reusable-components/ /docs/examples/shared-resources/
|
||||
/docs/pipelines/sdk/build-component/ /docs/components/pipelines/legacy-v1/sdk/build-pipeline/
|
||||
/docs/components/pipelines/sdk/build-component/ /docs/components/pipelines/legacy-v1/sdk/build-pipeline/
|
||||
/docs/components/pipelines/upgrade/ /docs/components/pipelines/legacy-v1/installation/upgrade/
|
||||
|
||||
# Moved the guide to monitoring GKE deployments.
|
||||
/docs/other-guides/monitoring/ /docs/distributions/gke/monitoring/
|
||||
|
|
@ -58,10 +58,10 @@
|
|||
/docs/pipelines/pipelines-concepts/ /docs/components/pipelines/concepts/
|
||||
|
||||
# Replaces Pipelines DSL overview with SDK overview
|
||||
/docs/pipelines/sdk/dsl-overview/ /docs/components/pipelines/sdk/sdk-overview/
|
||||
/docs/pipelines/sdk/dsl-overview/ /docs/components/pipelines/legacy-v1/sdk/sdk-overview/
|
||||
|
||||
# Created a new section for pipelines installation.
|
||||
/docs/pipelines/standalone-deployment-gcp/ /docs/components/pipelines/installation/standalone-deployment/
|
||||
/docs/pipelines/standalone-deployment-gcp/ /docs/components/pipelines/operator-guides/installation/
|
||||
|
||||
# Removed the downloads page from Reference to Getting Started with Kubeflow
|
||||
/docs/reference/downloads/ /docs/started/getting-started/
|
||||
|
|
@ -74,7 +74,6 @@ docs/started/requirements/ /docs/started/getting-started/
|
|||
/docs/components/modeldb /docs/components/misc/modeldb
|
||||
/docs/components/mpi /docs/components/training/mpi
|
||||
/docs/components/mxnet /docs/components/training/mxnet
|
||||
/docs/components/pipelines /docs/components/pipelines/pipelines
|
||||
/docs/components/pytorch /docs/components/training/pytorch
|
||||
/docs/components/nuclio /docs/components/misc/nuclio
|
||||
/docs/components/seldon /docs/external-add-ons/serving/seldon
|
||||
|
|
@ -174,17 +173,12 @@ docs/started/requirements/ /docs/started/getting-started/
|
|||
/docs/components/notebooks/why-use-jupyter-notebook /docs/components/notebooks/overview
|
||||
|
||||
# Refactor Pipelines section
|
||||
/docs/components/pipelines/caching /docs/components/pipelines/overview/caching
|
||||
/docs/components/pipelines/caching-v2 /docs/components/pipelines/overview/caching-v2
|
||||
/docs/components/pipelines/multi-user /docs/components/pipelines/overview/multi-user
|
||||
/docs/components/pipelines/pipeline-root /docs/components/pipelines/overview/pipeline-root
|
||||
/docs/components/pipelines/overview/pipelines-overview /docs/components/pipelines/introduction
|
||||
/docs/components/pipelines/caching /docs/components/pipelines/legacy-v1/overview/caching
|
||||
/docs/components/pipelines/caching-v2 /docs/components/pipelines/user-guides/core-functions/caching
|
||||
/docs/components/pipelines/multi-user /docs/components/pipelines/legacy-v1/overview/multi-user
|
||||
/docs/components/pipelines/pipeline-root /docs/components/pipelines/concepts/pipeline-root
|
||||
/docs/components/pipelines/overview/pipelines-overview /docs/components/pipelines/overview
|
||||
/docs/components/pipelines/pipelines-quickstart /docs/components/pipelines/overview/quickstart
|
||||
/docs/components/pipelines/overview/concepts/* /docs/components/pipelines/concepts/:splat
|
||||
/docs/components/pipelines/sdk/v2/* /docs/components/pipelines/sdk-v2/:splat
|
||||
|
||||
# pipelines v1 -> v2 redirects
|
||||
/docs/components/pipelines/* /docs/components/pipelines/v1/:splat
|
||||
|
||||
# Restructure About section
|
||||
/docs/about/kubeflow /docs/started/introduction
|
||||
|
|
@ -213,8 +207,8 @@ docs/started/requirements/ /docs/started/getting-started/
|
|||
# Catch-all redirects should be added at the end of this file as redirects happen from top to bottom
|
||||
# ===============
|
||||
/docs/guides/* /docs/:splat
|
||||
/docs/pipelines/concepts/* /docs/components/pipelines/overview/concepts/:splat
|
||||
/docs/pipelines/* /docs/components/pipelines/:splat
|
||||
/docs/pipelines/concepts/* /docs/components/pipelines/concepts
|
||||
/docs/pipelines/* /docs/components/pipelines
|
||||
/docs/aws/* /docs/distributions/aws/
|
||||
/docs/azure/* /docs/distributions/azure/:splat
|
||||
/docs/gke/* /docs/distributions/gke/:splat
|
||||
|
|
@ -241,3 +235,91 @@ docs/started/requirements/ /docs/started/getting-started/
|
|||
/docs/components/katib/early-stopping /docs/components/katib/user-guides/early-stopping
|
||||
/docs/components/katib/resume-experiment /docs/components/katib/user-guides/resume-experiment
|
||||
/docs/components/katib/trial-template /docs/components/katib/user-guides/trial-template
|
||||
|
||||
# Restructured the pipeline docs (https://github.com/kubeflow/website/issues/3716)
|
||||
/docs/components/pipelines/overview/quickstart/ /docs/components/pipelines/overview/
|
||||
/docs/components/pipelines/v1/ /docs/components/pipelines/legacy-v1/
|
||||
/docs/components/pipelines/v1/concepts/ /docs/components/pipelines/concepts/
|
||||
/docs/components/pipelines/v1/concepts/component/ /docs/components/pipelines/concepts/component/
|
||||
/docs/components/pipelines/v1/concepts/experiment/ /docs/components/pipelines/concepts/experiment/
|
||||
/docs/components/pipelines/v1/concepts/graph/ /docs/components/pipelines/concepts/graph/
|
||||
/docs/components/pipelines/v1/concepts/metadata/ /docs/components/pipelines/concepts/metadata/
|
||||
/docs/components/pipelines/v1/concepts/output-artifact/ /docs/components/pipelines/concepts/output-artifact/
|
||||
/docs/components/pipelines/v1/concepts/pipeline/ /docs/components/pipelines/concepts/pipeline/
|
||||
/docs/components/pipelines/v1/concepts/run-trigger/ /docs/components/pipelines/concepts/run-trigger/
|
||||
/docs/components/pipelines/v1/concepts/run/ /docs/components/pipelines/concepts/run/
|
||||
/docs/components/pipelines/v1/concepts/step/ /docs/components/pipelines/concepts/step/
|
||||
/docs/components/pipelines/v1/installation/ /docs/components/pipelines/legacy-v1/installation/
|
||||
/docs/components/pipelines/v1/installation/choose-executor/ /docs/components/pipelines/legacy-v1/installation/choose-executor/
|
||||
/docs/components/pipelines/v1/installation/compatibility-matrix/ /docs/components/pipelines/legacy-v1/installation/compatibility-matrix/
|
||||
/docs/components/pipelines/v1/installation/localcluster-deployment/ /docs/components/pipelines/legacy-v1/installation/localcluster-deployment/
|
||||
/docs/components/pipelines/v1/installation/overview/ /docs/components/pipelines/legacy-v1/installation/overview/
|
||||
/docs/components/pipelines/v1/installation/standalone-deployment/ /docs/components/pipelines/legacy-v1/installation/standalone-deployment/
|
||||
/docs/components/pipelines/v1/installation/upgrade/ /docs/components/pipelines/legacy-v1/installation/upgrade/
|
||||
/docs/components/pipelines/v1/introduction/ /docs/components/pipelines/legacy-v1/introduction/
|
||||
/docs/components/pipelines/v1/overview/ /docs/components/pipelines/legacy-v1/overview/
|
||||
/docs/components/pipelines/v1/overview/caching/ /docs/components/pipelines/legacy-v1/overview/caching/
|
||||
/docs/components/pipelines/v1/overview/interfaces/ /docs/components/pipelines/interfaces/
|
||||
/docs/components/pipelines/v1/overview/multi-user/ /docs/components/pipelines/legacy-v1/overview/multi-user/
|
||||
/docs/components/pipelines/v1/overview/pipeline-root/ /docs/components/pipelines/concepts/pipeline-root/
|
||||
/docs/components/pipelines/v1/overview/quickstart/ /docs/components/pipelines/legacy-v1/overview/quickstart/
|
||||
/docs/components/pipelines/v1/reference/ /docs/components/pipelines/legacy-v1/reference/
|
||||
/docs/components/pipelines/v1/reference/api/kubeflow-pipeline-api-spec/ /docs/components/pipelines/legacy-v1/reference/api/kubeflow-pipeline-api-spec/
|
||||
/docs/components/pipelines/v1/reference/component-spec/ /docs/components/pipelines/reference/component-spec/
|
||||
/docs/components/pipelines/v1/reference/sdk/ /docs/components/pipelines/legacy-v1/reference/sdk/
|
||||
/docs/components/pipelines/v1/sdk/ /docs/components/pipelines/legacy-v1/sdk/
|
||||
/docs/components/pipelines/v1/sdk/best-practices/ /docs/components/pipelines/legacy-v1/sdk/best-practices/
|
||||
/docs/components/pipelines/v1/sdk/build-pipeline/ /docs/components/pipelines/legacy-v1/sdk/build-pipeline/
|
||||
/docs/components/pipelines/v1/sdk/component-development/ /docs/components/pipelines/legacy-v1/sdk/component-development/
|
||||
/docs/components/pipelines/v1/sdk/connect-api/ /docs/components/pipelines/user-guides/core-functions/connect-api/
|
||||
/docs/components/pipelines/v1/sdk/dsl-recursion/ /docs/components/pipelines/legacy-v1/sdk/dsl-recursion/
|
||||
/docs/components/pipelines/v1/sdk/enviroment_variables/ /docs/components/pipelines/legacy-v1/sdk/enviroment_variables/
|
||||
/docs/components/pipelines/v1/sdk/gcp/ /docs/components/pipelines/legacy-v1/sdk/gcp/
|
||||
/docs/components/pipelines/v1/sdk/install-sdk/ /docs/components/pipelines/legacy-v1/sdk/install-sdk/
|
||||
/docs/components/pipelines/v1/sdk/manipulate-resources/ /docs/components/pipelines/legacy-v1/sdk/manipulate-resources/
|
||||
/docs/components/pipelines/v1/sdk/output-viewer/ /docs/components/pipelines/legacy-v1/sdk/output-viewer/
|
||||
/docs/components/pipelines/v1/sdk/parameters/ /docs/components/pipelines/legacy-v1/sdk/parameters/
|
||||
/docs/components/pipelines/v1/sdk/pipelines-with-tekton/ /docs/components/pipelines/legacy-v1/sdk/pipelines-with-tekton/
|
||||
/docs/components/pipelines/v1/sdk/python-based-visualizations/ /docs/components/pipelines/legacy-v1/sdk/python-based-visualizations/
|
||||
/docs/components/pipelines/v1/sdk/python-function-components/ /docs/components/pipelines/legacy-v1/sdk/python-function-components/
|
||||
/docs/components/pipelines/v1/sdk/sdk-overview/ /docs/components/pipelines/legacy-v1/sdk/sdk-overview/
|
||||
/docs/components/pipelines/v1/sdk/static-type-checking/ /docs/components/pipelines/legacy-v1/sdk/static-type-checking/
|
||||
/docs/components/pipelines/v1/troubleshooting/ /docs/components/pipelines/legacy-v1/troubleshooting/
|
||||
/docs/components/pipelines/v1/tutorials/ /docs/components/pipelines/legacy-v1/tutorials/
|
||||
/docs/components/pipelines/v1/tutorials/api-pipelines/ /docs/components/pipelines/legacy-v1/tutorials/api-pipelines/
|
||||
/docs/components/pipelines/v1/tutorials/benchmark-examples/ /docs/components/pipelines/legacy-v1/tutorials/benchmark-examples/
|
||||
/docs/components/pipelines/v1/tutorials/build-pipeline/ /docs/components/pipelines/legacy-v1/tutorials/build-pipeline/
|
||||
/docs/components/pipelines/v1/tutorials/cloud-tutorials/ /docs/components/pipelines/legacy-v1/tutorials/cloud-tutorials/
|
||||
/docs/components/pipelines/v1/tutorials/sdk-examples/ /docs/components/pipelines/legacy-v1/tutorials/sdk-examples/
|
||||
/docs/components/pipelines/v2/ /docs/components/pipelines/
|
||||
/docs/components/pipelines/v2/administration/ /docs/components/pipelines/operator-guides/
|
||||
/docs/components/pipelines/v2/administration/server-config/ /docs/components/pipelines/operator-guides/server-config/
|
||||
/docs/components/pipelines/v2/caching/ /docs/components/pipelines/user-guides/core-functions/caching/
|
||||
/docs/components/pipelines/v2/cli/ /docs/components/pipelines/user-guides/core-functions/cli/
|
||||
/docs/components/pipelines/v2/community-and-support/ /docs/components/pipelines/reference/community-and-support/
|
||||
/docs/components/pipelines/v2/compile-a-pipeline/ /docs/components/pipelines/user-guides/core-functions/compile-a-pipeline/
|
||||
/docs/components/pipelines/v2/components/ /docs/components/pipelines/user-guides/components/
|
||||
/docs/components/pipelines/v2/components/additional-functionality/ /docs/components/pipelines/user-guides/components/additional-functionality/
|
||||
/docs/components/pipelines/v2/components/container-components/ /docs/components/pipelines/user-guides/components/container-components/
|
||||
/docs/components/pipelines/v2/components/containerized-python-components/ /docs/components/pipelines/user-guides/components/containerized-python-components/
|
||||
/docs/components/pipelines/v2/components/importer-component/ /docs/components/pipelines/user-guides/components/importer-component/
|
||||
/docs/components/pipelines/v2/components/lightweight-python-components/ /docs/components/pipelines/user-guides/components/lightweight-python-components/
|
||||
/docs/components/pipelines/v2/data-types/ /docs/components/pipelines/user-guides/data-handling/data-types/
|
||||
/docs/components/pipelines/v2/data-types/artifacts/ /docs/components/pipelines/user-guides/data-handling/artifacts/
|
||||
/docs/components/pipelines/v2/data-types/parameters/ /docs/components/pipelines/user-guides/data-handling/parameters/
|
||||
/docs/components/pipelines/v2/hello-world/ /docs/components/pipelines/getting-started/
|
||||
/docs/components/pipelines/v2/installation/ /docs/components/pipelines/operator-guides/installation/
|
||||
/docs/components/pipelines/v2/installation/quickstart/ /docs/components/pipelines/operator-guides/installation/
|
||||
/docs/components/pipelines/v2/introduction/ /docs/components/pipelines/overview/
|
||||
/docs/components/pipelines/v2/load-and-share-components/ /docs/components/pipelines/user-guides/components/load-and-share-components/
|
||||
/docs/components/pipelines/v2/local-execution/ /docs/components/pipelines/user-guides/core-functions/execute-kfp-pipelines-locally/
|
||||
/docs/components/pipelines/v2/migration/ /docs/components/pipelines/user-guides/migration/
|
||||
/docs/components/pipelines/v2/pipelines/ /docs/components/pipelines/user-guides/core-functions/
|
||||
/docs/components/pipelines/v2/pipelines/control-flow/ /docs/components/pipelines/user-guides/core-functions/control-flow/
|
||||
/docs/components/pipelines/v2/pipelines/pipeline-basics/ /docs/components/pipelines/user-guides/components/compose-components-into-pipelines/
|
||||
/docs/components/pipelines/v2/platform-specific-features/ /docs/components/pipelines/user-guides/core-functions/platform-specific-features/
|
||||
/docs/components/pipelines/v2/reference/ /docs/components/pipelines/reference/
|
||||
/docs/components/pipelines/v2/reference/api/kubeflow-pipeline-api-spec/ /docs/components/pipelines/reference/api/kubeflow-pipeline-api-spec/
|
||||
/docs/components/pipelines/v2/reference/sdk/ /docs/components/pipelines/reference/sdk/
|
||||
/docs/components/pipelines/v2/run-a-pipeline/ /docs/components/pipelines/user-guides/core-functions/run-a-pipeline/
|
||||
/docs/components/pipelines/v2/version-compatibility/ /docs/components/pipelines/reference/version-compatibility/
|
||||
|
|
@ -54,5 +54,5 @@ weight = 40
|
|||
## Next steps
|
||||
|
||||
- See a [simple example](https://github.com/kubeflow/examples/tree/master/pipelines/simple-notebook-pipeline) of creating Kubeflow pipelines in a Jupyter notebook.
|
||||
- Build machine-learning pipelines with the [Kubeflow Pipelines SDK](/docs/components/pipelines/sdk/sdk-overview/).
|
||||
- Build machine-learning pipelines with the [Kubeflow Pipelines SDK](/docs/components/pipelines/legacy-v1/sdk/sdk-overview/).
|
||||
- Learn the advanced features available from a Kubeflow notebook, such as [submitting Kubernetes resources](/docs/components/notebooks/submit-kubernetes/) or [building Docker images](/docs/components/notebooks/custom-notebook/).
|
||||
|
|
|
|||
|
|
@ -1,5 +1,5 @@
|
|||
+++
|
||||
title = "Concepts"
|
||||
description = "Concepts used in Kubeflow Pipelines"
|
||||
weight = 30
|
||||
weight = 4
|
||||
+++
|
||||
|
|
@ -57,11 +57,11 @@ deserialize the data for use in the downstream component.
|
|||
|
||||
## Next steps
|
||||
|
||||
* Read an [overview of Kubeflow Pipelines](/docs/components/pipelines/introduction/).
|
||||
* Follow the [pipelines quickstart guide](/docs/components/pipelines/overview/quickstart/)
|
||||
* Read an [overview of Kubeflow Pipelines](/docs/components/pipelines/overview/).
|
||||
* Follow the [pipelines quickstart guide](/docs/components/pipelines/getting-started/)
|
||||
to deploy Kubeflow and run a sample pipeline directly from the Kubeflow
|
||||
Pipelines UI.
|
||||
* Build your own
|
||||
[component and pipeline](/docs/components/pipelines/sdk/build-component/).
|
||||
* Build a [reusable component](/docs/components/pipelines/sdk/component-development/) for
|
||||
[component and pipeline](/docs/components/pipelines/legacy-v1/sdk/component-development/).
|
||||
* Build a [reusable component](/docs/components/pipelines/legacy-v1/sdk/component-development/) for
|
||||
sharing in multiple pipelines.
|
||||
|
|
@ -12,7 +12,7 @@ groups. Experiments can contain arbitrary runs, including
|
|||
|
||||
## Next steps
|
||||
|
||||
* Read an [overview of Kubeflow Pipelines](/docs/components/pipelines/introduction/).
|
||||
* Follow the [pipelines quickstart guide](/docs/components/pipelines/overview/quickstart/)
|
||||
* Read an [overview of Kubeflow Pipelines](/docs/components/pipelines/overview/).
|
||||
* Follow the [pipelines quickstart guide](/docs/components/pipelines/getting-started/)
|
||||
to deploy Kubeflow and run a sample pipeline directly from the Kubeflow
|
||||
Pipelines UI.
|
||||
|
|
@ -24,7 +24,7 @@ parent contains a conditional clause.)
|
|||
|
||||
## Next steps
|
||||
|
||||
* Read an [overview of Kubeflow Pipelines](/docs/components/pipelines/introduction/).
|
||||
* Follow the [pipelines quickstart guide](/docs/components/pipelines/overview/quickstart/)
|
||||
* Read an [overview of Kubeflow Pipelines](/docs/components/pipelines/overview/).
|
||||
* Follow the [pipelines quickstart guide](/docs/components/pipelines/getting-started/)
|
||||
to deploy Kubeflow and run a sample pipeline directly from the Kubeflow
|
||||
Pipelines UI.
|
||||
|
|
@ -15,11 +15,11 @@ data to rich interactive visualizations.
|
|||
|
||||
## Next steps
|
||||
|
||||
* Read an [overview of Kubeflow Pipelines](/docs/components/pipelines/introduction/).
|
||||
* Follow the [pipelines quickstart guide](/docs/components/pipelines/overview/quickstart/)
|
||||
* Read an [overview of Kubeflow Pipelines](/docs/components/pipelines/overview/).
|
||||
* Follow the [pipelines quickstart guide](/docs/components/pipelines/getting-started/)
|
||||
to deploy Kubeflow and run a sample pipeline directly from the Kubeflow
|
||||
Pipelines UI.
|
||||
* Read more about the available
|
||||
[output viewers](/docs/components/pipelines/sdk/output-viewer/)
|
||||
[output viewers](/docs/components/pipelines/legacy-v1/sdk/output-viewer/)
|
||||
and how to provide the metadata to make use of the visualizations
|
||||
that the output viewers provide.
|
||||
|
|
@ -1,18 +1,16 @@
|
|||
+++
|
||||
title = "Pipeline Root"
|
||||
description = "Getting started with Kubeflow Pipelines pipeline root"
|
||||
weight = 50
|
||||
weight = 15
|
||||
|
||||
+++
|
||||
{{% beta-status
|
||||
feedbacklink="https://github.com/kubeflow/pipelines/issues" %}}
|
||||
|
||||
Starting from [Kubeflow Pipelines SDK v2](https://www.kubeflow.org/docs/components/pipelines/sdk-v2/) and Kubeflow Pipelines 1.7.0, Kubeflow Pipelines supports a new intermediate artifact repository feature: pipeline root in both [standalone deployment](https://www.kubeflow.org/docs/components/pipelines/installation/standalone-deployment/) and [AI Platform Pipelines](https://cloud.google.com/ai-platform/pipelines/docs).
|
||||
Starting from [Kubeflow Pipelines SDK v2](https://www.kubeflow.org/docs/components/pipelines/legacy-v1/sdk-v2/) and Kubeflow Pipelines 1.7.0, Kubeflow Pipelines supports a new intermediate artifact repository feature: pipeline root in both [standalone deployment](/docs/components/pipelines/legacy-v1/installation/standalone-deployment/) and [AI Platform Pipelines](https://cloud.google.com/ai-platform/pipelines/docs).
|
||||
|
||||
## Before you start
|
||||
This guide tells you the basic concepts of Kubeflow Pipelines pipeline root and how to use it.
|
||||
This guide assumes that you already have Kubeflow Pipelines installed, or want to use standalone or AI Platform Pipelines options in the [Kubeflow Pipelines deployment
|
||||
guide](/docs/components/pipelines/installation/) to deploy Kubeflow Pipelines.
|
||||
guide](/docs/components/pipelines/operator-guides/installation/) to deploy Kubeflow Pipelines.
|
||||
|
||||
## What is pipeline root?
|
||||
|
||||
|
|
@ -56,7 +54,7 @@ kubectl edit configMap kfp-launcher -n ${namespace}
|
|||
This pipeline root will be the default pipeline root for all pipelines running in the Kubernetes namespace unless you override it using one of the following options:
|
||||
|
||||
#### Via Building Pipelines
|
||||
You can configure a pipeline root through the `kfp.dsl.pipeline` annotation when [building pipelines](https://www.kubeflow.org/docs/components/pipelines/sdk-v2/build-pipeline/#build-your-pipeline)
|
||||
You can configure a pipeline root through the `kfp.dsl.pipeline` annotation when [building pipelines](/docs/components/pipelines/legacy-v1/sdk/build-pipeline/#build-your-pipeline)
|
||||
|
||||
#### Via Submitting a Pipeline through SDK
|
||||
You can configure pipeline root via `pipeline_root` argument when you submit a Pipeline using one of the following:
|
||||
|
|
@ -18,7 +18,7 @@ start Docker containers, and the containers in turn start your programs.
|
|||
After developing your pipeline, you can upload your pipeline using the Kubeflow Pipelines UI or the Kubeflow Pipelines SDK.
|
||||
|
||||
## Next steps
|
||||
* Read an [overview of Kubeflow Pipelines](/docs/components/pipelines/introduction/).
|
||||
* Follow the [pipelines quickstart guide](/docs/components/pipelines/overview/quickstart/)
|
||||
* Read an [overview of Kubeflow Pipelines](/docs/components/pipelines/overview/).
|
||||
* Follow the [pipelines quickstart guide](/docs/components/pipelines/getting-started/)
|
||||
to deploy Kubeflow and run a sample pipeline directly from the Kubeflow
|
||||
Pipelines UI.
|
||||
|
|
@ -15,7 +15,7 @@ available:
|
|||
|
||||
## Next steps
|
||||
|
||||
* Read an [overview of Kubeflow Pipelines](/docs/components/pipelines/introduction/).
|
||||
* Follow the [pipelines quickstart guide](/docs/components/pipelines/overview/quickstart/)
|
||||
* Read an [overview of Kubeflow Pipelines](/docs/components/pipelines/overview/).
|
||||
* Follow the [pipelines quickstart guide](/docs/components/pipelines/getting-started/)
|
||||
to deploy Kubeflow and run a sample pipeline directly from the Kubeflow
|
||||
Pipelines UI.
|
||||
|
|
@ -25,7 +25,7 @@ triggered to run frequently.
|
|||
|
||||
## Next steps
|
||||
|
||||
* Read an [overview of Kubeflow Pipelines](/docs/components/pipelines/introduction/).
|
||||
* Follow the [pipelines quickstart guide](/docs/components/pipelines/overview/quickstart/)
|
||||
* Read an [overview of Kubeflow Pipelines](/docs/components/pipelines/overview/).
|
||||
* Follow the [pipelines quickstart guide](/docs/components/pipelines/getting-started/)
|
||||
to deploy Kubeflow and run a sample pipeline directly from the Kubeflow
|
||||
Pipelines UI.
|
||||
|
|
@ -13,7 +13,7 @@ an if/else like clause in the pipeline code.
|
|||
|
||||
## Next steps
|
||||
|
||||
* Read an [overview of Kubeflow Pipelines](/docs/components/pipelines/introduction/).
|
||||
* Follow the [pipelines quickstart guide](/docs/components/pipelines/overview/quickstart/)
|
||||
* Read an [overview of Kubeflow Pipelines](/docs/components/pipelines/overview/).
|
||||
* Follow the [pipelines quickstart guide](/docs/components/pipelines/getting-started/)
|
||||
to deploy Kubeflow and run a sample pipeline directly from the Kubeflow
|
||||
Pipelines UI.
|
||||
|
|
@ -1,7 +1,7 @@
|
|||
+++
|
||||
title = "Hello World Pipeline"
|
||||
title = "Getting started"
|
||||
description = "Create your first pipeline"
|
||||
weight = 3
|
||||
weight = 2
|
||||
+++
|
||||
|
||||
{{% kfp-v2-keywords %}}
|
||||
|
|
@ -57,10 +57,10 @@ The client will print a link to view the pipeline execution graph and logs in th
|
|||
|
||||
In the next few sections, you'll learn more about the core concepts of authoring pipelines and how to create more expressive, useful pipelines.
|
||||
|
||||
[installation]: /docs/components/pipelines/v2/installation/
|
||||
[installation]: /docs/components/pipelines/operator-guides/installation/
|
||||
[client]: https://kubeflow-pipelines.readthedocs.io/en/stable/source/client.html#kfp.client.Client
|
||||
[compiler]: https://kubeflow-pipelines.readthedocs.io/en/stable/source/compiler.html#kfp.compiler.Compiler
|
||||
[ir-yaml]: /docs/components/pipelines/v2/compile-a-pipeline#ir-yaml
|
||||
[compile-a-pipeline]: /docs/components/pipelines/v2/compile-a-pipeline/
|
||||
[ir-yaml]: /docs/components/pipelines/user-guides/core-functions/compile-a-pipeline#ir-yaml
|
||||
[compile-a-pipeline]: /docs/components/pipelines/user-guides/core-functions/compile-a-pipeline/
|
||||
[dsl-pipeline]: https://kubeflow-pipelines.readthedocs.io/en/stable/source/dsl.html#kfp.dsl.pipeline
|
||||
[dsl-component]: https://kubeflow-pipelines.readthedocs.io/en/stable/source/dsl.html#kfp.dsl.component
|
||||
|
|
@ -1,7 +1,7 @@
|
|||
+++
|
||||
title = "Pipelines Interfaces"
|
||||
title = "Interfaces"
|
||||
description = "The ways you can interact with the Kubeflow Pipelines system"
|
||||
weight = 20
|
||||
weight = 3
|
||||
|
||||
+++
|
||||
|
||||
|
|
@ -21,7 +21,7 @@ From the Kubeflow Pipelines UI you can perform the following tasks:
|
|||
* Run one or more of the preloaded samples to try out pipelines quickly.
|
||||
* Upload a pipeline as a compressed file. The pipeline can be one that you
|
||||
have built (see how to [build a
|
||||
pipeline](/docs/components/pipelines/sdk/build-pipeline/)) or one
|
||||
pipeline](/docs/components/pipelines/legacy-v1/sdk/build-pipeline/)) or one
|
||||
that someone has shared with you.
|
||||
* Create an *experiment* to group one or more of your pipeline runs.
|
||||
See the [definition of an
|
||||
|
|
@ -33,13 +33,13 @@ From the Kubeflow Pipelines UI you can perform the following tasks:
|
|||
* Compare the results of one or more runs within an experiment.
|
||||
* Schedule runs by creating a recurring run.
|
||||
|
||||
See the [quickstart guide](/docs/components/pipelines/overview/quickstart/) for more
|
||||
See the [quickstart guide](/docs/components/pipelines/legacy-v1/overview/quickstart/) for more
|
||||
information about accessing the Kubeflow Pipelines UI and running the samples.
|
||||
|
||||
When building a pipeline component, you can write out information for display
|
||||
in the UI. See the guides to [exporting
|
||||
metrics](/docs/components/pipelines/sdk/pipelines-metrics/) and [visualizing results in
|
||||
the UI](/docs/components/pipelines/sdk/output-viewer/).
|
||||
metrics](/docs/components/pipelines/legacy-v1/sdk/pipelines-metrics/) and [visualizing results in
|
||||
the UI](/docs/components/pipelines/legacy-v1/sdk/output-viewer/).
|
||||
|
||||
## Python SDK
|
||||
|
||||
|
|
@ -47,7 +47,7 @@ The Kubeflow Pipelines SDK provides a set of Python packages that you can use to
|
|||
specify and run your ML workflows.
|
||||
|
||||
See the [introduction to the Kubeflow Pipelines
|
||||
SDK](/docs/components/pipelines/sdk/sdk-overview/) for an overview of the ways you can
|
||||
SDK](/docs/components/pipelines/legacy-v1/sdk/sdk-overview/) for an overview of the ways you can
|
||||
use the SDK to build pipeline components and pipelines.
|
||||
|
||||
## REST API
|
||||
|
|
@ -1,5 +1,5 @@
|
|||
+++
|
||||
title = "v1"
|
||||
title = "Legacy (v1)"
|
||||
description = "Kubeflow Pipelines v1 Documentation"
|
||||
weight = 15
|
||||
weight = 999
|
||||
+++
|
||||
|
|
@ -32,15 +32,15 @@ improvements can make it the default executor that most people should use going
|
|||
* Security: more secure
|
||||
* No `privileged` access.
|
||||
* Cannot escape the privileges of the pod's service account.
|
||||
* Migration: `command` must be specified in [Kubeflow Pipelines component specification](https://www.kubeflow.org/docs/components/pipelines/reference/component-spec/).
|
||||
* Migration: `command` must be specified in [Kubeflow Pipelines component specification](/docs/components/pipelines/reference/component-spec/).
|
||||
|
||||
Note, the same migration requirement is required by [Kubeflow Pipelines v2 compatible mode](https://www.kubeflow.org/docs/components/pipelines/sdk-v2/v2-compatibility/), refer to
|
||||
Note, the same migration requirement is required by [Kubeflow Pipelines v2 compatible mode](/docs/components/pipelines/legacy-v1/sdk/v2-compatibility/), refer to
|
||||
[known caveats & breaking changes](https://github.com/kubeflow/pipelines/issues/6133).
|
||||
|
||||
#### Migrate to Emissary Executor
|
||||
|
||||
Prerequisite: emissary executor is only available in Kubeflow Pipelines backend version 1.7+.
|
||||
To upgrade, refer to [upgrading Kubeflow Pipelines](/docs/components/pipelines/upgrade/).
|
||||
To upgrade, refer to [upgrading Kubeflow Pipelines](/docs/components/pipelines/legacy-v1/installation/upgrade//).
|
||||
|
||||
##### Configure an existing Kubeflow Pipelines cluster to use emissary executor
|
||||
|
||||
|
|
@ -92,7 +92,7 @@ To upgrade, refer to [upgrading Kubeflow Pipelines](/docs/components/pipelines/u
|
|||
|
||||
For [AI Platform Pipelines](https://cloud.google.com/ai-platform/pipelines/docs), check the "Use emissary executor" checkbox during installation.
|
||||
|
||||
For [Kubeflow Pipelines Standalone](https://www.kubeflow.org/docs/components/pipelines/installation/standalone-deployment/), install `env/platform-agnostic-emissary`:
|
||||
For [Kubeflow Pipelines Standalone](/docs/components/pipelines/legacy-v1/installation/standalone-deployment/), install `env/platform-agnostic-emissary`:
|
||||
|
||||
```bash
|
||||
kubectl apply -k "github.com/kubeflow/pipelines/manifests/kustomize/env/platform-agnostic-emissary?ref=$PIPELINE_VERSION"
|
||||
|
|
@ -105,7 +105,7 @@ existing clusters.
|
|||
##### Migrate pipeline components to run on emissary executor
|
||||
|
||||
Some pipeline components require manual updates to run on emissary executor.
|
||||
For [Kubeflow Pipelines component specification](https://www.kubeflow.org/docs/components/pipelines/reference/component-spec/) YAML,
|
||||
For [Kubeflow Pipelines component specification](/docs/components/pipelines/reference/component-spec/) YAML,
|
||||
the `command` field must be specified.
|
||||
|
||||
Step by step component migration tutorial:
|
||||
|
|
@ -152,7 +152,7 @@ Step by step component migration tutorial:
|
|||
1. The updated component can run on emissary executor now.
|
||||
|
||||
Note: Kubeflow Pipelines SDK compiler always specifies a command for
|
||||
[python function based components](https://www.kubeflow.org/docs/components/pipelines/sdk/python-function-components/).
|
||||
[python function based components](/docs/components/pipelines/legacy-v1/sdk/python-function-components/).
|
||||
Therefore, these components will continue to work on emissary executor without
|
||||
modifications.
|
||||
|
||||
|
|
@ -16,7 +16,7 @@ Such deployment methods can be part of your local environment using the supplied
|
|||
kustomize manifests for test purposes. This guide is an alternative to
|
||||
|
||||
[Deploying Kubeflow Pipelines
|
||||
(KFP)](/docs/started/getting-started/#installing-kubeflow).
|
||||
(KFP)](/docs/started/#installing-kubeflow).
|
||||
|
||||
## Before you get started
|
||||
|
||||
|
|
@ -14,7 +14,7 @@ portable installation that only includes Kubeflow Pipelines.
|
|||
* Kubeflow Pipelines as [part of a full Kubeflow deployment](#full-kubeflow-deployment) provides
|
||||
all Kubeflow components and more integration with each platform.
|
||||
* **Beta**: [Google Cloud AI Platform Pipelines](#google-cloud-ai-platform-pipelines) makes it easier to install and use Kubeflow Pipelines on Google Cloud by providing a management UI on [Google Cloud Console](https://console.cloud.google.com/ai-platform/pipelines/clusters).
|
||||
* A [local](/docs/components/pipelines/installation/localcluster-deployment) Kubeflow Pipelines deployment for testing purposes.
|
||||
* A [local](/docs/components/pipelines/legacy-v1/installation/localcluster-deployment) Kubeflow Pipelines deployment for testing purposes.
|
||||
|
||||
## Choosing an installation option
|
||||
|
||||
|
|
@ -23,7 +23,7 @@ all Kubeflow components and more integration with each platform.
|
|||
If yes, choose the [full Kubeflow deployment](#full-kubeflow-deployment).
|
||||
1. Can you use a cloud/on-prem Kubernetes cluster?
|
||||
|
||||
If you can't, you should try using Kubeflow Pipelines on a local Kubernetes cluster for learning and testing purposes by following the steps in [Deploying Kubeflow Pipelines on a local cluster](/docs/components/pipelines/installation/localcluster-deployment).
|
||||
If you can't, you should try using Kubeflow Pipelines on a local Kubernetes cluster for learning and testing purposes by following the steps in [Deploying Kubeflow Pipelines on a local cluster](/docs/components/pipelines/legacy-v1/installation/localcluster-deployment).
|
||||
1. Do you want to use Kubeflow Pipelines with [multi-user support](https://github.com/kubeflow/pipelines/issues/1223)?
|
||||
|
||||
If yes, choose the [full Kubeflow deployment](#full-kubeflow-deployment) with version >= v1.1.
|
||||
|
|
@ -56,7 +56,7 @@ Kubeflow Pipelines into an existing Kubernetes cluster.
|
|||
|
||||
Installation guide
|
||||
: [Kubeflow Pipelines Standalone deployment
|
||||
guide](/docs/components/pipelines/installation/standalone-deployment/)
|
||||
guide](/docs/components/pipelines/legacy-v1/installation/standalone-deployment/)
|
||||
|
||||
Interfaces
|
||||
:
|
||||
|
|
@ -74,7 +74,7 @@ Release Schedule
|
|||
You will have access to the latest features.
|
||||
|
||||
Upgrade Support (**Beta**)
|
||||
: [Upgrading Kubeflow Pipelines Standalone](/docs/components/pipelines/installation/standalone-deployment/#upgrading-kubeflow-pipelines) introduces how to upgrade
|
||||
: [Upgrading Kubeflow Pipelines Standalone](/docs/components/pipelines/legacy-v1/installation/standalone-deployment/#upgrading-kubeflow-pipelines) introduces how to upgrade
|
||||
in place.
|
||||
|
||||
Google Cloud Integrations
|
||||
|
|
@ -101,7 +101,7 @@ Use this option to deploy Kubeflow Pipelines to your local machine, on-premises,
|
|||
or to a cloud, as part of a full Kubeflow installation.
|
||||
|
||||
Installation guide
|
||||
: [Kubeflow installation guide](/docs/started/getting-started/)
|
||||
: [Kubeflow installation guide](/docs/started/)
|
||||
|
||||
Interfaces
|
||||
:
|
||||
|
|
@ -5,7 +5,7 @@ weight = 30
|
|||
+++
|
||||
|
||||
As an alternative to deploying Kubeflow Pipelines (KFP) as part of the
|
||||
[Kubeflow deployment](/docs/started/getting-started/#installing-kubeflow), you also have a choice
|
||||
[Kubeflow deployment](/docs/started/#installing-kubeflow), you also have a choice
|
||||
to deploy only Kubeflow Pipelines. Follow the instructions below to deploy
|
||||
Kubeflow Pipelines standalone using the supplied kustomize manifests.
|
||||
|
||||
|
|
@ -104,7 +104,7 @@ Kubeflow Pipelines will change default executor from Docker to Emissary starting
|
|||
deprecated on Kubernetes 1.20+.
|
||||
|
||||
For Kubeflow Pipelines before v1.8, configure to use Emissary executor by
|
||||
referring to [Argo Workflow Executors](/docs/components/pipelines/installation/choose-executor).
|
||||
referring to [Argo Workflow Executors](/docs/components/pipelines/legacy-v1/installation/choose-executor).
|
||||
{{% /alert %}}
|
||||
|
||||
1. Get the public URL for the Kubeflow Pipelines UI and use it to access the Kubeflow Pipelines UI:
|
||||
|
|
@ -115,7 +115,7 @@ referring to [Argo Workflow Executors](/docs/components/pipelines/installation/c
|
|||
|
||||
## Upgrading Kubeflow Pipelines
|
||||
|
||||
1. For release notices and breaking changes, refer to [Upgrading Kubeflow Pipelines](/docs/components/pipelines/upgrade).
|
||||
1. For release notices and breaking changes, refer to [Upgrading Kubeflow Pipelines](/docs/components/pipelines/legacy-v1/installation/upgrade/).
|
||||
|
||||
1. Check the [Kubeflow Pipelines GitHub repository](https://github.com/kubeflow/pipelines/releases) for available releases.
|
||||
|
||||
|
|
@ -12,13 +12,13 @@ For upgrade instructions, refer to distribution specific documentations:
|
|||
|
||||
## Upgrading to v2.0
|
||||
|
||||
* **Notice**: In v2.0 frontend, run metrics columns are deprecated in the run list page, but users can still get the same information by using [KFPv2 Scalar metrics](/docs/components/pipelines/v1/sdk/output-viewer/#scalar-metrics)
|
||||
* **Notice**: In v2.0 frontend, run metrics columns are deprecated in the run list page, but users can still get the same information by using [KFPv2 Scalar metrics](/docs/components/pipelines/legacy-v1/sdk/output-viewer/#scalar-metrics)
|
||||
|
||||
## Upgrading to [v1.7]
|
||||
|
||||
[v1.7]: https://github.com/kubeflow/pipelines/releases/tag/1.7.0
|
||||
|
||||
* **Breaking Change**: Metadata UI and visualizations are not compatible with TensorFlow Extended (TFX) <= v1.0.0. Upgrade to v1.2.0 or above, refer to [Kubeflow Pipelines Backend and TensorFlow Extended (TFX) compatibility matrix](/docs/components/pipelines/installation/compatibility-matrix/).
|
||||
* **Breaking Change**: Metadata UI and visualizations are not compatible with TensorFlow Extended (TFX) <= v1.0.0. Upgrade to v1.2.0 or above, refer to [Kubeflow Pipelines Backend and TensorFlow Extended (TFX) compatibility matrix](/docs/components/pipelines/legacy-v1/installation/compatibility-matrix/).
|
||||
|
||||
* **Notice**: Emissary executor (Alpha), a new argo workflow executor is available as an option. Due to [Kubernetes deprecating Docker as a container runtime after v1.20](https://kubernetes.io/blog/2020/12/02/dont-panic-kubernetes-and-docker/), emissary may become the default workflow executor for Kubeflow Pipelines in the near future.
|
||||
|
||||
|
|
@ -26,6 +26,6 @@ For upgrade instructions, refer to distribution specific documentations:
|
|||
|
||||
Alternatively, using emissary executor (Alpha) removes the restriction on container runtime, but note some of your pipelines may require manual migrations. The Kubeflow Pipelines team welcomes your feedback in [the Emissary Executor feedback github issue](https://github.com/kubeflow/pipelines/issues/6249).
|
||||
|
||||
For detailed configuration and migration instructions for both options, refer to [Argo Workflow Executors](https://www.kubeflow.org/docs/components/pipelines/installation/choose-executor/).
|
||||
For detailed configuration and migration instructions for both options, refer to [Argo Workflow Executors](/docs/components/pipelines/legacy-v1/installation/choose-executor/).
|
||||
|
||||
* **Notice**: [Kubeflow Pipelines SDK v2 compatibility mode](/docs/components/pipelines/sdk-v2/v2-compatibility/) (Beta) was recently released. The new mode adds support for tracking pipeline runs and artifacts using ML Metadata. In v1.7 backend, complete UI support and caching capabilities for v2 compatibility mode are newly added. We welcome any [feedback](https://github.com/kubeflow/pipelines/issues/6451) on positive experiences or issues you encounter.
|
||||
* **Notice**: [Kubeflow Pipelines SDK v2 compatibility mode](/docs/components/pipelines/legacy-v1/sdk/v2-compatibility/) (Beta) was recently released. The new mode adds support for tracking pipeline runs and artifacts using ML Metadata. In v1.7 backend, complete UI support and caching capabilities for v2 compatibility mode are newly added. We welcome any [feedback](https://github.com/kubeflow/pipelines/issues/6451) on positive experiences or issues you encounter.
|
||||
|
|
@ -13,7 +13,7 @@ scalable machine learning (ML) workflows based on Docker containers.
|
|||
## Quickstart
|
||||
|
||||
Run your first pipeline by following the
|
||||
[pipelines quickstart guide](/docs/components/pipelines/overview/quickstart).
|
||||
[pipelines quickstart guide](/docs/components/pipelines/legacy-v1/overview/quickstart).
|
||||
|
||||
## What is Kubeflow Pipelines?
|
||||
|
||||
|
|
@ -35,8 +35,8 @@ The following are the goals of Kubeflow Pipelines:
|
|||
|
||||
Kubeflow Pipelines is available as a core component of Kubeflow or as a standalone installation.
|
||||
|
||||
* [Learn more about installing Kubeflow](/docs/started/getting-started/).
|
||||
* [Learn more about installing Kubeflow Pipelines standalone](/docs/components/pipelines/installation/overview/).
|
||||
* [Learn more about installing Kubeflow](/docs/started/).
|
||||
* [Learn more about installing Kubeflow Pipelines standalone](/docs/components/pipelines/legacy-v1/overview/).
|
||||
|
||||
{{% pipelines-compatibility %}}
|
||||
|
||||
|
|
@ -275,10 +275,10 @@ At a high level, the execution of a pipeline proceeds as follows:
|
|||
## Next steps
|
||||
|
||||
* Follow the
|
||||
[pipelines quickstart guide](/docs/components/pipelines/overview/quickstart) to
|
||||
[pipelines quickstart guide](/docs/components/pipelines/legacy-v1/overview/quickstart) to
|
||||
deploy Kubeflow and run a sample pipeline directly from the
|
||||
Kubeflow Pipelines UI.
|
||||
* Build machine-learning pipelines with the [Kubeflow Pipelines
|
||||
SDK](/docs/components/pipelines/sdk/sdk-overview/).
|
||||
SDK](/docs/components/pipelines/legacy-v1/sdk/sdk-overview/).
|
||||
* Follow the full guide to experimenting with
|
||||
[the Kubeflow Pipelines samples](/docs/components/pipelines/tutorials/build-pipeline/).
|
||||
[the Kubeflow Pipelines samples](/docs/components/pipelines/legacy-v1/tutorials/build-pipeline/).
|
||||
|
|
@ -12,7 +12,7 @@ Starting from Kubeflow Pipelines 0.4, Kubeflow Pipelines supports step caching c
|
|||
## Before you start
|
||||
|
||||
This guide tells you the basic concepts of Kubeflow Pipelines step caching and how to use it.
|
||||
This guide assumes that you already have Kubeflow Pipelines installed or want to use options in the [Kubeflow Pipelines deployment guide](/docs/components/pipelines/installation/) to deploy Kubeflow Pipelines.
|
||||
This guide assumes that you already have Kubeflow Pipelines installed or want to use options in the [Kubeflow Pipelines deployment guide](/docs/components/pipelines/operator-guides/installation/) to deploy Kubeflow Pipelines.
|
||||
|
||||
## What is step caching?
|
||||
|
||||
|
|
@ -7,7 +7,7 @@ weight = 30
|
|||
Multi-user isolation for Kubeflow Pipelines is part of Kubeflow's overall [multi-tenancy](/docs/concepts/multi-tenancy/) feature.
|
||||
|
||||
{{% alert title="Tip" color="info" %}}
|
||||
* Kubeflow Pipelines multi-user isolation is only supported in ["full" Kubeflow deployments](/docs/components/pipelines/installation/overview/#full-kubeflow-deployment).
|
||||
* Kubeflow Pipelines multi-user isolation is only supported in ["full" Kubeflow deployments](/docs/components/pipelines/legacy-v1/overview/#full-kubeflow-deployment).
|
||||
* Refer to docs about [profiles and namespaces](/docs/components/central-dash/profiles/) for the common Kubeflow multi-user operations
|
||||
like [managing profile contributors](/docs/components/central-dash/profiles/#manage-profile-contributors).
|
||||
{{% /alert %}}
|
||||
|
|
@ -42,10 +42,10 @@ Pipeline definitions are not isolated right now, and are shared across all names
|
|||
|
||||
How to connect Pipelines SDK to Kubeflow Pipelines will depend on __what kind__ of Kubeflow deployment you have, and __from where you are running your code__.
|
||||
|
||||
* [Full Kubeflow (from inside cluster)](/docs/components/pipelines/sdk/connect-api/#full-kubeflow-subfrom-inside-clustersub)
|
||||
* [Full Kubeflow (from outside cluster)](/docs/components/pipelines/sdk/connect-api/#full-kubeflow-subfrom-outside-clustersub)
|
||||
* [Standalone Kubeflow Pipelines (from inside cluster)](/docs/components/pipelines/sdk/connect-api/#standalone-kubeflow-pipelines-subfrom-inside-clustersub)
|
||||
* [Standalone Kubeflow Pipelines (from outside cluster)](/docs/components/pipelines/sdk/connect-api/#standalone-kubeflow-pipelines-subfrom-outside-clustersub)
|
||||
* [Full Kubeflow (from inside cluster)](/docs/components/pipelines/legacy-v1/sdk/connect-api/#full-kubeflow-subfrom-inside-clustersub)
|
||||
* [Full Kubeflow (from outside cluster)](/docs/components/pipelines/legacy-v1/sdk/connect-api/#full-kubeflow-subfrom-outside-clustersub)
|
||||
* [Standalone Kubeflow Pipelines (from inside cluster)](/docs/components/pipelines/legacy-v1/sdk/connect-api/#standalone-kubeflow-pipelines-subfrom-inside-clustersub)
|
||||
* [Standalone Kubeflow Pipelines (from outside cluster)](/docs/components/pipelines/legacy-v1/sdk/connect-api/#standalone-kubeflow-pipelines-subfrom-outside-clustersub)
|
||||
|
||||
The following Python code will create an experiment (and associated run) from a Pod inside a full Kubeflow cluster.
|
||||
|
||||
|
|
@ -15,7 +15,7 @@ Kubeflow Pipelines UI.
|
|||
|
||||
## Deploy Kubeflow and open the Kubeflow Pipelines UI
|
||||
|
||||
There are several options to [deploy Kubeflow Pipelines](/docs/components/pipelines/installation/overview/), follow the option that best suits your needs. If you are uncertain and just want to try out kubeflow pipelines it is recommended to start with the [standalone deployment](/docs/components/pipelines/installation/standalone-deployment/).
|
||||
There are several options to [deploy Kubeflow Pipelines](/docs/components/pipelines/legacy-v1/overview/), follow the option that best suits your needs. If you are uncertain and just want to try out kubeflow pipelines it is recommended to start with the [standalone deployment](/docs/components/pipelines/legacy-v1/installation/standalone-deployment/).
|
||||
|
||||
Once you have deployed Kubeflow Pipelines, make sure you can access the UI. The steps to access the UI vary based on the method you used to deploy Kubeflow Pipelines.
|
||||
|
||||
|
|
@ -94,11 +94,11 @@ You can find the [source code for the **XGBoost - Iterative model training** dem
|
|||
## Next steps
|
||||
|
||||
* Learn more about the
|
||||
[important concepts](/docs/pipelines/overview/concepts/) in Kubeflow
|
||||
[important concepts](/docs/components/pipelines/concepts/) in Kubeflow
|
||||
Pipelines.
|
||||
* This page showed you how to run some of the examples supplied in the Kubeflow
|
||||
Pipelines UI. Next, you may want to run a pipeline from a notebook, or compile
|
||||
and run a sample from the code. See the guide to experimenting with
|
||||
[the Kubeflow Pipelines samples](/docs/components/pipelines/tutorials/build-pipeline/).
|
||||
[the Kubeflow Pipelines samples](/docs/components/pipelines/legacy-v1/tutorials/build-pipeline/).
|
||||
* Build your own machine-learning pipelines with the [Kubeflow Pipelines
|
||||
SDK](/docs/components/pipelines/sdk/sdk-overview/).
|
||||
SDK](/docs/components/pipelines/legacy-v1/sdk/sdk-overview/).
|
||||
|
|
@ -11,7 +11,7 @@ needs to be updated for Kubeflow 1.1.
|
|||
|
||||
This page describes some recommended practices for designing
|
||||
components. For an application of these best practices, see the
|
||||
[component development guide](/docs/components/pipelines/sdk/component-development). If
|
||||
[component development guide](/docs/components/pipelines/legacy-v1/sdk/component-development). If
|
||||
you're new to pipelines, see the conceptual guides to
|
||||
[pipelines](/docs/components/pipelines/concepts/pipeline/)
|
||||
and [components](/docs/components/pipelines/concepts/component/).
|
||||
|
|
@ -226,9 +226,9 @@
|
|||
"\n",
|
||||
"[container-op]: https://kubeflow-pipelines.readthedocs.io/en/latest/source/dsl.html#kfp.dsl.ContainerOp\n",
|
||||
"[component-spec]: https://www.kubeflow.org/docs/components/pipelines/reference/component-spec/\n",
|
||||
"[python-function-component]: https://www.kubeflow.org/docs/components/pipelines/sdk/python-function-components/\n",
|
||||
"[component-dev]: https://www.kubeflow.org/docs/components/pipelines/sdk/component-development/\n",
|
||||
"[python-function-component-data-passing]: https://www.kubeflow.org/docs/components/pipelines/sdk/python-function-components/#understanding-how-data-is-passed-between-components\n",
|
||||
"[python-function-component]: https://www.kubeflow.org/docs/components/pipelines/legacy-v1/sdk/python-function-components/\n",
|
||||
"[component-dev]: https://www.kubeflow.org/docs/components/pipelines/legacy-v1/sdk/component-development/\n",
|
||||
"[python-function-component-data-passing]: https://www.kubeflow.org/docs/components/pipelines/legacy-v1/sdk/python-function-components/#understanding-how-data-is-passed-between-components\n",
|
||||
"[prebuilt-components]: https://www.kubeflow.org/docs/examples/shared-resources/"
|
||||
]
|
||||
},
|
||||
|
|
@ -358,7 +358,7 @@
|
|||
" The following example shows the updated `merge_csv` function.\n",
|
||||
"\n",
|
||||
"[web-download-component]: https://github.com/kubeflow/pipelines/blob/sdk/release-1.8/components/web/Download/component.yaml\n",
|
||||
"[python-function-components]: https://www.kubeflow.org/docs/components/pipelines/sdk/python-function-components/\n",
|
||||
"[python-function-components]: https://www.kubeflow.org/docs/components/pipelines/legacy-v1/sdk/python-function-components/\n",
|
||||
"[input-path]: https://kubeflow-pipelines.readthedocs.io/en/latest/source/components.html?highlight=inputpath#kfp.components.InputPath\n",
|
||||
"[output-path]: https://kubeflow-pipelines.readthedocs.io/en/latest/source/components.html?highlight=outputpath#kfp.components.OutputPath"
|
||||
]
|
||||
|
|
@ -516,7 +516,7 @@
|
|||
"2. Upload and run your `pipeline.yaml` using the Kubeflow Pipelines user interface.\n",
|
||||
"See the guide to [getting started with the UI][quickstart].\n",
|
||||
"\n",
|
||||
"[quickstart]: https://www.kubeflow.org/docs/components/pipelines/overview/quickstart"
|
||||
"[quickstart]: https://www.kubeflow.org/docs/components/pipelines/legacy-v1/overview/quickstart"
|
||||
]
|
||||
},
|
||||
{
|
||||
|
|
@ -530,7 +530,7 @@
|
|||
"1. Create an instance of the [`kfp.Client` class][kfp-client] following steps in [connecting to Kubeflow Pipelines using the SDK client][connect-api].\n",
|
||||
"\n",
|
||||
"[kfp-client]: https://kubeflow-pipelines.readthedocs.io/en/latest/source/client.html#kfp.Client\n",
|
||||
"[connect-api]: https://www.kubeflow.org/docs/components/pipelines/sdk/connect-api"
|
||||
"[connect-api]: https://www.kubeflow.org/docs/components/pipelines/legacy-v1/sdk/connect-api"
|
||||
]
|
||||
},
|
||||
{
|
||||
|
|
@ -580,8 +580,8 @@
|
|||
" pipeline][k8s-resources] (Experimental).\n",
|
||||
"\n",
|
||||
"[conditional]: https://github.com/kubeflow/pipelines/blob/sdk/release-1.8/samples/tutorials/DSL%20-%20Control%20structures/DSL%20-%20Control%20structures.py\n",
|
||||
"[recursion]: https://www.kubeflow.org/docs/components/pipelines/sdk/dsl-recursion/\n",
|
||||
"[k8s-resources]: https://www.kubeflow.org/docs/components/pipelines/sdk/manipulate-resources/"
|
||||
"[recursion]: https://www.kubeflow.org/docs/components/pipelines/legacy-v1/sdk/dsl-recursion/\n",
|
||||
"[k8s-resources]: https://www.kubeflow.org/docs/components/pipelines/legacy-v1/sdk/manipulate-resources/"
|
||||
]
|
||||
}
|
||||
],
|
||||
|
|
@ -5,7 +5,7 @@ weight = 30
|
|||
+++
|
||||
|
||||
<!--
|
||||
AUTOGENERATED FROM content/en/docs/components/pipelines/sdk/build-pipeline.ipynb
|
||||
AUTOGENERATED FROM content/en/docs/components/pipelines/legacy-v1/sdk/build-pipeline.ipynb
|
||||
PLEASE UPDATE THE JUPYTER NOTEBOOK AND REGENERATE THIS FILE USING scripts/nb_to_md.py.-->
|
||||
|
||||
<style>
|
||||
|
|
@ -26,8 +26,8 @@ background-position: left center;
|
|||
}
|
||||
</style>
|
||||
<div class="notebook-links">
|
||||
<a class="colab-link" href="https://colab.research.google.com/github/kubeflow/website/blob/master/content/en/docs/components/pipelines/v1/sdk/build-pipeline.ipynb">Run in Google Colab</a>
|
||||
<a class="github-link" href="https://github.com/kubeflow/website/blob/master/content/en/docs/components/pipelines/v1/sdk/build-pipeline.ipynb">View source on GitHub</a>
|
||||
<a class="colab-link" href="https://colab.research.google.com/github/kubeflow/website/blob/master/content/en/docs/components/pipelines/legacy-v1/sdk/build-pipeline.ipynb">Run in Google Colab</a>
|
||||
<a class="github-link" href="https://github.com/kubeflow/website/blob/master/content/en/docs/components/pipelines/legacy-v1/sdk/build-pipeline.ipynb">View source on GitHub</a>
|
||||
</div>
|
||||
|
||||
|
||||
|
|
@ -221,11 +221,11 @@ when designing a pipeline.
|
|||
into a single file.
|
||||
|
||||
[container-op]: https://kubeflow-pipelines.readthedocs.io/en/stable/source/dsl.html#kfp.dsl.ContainerOp
|
||||
[component-spec]: https://www.kubeflow.org/docs/components/pipelines/reference/component-spec/
|
||||
[python-function-component]: https://www.kubeflow.org/docs/components/pipelines/sdk/python-function-components/
|
||||
[component-dev]: https://www.kubeflow.org/docs/components/pipelines/sdk/component-development/
|
||||
[python-function-component-data-passing]: https://www.kubeflow.org/docs/components/pipelines/sdk/python-function-components/#understanding-how-data-is-passed-between-components
|
||||
[prebuilt-components]: https://www.kubeflow.org/docs/examples/shared-resources/
|
||||
[component-spec]: /docs/components/pipelines/reference/component-spec/
|
||||
[python-function-component]: /docs/components/pipelines/legacy-v1/sdk/python-function-components/
|
||||
[component-dev]: /docs/components/pipelines/legacy-v1/sdk/component-development/
|
||||
[python-function-component-data-passing]: /docs/components/pipelines/legacy-v1/sdk/python-function-components/#understanding-how-data-is-passed-between-components
|
||||
[prebuilt-components]: /docs/examples/shared-resources/
|
||||
|
||||
|
||||
```python
|
||||
|
|
@ -319,7 +319,7 @@ $ head merged_data.csv
|
|||
The following example shows the updated `merge_csv` function.
|
||||
|
||||
[web-download-component]: https://github.com/kubeflow/pipelines/blob/sdk/release-1.8/components/web/Download/component.yaml
|
||||
[python-function-components]: https://www.kubeflow.org/docs/components/pipelines/sdk/python-function-components/
|
||||
[python-function-components]: /docs/components/pipelines/legacy-v1/sdk/python-function-components/
|
||||
[input-path]: https://kubeflow-pipelines.readthedocs.io/en/stable/source/components.html?highlight=inputpath#kfp.components.InputPath
|
||||
[output-path]: https://kubeflow-pipelines.readthedocs.io/en/stable/source/components.html?highlight=outputpath#kfp.components.OutputPath
|
||||
|
||||
|
|
@ -415,14 +415,14 @@ kfp.compiler.Compiler().compile(
|
|||
2. Upload and run your `pipeline.yaml` using the Kubeflow Pipelines user interface.
|
||||
See the guide to [getting started with the UI][quickstart].
|
||||
|
||||
[quickstart]: https://www.kubeflow.org/docs/components/pipelines/overview/quickstart
|
||||
[quickstart]: /docs/components/pipelines/legacy-v1/overview/quickstart
|
||||
|
||||
#### Option 2: run the pipeline using Kubeflow Pipelines SDK client
|
||||
|
||||
1. Create an instance of the [`kfp.Client` class][kfp-client] following steps in [connecting to Kubeflow Pipelines using the SDK client][connect-api].
|
||||
|
||||
[kfp-client]: https://kubeflow-pipelines.readthedocs.io/en/stable/source/client.html#kfp.Client
|
||||
[connect-api]: https://www.kubeflow.org/docs/components/pipelines/sdk/connect-api
|
||||
[connect-api]: /docs/components/pipelines/legacy-v1/sdk/connect-api
|
||||
|
||||
|
||||
```python
|
||||
|
|
@ -450,11 +450,11 @@ client.create_run_from_pipeline_func(
|
|||
pipeline][k8s-resources] (Experimental).
|
||||
|
||||
[conditional]: https://github.com/kubeflow/pipelines/blob/sdk/release-1.8/samples/tutorials/DSL%20-%20Control%20structures/DSL%20-%20Control%20structures.py
|
||||
[recursion]: https://www.kubeflow.org/docs/components/pipelines/sdk/dsl-recursion/
|
||||
[k8s-resources]: https://www.kubeflow.org/docs/components/pipelines/sdk/manipulate-resources/
|
||||
[recursion]: /docs/components/pipelines/legacy-v1/sdk/dsl-recursion/
|
||||
[k8s-resources]: /docs/components/pipelines/legacy-v1/sdk/manipulate-resources/
|
||||
|
||||
|
||||
<div class="notebook-links">
|
||||
<a class="colab-link" href="https://colab.research.google.com/github/kubeflow/website/blob/master/content/en/docs/components/pipelines/v1/sdk/build-pipeline.ipynb">Run in Google Colab</a>
|
||||
<a class="github-link" href="https://github.com/kubeflow/website/blob/master/content/en/docs/components/pipelines/v1/sdk/build-pipeline.ipynb">View source on GitHub</a>
|
||||
<a class="colab-link" href="https://colab.research.google.com/github/kubeflow/website/blob/master/content/en/docs/components/pipelines/legacy-v1/sdk/build-pipeline.ipynb">Run in Google Colab</a>
|
||||
<a class="github-link" href="https://github.com/kubeflow/website/blob/master/content/en/docs/components/pipelines/legacy-v1/sdk/build-pipeline.ipynb">View source on GitHub</a>
|
||||
</div>
|
||||
|
|
@ -395,7 +395,7 @@ The following examples demonstrate how to specify your component's interface.
|
|||
```
|
||||
|
||||
[dsl-types]: https://github.com/kubeflow/pipelines/blob/sdk/release-1.8/sdk/python/kfp/dsl/types.py
|
||||
[dsl-type-checking]: https://www.kubeflow.org/docs/components/pipelines/sdk/static-type-checking/
|
||||
[dsl-type-checking]: /docs/components/pipelines/legacy-v1/sdk/static-type-checking/
|
||||
|
||||
### Specify your component's metadata
|
||||
|
||||
|
|
@ -588,15 +588,15 @@ See this [sample component][org-sample] for a real-life component example.
|
|||
## Next steps
|
||||
|
||||
* Consolidate what you've learned by reading the
|
||||
[best practices](/docs/components/pipelines/sdk/best-practices) for designing and
|
||||
[best practices](/docs/components/pipelines/legacy-v1/sdk/best-practices/) for designing and
|
||||
writing components.
|
||||
* For quick iteration,
|
||||
[build lightweight Python function-based components](/docs/components/pipelines/sdk/python-function-components/)
|
||||
[build lightweight Python function-based components](/docs/components/pipelines/legacy-v1/sdk/python-function-components/)
|
||||
directly from Python functions.
|
||||
* See how to [export metrics from your
|
||||
pipeline](/docs/components/pipelines/sdk/pipelines-metrics/).
|
||||
pipeline](/docs/components/pipelines/legacy-v1/sdk/pipelines-metrics/).
|
||||
* Visualize the output of your component by
|
||||
[adding metadata for an output
|
||||
viewer](/docs/components/pipelines/sdk/output-viewer/).
|
||||
viewer](/docs/components/pipelines/legacy-v1/sdk/output-viewer/).
|
||||
* Explore the [reusable components and other shared
|
||||
resources](/docs/examples/shared-resources/).
|
||||
|
|
@ -84,7 +84,7 @@ def pipeline():
|
|||
|
||||
## Limitations
|
||||
|
||||
* [Type checking](/docs/components/pipelines/sdk/static-type-checking) does not work for the recursive functions. In other words, The type information that is annotated to the recursive
|
||||
* [Type checking](/docs/components/pipelines/legacy-v1/sdk/static-type-checking) does not work for the recursive functions. In other words, The type information that is annotated to the recursive
|
||||
function signature will not be checked.
|
||||
* Since the output of the recursive functions cannot be dynamically resolved, the downstream ContainerOps cannot
|
||||
access the output from the recursive functions.
|
||||
|
|
@ -16,8 +16,8 @@ components.
|
|||
|
||||
Set up your environment:
|
||||
|
||||
- [Install Kubeflow](/docs/started/getting-started/)
|
||||
- [Install the Kubeflow Pipelines SDK](/docs/components/pipelines/sdk/install-sdk/)
|
||||
- [Install Kubeflow](/docs/started/)
|
||||
- [Install the Kubeflow Pipelines SDK](/docs/components/pipelines/legacy-v1/sdk/install-sdk/)
|
||||
|
||||
|
||||
|
||||
|
|
@ -26,7 +26,7 @@ Set up your environment:
|
|||
In this example, you pass an environment variable to a lightweight Python
|
||||
component, which writes the variable's value to the log.
|
||||
|
||||
[Learn more about lightweight Python components](/docs/components/pipelines/sdk/lightweight-python-components/)
|
||||
[Learn more about lightweight Python components](/docs/components/pipelines/legacy-v1/sdk/lightweight-python-components/)
|
||||
|
||||
To build a component, define a stand-alone Python function and then call
|
||||
[kfp.components.func_to_container_op(func)](https://kubeflow-pipelines.readthedocs.io/en/stable/source/components.html#kfp.components.func_to_container_op) to convert the
|
||||
|
|
@ -15,7 +15,7 @@ All of the SDK's classes and methods are described in the auto-generated [SDK re
|
|||
|
||||
**Note:** If you are running [Kubeflow Pipelines with Tekton](https://github.com/kubeflow/kfp-tekton),
|
||||
instead of the default [Kubeflow Pipelines with Argo](https://github.com/kubeflow/pipelines), you should use the
|
||||
[Kubeflow Pipelines SDK for Tekton](/docs/components/pipelines/sdk/pipelines-with-tekton).
|
||||
[Kubeflow Pipelines SDK for Tekton](/docs/components/pipelines/legacy-v1/sdk/pipelines-with-tekton).
|
||||
|
||||
## Set up Python
|
||||
|
||||
|
|
@ -115,7 +115,7 @@ The response should be something like this:
|
|||
|
||||
## Next steps
|
||||
|
||||
* [See how to use the SDK](/docs/components/pipelines/sdk/sdk-overview/).
|
||||
* [Build a component and a pipeline](/docs/components/pipelines/sdk/component-development/).
|
||||
* [Get started with the UI](/docs/components/pipelines/overview/quickstart).
|
||||
* [See how to use the SDK](/docs/components/pipelines/legacy-v1/sdk/sdk-overview/).
|
||||
* [Build a component and a pipeline](/docs/components/pipelines/legacy-v1/sdk/component-development/).
|
||||
* [Get started with the UI](/docs/components/pipelines/legacy-v1/overview/quickstart).
|
||||
* [Understand pipeline concepts](/docs/components/pipelines/concepts/).
|
||||
|
|
@ -274,7 +274,7 @@ For better understanding, please refer to the following samples:
|
|||
and
|
||||
[VolumeSnapshotOps](https://github.com/kubeflow/pipelines/tree/sdk/release-1.8/samples/core/volume_snapshot_ops).
|
||||
* Learn more about the
|
||||
[Kubeflow Pipelines domain-specific language (DSL)](/docs/components/pipelines/sdk/dsl-overview/),
|
||||
[Kubeflow Pipelines domain-specific language (DSL)](/docs/components/pipelines/legacy-v1/sdk/dsl-overview/),
|
||||
a set of Python libraries that you can use to specify ML pipelines.
|
||||
* For quick iteration,
|
||||
[build components and pipelines](/docs/components/pipelines/sdk/build-component/).
|
||||
[build components and pipelines](/docs/components/pipelines/legacy-v1/sdk/component-development/).
|
||||
|
|
@ -59,7 +59,7 @@ of all metrics visualizations can be found in [metrics_visualization_v2.py](http
|
|||
|
||||
### Requirements
|
||||
|
||||
* Use Kubeflow Pipelines v1.7.0 or above: [upgrade Kubeflow Pipelines](/docs/components/pipelines/installation/standalone-deployment/#upgrading-kubeflow-pipelines).
|
||||
* Use Kubeflow Pipelines v1.7.0 or above: [upgrade Kubeflow Pipelines](/docs/components/pipelines/legacy-v1/installation/standalone-deployment/#upgrading-kubeflow-pipelines).
|
||||
* Use `kfp.dsl.PipelineExecutionMode.V2_COMPATIBLE` mode when you [compile and run your pipelines](/docs/components/pipelines/sdk-v2/build-pipeline/#compile-and-run-your-pipeline).
|
||||
* Make sure to use the latest environment kustomize manifest [pipelines/manifests/kustomize/env/dev/kustomization.yaml](https://github.com/kubeflow/pipelines/blob/sdk/release-1.8/manifests/kustomize/env/dev/kustomization.yaml).
|
||||
|
||||
|
|
@ -383,7 +383,7 @@ viewers later on the page.
|
|||
below as examples.</p>
|
||||
<p><b>Be aware</b>, support for inline visualizations, other than
|
||||
markdown, was introduced in Kubeflow Pipelines 0.2.5. Before using these
|
||||
visualizations, [upgrade your Kubeflow Pipelines cluster](/docs/components/pipelines/upgrade/)
|
||||
visualizations, [upgrade your Kubeflow Pipelines cluster](/docs/components/pipelines/legacy-v1/installation/upgrade//)
|
||||
to version 0.2.5 or higher.</p>
|
||||
</td>
|
||||
</tr>
|
||||
|
|
@ -730,7 +730,7 @@ pre-installed when you deploy Kubeflow.
|
|||
You can run the sample by selecting
|
||||
**[Sample] ML - TFX - Taxi Tip Prediction Model Trainer** from the
|
||||
Kubeflow Pipelines UI. For help getting started with the UI, follow the
|
||||
[Kubeflow Pipelines quickstart](/docs/components/pipelines/overview/quickstart/).
|
||||
[Kubeflow Pipelines quickstart](/docs/components/pipelines/legacy-v1/overview/quickstart/).
|
||||
|
||||
<!--- TODO: Will replace the tfx cab with tfx oss when it is ready.-->
|
||||
The pipeline uses a number of prebuilt, reusable components, including:
|
||||
|
|
@ -756,8 +756,3 @@ For a complete example of lightweigh Python component, you can refer to
|
|||
## YAML component example
|
||||
|
||||
You can also configure visualization in a component.yaml file. Refer to `{name: MLPipeline UI Metadata}` output in [Create Tensorboard Visualization component](https://github.com/kubeflow/pipelines/blob/f61048b5d2e1fb5a6a61782d570446b0ec940ff7/components/tensorflow/tensorboard/prepare_tensorboard/component.yaml#L12).
|
||||
|
||||
## Next step
|
||||
|
||||
See how to [export metrics from your
|
||||
pipeline](/docs/components/pipelines/metrics/pipelines-metrics/).
|
||||
|
|
@ -37,4 +37,4 @@ def my_pipeline(
|
|||
```
|
||||
|
||||
For more information, you can refer to the guide on
|
||||
[building components and pipelines](/docs/components/pipelines/sdk/component-development/).
|
||||
[building components and pipelines](/docs/components/pipelines/legacy-v1/sdk/component-development/).
|
||||
|
|
@ -11,7 +11,7 @@ to compile, upload and run your Kubeflow Pipeline DSL Python scripts on a
|
|||
|
||||
## SDK packages
|
||||
|
||||
The `kfp-tekton` SDK is an extension to the [Kubeflow Pipelines SDK](/docs/components/pipelines/sdk/sdk-overview/)
|
||||
The `kfp-tekton` SDK is an extension to the [Kubeflow Pipelines SDK](/docs/components/pipelines/legacy-v1/sdk/sdk-overview/)
|
||||
adding the `TektonCompiler` and the `TektonClient`:
|
||||
|
||||
* `kfp_tekton.compiler` includes classes and methods for compiling pipeline
|
||||
|
|
@ -64,7 +64,7 @@
|
|||
"3. Create an instance of the [`kfp.Client` class][kfp-client] following steps in [connecting to Kubeflow Pipelines using the SDK client][connect-api].\n",
|
||||
"\n",
|
||||
"[kfp-client]: https://kubeflow-pipelines.readthedocs.io/en/latest/source/client.html#kfp.Client\n",
|
||||
"[connect-api]: https://www.kubeflow.org/docs/components/pipelines/sdk/connect-api"
|
||||
"[connect-api]: https://www.kubeflow.org/docs/components/pipelines/legacy-v1/sdk/connect-api"
|
||||
]
|
||||
},
|
||||
{
|
||||
|
|
@ -131,7 +131,7 @@
|
|||
"source": [
|
||||
"3. Create and run your pipeline. [Learn more about creating and running pipelines][build-pipelines].\n",
|
||||
"\n",
|
||||
"[build-pipelines]: https://www.kubeflow.org/docs/components/pipelines/sdk/build-component/"
|
||||
"[build-pipelines]: https://www.kubeflow.org/docs/components/pipelines/legacy-v1/sdk/component-development/"
|
||||
]
|
||||
},
|
||||
{
|
||||
|
|
@ -353,8 +353,8 @@
|
|||
"[dockerfile]: https://docs.docker.com/engine/reference/builder/\n",
|
||||
"[named-tuple-hint]: https://docs.python.org/3/library/typing.html#typing.NamedTuple\n",
|
||||
"[named-tuple]: https://docs.python.org/3/library/collections.html#collections.namedtuple\n",
|
||||
"[kfp-visualize]: https://www.kubeflow.org/docs/components/pipelines/sdk/output-viewer/\n",
|
||||
"[kfp-metrics]: https://www.kubeflow.org/docs/components/pipelines/sdk/pipelines-metrics/\n",
|
||||
"[kfp-visualize]: https://www.kubeflow.org/docs/components/pipelines/legacy-v1/sdk/output-viewer/\n",
|
||||
"[kfp-metrics]: https://www.kubeflow.org/docs/components/pipelines/legacy-v1/sdk/pipelines-metrics/\n",
|
||||
"[input-path]: https://kubeflow-pipelines.readthedocs.io/en/latest/source/components.html#kfp.components.InputPath\n",
|
||||
"[output-path]: https://kubeflow-pipelines.readthedocs.io/en/latest/source/components.html#kfp.components.OutputPath"
|
||||
]
|
||||
|
|
@ -647,7 +647,7 @@
|
|||
"source": [
|
||||
"5. Compile and run your pipeline. [Learn more about compiling and running pipelines][build-pipelines].\n",
|
||||
"\n",
|
||||
"[build-pipelines]: https://www.kubeflow.org/docs/components/pipelines/sdk/build-pipeline/#compile-and-run-your-pipeline"
|
||||
"[build-pipelines]: https://www.kubeflow.org/docs/components/pipelines/legacy-v1/sdk/build-pipeline/#compile-and-run-your-pipeline"
|
||||
]
|
||||
},
|
||||
{
|
||||
|
|
@ -5,7 +5,7 @@ weight = 50
|
|||
+++
|
||||
|
||||
<!--
|
||||
AUTOGENERATED FROM content/en/docs/components/pipelines/sdk/python-function-components.ipynb
|
||||
AUTOGENERATED FROM content/en/docs/components/pipelines/legacy-v1/sdk/python-function-components.ipynb
|
||||
PLEASE UPDATE THE JUPYTER NOTEBOOK AND REGENERATE THIS FILE USING scripts/nb_to_md.py.-->
|
||||
|
||||
<style>
|
||||
|
|
@ -26,8 +26,8 @@ background-position: left center;
|
|||
}
|
||||
</style>
|
||||
<div class="notebook-links">
|
||||
<a class="colab-link" href="https://colab.research.google.com/github/kubeflow/website/blob/master/content/en/docs/components/pipelines/v1/sdk/python-function-components.ipynb">Run in Google Colab</a>
|
||||
<a class="github-link" href="https://github.com/kubeflow/website/blob/master/content/en/docs/components/pipelines/v1/sdk/python-function-components.ipynb">View source on GitHub</a>
|
||||
<a class="colab-link" href="https://colab.research.google.com/github/kubeflow/website/blob/master/content/en/docs/components/pipelines/legacy-v1/sdk/python-function-components.ipynb">Run in Google Colab</a>
|
||||
<a class="github-link" href="https://github.com/kubeflow/website/blob/master/content/en/docs/components/pipelines/legacy-v1/sdk/python-function-components.ipynb">View source on GitHub</a>
|
||||
</div>
|
||||
|
||||
|
||||
|
|
@ -47,7 +47,7 @@ Python function-based components make it easier to iterate quickly by letting yo
|
|||
component code as a Python function and generating the [component specification][component-spec] for you.
|
||||
This document describes how to build Python function-based components and use them in your pipeline.
|
||||
|
||||
[component-spec]: https://www.kubeflow.org/docs/components/pipelines/reference/component-spec/
|
||||
[component-spec]: /docs/components/pipelines/reference/component-spec/
|
||||
|
||||
## Before you begin
|
||||
|
||||
|
|
@ -70,7 +70,7 @@ from kfp.components import create_component_from_func
|
|||
3. Create an instance of the [`kfp.Client` class][kfp-client] following steps in [connecting to Kubeflow Pipelines using the SDK client][connect-api].
|
||||
|
||||
[kfp-client]: https://kubeflow-pipelines.readthedocs.io/en/stable/source/client.html#kfp.Client
|
||||
[connect-api]: https://www.kubeflow.org/docs/components/pipelines/sdk/connect-api
|
||||
[connect-api]: /docs/components/pipelines/legacy-v1/sdk/connect-api
|
||||
|
||||
|
||||
```python
|
||||
|
|
@ -110,7 +110,7 @@ add_op = create_component_from_func(
|
|||
|
||||
3. Create and run your pipeline. [Learn more about creating and running pipelines][build-pipelines].
|
||||
|
||||
[build-pipelines]: https://www.kubeflow.org/docs/components/pipelines/sdk/build-component/
|
||||
[build-pipelines]: /docs/components/pipelines/legacy-v1/sdk/component-development/
|
||||
|
||||
|
||||
```python
|
||||
|
|
@ -323,8 +323,8 @@ including component metadata and metrics.
|
|||
[dockerfile]: https://docs.docker.com/engine/reference/builder/
|
||||
[named-tuple-hint]: https://docs.python.org/3/library/typing.html#typing.NamedTuple
|
||||
[named-tuple]: https://docs.python.org/3/library/collections.html#collections.namedtuple
|
||||
[kfp-visualize]: https://www.kubeflow.org/docs/components/pipelines/sdk/output-viewer/
|
||||
[kfp-metrics]: https://www.kubeflow.org/docs/components/pipelines/sdk/pipelines-metrics/
|
||||
[kfp-visualize]: /docs/components/pipelines/legacy-v1/sdk/output-viewer/
|
||||
[kfp-metrics]: /docs/components/pipelines/legacy-v1/sdk/pipelines-metrics/
|
||||
[input-path]: https://kubeflow-pipelines.readthedocs.io/en/stable/source/components.html#kfp.components.InputPath
|
||||
[output-path]: https://kubeflow-pipelines.readthedocs.io/en/stable/source/components.html#kfp.components.OutputPath
|
||||
|
||||
|
|
@ -563,7 +563,7 @@ def calc_pipeline(
|
|||
|
||||
5. Compile and run your pipeline. [Learn more about compiling and running pipelines][build-pipelines].
|
||||
|
||||
[build-pipelines]: https://www.kubeflow.org/docs/components/pipelines/sdk/build-pipeline/#compile-and-run-your-pipeline
|
||||
[build-pipelines]: /docs/components/pipelines/legacy-v1/sdk/build-pipeline/#compile-and-run-your-pipeline
|
||||
|
||||
|
||||
```python
|
||||
|
|
@ -576,6 +576,6 @@ client.create_run_from_pipeline_func(calc_pipeline, arguments=arguments)
|
|||
|
||||
|
||||
<div class="notebook-links">
|
||||
<a class="colab-link" href="https://colab.research.google.com/github/kubeflow/website/blob/master/content/en/docs/components/pipelines/v1/sdk/python-function-components.ipynb">Run in Google Colab</a>
|
||||
<a class="github-link" href="https://github.com/kubeflow/website/blob/master/content/en/docs/components/pipelines/v1/sdk/python-function-components.ipynb">View source on GitHub</a>
|
||||
<a class="colab-link" href="https://colab.research.google.com/github/kubeflow/website/blob/master/content/en/docs/components/pipelines/legacy-v1/sdk/python-function-components.ipynb">Run in Google Colab</a>
|
||||
<a class="github-link" href="https://github.com/kubeflow/website/blob/master/content/en/docs/components/pipelines/legacy-v1/sdk/python-function-components.ipynb">View source on GitHub</a>
|
||||
</div>
|
||||
|
|
@ -16,7 +16,7 @@ workflow and how the components interact with each other.
|
|||
|
||||
**Note**: The SDK documentation here refers to [Kubeflow Pipelines with Argo](https://github.com/kubeflow/pipelines) which is the default.
|
||||
If you are running [Kubeflow Pipelines with Tekton](https://github.com/kubeflow/kfp-tekton) instead,
|
||||
please follow the [Kubeflow Pipelines SDK for Tekton](/docs/components/pipelines/sdk/pipelines-with-tekton) documentation.
|
||||
please follow the [Kubeflow Pipelines SDK for Tekton](/docs/components/pipelines/legacy-v1/sdk/pipelines-with-tekton) documentation.
|
||||
|
||||
## SDK packages
|
||||
|
||||
|
|
@ -65,7 +65,7 @@ The Kubeflow Pipelines SDK includes the following packages:
|
|||
|
||||
* `kfp.dsl.PipelineParam` represents a pipeline parameter that you can pass
|
||||
from one pipeline component to another. See the guide to
|
||||
[pipeline parameters](/docs/components/pipelines/sdk/parameters/).
|
||||
[pipeline parameters](/docs/components/pipelines/legacy-v1/sdk/parameters/).
|
||||
* `kfp.dsl.component` is a decorator for DSL functions that returns a
|
||||
pipeline component.
|
||||
([`ContainerOp`](https://kubeflow-pipelines.readthedocs.io/en/stable/source/dsl.html#kfp.dsl.ContainerOp)).
|
||||
|
|
@ -78,7 +78,7 @@ The Kubeflow Pipelines SDK includes the following packages:
|
|||
include basic types like `String`, `Integer`, `Float`, and `Bool`, as well
|
||||
as domain-specific types like `GCPProjectID` and `GCRPath`.
|
||||
See the guide to
|
||||
[DSL static type checking](/docs/components/pipelines/sdk/static-type-checking).
|
||||
[DSL static type checking](/docs/components/pipelines/legacy-v1/sdk/static-type-checking).
|
||||
* [`kfp.dsl.ResourceOp`](https://kubeflow-pipelines.readthedocs.io/en/stable/source/dsl.html#kfp.dsl.ResourceOp)
|
||||
represents a pipeline task (op) which lets you directly manipulate
|
||||
Kubernetes resources (`create`, `get`, `apply`, ...).
|
||||
|
|
@ -119,7 +119,7 @@ The Kubeflow Pipelines SDK includes the following packages:
|
|||
for execution on Kubeflow Pipelines.
|
||||
* `kfp.Client.create_run_from_pipeline_package` runs a local pipeline package on Kubeflow Pipelines.
|
||||
* `kfp.Client.upload_pipeline` uploads a local file to create a new pipeline in Kubeflow Pipelines.
|
||||
* `kfp.Client.upload_pipeline_version` uploads a local file to create a pipeline version. [Follow an example to learn more about creating a pipeline version](/docs/components/pipelines/tutorials/sdk-examples).
|
||||
* `kfp.Client.upload_pipeline_version` uploads a local file to create a pipeline version. [Follow an example to learn more about creating a pipeline version](/docs/components/pipelines/legacy-v1/tutorials/sdk-examples).
|
||||
|
||||
* [Kubeflow Pipelines extension modules](https://kubeflow-pipelines.readthedocs.io/en/stable/source/kfp.extensions.html)
|
||||
include classes and functions for specific platforms on which you can use
|
||||
|
|
@ -155,7 +155,7 @@ The Kubeflow Pipelines CLI tool enables you to use a subset of the Kubeflow Pipe
|
|||
## Installing the SDK
|
||||
|
||||
Follow the guide to
|
||||
[installing the Kubeflow Pipelines SDK](/docs/components/pipelines/sdk/install-sdk/).
|
||||
[installing the Kubeflow Pipelines SDK](/docs/components/pipelines/legacy-v1/sdk/install-sdk/).
|
||||
|
||||
## Building pipelines and components
|
||||
|
||||
|
|
@ -166,7 +166,7 @@ A Kubeflow _pipeline_ is a portable and scalable definition of an ML workflow.
|
|||
Each step in your ML workflow, such as preparing data or training a model,
|
||||
is an instance of a pipeline component.
|
||||
|
||||
[Learn more about building pipelines](/docs/components/pipelines/sdk/build-pipeline).
|
||||
[Learn more about building pipelines](/docs/components/pipelines/legacy-v1/sdk/build-pipeline).
|
||||
|
||||
A pipeline _component_ is a self-contained set of code that performs one step
|
||||
in your ML workflow. Components are defined in a component specification, which
|
||||
|
|
@ -183,13 +183,13 @@ Use the following options to create or reuse pipeline components.
|
|||
* You can build components by defining a component specification for a
|
||||
containerized application.
|
||||
|
||||
[Learn more about building pipeline components](/docs/components/pipelines/sdk/component-development).
|
||||
[Learn more about building pipeline components](/docs/components/pipelines/legacy-v1/sdk/component-development).
|
||||
|
||||
* Lightweight Python function-based components make it easier to build a
|
||||
component by using the Kubeflow Pipelines SDK to generate the component
|
||||
specification for a Python function.
|
||||
|
||||
[Learn how to build a Python function-based component](/docs/components/pipelines/sdk/python-function-components).
|
||||
[Learn how to build a Python function-based component](/docs/components/pipelines/legacy-v1/sdk/python-function-components).
|
||||
|
||||
* You can reuse prebuilt components in your pipeline.
|
||||
|
||||
|
|
@ -199,7 +199,7 @@ Use the following options to create or reuse pipeline components.
|
|||
## Next steps
|
||||
|
||||
* Learn how to [write recursive functions in the
|
||||
DSL](/docs/components/pipelines/sdk/dsl-recursion).
|
||||
* Build a [pipeline component](/docs/components/pipelines/sdk/component-development/).
|
||||
DSL](/docs/components/pipelines/legacy-v1/sdk/dsl-recursion).
|
||||
* Build a [pipeline component](/docs/components/pipelines/legacy-v1/sdk/component-development/).
|
||||
* Find out how to use the DSL to [manipulate Kubernetes resources dynamically
|
||||
as steps of your pipeline](/docs/components/pipelines/sdk/manipulate-resources/).
|
||||
as steps of your pipeline](/docs/components/pipelines/legacy-v1/sdk/manipulate-resources/).
|
||||
|
|
@ -10,7 +10,7 @@ static type checking for fast development iterations.
|
|||
|
||||
## Motivation
|
||||
|
||||
A pipeline is a workflow consisting of [components](/docs/components/pipelines/sdk/build-component#overview-of-pipelines-and-components) and each
|
||||
A pipeline is a workflow consisting of [components](/docs/components/pipelines/legacy-v1/sdk/component-development/#overview-of-pipelines-and-components) and each
|
||||
component contains inputs and outputs. The DSL compiler supports static type checking to ensure the type consistency among the component
|
||||
I/Os within the same pipeline. Static type checking helps you to identify component I/O inconsistencies without running the pipeline.
|
||||
It also shortens the development cycles by catching the errors early.
|
||||
|
|
@ -11,7 +11,7 @@ may encounter.
|
|||
## Diagnosing problems in your Kubeflow Pipelines environment
|
||||
|
||||
For help diagnosing environment issues that affect Kubeflow Pipelines, run
|
||||
the [`kfp diagnose_me` command-line tool](/docs/components/pipelines/sdk/sdk-overview/#kfp-cli-tool).
|
||||
the [`kfp diagnose_me` command-line tool](/docs/components/pipelines/legacy-v1/sdk/sdk-overview/#kfp-cli-tool).
|
||||
|
||||
The `kfp diagnose_me` CLI reports on the configuration of your local
|
||||
development environment, Kubernetes cluster, or Google Cloud environment.
|
||||
|
|
@ -23,7 +23,7 @@ Use this command to help resolve issues like the following:
|
|||
|
||||
To use the `kfp diagnose_me` CLI, follow these steps:
|
||||
|
||||
1. Install the [Kubeflow Pipelines SDK](/docs/components/pipelines/sdk/install-sdk/).
|
||||
1. Install the [Kubeflow Pipelines SDK](/docs/components/pipelines/legacy-v1/sdk/install-sdk/).
|
||||
1. Follow the [guide to configuring access to Kubernetes clusters][kubeconfig],
|
||||
to update your kubeconfig file with appropriate credentials and endpoint
|
||||
information to access your Kubeflow cluster.
|
||||
|
|
@ -44,7 +44,7 @@ installing or using the Kubeflow Pipelines SDK.
|
|||
|
||||
This error indicates that you have not installed the `kfp` package in your
|
||||
Python3 environment. Follow the instructions in the [Kubeflow Pipelines SDK
|
||||
installation guide](/docs/components/pipelines/sdk/install-sdk/), if you have not already
|
||||
installation guide](/docs/components/pipelines/legacy-v1/sdk/install-sdk/), if you have not already
|
||||
installed the SDK.
|
||||
|
||||
If you have already installed the Kubeflow Pipelines SDK, check that you have
|
||||
|
|
@ -88,4 +88,4 @@ You can resolve this issue by using one of the following options:
|
|||
|
||||
## TFX visualizations do not show up or throw an error
|
||||
|
||||
Confirm your Kubeflow Pipelines backend version is compatible with your TFX version, refer to [Kubeflow Pipelines Compatibility Matrix](/docs/components/pipelines/installation/compatibility-matrix/).
|
||||
Confirm your Kubeflow Pipelines backend version is compatible with your TFX version, refer to [Kubeflow Pipelines Compatibility Matrix](/docs/components/pipelines/legacy-v1/installation/compatibility-matrix/).
|
||||
|
|
@ -18,11 +18,11 @@ kubectl port-forward -n kubeflow svc/ml-pipeline ${SVC_PORT}:8888
|
|||
|
||||
This tutorial assumes that the service is accessible on localhost.
|
||||
|
||||
You also need to install [jq](https://stedolan.github.io/jq/download/), and the [Kubeflow Pipelines SDK](/docs/components/pipelines/sdk/install-sdk/).
|
||||
You also need to install [jq](https://stedolan.github.io/jq/download/), and the [Kubeflow Pipelines SDK](/docs/components/pipelines/legacy-v1/sdk/install-sdk/).
|
||||
|
||||
## Building and running a pipeline
|
||||
|
||||
Follow this guide to download, compile, and run the [`sequential.py` sample pipeline](https://github.com/kubeflow/pipelines/blob/sdk/release-1.8/samples/core/sequential/sequential.py). To learn how to compile and run pipelines using the Kubeflow Pipelines SDK or a Jupyter notebook, follow the [experimenting with Kubeflow Pipelines samples tutorial](/docs/components/pipelines/tutorials/build-pipeline/).
|
||||
Follow this guide to download, compile, and run the [`sequential.py` sample pipeline](https://github.com/kubeflow/pipelines/blob/sdk/release-1.8/samples/core/sequential/sequential.py). To learn how to compile and run pipelines using the Kubeflow Pipelines SDK or a Jupyter notebook, follow the [experimenting with Kubeflow Pipelines samples tutorial](/docs/components/pipelines/legacy-v1/tutorials/build-pipeline/).
|
||||
|
||||
```
|
||||
PIPELINE_URL=https://raw.githubusercontent.com/kubeflow/pipelines/master/samples/core/sequential/sequential.py
|
||||
|
|
@ -33,7 +33,7 @@ wget -O ${PIPELINE_FILE} ${PIPELINE_URL}
|
|||
dsl-compile --py ${PIPELINE_FILE} --output ${PIPELINE_NAME}.tar.gz
|
||||
```
|
||||
|
||||
After running the commands above, you should get two files in your current directory: `sequential.py` and `sequential.tar.gz`. Run the following command to deploy the generated `.tar.gz` file as you would do using the [Kubeflow Pipelines UI](/docs/components/pipelines/sdk/build-component/#deploy-the-pipeline), but this time using the REST API.
|
||||
After running the commands above, you should get two files in your current directory: `sequential.py` and `sequential.tar.gz`. Run the following command to deploy the generated `.tar.gz` file as you would do using the [Kubeflow Pipelines UI](/docs/components/pipelines/legacy-v1/sdk/component-development/#deploy-the-pipeline), but this time using the REST API.
|
||||
|
||||
```
|
||||
SVC=localhost:8888
|
||||
|
|
@ -51,7 +51,7 @@ To run the provided benchmark scripts, you need the following:
|
|||
run, job, and experiment services from your Jupyter notebook environment.
|
||||
* A Kubeflow Pipelines cluster. If you do not have a Kubeflow Pipelines
|
||||
cluster, learn more about your [options for installing Kubeflow
|
||||
Pipelines](/docs/components/pipelines/installation/overview/).
|
||||
Pipelines](/docs/components/pipelines/legacy-v1/overview/).
|
||||
* A pipeline manifest. For example, this guide uses the
|
||||
[taxi_updated_pool.yaml](https://storage.googleapis.com/ml-pipeline/sample-benchmark/taxi_updated_pool.yaml)
|
||||
pipeline manifest file.
|
||||
|
|
@ -21,7 +21,7 @@ Set up your environment:
|
|||
|
||||
1. Clone or download the
|
||||
[Kubeflow Pipelines samples](https://github.com/kubeflow/pipelines/tree/sdk/release-1.8/samples).
|
||||
1. Install the [Kubeflow Pipelines SDK](/docs/components/pipelines/sdk/install-sdk/).
|
||||
1. Install the [Kubeflow Pipelines SDK](/docs/components/pipelines/legacy-v1/sdk/install-sdk/).
|
||||
1. Activate your Python 3 environment if you haven't done so already:
|
||||
|
||||
```
|
||||
|
|
@ -64,7 +64,7 @@ dsl-compile --py ${DIR}/sequential.py --output ${DIR}/sequential.tar.gz
|
|||
### Deploy the pipeline
|
||||
|
||||
Upload the generated `.tar.gz` file through the Kubeflow Pipelines UI. See the
|
||||
guide to [getting started with the UI](/docs/components/pipelines/overview/quickstart).
|
||||
guide to [getting started with the UI](/docs/components/pipelines/legacy-v1/overview/quickstart).
|
||||
|
||||
## Building a pipeline in a Jupyter notebook
|
||||
|
||||
|
|
@ -127,8 +127,8 @@ The following notebooks are available:
|
|||
## Next steps
|
||||
|
||||
* Learn the various ways to use the [Kubeflow Pipelines
|
||||
SDK](/docs/components/pipelines/sdk/sdk-overview/).
|
||||
SDK](/docs/components/pipelines/legacy-v1/sdk/sdk-overview/).
|
||||
* See how to
|
||||
[build your own pipeline components](/docs/components/pipelines/sdk/build-component/).
|
||||
[build your own pipeline components](/docs/components/pipelines/legacy-v1/sdk/component-development/).
|
||||
* Read more about
|
||||
[building lightweight components](/docs/components/pipelines/sdk/lightweight-python-components/).
|
||||
[building lightweight components](/docs/components/pipelines/legacy-v1/sdk/lightweight-python-components/).
|
||||
|
|
@ -13,7 +13,7 @@ To follow the examples in this guide, you must have Kubeflow Pipelines SDK
|
|||
version 0.2.5 or higher installed. Use the following instructions to install
|
||||
the Kubeflow Pipelines SDK and check the SDK version.
|
||||
|
||||
1. Install the [Kubeflow Pipelines SDK](/docs/components/pipelines/sdk/install-sdk/)
|
||||
1. Install the [Kubeflow Pipelines SDK](/docs/components/pipelines/legacy-v1/sdk/install-sdk/)
|
||||
1. Run the following command to check the version of the SDK
|
||||
```
|
||||
pip list | grep kfp
|
||||
|
|
@ -0,0 +1,5 @@
|
|||
+++
|
||||
title = "Operator Guides"
|
||||
description = "Documentation for operators of Kubeflow Pipelines."
|
||||
weight = 6
|
||||
+++
|
||||
|
|
@ -1,11 +1,11 @@
|
|||
+++
|
||||
title = "Installation"
|
||||
description = "Options for deploying Kubeflow Pipelines"
|
||||
weight = 4
|
||||
weight = 1
|
||||
+++
|
||||
|
||||
{{% kfp-v2-keywords %}}
|
||||
|
||||
This page will be available soon. For similar information, see [KFP v1 installation documentation][v1-installation].
|
||||
|
||||
[v1-installation]: /docs/components/pipelines/v1/installation/
|
||||
[v1-installation]: /docs/components/pipelines/legacy-v1/installation/
|
||||
|
|
@ -1,7 +1,7 @@
|
|||
+++
|
||||
title = "Server Configuration"
|
||||
description = "Guidance on managing your Kubeflow Pipelines instances"
|
||||
weight = 1
|
||||
weight = 2
|
||||
+++
|
||||
|
||||
|
||||
|
|
@ -1,5 +1,5 @@
|
|||
+++
|
||||
title = "Introduction"
|
||||
title = "Overview"
|
||||
description = "What is Kubeflow Pipelines?"
|
||||
weight = 1
|
||||
+++
|
||||
|
|
@ -37,10 +37,10 @@ A [pipeline][pipelines] is a definition of a workflow that composes one or more
|
|||
* Learn more about [authoring components][components]
|
||||
* Learn more about [authoring pipelines][pipelines]
|
||||
|
||||
[components]: /docs/components/pipelines/v2/components
|
||||
[pipelines]: /docs/components/pipelines/v2/pipelines
|
||||
[installation]: /docs/components/pipelines/v2/installation
|
||||
[ir-yaml]: /docs/components/pipelines/v2/compile-a-pipeline#ir-yaml
|
||||
[components]: /docs/components/pipelines/user-guides/components
|
||||
[pipelines]: /docs/components/pipelines/user-guides
|
||||
[installation]: /docs/components/pipelines/operator-guides/installation
|
||||
[ir-yaml]: /docs/components/pipelines/user-guides/core-functions/compile-a-pipeline#ir-yaml
|
||||
[pypi]: https://pypi.org/project/kfp/
|
||||
[hello-world-pipeline]: /docs/components/pipelines/v2/hello-world
|
||||
[control-flow]: /docs/components/pipelines/v2/pipelines/control-flow
|
||||
[hello-world-pipeline]: /docs/components/pipelines/getting-started
|
||||
[control-flow]: /docs/components/pipelines/user-guides/core-functions/control-flow
|
||||
|
|
@ -1,5 +1,5 @@
|
|||
+++
|
||||
title = "Reference"
|
||||
description = "Reference docs for Kubeflow Pipelines Version 2"
|
||||
weight = 100
|
||||
weight = 8
|
||||
+++
|
||||
|
|
@ -1,7 +1,7 @@
|
|||
+++
|
||||
title = "Pipelines API Reference"
|
||||
description = "Reference documentation for the Kubeflow Pipelines API Version 2"
|
||||
weight = 10
|
||||
weight = 3
|
||||
+++
|
||||
<!DOCTYPE html>
|
||||
<html>
|
||||
|
|
@ -1,7 +1,7 @@
|
|||
+++
|
||||
title = "Community and Support"
|
||||
description = "Where to get help, contribute, and learn more"
|
||||
weight = 11
|
||||
weight = 1
|
||||
+++
|
||||
|
||||
{{% kfp-v2-keywords %}}
|
||||
|
|
@ -1,13 +1,9 @@
|
|||
+++
|
||||
title = "Component Specification"
|
||||
description = "Definition of a Kubeflow Pipelines component"
|
||||
weight = 10
|
||||
weight = 4
|
||||
|
||||
+++
|
||||
{{% alert title="Out of date" color="warning" %}}
|
||||
This guide contains outdated information pertaining to Kubeflow 1.0. This guide
|
||||
needs to be updated for Kubeflow 1.1.
|
||||
{{% /alert %}}
|
||||
|
||||
This specification describes the container component data model for Kubeflow
|
||||
Pipelines. The data model is serialized to a file in YAML format for sharing.
|
||||
|
|
@ -1,7 +1,7 @@
|
|||
+++
|
||||
title = " Pipelines SDK Reference"
|
||||
title = "Pipelines SDK Reference"
|
||||
description = "Reference documentation for the Kubeflow Pipelines SDK Version 2"
|
||||
weight = 20
|
||||
weight = 5
|
||||
|
||||
+++
|
||||
|
||||
|
|
@ -1,7 +1,7 @@
|
|||
+++
|
||||
title = "Version Compatibility"
|
||||
description = "Version compatibility between KFP Runtime and KFP SDK"
|
||||
weight = 80
|
||||
weight = 2
|
||||
+++
|
||||
|
||||
{{% kfp-v2-keywords %}}
|
||||
|
|
@ -33,5 +33,5 @@ For more detailed information on feature support, please refer to the version-sp
|
|||
* [Kubeflow Pipelines v1][kfp-v1-doc]
|
||||
* [Kubeflow Pipelines v2][kfp-v2-doc]
|
||||
|
||||
[kfp-v1-doc]: /docs/components/pipelines/v1
|
||||
[kfp-v2-doc]: /docs/components/pipelines/v2
|
||||
[kfp-v1-doc]: /docs/components/pipelines/legacy-v1
|
||||
[kfp-v2-doc]: /docs/components/pipelines/
|
||||
|
|
@ -0,0 +1,5 @@
|
|||
+++
|
||||
title = "User Guides"
|
||||
description = "Documentation for users of Kubeflow Pipelines."
|
||||
weight = 7
|
||||
+++
|
||||
|
|
@ -1,7 +1,6 @@
|
|||
+++
|
||||
title = "Components"
|
||||
description = "Author KFP components"
|
||||
weight = 5
|
||||
title = "Create components"
|
||||
weight = 3
|
||||
+++
|
||||
|
||||
{{% kfp-v2-keywords %}}
|
||||
|
|
@ -62,9 +62,9 @@ def dataset_concatenator(
|
|||
|
||||
Note that if you provide a `description` argument to the [`@dsl.pipeline`][dsl-pipeline] decorator, KFP will use this description instead of the docstring description.
|
||||
|
||||
[ir-yaml]: /docs/components/pipelines/v2/compile-a-pipeline#ir-yaml
|
||||
[ir-yaml]: /docs/components/pipelines/user-guides/core-functions/compile-a-pipeline#ir-yaml
|
||||
[google-docstring-style]: https://sphinxcontrib-napoleon.readthedocs.io/en/latest/example_google.html
|
||||
[dsl-pipeline]: https://kubeflow-pipelines.readthedocs.io/en/stable/source/dsl.html#kfp.dsl.pipeline
|
||||
[output-artifacts]: /docs/components/pipelines/v2/data-types/artifacts#declaring-inputoutput-artifacts
|
||||
[output-artifacts]: /docs/components/pipelines/user-guides/data-handling/artifacts#declaring-inputoutput-artifacts
|
||||
[dsl-outputpath]: https://kubeflow-pipelines.readthedocs.io/en/latest/source/dsl.html#kfp.dsl.OutputPath
|
||||
[output-type-marker]: https://kubeflow-pipelines.readthedocs.io/en/latest/source/dsl.html#kfp.dsl.Output
|
||||
|
|
@ -1,7 +1,6 @@
|
|||
+++
|
||||
title = "Pipeline Basics"
|
||||
description = "Compose components into pipelines"
|
||||
weight = 1
|
||||
title = "Compose components into pipelines"
|
||||
weight = 2
|
||||
+++
|
||||
|
||||
{{% kfp-v2-keywords %}}
|
||||
|
|
@ -203,14 +202,14 @@ def pythagorean(a: float = 1.2, b: float = 1.2) -> float:
|
|||
<!-- TODO: make this reference more precise throughout -->
|
||||
[dsl-reference-docs]: https://kubeflow-pipelines.readthedocs.io/en/stable/source/dsl.html
|
||||
[dsl-pipeline]: https://kubeflow-pipelines.readthedocs.io/en/stable/source/dsl.html#kfp.dsl.pipeline
|
||||
[control-flow]: /docs/components/pipelines/v2/pipelines/control-flow
|
||||
[components]: /docs/components/pipelines/v2/components
|
||||
[control-flow]: /docs/components/pipelines/user-guides/core-functions/control-flow
|
||||
[components]: /docs/components/pipelines/user-guides/components
|
||||
[pipelinetask]: https://kubeflow-pipelines.readthedocs.io/en/stable/source/dsl.html#kfp.dsl.PipelineTask
|
||||
[vertex-pipelines]: https://cloud.google.com/vertex-ai/docs/pipelines/introduction
|
||||
[oss-be]: /docs/components/pipelines/v2/installation/
|
||||
[data-types]: /docs/components/pipelines/v2/data-types
|
||||
[output-artifacts]: /docs/components/pipelines/v2/data-types/artifacts#using-output-artifacts
|
||||
[container-component-outputs]: /docs/components/pipelines/v2/components/container-components#create-component-outputs
|
||||
[parameters-namedtuple]: /docs/components/pipelines/v2/data-types/parameters#multiple-output-parameters
|
||||
[oss-be]: /docs/components/pipelines/operator-guides/installation/
|
||||
[data-types]: /docs/components/pipelines/user-guides/data-handling/data-types
|
||||
[output-artifacts]: /docs/components/pipelines/user-guides/data-handling/artifacts#using-output-artifacts
|
||||
[container-component-outputs]: /docs/components/pipelines/user-guides/components/container-components#create-component-outputs
|
||||
[parameters-namedtuple]: /docs/components/pipelines/user-guides/data-handling/parameters#multiple-output-parameters
|
||||
[dsl-pipeline-job-name-placeholder]: https://kubeflow-pipelines.readthedocs.io/en/stable/source/dsl.html#kfp.dsl.PIPELINE_JOB_NAME_PLACEHOLDER
|
||||
[component-docstring-format]: /docs/components/pipelines/v2/components/additional-functionality#component-docstring-format
|
||||
[component-docstring-format]: /docs/components/pipelines/user-guides/components/additional-functionality#component-docstring-format
|
||||
|
|
@ -184,8 +184,8 @@ def hello_someone(optional_name: str = None):
|
|||
Arguments to `then` and `else_` may be a list of any combination of static strings, upstream outputs, pipeline parameters, or other instances of `dsl.ConcatPlaceholder` or `dsl.IfPresentPlaceholder`
|
||||
|
||||
|
||||
[hello-world-pipeline]: /docs/components/pipelines/v2/hello-world
|
||||
[pipeline-basics]: /docs/components/pipelines/v2/pipelines/pipeline-basics
|
||||
[hello-world-pipeline]: /docs/components/pipelines/getting-started
|
||||
[pipeline-basics]: /docs/components/pipelines/user-guides/components/compose-components-into-pipelines
|
||||
[alpine]: https://hub.docker.com/_/alpine
|
||||
[dsl-outputpath]: https://kubeflow-pipelines.readthedocs.io/en/latest/source/dsl.html#kfp.dsl.OutputPath
|
||||
[dsl-container-component]: https://kubeflow-pipelines.readthedocs.io/en/latest/source/dsl.html#kfp.dsl.container_component
|
||||
|
|
@ -103,10 +103,10 @@ Since `add`'s `target_image` uses [Google Cloud Artifact Registry][artifact-regi
|
|||
|
||||
|
||||
[kfp-component-build]: https://kubeflow-pipelines.readthedocs.io/en/stable/source/cli.html#kfp-component-build
|
||||
[lightweight-python-components]: /docs/components/pipelines/v2/components/lightweight-python-components
|
||||
[lightweight-python-components]: /docs/components/pipelines/user-guides/components/lightweight-python-components/
|
||||
[image-tag]: https://docs.docker.com/engine/reference/commandline/tag/
|
||||
[docker-from]: https://docs.docker.com/engine/reference/builder/#from
|
||||
[artifact-registry]: https://cloud.google.com/artifact-registry/docs/docker/authentication
|
||||
[vertex-pipelines]: https://cloud.google.com/vertex-ai/docs/pipelines/introduction
|
||||
[iam]: https://cloud.google.com/iam
|
||||
[packages-to-install]: https://www.kubeflow.org/docs/components/pipelines/v2/components/lightweight-python-components/#packages_to_install
|
||||
[packages-to-install]: /docs/components/pipelines/user-guides/components/lightweight-python-components#packages_to_install
|
||||
|
|
@ -34,7 +34,7 @@ The `importer` component permits setting artifact metadata via the `metadata` ar
|
|||
|
||||
You may also specify a boolean `reimport` argument. If `reimport` is `False`, KFP will check to see if the artifact has already been imported to ML Metadata and, if so, use it. This is useful for avoiding duplicative artifact entries in ML Metadata when multiple pipeline runs import the same artifact. If `reimport` is `True`, KFP will reimport the artifact as a new artifact in ML Metadata regardless of whether it was previously imported.
|
||||
|
||||
[pipeline-basics]: /docs/components/pipelines/v2/pipelines/pipeline-basics
|
||||
[pipeline-basics]: /docs/components/pipelines/user-guides/components/compose-components-into-pipelines
|
||||
[dsl-importer]: https://kubeflow-pipelines.readthedocs.io/en/latest/source/dsl.html#kfp.dsl.importer
|
||||
[artifacts]: /docs/components/pipelines/v2/data-types/artifacts
|
||||
[artifacts]: /docs/components/pipelines/user-guides/data-handling/artifacts
|
||||
[ml-metadata]: https://github.com/google/ml-metadata
|
||||
|
|
@ -126,12 +126,12 @@ By default, Python Components install `kfp` at runtime. This is required to defi
|
|||
|
||||
Note that setting `install_kfp_package` to `False` is rarely necessary and is discouraged for the majority of use cases.
|
||||
|
||||
[hello-world-pipeline]: /docs/components/pipelines/v2/hello-world
|
||||
[containerized-python-components]: /docs/components/pipelines/v2/components/containerized-python-components
|
||||
[hello-world-pipeline]: /docs/components/pipelines/getting-started
|
||||
[containerized-python-components]: /docs/components/pipelines/user-guides/components/containerized-python-components
|
||||
[dsl-component]: https://kubeflow-pipelines.readthedocs.io/en/stable/source/dsl.html#kfp.dsl.component
|
||||
[data-types]: /docs/components/pipelines/v2/data-types
|
||||
[parameters]: /docs/components/pipelines/v2/data-types/parameters
|
||||
[artifacts]: /docs/components/pipelines/v2/data-types/artifacts
|
||||
[data-types]: /docs/components/pipelines/user-guides/data-handling/data-types
|
||||
[parameters]: /docs/components/pipelines/user-guides/data-handling/parameters
|
||||
[artifacts]: /docs/components/pipelines/user-guides/data-handling/artifacts
|
||||
[requirements-txt]: https://pip.pypa.io/en/stable/reference/requirements-file-format/
|
||||
[pypi-org]: https://pypi.org/
|
||||
[pip-install]: https://pip.pypa.io/en/stable/cli/pip_install/
|
||||
|
|
@ -51,7 +51,7 @@ def my_pipeline():
|
|||
|
||||
Some libraries, such as [Google Cloud Pipeline Components][gcpc] package and provide reusable components in a pip-installable [Python package][gcpc-pypi].
|
||||
|
||||
[pipeline-as-component]: /docs/components/pipelines/v2/pipelines/pipeline-basics#pipelines-as-components
|
||||
[pipeline-as-component]: /docs/components/pipelines/user-guides/components/compose-components-into-pipelines#pipelines-as-components
|
||||
[gcpc]: https://cloud.google.com/vertex-ai/docs/pipelines/components-introduction
|
||||
[gcpc-pypi]: https://pypi.org/project/google-cloud-pipeline-components/
|
||||
[components-module]: https://kubeflow-pipelines.readthedocs.io/en/latest/source/components.html
|
||||
|
|
@ -0,0 +1,5 @@
|
|||
+++
|
||||
title = "Core Functions"
|
||||
description = "Documentation for users of Kubeflow Pipelines."
|
||||
weight = 2
|
||||
+++
|
||||
|
|
@ -0,0 +1,147 @@
|
|||
+++
|
||||
title = "Build a More Advanced ML Pipeline"
|
||||
weight = 6
|
||||
+++
|
||||
|
||||
{{% kfp-v2-keywords %}}
|
||||
|
||||
This step demonstrates how to build a more advanced machine learning (ML) pipeline that leverages additional KFP pipeline composition features.
|
||||
|
||||
The following ML pipeline creates a dataset, normalizes the features of the dataset as a preprocessing step, and trains a simple ML model on the data using different hyperparameters:
|
||||
|
||||
```python
|
||||
from typing import List
|
||||
|
||||
from kfp import client
|
||||
from kfp import dsl
|
||||
from kfp.dsl import Dataset
|
||||
from kfp.dsl import Input
|
||||
from kfp.dsl import Model
|
||||
from kfp.dsl import Output
|
||||
|
||||
|
||||
@dsl.component(packages_to_install=['pandas==1.3.5'])
|
||||
def create_dataset(iris_dataset: Output[Dataset]):
|
||||
import pandas as pd
|
||||
|
||||
csv_url = 'https://archive.ics.uci.edu/ml/machine-learning-databases/iris/iris.data'
|
||||
col_names = [
|
||||
'Sepal_Length', 'Sepal_Width', 'Petal_Length', 'Petal_Width', 'Labels'
|
||||
]
|
||||
df = pd.read_csv(csv_url, names=col_names)
|
||||
|
||||
with open(iris_dataset.path, 'w') as f:
|
||||
df.to_csv(f)
|
||||
|
||||
|
||||
@dsl.component(packages_to_install=['pandas==1.3.5', 'scikit-learn==1.0.2'])
|
||||
def normalize_dataset(
|
||||
input_iris_dataset: Input[Dataset],
|
||||
normalized_iris_dataset: Output[Dataset],
|
||||
standard_scaler: bool,
|
||||
min_max_scaler: bool,
|
||||
):
|
||||
if standard_scaler is min_max_scaler:
|
||||
raise ValueError(
|
||||
'Exactly one of standard_scaler or min_max_scaler must be True.')
|
||||
|
||||
import pandas as pd
|
||||
from sklearn.preprocessing import MinMaxScaler
|
||||
from sklearn.preprocessing import StandardScaler
|
||||
|
||||
with open(input_iris_dataset.path) as f:
|
||||
df = pd.read_csv(f)
|
||||
labels = df.pop('Labels')
|
||||
|
||||
if standard_scaler:
|
||||
scaler = StandardScaler()
|
||||
if min_max_scaler:
|
||||
scaler = MinMaxScaler()
|
||||
|
||||
df = pd.DataFrame(scaler.fit_transform(df))
|
||||
df['Labels'] = labels
|
||||
with open(normalized_iris_dataset.path, 'w') as f:
|
||||
df.to_csv(f)
|
||||
|
||||
|
||||
@dsl.component(packages_to_install=['pandas==1.3.5', 'scikit-learn==1.0.2'])
|
||||
def train_model(
|
||||
normalized_iris_dataset: Input[Dataset],
|
||||
model: Output[Model],
|
||||
n_neighbors: int,
|
||||
):
|
||||
import pickle
|
||||
|
||||
import pandas as pd
|
||||
from sklearn.model_selection import train_test_split
|
||||
from sklearn.neighbors import KNeighborsClassifier
|
||||
|
||||
with open(normalized_iris_dataset.path) as f:
|
||||
df = pd.read_csv(f)
|
||||
|
||||
y = df.pop('Labels')
|
||||
X = df
|
||||
|
||||
X_train, X_test, y_train, y_test = train_test_split(X, y, random_state=0)
|
||||
|
||||
clf = KNeighborsClassifier(n_neighbors=n_neighbors)
|
||||
clf.fit(X_train, y_train)
|
||||
with open(model.path, 'wb') as f:
|
||||
pickle.dump(clf, f)
|
||||
|
||||
|
||||
@dsl.pipeline(name='iris-training-pipeline')
|
||||
def my_pipeline(
|
||||
standard_scaler: bool,
|
||||
min_max_scaler: bool,
|
||||
neighbors: List[int],
|
||||
):
|
||||
create_dataset_task = create_dataset()
|
||||
|
||||
normalize_dataset_task = normalize_dataset(
|
||||
input_iris_dataset=create_dataset_task.outputs['iris_dataset'],
|
||||
standard_scaler=True,
|
||||
min_max_scaler=False)
|
||||
|
||||
with dsl.ParallelFor(neighbors) as n_neighbors:
|
||||
train_model(
|
||||
normalized_iris_dataset=normalize_dataset_task
|
||||
.outputs['normalized_iris_dataset'],
|
||||
n_neighbors=n_neighbors)
|
||||
|
||||
|
||||
endpoint = '<KFP_UI_URL>'
|
||||
kfp_client = client.Client(host=endpoint)
|
||||
run = kfp_client.create_run_from_pipeline_func(
|
||||
my_pipeline,
|
||||
arguments={
|
||||
'min_max_scaler': True,
|
||||
'standard_scaler': False,
|
||||
'neighbors': [3, 6, 9]
|
||||
},
|
||||
)
|
||||
url = f'{endpoint}/#/runs/details/{run.run_id}'
|
||||
print(url)
|
||||
```
|
||||
|
||||
This example introduces the following new features in the pipeline:
|
||||
|
||||
* Some Python **packages to install** are added at component runtime, using the `packages_to_install` argument on the `@dsl.component` decorator, as follows:
|
||||
|
||||
`@dsl.component(packages_to_install=['pandas==1.3.5'])`
|
||||
|
||||
To use a library after installing it, you must include its import statements within the scope of the component function, so that the library is imported at component runtime.
|
||||
|
||||
* **Input and output artifacts** of types `Dataset` and `Model` are introduced in the component signature to describe the input and output artifacts of the components. This is done using the type annotation generics `Input[]` and `Output[]` for input and output artifacts respectively.
|
||||
|
||||
Within the scope of a component, artifacts can be read (for inputs) and written (for outputs) via the `.path` attribute. The KFP backend ensures that *input* artifact files are copied *to* the executing pod's local file system from the remote storage at runtime, so that the component function can read input artifacts from the local file system. By comparison, *output* artifact files are copied *from* the local file system of the pod to remote storage, when the component finishes running. This way, the output artifacts persist outside the pod. In both cases, the component author needs to interact with the local file system only to create persistent artifacts.
|
||||
|
||||
The arguments for the parameters annotated with `Output[]` are not passed to components by the pipeline author. The KFP backend passes this artifact during component runtime, so that component authors don't need to be concerned about the path to which the output artifacts are written. After an output artifact is written, the backend executing the component recognizes the KFP artifact types (`Dataset` or `Model`), and organizes them on the Dashboard.
|
||||
|
||||
An output artifact can be passed as an input to a downstream component using the `.outputs` attribute of the source task and the output artifact parameter name, as follows:
|
||||
|
||||
`create_dataset_task.outputs['iris_dataset']`
|
||||
|
||||
* One of the **DSL control flow features**, `dsl.ParallelFor`, is used. It is a context manager that lets pipeline authors create tasks. These tasks execute in parallel in a loop. Using `dsl.ParallelFor` to iterate over the `neighbors` pipeline argument lets you execute the `train_model` component with different arguments and test multiple hyperparameters in one pipeline run. Other control flow features include `dsl.Condition` and `dsl.ExitHandler`.
|
||||
|
||||
Congratulations! You now have a KFP deployment, an end-to-end ML pipeline, and an introduction to the UI. That's just the beginning of KFP pipeline and Dashboard features.
|
||||
|
|
@ -1,7 +1,7 @@
|
|||
+++
|
||||
title = "Caching"
|
||||
title = "Use Caching"
|
||||
description = "How to use caching in Kubeflow Pipelines."
|
||||
weight = 13
|
||||
weight = 5
|
||||
+++
|
||||
|
||||
Kubeflow Pipelines support caching to eliminate redundant executions and improve
|
||||
|
|
@ -1,7 +1,6 @@
|
|||
+++
|
||||
title = "Command Line Interface"
|
||||
description = "Interact with KFP via the CLI"
|
||||
weight = 10
|
||||
title = "Interact with KFP via the CLI"
|
||||
weight = 4
|
||||
+++
|
||||
|
||||
{{% kfp-v2-keywords %}}
|
||||
|
|
@ -158,5 +157,5 @@ For more information about the arguments and flags supported by the `kfp compone
|
|||
|
||||
[cli-reference-docs]: https://kubeflow-pipelines.readthedocs.io/en/stable/source/cli.html
|
||||
[kfp-sdk-api-ref]: https://kubeflow-pipelines.readthedocs.io/en/stable/index.html
|
||||
[lightweight-python-component]: /docs/components/pipelines/v2/components/lightweight-python-components
|
||||
[containerized-python-components]: /docs/components/pipelines/v2/components/containerized-python-components
|
||||
[lightweight-python-component]: /docs/components/pipelines/user-guides/components/lightweight-python-components/
|
||||
[containerized-python-components]: /docs/components/pipelines/user-guides/components/containerized-python-components
|
||||
|
|
@ -1,7 +1,7 @@
|
|||
+++
|
||||
title = "Compile a Pipeline"
|
||||
description = "Compile pipelines and components to YAML"
|
||||
weight = 7
|
||||
weight = 2
|
||||
+++
|
||||
|
||||
{{% kfp-v2-keywords %}}
|
||||
|
|
@ -113,5 +113,5 @@ While IR YAML is not intended to be easily human readable, you can still inspect
|
|||
[component-spec]: https://github.com/kubeflow/pipelines/blob/41b69fd90da812005965f2209b64fd1278f1cdc9/api/v2alpha1/pipeline_spec.proto#L85-L96
|
||||
[executor-spec]: https://github.com/kubeflow/pipelines/blob/41b69fd90da812005965f2209b64fd1278f1cdc9/api/v2alpha1/pipeline_spec.proto#L788-L803
|
||||
[dag-spec]: https://github.com/kubeflow/pipelines/blob/41b69fd90da812005965f2209b64fd1278f1cdc9/api/v2alpha1/pipeline_spec.proto#L98-L105
|
||||
[data-types]: /docs/components/pipelines/v2/data-types
|
||||
[data-types]: /docs/components/pipelines/user-guides/data-handling/data-types
|
||||
[compiler-compile]: https://kubeflow-pipelines.readthedocs.io/en/latest/source/compiler.html#kfp.compiler.Compiler.compile
|
||||
|
|
@ -1,7 +1,7 @@
|
|||
+++
|
||||
title = "Connect the Pipelines SDK to Kubeflow Pipelines"
|
||||
description = "How to connect the Pipelines SDK to Kubeflow Pipelines in various ways"
|
||||
weight = 25
|
||||
weight = 7
|
||||
+++
|
||||
|
||||
How to connect Pipelines SDK to Kubeflow Pipelines will depend on __what kind__ of Kubeflow deployment you have, and __from where you are running your code__.
|
||||
|
|
@ -14,8 +14,8 @@ How to connect Pipelines SDK to Kubeflow Pipelines will depend on __what kind__
|
|||
|
||||
{{% alert title="Tip" color="info" %}}
|
||||
Before you begin, you will need to:
|
||||
* [Deploy Kubeflow Pipelines](/docs/components/pipelines/installation/overview/)
|
||||
* [Install the Kubeflow Pipelines SDK](/docs/components/pipelines/sdk/install-sdk/)
|
||||
* [Deploy Kubeflow Pipelines](/docs/components/pipelines/legacy-v1/overview/)
|
||||
* [Install the Kubeflow Pipelines SDK](/docs/components/pipelines/legacy-v1/sdk/install-sdk/)
|
||||
{{% /alert %}}
|
||||
|
||||
## Full Kubeflow <sub>(from inside cluster)</sub>
|
||||
|
|
@ -309,7 +309,7 @@ This information only applies to _Standalone Kubeflow Pipelines_.
|
|||
When running inside the Kubernetes cluster, you may connect Pipelines SDK directly to the `ml-pipeline-ui` service via [cluster-internal service DNS resolution](https://kubernetes.io/docs/concepts/services-networking/service/#discovering-services).
|
||||
|
||||
{{% alert title="Tip" color="info" %}}
|
||||
In [standalone deployments](/docs/components/pipelines/installation/standalone-deployment/) of Kubeflow Pipelines, there is no authentication enforced on the `ml-pipeline-ui` service.
|
||||
In [standalone deployments](/docs/components/pipelines/legacy-v1/installation/standalone-deployment/) of Kubeflow Pipelines, there is no authentication enforced on the `ml-pipeline-ui` service.
|
||||
{{% /alert %}}
|
||||
|
||||
For example, when running in the __same namespace__ as Kubeflow:
|
||||
|
|
@ -350,7 +350,7 @@ This information only applies to _Standalone Kubeflow Pipelines_.
|
|||
When running outside the Kubernetes cluster, you may connect Pipelines SDK to the `ml-pipeline-ui` service by using [kubectl port-forwarding](https://kubernetes.io/docs/tasks/access-application-cluster/port-forward-access-application-cluster/).
|
||||
|
||||
{{% alert title="Tip" color="info" %}}
|
||||
In [standalone deployments](/docs/components/pipelines/installation/standalone-deployment/) of Kubeflow Pipelines, there is no authentication enforced on the `ml-pipeline-ui` service.
|
||||
In [standalone deployments](/docs/components/pipelines/legacy-v1/installation/standalone-deployment/) of Kubeflow Pipelines, there is no authentication enforced on the `ml-pipeline-ui` service.
|
||||
{{% /alert %}}
|
||||
|
||||
__Step 1:__ run the following command on your external system to initiate port-forwarding:
|
||||
|
|
@ -375,6 +375,6 @@ print(client.list_experiments())
|
|||
|
||||
## Next Steps
|
||||
|
||||
* [Using the Kubeflow Pipelines SDK](/docs/components/pipelines/tutorials/sdk-examples/)
|
||||
* [Using the Kubeflow Pipelines SDK](/docs/components/pipelines/legacy-v1/tutorials/sdk-examples/)
|
||||
* [Kubeflow Pipelines SDK Reference](https://kubeflow-pipelines.readthedocs.io/en/stable/)
|
||||
* [Experiment with the Kubeflow Pipelines API](/docs/components/pipelines/tutorials/api-pipelines/)
|
||||
* [Experiment with the Kubeflow Pipelines API](/docs/components/pipelines/legacy-v1/tutorials/api-pipelines/)
|
||||
|
|
@ -1,7 +1,6 @@
|
|||
+++
|
||||
title = "Control Flow"
|
||||
description = "Create pipelines with control flow"
|
||||
weight = 2
|
||||
title = "Create pipelines with control flow"
|
||||
weight = 9
|
||||
+++
|
||||
|
||||
{{% kfp-v2-keywords %}}
|
||||
|
|
@ -209,8 +208,8 @@ def my_pipeline(text: str = 'message'):
|
|||
|
||||
Note that the component used for the caller task (`print_op` in the example above) requires a default value for all inputs it consumes from an upstream task. The default value is applied if the upstream task fails to produce the outputs that are passed to the caller task. Specifying default values ensures that the caller task always succeeds, regardless of the status of the upstream task.
|
||||
|
||||
[data-passing]: /docs/components/pipelines/v2/pipelines/pipeline-basics#data-passing-and-task-dependencies
|
||||
[pipeline-basics]: /docs/components/pipelines/v2/pipelines/pipeline-basics
|
||||
[data-passing]: /docs/components/pipelines/user-guides/components/compose-components-into-pipelines#data-passing-and-task-dependencies
|
||||
[pipeline-basics]: /docs/components/pipelines/user-guides/components/compose-components-into-pipelines
|
||||
[dsl-condition]: https://kubeflow-pipelines.readthedocs.io/en/latest/source/dsl.html#kfp.dsl.Condition
|
||||
[dsl-exithandler]: https://kubeflow-pipelines.readthedocs.io/en/latest/source/dsl.html#kfp.dsl.ExitHandler
|
||||
[dsl-parallelfor]: https://kubeflow-pipelines.readthedocs.io/en/latest/source/dsl.html#kfp.dsl.ParallelFor
|
||||
|
|
@ -1,7 +1,6 @@
|
|||
+++
|
||||
title = "Local Execution"
|
||||
description = "Execute KFP pipelines locally"
|
||||
weight = 9
|
||||
title = "Execute KFP pipelines locally"
|
||||
weight = 3
|
||||
+++
|
||||
|
||||
{{% kfp-v2-keywords %}}
|
||||
|
|
@ -133,9 +132,9 @@ Local execution comes with several limitations:
|
|||
While local pipeline execution has full support for sequential and nested pipelines, it does not yet support `dsl.Condition`, `dsl.ParallelFor`, or `dsl.ExitHandler`.
|
||||
|
||||
|
||||
[lightweight-python-component]: /docs/components/pipelines/v2/components/lightweight-python-components
|
||||
[containerized-python-components]: /docs/components/pipelines/v2/components/containerized-python-components
|
||||
[container-components]: /docs/components/pipelines/v2/components/container-components
|
||||
[lightweight-python-component]: /docs/components/pipelines/user-guides/components/lightweight-python-components/
|
||||
[containerized-python-components]: /docs/components/pipelines/user-guides/components/containerized-python-components
|
||||
[container-components]: /docs/components/pipelines/user-guides/components/container-components
|
||||
|
||||
|
||||
|
||||
|
|
@ -1,7 +1,6 @@
|
|||
+++
|
||||
title = "Platform-specific Features"
|
||||
description = "Author tasks with platform-specific functionality"
|
||||
weight = 12
|
||||
title = "Author Tasks with Platform-Specific Functionality"
|
||||
weight = 10
|
||||
+++
|
||||
|
||||
|
||||
|
|
@ -106,14 +105,14 @@ Finally, we can schedule deletion of the PVC after `task2` finishes to clean up
|
|||
For the full pipeline and more information, see a [similar example][full-example] in the [`kfp-kubernetes` documentation][kfp-kubernetes-docs].
|
||||
|
||||
|
||||
[ir-yaml]: /docs/components/pipelines/v2/compile-a-pipeline#ir-yaml
|
||||
[oss-be]: /docs/components/pipelines/v2/installation/
|
||||
[ir-yaml]: /docs/components/pipelines/user-guides/core-functions/compile-a-pipeline#ir-yaml
|
||||
[oss-be]: /docs/components/pipelines/operator-guides/installation/
|
||||
[kfp-kubernetes-pypi]: https://pypi.org/project/kfp-kubernetes/
|
||||
[task-level-config-methods]: /docs/components/pipelines/v2/pipelines/pipeline-basics/#task-configurations
|
||||
[task-level-config-methods]: /docs/components/pipelines/user-guides/components/compose-components-into-pipelines/#task-configurations
|
||||
[kfp-kubernetes-docs]: https://kfp-kubernetes.readthedocs.io/
|
||||
[persistent-volume]: https://kubernetes.io/docs/concepts/storage/persistent-volumes/
|
||||
[storage-class]: https://kubernetes.io/docs/concepts/storage/storage-classes/
|
||||
[access-mode]: https://kubernetes.io/docs/concepts/storage/persistent-volumes/#access-modes
|
||||
[full-example]: https://kfp-kubernetes.readthedocs.io/en/kfp-kubernetes-0.0.1/#persistentvolumeclaim-dynamically-create-pvc-mount-then-delete
|
||||
[authoring-components]: http://localhost:1313/docs/components/pipelines/v2/components/
|
||||
[authoring-pipelines]: http://localhost:1313/docs/components/pipelines/v2/pipelines/
|
||||
[authoring-components]: /docs/components/pipelines/user-guides/components/
|
||||
[authoring-pipelines]: /docs/components/pipelines/user-guides/
|
||||
|
|
@ -1,7 +1,7 @@
|
|||
+++
|
||||
title = "Run a Pipeline"
|
||||
description = "Execute a pipeline on the KFP backend"
|
||||
weight = 8
|
||||
weight = 1
|
||||
+++
|
||||
|
||||
{{% kfp-v2-keywords %}}
|
||||
|
|
@ -77,7 +77,7 @@ kfp run create --experiment-name my-experiment --package-file path/to/pipeline.y
|
|||
For more information about the `kfp run create` command, see [Command Line Interface][kfp-run-create-reference-docs] in the [KFP SDK reference documentation][kfp-sdk-api-ref]. For a summary of the available commands in the KFP CLI, see [Command-line Interface][kfp-cli].
|
||||
|
||||
[kfp-sdk-api-ref]: https://kubeflow-pipelines.readthedocs.io/en/master/index.html
|
||||
[compile-a-pipeline]: /docs/components/pipelines/v2/compile-a-pipeline/
|
||||
[compile-a-pipeline]: /docs/components/pipelines/user-guides/core-functions/compile-a-pipeline/
|
||||
[kfp-sdk-api-ref-client]: https://kubeflow-pipelines.readthedocs.io/en/master/source/client.html
|
||||
[kfp-cli]: /docs/components/pipelines/v2/cli/
|
||||
[kfp-cli]: /docs/components/pipelines/user-guides/core-functions/cli/
|
||||
[kfp-run-create-reference-docs]: https://kubeflow-pipelines.readthedocs.io/en/master/source/cli.html#kfp-run-create
|
||||
|
|
@ -0,0 +1,4 @@
|
|||
+++
|
||||
title = "Data Handling"
|
||||
weight = 4
|
||||
+++
|
||||
|
|
@ -1,7 +1,6 @@
|
|||
+++
|
||||
title = "Artifacts"
|
||||
description = "Create, use, pass, and track ML artifacts"
|
||||
weight = 2
|
||||
title = "Create, use, pass, and track ML artifacts"
|
||||
weight = 3
|
||||
+++
|
||||
|
||||
{{% kfp-v2-keywords %}}
|
||||
|
|
@ -196,7 +195,7 @@ def augment_and_train(dataset: Dataset) -> Model:
|
|||
|
||||
The [KFP SDK compiler][compiler] will type check artifact usage according to the rules described in [Type Checking][type-checking].
|
||||
|
||||
Please see [Pipeline Basics](pipelines) for comprehensive documentation on how to author a pipeline.
|
||||
Please see [Pipeline Basics][pipelines] for comprehensive documentation on how to author a pipeline.
|
||||
|
||||
|
||||
### Lists of artifacts
|
||||
|
|
@ -241,14 +240,14 @@ On the [KFP open source][oss-be] UI, `ClassificationMetrics`, `SlicedClassificat
|
|||
[dsl-slicedclassificationmetrics]: https://kubeflow-pipelines.readthedocs.io/en/latest/source/dsl.html#kfp.dsl.SlicedClassificationMetrics
|
||||
[dsl-html]: https://kubeflow-pipelines.readthedocs.io/en/latest/source/dsl.html#kfp.dsl.HTML
|
||||
[dsl-markdown]: https://kubeflow-pipelines.readthedocs.io/en/latest/source/dsl.html#kfp.dsl.Markdown
|
||||
[type-checking]: /docs/components/pipelines/v2/compile-a-pipeline#type-checking
|
||||
[oss-be]: /docs/components/pipelines/v2/installation/
|
||||
[pipelines]: /docs/components/pipelines/v2/pipelines/pipelines-basics/
|
||||
[container-components]: /docs/components/pipelines/v2/components/lightweight-python-components/
|
||||
[python-components]: /docs/components/pipelines/v2/components/container-components
|
||||
[type-checking]: /docs/components/pipelines/user-guides/core-functions/compile-a-pipeline#type-checking
|
||||
[oss-be]: /docs/components/pipelines/operator-guides/installation/
|
||||
[pipelines]: /docs/components/pipelines/user-guides/components/compose-components-into-pipelines/
|
||||
[container-components]: /docs/components/pipelines/user-guides/components/lightweight-python-components//
|
||||
[python-components]: /docs/components/pipelines/user-guides/components/container-components
|
||||
[dsl-parallelfor]: https://kubeflow-pipelines.readthedocs.io/en/latest/source/dsl.html#kfp.dsl.ParallelFor
|
||||
[dsl-collected]: https://kubeflow-pipelines.readthedocs.io/en/latest/source/dsl.html#kfp.dsl.Collected
|
||||
[parallel-looping]: https://www.kubeflow.org/docs/components/pipelines/v2/pipelines/control-flow/#parallel-looping-dslparallelfor
|
||||
[traditional-artifact-syntax]: /docs/components/pipelines/v2/data-types/artifacts/#traditional-artifact-syntax
|
||||
[multiple-outputs]: /docs/components/pipelines/v2/data-types/parameters/#multiple-output-parameters
|
||||
[pythonic-artifact-syntax]: /docs/components/pipelines/v2/data-types/artifacts/#new-pythonic-artifact-syntax
|
||||
[parallel-looping]: /docs/components/pipelines/user-guides/core-functions/control-flow/#parallel-looping-dslparallelfor
|
||||
[traditional-artifact-syntax]: /docs/components/pipelines/user-guides/data-handling/artifacts/#traditional-artifact-syntax
|
||||
[multiple-outputs]: /docs/components/pipelines/user-guides/data-handling/parameters/#multiple-output-parameters
|
||||
[pythonic-artifact-syntax]: /docs/components/pipelines/user-guides/data-handling/artifacts/#new-pythonic-artifact-syntax
|
||||
|
|
@ -1,7 +1,7 @@
|
|||
+++
|
||||
title = "Data Types"
|
||||
description = "Component and pipeline I/O types"
|
||||
weight = 6
|
||||
weight = 1
|
||||
+++
|
||||
|
||||
{{% kfp-v2-keywords %}}
|
||||
|
|
@ -14,6 +14,6 @@ So far [Hello World pipeline][hello-world] and the examples in [Components][comp
|
|||
|
||||
KFP automatically tracks the way parameters and artifacts are passed between components and stores the this data passing history in [ML Metadata][ml-metadata]. This enables out-of-the-box ML artifact lineage tracking and easily reproducible pipeline executions. Furthermore, KFP's strongly-typed components provide a data contract between tasks in a pipeline.
|
||||
|
||||
[hello-world]: /docs/components/pipelines/v2/hello-world
|
||||
[components]: /docs/components/pipelines/v2/components
|
||||
[hello-world]: /docs/components/pipelines/getting-started
|
||||
[components]: /docs/components/pipelines/user-guides/components
|
||||
[ml-metadata]: https://github.com/google/ml-metadata
|
||||
|
|
@ -1,7 +1,6 @@
|
|||
+++
|
||||
title = "Parameters"
|
||||
description = "Pass small amounts of data between components"
|
||||
weight = 1
|
||||
title = "Pass small amounts of data between components"
|
||||
weight = 2
|
||||
+++
|
||||
|
||||
{{% kfp-v2-keywords %}}
|
||||
|
|
@ -167,10 +166,10 @@ def my_pipeline() -> NamedTuple('pipeline_outputs', c=int, d=str):
|
|||
|
||||
|
||||
[ml-metadata]: https://github.com/google/ml-metadata
|
||||
[lightweight-python-components]: /docs/components/pipelines/v2/components/lightweight-python-components
|
||||
[containerized-python-components]: /docs/components/pipelines/v2/components/containerized-python-components
|
||||
[container-component]: /docs/components/pipelines/v2/components/container-components
|
||||
[container-component-outputs]: /docs/components/pipelines/v2/components/container-components#create-component-outputs
|
||||
[lightweight-python-components]: /docs/components/pipelines/user-guides/components/lightweight-python-components/
|
||||
[containerized-python-components]: /docs/components/pipelines/user-guides/components/containerized-python-components
|
||||
[container-component]: /docs/components/pipelines/user-guides/components/container-components
|
||||
[container-component-outputs]: /docs/components/pipelines/user-guides/components/container-components#create-component-outputs
|
||||
[pipelinetask]: https://kubeflow-pipelines.readthedocs.io/en/stable/source/dsl.html#kfp.dsl.PipelineTask
|
||||
[dsl-outputpath]: https://kubeflow-pipelines.readthedocs.io/en/latest/source/dsl.html#kfp.dsl.OutputPath
|
||||
[ml-metadata]: https://github.com/google/ml-metadata
|
||||
|
|
@ -1,7 +1,7 @@
|
|||
+++
|
||||
title = "Migrate from KFP SDK v1"
|
||||
description = "v1 to v2 migration instructions and breaking changes"
|
||||
weight = 3
|
||||
weight = 1
|
||||
+++
|
||||
|
||||
{{% kfp-v2-keywords %}}
|
||||
|
|
@ -409,7 +409,7 @@ def flip_coin(rand: int, result: dsl.OutputPath(str)):
|
|||
|
||||
`VolumeOp` and `ResourceOp` expose direct access to Kubernetes resources within a pipeline definition. There is no support for these features on a non-Kubernetes platforms.
|
||||
|
||||
KFP v2 enables support for [platform-specific features](/docs/components/pipelines/v2/platform-specific-features/) via KFP SDK extension libraries. Kubernetes-specific features are supported in KFP v2 via the [`kfp-kubernetes`](https://kfp-kubernetes.readthedocs.io/) extension library.
|
||||
KFP v2 enables support for [platform-specific features](/docs/components/pipelines/user-guides/core-functions/platform-specific-features/) via KFP SDK extension libraries. Kubernetes-specific features are supported in KFP v2 via the [`kfp-kubernetes`](https://kfp-kubernetes.readthedocs.io/) extension library.
|
||||
|
||||
#### v1 component YAML support
|
||||
|
||||
|
|
@ -575,27 +575,27 @@ def training_pipeline(number_of_epochs: int = 1):
|
|||
|
||||
If you believe we missed a breaking change or an important migration step, please [create an issue][new-issue] describing the change in the [kubeflow/pipelines repository][pipelines-repo].
|
||||
|
||||
[artifacts]: /docs/components/pipelines/v2/data-types/artifacts
|
||||
[cli]: /docs/components/pipelines/v2/cli/
|
||||
[compile]: /docs/components/pipelines/v2/compile-a-pipeline
|
||||
[artifacts]: /docs/components/pipelines/user-guides/data-handling/artifacts
|
||||
[cli]: /docs/components/pipelines/user-guides/core-functions/cli/
|
||||
[compile]: /docs/components/pipelines/user-guides/core-functions/compile-a-pipeline
|
||||
[compiler-compile]: https://kubeflow-pipelines.readthedocs.io/en/stable/source/compiler.html#kfp.compiler.Compiler.compile
|
||||
[components-load-component-from-file]: https://kubeflow-pipelines.readthedocs.io/en/stable/source/components.html#kfp.components.load_component_from_file
|
||||
[container-components]: https://www.kubeflow.org/docs/components/pipelines/v2/components/containerized-python-components/
|
||||
[containerized-python-components]: /docs/components/pipelines/v2/components/containerized-python-components/
|
||||
[container-components]: /docs/components/pipelines/user-guides/components/containerized-python-components/
|
||||
[containerized-python-components]: /docs/components/pipelines/user-guides/components/containerized-python-components/
|
||||
[create-custom-training-job-from-component]: https://cloud.google.com/vertex-ai/docs/pipelines/customjob-component
|
||||
[dsl-collected]: https://kubeflow-pipelines.readthedocs.io/en/stable/source/dsl.html#kfp.dsl.Collected
|
||||
[dsl-component]: https://kubeflow-pipelines.readthedocs.io/en/stable/source/dsl.html#kfp.dsl.component
|
||||
[dsl-container-component]: https://kubeflow-pipelines.readthedocs.io/en/stable/source/dsl.html#kfp.dsl.container_component
|
||||
[dsl-parallelfor]: https://kubeflow-pipelines.readthedocs.io/en/stable/source/dsl.html#kfp.dsl.ParallelFor
|
||||
[gcpc]: https://cloud.google.com/vertex-ai/docs/pipelines/components-introduction
|
||||
[ir-yaml]: /docs/components/pipelines/v2/compile-a-pipeline/#ir-yaml
|
||||
[lightweight-python-components]: /docs/components/pipelines/v2/components/lightweight-python-components/
|
||||
[load]: /docs/components/pipelines/v2/load-and-share-components/
|
||||
[ir-yaml]: /docs/components/pipelines/user-guides/core-functions/compile-a-pipeline/#ir-yaml
|
||||
[lightweight-python-components]: /docs/components/pipelines/user-guides/components/lightweight-python-components//
|
||||
[load]: /docs/components/pipelines/user-guides/components/load-and-share-components//
|
||||
[new-issue]: https://github.com/kubeflow/pipelines/issues/new
|
||||
[oss-be-v1]: /docs/components/pipelines/v1/
|
||||
[oss-be-v2]: /docs/components/pipelines/v2/installation/
|
||||
[parallelfor-control-flow]: /docs/components/pipelines/v2/pipelines/control-flow/#parallel-looping-dslparallelfor
|
||||
[parameters]: /docs/components/pipelines/v2/data-types/parameters
|
||||
[oss-be-v1]: /docs/components/pipelines/legacy-v1/
|
||||
[oss-be-v2]: /docs/components/pipelines/operator-guides/installation/
|
||||
[parallelfor-control-flow]: /docs/components/pipelines/user-guides/core-functions/control-flow/#parallel-looping-dslparallelfor
|
||||
[parameters]: /docs/components/pipelines/user-guides/data-handling/parameters
|
||||
[pipelines-repo]: https://github.com/kubeflow/pipelines
|
||||
[semver-minor-version]: https://semver.org/#:~:text=MINOR%20version%20when%20you%20add%20functionality%20in%20a%20backwards%20compatible%20manner
|
||||
[v1-component-yaml-example]: https://github.com/kubeflow/pipelines/blob/01c87f8a032e70a6ca92cdbefa974a7da387f204/sdk/python/test_data/v1_component_yaml/add_component.yaml
|
||||
|
|
@ -604,4 +604,4 @@ If you believe we missed a breaking change or an important migration step, pleas
|
|||
[vertex-sdk]: https://cloud.google.com/vertex-ai/docs/pipelines/run-pipeline#vertex-ai-sdk-for-python
|
||||
[argo]: https://argoproj.github.io/argo-workflows/
|
||||
[dsl-pipelinetask-set-env-variable]: https://kubeflow-pipelines.readthedocs.io/en/2.0.0b13/source/dsl.html#kfp.dsl.PipelineTask.set_env_variable
|
||||
[task-configuration-methods]: https://www.kubeflow.org/docs/components/pipelines/v2/pipelines/pipeline-basics/#task-configurations
|
||||
[task-configuration-methods]: /docs/components/pipelines/user-guides/components/compose-components-into-pipelines/#task-configurations
|
||||
|
|
@ -1,7 +0,0 @@
|
|||
+++
|
||||
title = "v2"
|
||||
description = "Kubeflow Pipelines v2 Documentation"
|
||||
weight = 16
|
||||
+++
|
||||
|
||||
{{% kfp-v2-keywords %}}
|
||||
|
|
@ -1,8 +0,0 @@
|
|||
+++
|
||||
title = "Administration"
|
||||
description = "Guidance on managing your Kubeflow Pipelines instances"
|
||||
weight = 5
|
||||
+++
|
||||
|
||||
When hosting Kubeflow and Kubeflow Pipelines, there are customizations and best practices
|
||||
you can apply to faciliate the administration of the Kubeflow Pipelines control plane.
|
||||
|
|
@ -1,353 +0,0 @@
|
|||
+++
|
||||
title = "Quickstart"
|
||||
description = "Get started with Kubeflow Pipelines"
|
||||
weight = 2
|
||||
+++
|
||||
|
||||
{{% kfp-v2-keywords %}}
|
||||
|
||||
<style type="text/css">
|
||||
summary::marker {
|
||||
font-size: 1.5rem;
|
||||
}
|
||||
summary {
|
||||
margin-bottom: 1.5rem;
|
||||
}
|
||||
</style>
|
||||
|
||||
<!-- TODO: add UI screenshots for final pipeline -->
|
||||
This tutorial helps you get started with deploying a KFP standalone instance, using KFP dashboard, and creating a pipeline with the KFP SDK.
|
||||
|
||||
Before you begin, you need the following prerequisites:
|
||||
|
||||
* **An existing Kubernetes cluster**: If you don't have a Kubernetes cluster, see [Installation][installation] for instructions about how to get one.
|
||||
|
||||
* **The [kubectl](https://kubernetes.io/docs/tasks/tools/) command-line tool**: Install and configure your [kubectl context](https://kubernetes.io/docs/tasks/access-application-cluster/configure-access-multiple-clusters/) to connect with your cluster.
|
||||
|
||||
After you complete the prerequisites, click each section to view the instructions:
|
||||
|
||||
<details>
|
||||
<summary><a name="kfp_qs_deployment"></a><h2 style="display:inline;">Deploy a KFP standalone instance into your cluster</h2></summary>
|
||||
|
||||
This step demonstrates how to deploy a KFP standalone instance into an existing Kubernetes cluster.
|
||||
|
||||
Run the following script after replacing `PIPELINE_VERSION` with the desired version of KFP (release are listed [here][releases]):
|
||||
|
||||
```shell
|
||||
export PIPELINE_VERSION={{% pipelines/latest-version %}}
|
||||
|
||||
kubectl apply -k "github.com/kubeflow/pipelines/manifests/kustomize/cluster-scoped-resources?ref=$PIPELINE_VERSION"
|
||||
kubectl wait --for condition=established --timeout=60s crd/applications.app.k8s.io
|
||||
kubectl apply -k "github.com/kubeflow/pipelines/manifests/kustomize/env/dev?ref=$PIPELINE_VERSION"
|
||||
```
|
||||
|
||||
After you deploy Kubernetes, obtain your KFP endpoint by following [these instructions][installation].
|
||||
<!-- TODO: add more precise section link and descriptive link text (with more context) when available -->
|
||||
</details>
|
||||
|
||||
<details>
|
||||
<summary><a name="kfp_qs_basic_pipeline"></a><h2 style="display:inline;">Run a basic pipeline
|
||||
using KFP Dashboard and SDK</h2></summary>
|
||||
|
||||
### **KFP Dashboard** ###
|
||||
Kubeflow Pipelines offers a few samples that you can use to try out
|
||||
Kubeflow Pipelines quickly. The steps below show you how to run a basic sample that
|
||||
includes some Python operations, but doesn't include a machine learning (ML)
|
||||
workload:
|
||||
|
||||
1. Click the name of the sample, **[Tutorial] Data passing in python components**, on the pipelines UI:
|
||||
<img src="/docs/images/v2/click-pipeline-example.png"
|
||||
alt="Pipelines UI"
|
||||
class="mt-3 mb-3 border border-info rounded">
|
||||
|
||||
2. Click **Create run**:
|
||||
<img src="/docs/images/v2/pipelines-start-run.png"
|
||||
alt="Creating a run on the pipelines UI"
|
||||
class="mt-3 mb-3 border border-info rounded">
|
||||
|
||||
3. Follow the prompts to create a **run**.
|
||||
The sample supplies default values for all the parameters you need. The
|
||||
following screenshot assumes you are now creating a run named _My first run_:
|
||||
<img src="/docs/images/v2/pipelines-start-run-details.png"
|
||||
alt="Details page of creating a run on the pipelines UI"
|
||||
class="mt-3 mb-3 border border-info rounded">
|
||||
|
||||
4. Click **Start** to run the pipeline.
|
||||
|
||||
5. Explore the graph and other aspects of your run by clicking on the nodes
|
||||
(components) of the graph and the other UI elements:
|
||||
<img src="/docs/images/v2/pipelines-basic-run.png"
|
||||
alt="Run results on the pipelines UI"
|
||||
class="mt-3 mb-3 border border-info rounded">
|
||||
|
||||
You can find the [source code for the **Data passing in python components** tutorial](https://github.com/kubeflow/pipelines/tree/2.0.0/samples/tutorials/Data%20passing%20in%20python%20components) in the Kubeflow Pipelines repo.
|
||||
|
||||
### **KFP SDK** ###
|
||||
This section shows how to use the KFP SDK to compose a pipeline and submit it for execution by KFP.
|
||||
|
||||
* Run the following script to install the KFP SDK:
|
||||
```shell
|
||||
pip install kfp
|
||||
```
|
||||
|
||||
The following simple pipeline adds two integers, and then adds another integer to the result to come up with a final sum.
|
||||
|
||||
```python
|
||||
from kfp import dsl
|
||||
from kfp import client
|
||||
|
||||
|
||||
@dsl.component
|
||||
def addition_component(num1: int, num2: int) -> int:
|
||||
return num1 + num2
|
||||
|
||||
|
||||
@dsl.pipeline(name='addition-pipeline')
|
||||
def my_pipeline(a: int, b: int, c: int = 10):
|
||||
add_task_1 = addition_component(num1=a, num2=b)
|
||||
add_task_2 = addition_component(num1=add_task_1.output, num2=c)
|
||||
|
||||
|
||||
endpoint = '<KFP_ENDPOINT>'
|
||||
kfp_client = client.Client(host=endpoint)
|
||||
run = kfp_client.create_run_from_pipeline_func(
|
||||
my_pipeline,
|
||||
arguments={
|
||||
'a': 1,
|
||||
'b': 2
|
||||
},
|
||||
)
|
||||
url = f'{endpoint}/#/runs/details/{run.run_id}'
|
||||
print(url)
|
||||
```
|
||||
|
||||
The above code consists of the following parts:
|
||||
|
||||
* In the first part, the following lines create a [Lightweight Python Component][lightweight-python-component] by using the `@dsl.component` decorator:
|
||||
|
||||
```python
|
||||
@dsl.component
|
||||
def addition_component(num1: int, num2: int) -> int:
|
||||
return num1 + num2
|
||||
```
|
||||
|
||||
The `@dsl.component` decorator transforms a Python function into a component, which can be used within a pipeline. You are required to specify the type annotations on the parameters as well as the return value, as these inform the KFP executor how to serialize and deserialize the data passed between components. The type annotations and return value also enable the KFP compiler to type check any data that is passed between pipeline tasks.
|
||||
|
||||
* In the second part, the following lines [create a pipeline][pipelines] by using the `@dsl.pipeline` decorator:
|
||||
|
||||
```python
|
||||
@dsl.pipeline(name='addition-pipeline')
|
||||
def my_pipeline(a: int, b: int, c: int = 10):
|
||||
...
|
||||
```
|
||||
|
||||
Like the component decorator, the `@dsl.pipeline` decorator transforms a Python function into a pipeline that can be executed by the KFP backend. The pipeline can have arguments. These arguments also require type annotations. In this example, the argument `c` has a default value of `10`.
|
||||
|
||||
* In the third part, the following lines connect the components together to form a computational directed acyclic graph (DAG) within the body of the pipeline function:
|
||||
|
||||
```python
|
||||
add_task_1 = addition_component(num1=a, num2=b)
|
||||
add_task_2 = addition_component(num1=add_task_1.output, num2=c)
|
||||
```
|
||||
|
||||
This example instantiates two different addition tasks from the same component named `addition_component`, by passing different arguments to the component function for each task, as follows:
|
||||
* The first task accepts pipeline parameters `a` and `b` as input arguments.
|
||||
* The second task accepts `add_task_1.output`, which is the output from `add_task_1`, as the first input argument. The pipeline parameter `c` is the second input argument.
|
||||
|
||||
You must always pass component arguments as keyword arguments.
|
||||
|
||||
* In the fourth part, the following lines instantiate a KFP client using the endpoint obtained in [deployment step](#kfp_qs_deployment) and submit the pipeline to the KFP backend with the required pipeline arguments:
|
||||
|
||||
```python
|
||||
endpoint = '<KFP_ENDPOINT>'
|
||||
kfp_client = client.Client(host=endpoint)
|
||||
run = kfp_client.create_run_from_pipeline_func(
|
||||
my_pipeline,
|
||||
arguments={
|
||||
'a': 1,
|
||||
'b': 2
|
||||
},
|
||||
)
|
||||
url = f'{endpoint}/#/runs/details/{run.run_id}'
|
||||
print(url)
|
||||
```
|
||||
|
||||
In this example, replace `endpoint` with the KFP endpoint URL you obtained in [deployment step](#kfp_qs_deployment).
|
||||
|
||||
Alternatively, you can compile the pipeline to [IR YAML][ir-yaml] for use at another time:
|
||||
|
||||
```python
|
||||
from kfp import compiler
|
||||
|
||||
compiler.Compiler().compile(pipeline_func=my_pipeline, package_path='pipeline.yaml')
|
||||
```
|
||||
To view the pipeline run on the KFP Dashboard, go to the URL printed above.
|
||||
|
||||
To view the details of each task, including input and output, click the appropriate task node.
|
||||
<!-- TODO: add logs to this list when available in v2 -->
|
||||
|
||||
<img src="/docs/images/pipelines/addition_pipeline_ui.png"
|
||||
alt="Pipelines Dashboard"
|
||||
class="mt-3 mb-3 border border-info rounded">
|
||||
|
||||
|
||||
</details>
|
||||
|
||||
<details>
|
||||
<summary><a name="kfp_qs_advanced_ml"></a><h2 style="display:inline;">Build a more advanced ML pipeline</h2></summary>
|
||||
|
||||
This step demonstrates how to build a more advanced machine learning (ML) pipeline that leverages additional KFP pipeline composition features.
|
||||
|
||||
The following ML pipeline creates a dataset, normalizes the features of the dataset as a preprocessing step, and trains a simple ML model on the data using different hyperparameters:
|
||||
|
||||
```python
|
||||
from typing import List
|
||||
|
||||
from kfp import client
|
||||
from kfp import dsl
|
||||
from kfp.dsl import Dataset
|
||||
from kfp.dsl import Input
|
||||
from kfp.dsl import Model
|
||||
from kfp.dsl import Output
|
||||
|
||||
|
||||
@dsl.component(packages_to_install=['pandas==1.3.5'])
|
||||
def create_dataset(iris_dataset: Output[Dataset]):
|
||||
import pandas as pd
|
||||
|
||||
csv_url = 'https://archive.ics.uci.edu/ml/machine-learning-databases/iris/iris.data'
|
||||
col_names = [
|
||||
'Sepal_Length', 'Sepal_Width', 'Petal_Length', 'Petal_Width', 'Labels'
|
||||
]
|
||||
df = pd.read_csv(csv_url, names=col_names)
|
||||
|
||||
with open(iris_dataset.path, 'w') as f:
|
||||
df.to_csv(f)
|
||||
|
||||
|
||||
@dsl.component(packages_to_install=['pandas==1.3.5', 'scikit-learn==1.0.2'])
|
||||
def normalize_dataset(
|
||||
input_iris_dataset: Input[Dataset],
|
||||
normalized_iris_dataset: Output[Dataset],
|
||||
standard_scaler: bool,
|
||||
min_max_scaler: bool,
|
||||
):
|
||||
if standard_scaler is min_max_scaler:
|
||||
raise ValueError(
|
||||
'Exactly one of standard_scaler or min_max_scaler must be True.')
|
||||
|
||||
import pandas as pd
|
||||
from sklearn.preprocessing import MinMaxScaler
|
||||
from sklearn.preprocessing import StandardScaler
|
||||
|
||||
with open(input_iris_dataset.path) as f:
|
||||
df = pd.read_csv(f)
|
||||
labels = df.pop('Labels')
|
||||
|
||||
if standard_scaler:
|
||||
scaler = StandardScaler()
|
||||
if min_max_scaler:
|
||||
scaler = MinMaxScaler()
|
||||
|
||||
df = pd.DataFrame(scaler.fit_transform(df))
|
||||
df['Labels'] = labels
|
||||
with open(normalized_iris_dataset.path, 'w') as f:
|
||||
df.to_csv(f)
|
||||
|
||||
|
||||
@dsl.component(packages_to_install=['pandas==1.3.5', 'scikit-learn==1.0.2'])
|
||||
def train_model(
|
||||
normalized_iris_dataset: Input[Dataset],
|
||||
model: Output[Model],
|
||||
n_neighbors: int,
|
||||
):
|
||||
import pickle
|
||||
|
||||
import pandas as pd
|
||||
from sklearn.model_selection import train_test_split
|
||||
from sklearn.neighbors import KNeighborsClassifier
|
||||
|
||||
with open(normalized_iris_dataset.path) as f:
|
||||
df = pd.read_csv(f)
|
||||
|
||||
y = df.pop('Labels')
|
||||
X = df
|
||||
|
||||
X_train, X_test, y_train, y_test = train_test_split(X, y, random_state=0)
|
||||
|
||||
clf = KNeighborsClassifier(n_neighbors=n_neighbors)
|
||||
clf.fit(X_train, y_train)
|
||||
with open(model.path, 'wb') as f:
|
||||
pickle.dump(clf, f)
|
||||
|
||||
|
||||
@dsl.pipeline(name='iris-training-pipeline')
|
||||
def my_pipeline(
|
||||
standard_scaler: bool,
|
||||
min_max_scaler: bool,
|
||||
neighbors: List[int],
|
||||
):
|
||||
create_dataset_task = create_dataset()
|
||||
|
||||
normalize_dataset_task = normalize_dataset(
|
||||
input_iris_dataset=create_dataset_task.outputs['iris_dataset'],
|
||||
standard_scaler=True,
|
||||
min_max_scaler=False)
|
||||
|
||||
with dsl.ParallelFor(neighbors) as n_neighbors:
|
||||
train_model(
|
||||
normalized_iris_dataset=normalize_dataset_task
|
||||
.outputs['normalized_iris_dataset'],
|
||||
n_neighbors=n_neighbors)
|
||||
|
||||
|
||||
endpoint = '<KFP_UI_URL>'
|
||||
kfp_client = client.Client(host=endpoint)
|
||||
run = kfp_client.create_run_from_pipeline_func(
|
||||
my_pipeline,
|
||||
arguments={
|
||||
'min_max_scaler': True,
|
||||
'standard_scaler': False,
|
||||
'neighbors': [3, 6, 9]
|
||||
},
|
||||
)
|
||||
url = f'{endpoint}/#/runs/details/{run.run_id}'
|
||||
print(url)
|
||||
```
|
||||
|
||||
This example introduces the following new features in the pipeline:
|
||||
|
||||
* Some Python **packages to install** are added at component runtime, using the `packages_to_install` argument on the `@dsl.component` decorator, as follows:
|
||||
|
||||
`@dsl.component(packages_to_install=['pandas==1.3.5'])`
|
||||
|
||||
To use a library after installing it, you must include its import statements within the scope of the component function, so that the library is imported at component runtime.
|
||||
|
||||
* **Input and output artifacts** of types `Dataset` and `Model` are introduced in the component signature to describe the input and output artifacts of the components. This is done using the type annotation generics `Input[]` and `Output[]` for input and output artifacts respectively.
|
||||
|
||||
Within the scope of a component, artifacts can be read (for inputs) and written (for outputs) via the `.path` attribute. The KFP backend ensures that *input* artifact files are copied *to* the executing pod's local file system from the remote storage at runtime, so that the component function can read input artifacts from the local file system. By comparison, *output* artifact files are copied *from* the local file system of the pod to remote storage, when the component finishes running. This way, the output artifacts persist outside the pod. In both cases, the component author needs to interact with the local file system only to create persistent artifacts.
|
||||
|
||||
The arguments for the parameters annotated with `Output[]` are not passed to components by the pipeline author. The KFP backend passes this artifact during component runtime, so that component authors don't need to be concerned about the path to which the output artifacts are written. After an output artifact is written, the backend executing the component recognizes the KFP artifact types (`Dataset` or `Model`), and organizes them on the Dashboard.
|
||||
|
||||
An output artifact can be passed as an input to a downstream component using the `.outputs` attribute of the source task and the output artifact parameter name, as follows:
|
||||
|
||||
`create_dataset_task.outputs['iris_dataset']`
|
||||
|
||||
* One of the **DSL control flow features**, `dsl.ParallelFor`, is used. It is a context manager that lets pipeline authors create tasks. These tasks execute in parallel in a loop. Using `dsl.ParallelFor` to iterate over the `neighbors` pipeline argument lets you execute the `train_model` component with different arguments and test multiple hyperparameters in one pipeline run. Other control flow features include `dsl.Condition` and `dsl.ExitHandler`.
|
||||
|
||||
</details>
|
||||
|
||||
Congratulations! You now have a KFP deployment, an end-to-end ML pipeline, and an introduction to the UI. That's just the beginning of KFP pipeline and Dashboard features.
|
||||
|
||||
<!TODO: Add some more content to direct the user to what comes next. -->
|
||||
|
||||
## Next steps
|
||||
|
||||
* See [Installation][installation] for additional ways to deploy KFP
|
||||
* See [Pipelines][pipelines] to learn more about features available when authoring pipelines
|
||||
|
||||
[pipelines]: /docs/components/pipelines/v2/pipelines/
|
||||
[installation]: /docs/components/pipelines/v2/installation/
|
||||
[ir-yaml]: /docs/components/pipelines/v2/compile-a-pipeline/#ir-yaml
|
||||
[lightweight-python-component]: /docs/components/pipelines/v2/components/lightweight-python-components
|
||||
[releases]: https://github.com/kubeflow/pipelines/releases?q=Version&expanded=true
|
||||
|
|
@ -1,9 +0,0 @@
|
|||
+++
|
||||
title = "Pipelines"
|
||||
description = "Author KFP pipelines"
|
||||
weight = 5
|
||||
+++
|
||||
|
||||
{{% kfp-v2-keywords %}}
|
||||
|
||||
A *pipeline* is a definition of a workflow containing one or more tasks, including how tasks relate to each other to form a computational graph. Pipelines may have inputs which can be passed to tasks within the pipeline and may surface outputs created by tasks within the pipeline. Pipelines can themselves be used as components within other pipelines.
|
||||
|
|
@ -38,9 +38,9 @@ Once Feast is installed within the same Kubernetes cluster as Kubeflow, users ca
|
|||
Feast APIs can roughly be grouped into the following sections:
|
||||
* __Feature definition and management__: Feast provides both a [Python SDK](https://docs.feast.dev/getting-started/quickstart) and [CLI](https://docs.feast.dev/reference/feast-cli-commands) for interacting with Feast Core. Feast Core allows users to define and register features and entities and their associated metadata and schemas. The Python SDK is typically used from within a Jupyter notebook by end users to administer Feast, but ML teams may opt to version control feature specifications in order to follow a GitOps based approach.
|
||||
|
||||
* __Model training__: The Feast Python SDK can be used to trigger the [creation of training datasets](https://docs.feast.dev/how-to-guides/feast-snowflake-gcp-aws/build-a-training-dataset). The most natural place to use this SDK is to create a training dataset as part of a [Kubeflow Pipeline](/docs/components/pipelines/introduction) prior to model training.
|
||||
* __Model training__: The Feast Python SDK can be used to trigger the [creation of training datasets](https://docs.feast.dev/how-to-guides/feast-snowflake-gcp-aws/build-a-training-dataset). The most natural place to use this SDK is to create a training dataset as part of a [Kubeflow Pipeline](/docs/components/pipelines/overview) prior to model training.
|
||||
|
||||
* __Model serving__: The Feast Python SDK can also be used for [online feature retrieval](https://docs.feast.dev/how-to-guides/feast-snowflake-gcp-aws/read-features-from-the-online-store). This client is used to retrieve feature values for inference with [Model Serving](/docs/components/pipelines/introduction) systems like KFServing, TFX, or Seldon.
|
||||
* __Model serving__: The Feast Python SDK can also be used for [online feature retrieval](https://docs.feast.dev/how-to-guides/feast-snowflake-gcp-aws/read-features-from-the-online-store). This client is used to retrieve feature values for inference with [Model Serving](/docs/components/pipelines/overview) systems like KFServing, TFX, or Seldon.
|
||||
|
||||
## Examples
|
||||
|
||||
|
|
|
|||
|
|
@ -33,4 +33,4 @@ workflow.
|
|||
## Next steps
|
||||
|
||||
Work through one of the
|
||||
[Kubeflow Pipelines samples](/docs/components/pipelines/tutorials/build-pipeline/).
|
||||
[Kubeflow Pipelines samples](/docs/components/pipelines/legacy-v1/tutorials/build-pipeline/).
|
||||
|
|
|
|||
|
|
@ -54,7 +54,7 @@
|
|||
inkscape:pagecheckerboard="false" />
|
||||
<a
|
||||
id="a3946"
|
||||
xlink:href="https://www.kubeflow.org/docs/components/pipelines/introduction/"
|
||||
xlink:href="https://www.kubeflow.org/docs/components/pipelines/overview/"
|
||||
target="_blank"
|
||||
transform="matrix(0.49363962,0,0,0.49963206,1.4381833,0.08830732)">
|
||||
<image
|
||||
|
|
|
|||
|
Before Width: | Height: | Size: 23 KiB After Width: | Height: | Size: 23 KiB |
Loading…
Reference in New Issue