Machine Learning Pipelines for Kubeflow
Go to file
Alexey Volkov dd59bc2597 SDK - Lightweight - Fixed regression for components without outputs (#1726) 2019-08-05 21:47:53 -07:00
.github/ISSUE_TEMPLATE add issue template (#1492) 2019-06-22 08:42:11 -07:00
backend fix cloudbuild failure (#1733) 2019-08-05 13:37:49 -07:00
components Components - Added the pymongo license URL (#1740) 2019-08-05 19:49:53 -07:00
contrib Samples - Cleaned up unnecessary usage of PipelineParam (#1631) 2019-07-23 14:16:15 -07:00
docs Set theme jekyll-theme-minimal 2019-05-31 14:14:31 -07:00
frontend Rename InputPath -> Source for Visualization API definition (#1717) 2019-08-05 10:55:50 -07:00
manifests/kustomize Move kustomize manifests a dedicate directory (#1690) 2019-07-30 10:21:16 -07:00
proxy Add proxy agent as optional kustomize component (#1325) 2019-05-14 15:24:21 -07:00
samples Add SageMaker create workteam and Ground Truth components, sample demo pipeline, other minor updates (#1716) 2019-08-05 15:33:50 -07:00
sdk SDK - Lightweight - Fixed regression for components without outputs (#1726) 2019-08-05 21:47:53 -07:00
test Restructure samples (#1710) 2019-08-01 17:31:37 -07:00
third_party Fix Makefile to add licenses using Go modules. (#674) 2019-01-14 15:25:27 -08:00
.cloudbuild.yaml Moved component_sdk to components/gcp/ (#1698) 2019-07-31 19:56:13 -07:00
.dockerignore Initial commit of the kubeflow/pipeline project. 2018-11-02 14:02:31 -07:00
.gitattributes Support filtering on storage state (#629) 2019-01-11 11:01:01 -08:00
.gitignore optimize UX for loading pipeline pages (#1085) 2019-04-05 14:56:28 -07:00
.release.cloudbuild.yaml add proxy image to cloud builder (#996) 2019-03-29 21:25:09 -07:00
.travis.yml Add code for python visualization service (#1651) 2019-08-01 11:30:15 -07:00
BUILD.bazel Record TFX output artifacts in Metadata store (#884) 2019-03-06 09:16:51 -08:00
CHANGELOG.md Changelog 0.1.25 (#1695) 2019-07-30 21:49:35 -07:00
CONTRIBUTING.md Initial commit of the kubeflow/pipeline project. 2018-11-02 14:02:31 -07:00
LICENSE Initial commit of the kubeflow/pipeline project. 2018-11-02 14:02:31 -07:00
Makefile Fix Makefile to add licenses using Go modules. (#674) 2019-01-14 15:25:27 -08:00
OWNERS Add myself as a approver/reviewer. (#1503) 2019-06-17 12:15:50 -07:00
README.md Updated links in READMEs. (#1544) 2019-06-24 16:32:07 -07:00
ROADMAP.md ROADMAP.md cosmetic changes (#846) 2019-02-22 15:03:45 -08:00
WORKSPACE Backend - Updated the version of the ml metadata package (#1725) 2019-08-02 17:03:52 -07:00
developer_guide.md switch third party images to gcr (#1622) 2019-07-16 12:03:21 -07:00
go.mod Backend - Updated the version of the ml metadata package (#1725) 2019-08-02 17:03:52 -07:00
go.sum Backend - Updated Argo package from v2.3.0 RC to final version (#1428) 2019-06-04 12:55:54 -07:00

README.md

Build Status Coverage Status SDK: Documentation Status

Overview of the Kubeflow pipelines service

Kubeflow is a machine learning (ML) toolkit that is dedicated to making deployments of ML workflows on Kubernetes simple, portable, and scalable.

Kubeflow pipelines are reusable end-to-end ML workflows built using the Kubeflow Pipelines SDK.

The Kubeflow pipelines service has the following goals:

  • End to end orchestration: enabling and simplifying the orchestration of end to end machine learning pipelines
  • Easy experimentation: making it easy for you to try numerous ideas and techniques, and manage your various trials/experiments.
  • Easy re-use: enabling you to re-use components and pipelines to quickly cobble together end to end solutions, without having to re-build each time.

Documentation

Get started with your first pipeline and read further information in the Kubeflow Pipelines overview.

See the various ways you can use the Kubeflow Pipelines SDK.

See the Kubeflow Pipelines API doc for API specification.

Consult the Python SDK reference docs when writing pipelines using the Python SDK.

Blog posts

Acknowledgments

Kubeflow pipelines uses Argo under the hood to orchestrate Kubernetes resources. The Argo community has been very supportive and we are very grateful.