pipelines/components
Jiaxiao Zheng e10e119af6
[Component] Add quota check in diagnose me component. (#3062)
* staging

* add quota check

* lint

* refactor and update component.yaml

* address comments
2020-02-13 11:52:36 -08:00
..
arena Make wget quieter (#2069) 2019-09-09 14:32:54 -07:00
aws [Component] Add VPC Interface Endpoint Support for SageMaker (#2299) 2019-10-03 18:11:56 -07:00
deprecated Remove dataflow components (#2161) 2019-09-23 11:12:27 -07:00
diagnostics/diagnose_me [Component] Add quota check in diagnose me component. (#3062) 2020-02-13 11:52:36 -08:00
filesystem Components - Filesystem (#2659) 2019-11-27 11:43:03 -08:00
gcp Release 85945e1092 (#2960) 2020-02-02 15:33:20 -08:00
git/clone Components - Git clone (#2658) 2019-11-26 20:29:19 -08:00
google-cloud/storage Components - Google Cloud Storage (#2532) 2019-11-07 18:06:19 -08:00
ibm-components update watson ML default framework version (#2398) 2019-10-15 14:36:09 -07:00
kubeflow Bump tensorflow in /components/kubeflow/dnntrainer/src (#2923) 2020-02-10 10:03:54 -08:00
local Release 85945e1092 (#2960) 2020-02-02 15:33:20 -08:00
nuclio add nuclio components (to build/deploy, delete, invoke functions) (#1295) 2019-05-08 01:58:33 -07:00
sample/keras/train_classifier SDK - Hiding Argo's workflow.uid placeholder behind DSL (#1683) 2019-10-07 18:33:11 -07:00
tfx Samples - Updated the TFX-KFP pipeline (#2867) 2020-01-17 16:22:07 -08:00
OWNERS add jiaxiao to the component owners (#2804) 2020-01-07 12:48:18 -08:00
README.md move old gcp components to deprecated folder (#2031) 2019-09-06 16:29:20 -07:00
build_image.sh common build image script (#815) 2019-02-13 10:37:19 -08:00
license.sh Initial commit of the kubeflow/pipeline project. 2018-11-02 14:02:31 -07:00
release.sh Build - Fix building TF images (#2736) 2019-12-16 14:25:38 -08:00
test_load_all_components.sh Test loading all component.yaml definitions (#1045) 2019-04-02 12:25:18 -07:00
third_party_licenses.csv Another fix of licenses (#2984) 2020-02-04 21:55:54 -08:00

README.md

Kubeflow pipeline components

Kubeflow pipeline components are implementations of Kubeflow pipeline tasks. Each task takes one or more artifacts as input and may produce one or more artifacts as output.

Example: XGBoost DataProc components

Each task usually includes two parts:

Client code The code that talks to endpoints to submit jobs. For example, code to talk to Google Dataproc API to submit a Spark job.

Runtime code The code that does the actual job and usually runs in the cluster. For example, Spark code that transforms raw data into preprocessed data.

Container A container image that runs the client code.

Note the naming convention for client code and runtime code—for a task named "mytask":

  • The mytask.py program contains the client code.
  • The mytask directory contains all the runtime code.

See how to use the Kubeflow Pipelines SDK and build your own components.