parent
98d59a3100
commit
ac4521780e
|
|
@ -26,7 +26,7 @@ To get started experimenting with the KFP Tekton Compiler, please [follow these
|
|||
If you would like to make code contributions take a look at the [Developer Guide](sdk/python/README.md) and go through
|
||||
the list of [open issues](https://github.com/kubeflow/kfp-tekton/issues).
|
||||
|
||||
We are currently using [Kubeflow Pipelines 0.2.2](https://github.com/kubeflow/pipelines/releases/tag/0.2.2) and
|
||||
We are currently using [Kubeflow Pipelines 0.5.0](https://github.com/kubeflow/pipelines/releases/tag/0.5.0) and
|
||||
[Tekton 0.11.3](https://github.com/tektoncd/pipeline/releases/tag/v0.11.3) for this project.
|
||||
|
||||
The [KFP, Argo and Tekton Feature Comparison](https://docs.google.com/spreadsheets/d/1LFUy86MhVrU2cRhXNsDU-OBzB4BlkT9C0ASD3hoXqpo/edit#gid=979402121)
|
||||
|
|
|
|||
129
sdk/README.md
129
sdk/README.md
|
|
@ -4,26 +4,53 @@ There is an [SDK](https://www.kubeflow.org/docs/pipelines/sdk/sdk-overview/)
|
|||
for `Kubeflow Pipeline` for end users to define end to end machine learning and data pipelines.
|
||||
The output of the KFP SDK compiler is YAML for [Argo](https://github.com/argoproj/argo).
|
||||
|
||||
We are updating the `Compiler` of the KFP SDK to generate `Tekton` YAML. Please go through these steps to ensure you are setup properly to use the updated compiler.
|
||||
We are updating the `Compiler` of the KFP SDK to generate `Tekton` YAML. Please go through the steps below
|
||||
to ensure you are set up properly to use the KFP-Tekton compiler.
|
||||
|
||||
|
||||
## Table of Contents
|
||||
|
||||
<!-- START of ToC generated by running ./tools/mdtoc.sh sdk/README.md -->
|
||||
|
||||
- [Project Prerequisites](#project-prerequisites)
|
||||
- [Tested Pipelines](#tested-pipelines)
|
||||
- [How to use the KFP-Tekton Compiler](#how-to-use-the-kfp-tekton-compiler)
|
||||
- [Build Tekton from Master](#build-tekton-from-master)
|
||||
- [Additional Features](#additional-features)
|
||||
- [1. Compile Kubeflow Pipelines as a Tekton PipelineRun](#1-compile-kubeflow-pipelines-as-a-tekton-pipelinerun)
|
||||
- [2. Compile Kubeflow Pipelines with Artifacts Enabled](#2-compile-kubeflow-pipelines-with-artifacts-enabled)
|
||||
- [List of Available Features](#list-of-available-features)
|
||||
- [Troubleshooting](#troubleshooting)
|
||||
|
||||
<!-- END of ToC generated by running ./tools/mdtoc.sh sdk/README.md -->
|
||||
|
||||
|
||||
## Project Prerequisites
|
||||
|
||||
- Python: `3.7.5`
|
||||
- Kubeflow Pipelines: [`0.2.2`](https://github.com/kubeflow/pipelines/releases/tag/0.2.2)
|
||||
- Kubeflow Pipelines: [`0.5.0`](https://github.com/kubeflow/pipelines/releases/tag/0.5.0)
|
||||
- Tekton: [`0.11.3`](https://github.com/tektoncd/pipeline/releases/tag/v0.11.3)
|
||||
- For KFP, we shouldn't modify the default work directory for any component. Therefore, please run the below command to disable the [home and work directory overwrite](https://github.com/tektoncd/pipeline/blob/master/docs/install.md#customizing-the-pipelines-controller-behavior) from Tekton default.
|
||||
```shell
|
||||
kubectl patch cm feature-flags -n tekton-pipelines -p '{"data":{"disable-home-env-overwrite":"true","disable-working-directory-overwrite":"true"}}'
|
||||
```
|
||||
- Tekton CLI: [`0.8.0`](https://github.com/tektoncd/cli/releases/tag/v0.8.0)
|
||||
|
||||
Follow the instructions for [installing project prerequisites](/sdk/python/README.md#development-prerequisites)
|
||||
and take note of some important caveats.
|
||||
|
||||
|
||||
## Tested Pipelines
|
||||
|
||||
We are running the tests over approximately 80+ Pipelines spread across different Kubeflow Pipelines repository, specifically pipelines in KFP compiler test data, KFP core samples and 3rd-party contributed pipelines folders.
|
||||
We are [testing the compiler](/sdk/python/tests/README.md) on more than 80 pipelines found in the Kubeflow Pipelines
|
||||
repository, specifically the pipelines in KFP compiler `testdata` folder, the KFP core samples and the samples
|
||||
contributed by third parties.
|
||||
|
||||
## Steps
|
||||
A report card of Kubeflow Pipelines samples that are currently supported by the `kfp-tekton` compiler can be found
|
||||
[here](/sdk/python/tests/test_kfp_samples_report.txt). As you work on a PR that enables another of the missing features
|
||||
please ensure that your code changes are improving the number of successfully compiled KFP pipeline samples.
|
||||
|
||||
|
||||
## How to use the KFP-Tekton Compiler
|
||||
|
||||
1. Clone the `kfp-tekton` repo:
|
||||
|
||||
1. Clone the kfp-tekton repo:
|
||||
- `git clone https://github.com/kubeflow/kfp-tekton.git`
|
||||
- `cd kfp-tekton`
|
||||
|
||||
|
|
@ -50,15 +77,12 @@ We are running the tests over approximately 80+ Pipelines spread across differen
|
|||
- `tkn pipeline start parallel-pipeline --showlog`
|
||||
|
||||
You should see messages asking for default URLs like below. Press `enter` and take the defaults
|
||||
|
||||
```bash
|
||||
? Value for param `url1` of type `string`? (Default is `gs://ml-pipeline-playgro
|
||||
? Value for param `url1` of type `string`? (Default is `gs://ml-pipeline-playground/shakespeare1.txt`) gs://ml-pipeline-
|
||||
playground/shakespeare1.txt
|
||||
? Value for param `url2` of type `string`? (Default is `gs://ml-pipeline-playgro? Value for param `url2` of type `string`? (Default
|
||||
is `gs://ml-pipeline-playground/shakespeare2.txt`) gs://ml-pipeline-playground/shakespeare2.txt
|
||||
? Value for param `url1` of type `string`? (Default is `gs://ml-pipeline-playground/shakespeare1.txt`) gs://ml-pipeline-playground/shakespeare1.txt
|
||||
? Value for param `url2` of type `string`? (Default is `gs://ml-pipeline-playground/shakespeare2.txt`) gs://ml-pipeline-playground/shakespeare2.txt
|
||||
|
||||
Pipelinerun started: parallel-pipeline-run-th4x6
|
||||
|
||||
```
|
||||
|
||||
We will see the logs of the running Tekton Pipeline streamed, similar to the one below
|
||||
|
|
@ -66,45 +90,63 @@ We are running the tests over approximately 80+ Pipelines spread across differen
|
|||
```bash
|
||||
Waiting for logs to be available...
|
||||
|
||||
[gcs-download-2 : gcs-download-2] I find thou art no less than fame hath bruited And more than may be gatherd by thy shape Let my
|
||||
presumption not provoke thy wrath
|
||||
[gcs-download : gcs-download] With which he yoketh your rebellious necks Razeth your cities and subverts your towns And in a moment makes them desolate
|
||||
[echo : echo] Text 1: With which he yoketh your rebellious necks Razeth your cities and subverts your towns And in a moment makes them desolate
|
||||
[echo : echo] Text 2: I find thou art no less than fame hath bruited And more than may be gatherd by thy shape Let my presumption not
|
||||
provoke thy wrath
|
||||
[gcs-download-2 : gcs-download-2] I find thou art no less than fame hath bruited And more than may be gatherd by thy shape Let my presumption not provoke thy wrath
|
||||
[gcs-download : gcs-download] With which he yoketh your rebellious necks Razeth your cities and subverts your towns And in a moment makes them desolate
|
||||
[echo : echo] Text 1: With which he yoketh your rebellious necks Razeth your cities and subverts your towns And in a moment makes them desolate
|
||||
[echo : echo] Text 2: I find thou art no less than fame hath bruited And more than may be gatherd by thy shape Let my presumption not provoke thy wrath
|
||||
```
|
||||
|
||||
|
||||
## Build Tekton from Master
|
||||
|
||||
In order to utilize the latest features and functions the team has been driving in Tekton, we suggest that Tekton must be built from [master](https://github.com/tektoncd/pipeline/blob/master/DEVELOPMENT.md#install-pipeline). Features that require special builds different from the 'Tested Version' will be listed below.
|
||||
|
||||
## Test Kubeflow Pipelines with Tekton
|
||||
In order to utilize the latest features and functions of the `kfp-tekton` compiler, we suggest to install Tekton from a
|
||||
nightly built or build it from the [master](https://github.com/tektoncd/pipeline/blob/master/DEVELOPMENT.md#install-pipeline) branch.
|
||||
Features that require special builds different from the 'Tested Version' will be listed below.
|
||||
|
||||
Please [refer to the instructions here](./python/tests/README.md) as you work on a PR test sample Kubeflow Pipelines in their test data folder to ensure your PR is improving the number of successful samples
|
||||
|
||||
## Experimental features
|
||||
## Additional Features
|
||||
|
||||
### 1. Compile Kubeflow Pipelines as Tekton pipelineRun
|
||||
### 1. Compile Kubeflow Pipelines as a Tekton PipelineRun
|
||||
|
||||
By default, Tekton pipelineRun is generated by the `tkn` CLI so that users can interactively change their pipeline parameters during each execution. However, `tkn` CLI is lagging several important features when generating pipelineRun. Therefore, we added support for generating pipelineRun using `dsl-compile-tekton` with all the latest kfp-tekton compiler features. The comparison between Tekton pipeline and Argo workflow is described in our [design docs](https://docs.google.com/document/d/1oXOdiItI4GbEe_qzyBmMAqfLBjfYX1nM94WHY3EPa94/edit#heading=h.f38y0bqkxo87).
|
||||
By default, a Tekton [`PipelineRun`](https://github.com/tektoncd/pipeline/blob/master/docs/pipelineruns.md#overview)
|
||||
is generated by the `tkn` CLI so that users can interactively change their pipeline parameters during each execution.
|
||||
However, `tkn` CLI is lagging several important features when generating `PipelineRun`.
|
||||
Therefore, we added support for generating pipelineRun using `dsl-compile-tekton` with all the latest `kfp-tekton` compiler
|
||||
features. The comparison between Tekton pipeline and Argo workflow is described in our
|
||||
[design docs](https://docs.google.com/document/d/1oXOdiItI4GbEe_qzyBmMAqfLBjfYX1nM94WHY3EPa94/edit#heading=h.f38y0bqkxo87).
|
||||
|
||||
Compiling Kubeflow Pipelines into Tekton pipelineRun is currently under the experimental stage. [Here](https://github.com/tektoncd/pipeline/blob/master/docs/pipelineruns.md) is the list of supported features in pipelineRun.
|
||||
Compiling Kubeflow Pipelines into a Tekton `PipelineRun` is currently in the experimental stage.
|
||||
[Here](https://github.com/tektoncd/pipeline/blob/master/docs/pipelineruns.md) is the list of supported features in `PipelineRun`.
|
||||
|
||||
As of today, the below pipelineRun features are available within `dsl-compile-tekton`:
|
||||
- Affinity
|
||||
- Node Selector
|
||||
- Tolerations
|
||||
As of today, the below `PipelineRun` features are available within `dsl-compile-tekton`:
|
||||
- Affinity
|
||||
- Node Selector
|
||||
- Tolerations
|
||||
|
||||
To compile Kubeflow Pipelines as Tekton pipelineRun, simply add the `--generate-pipelinerun` as part of your `dsl-compile-tekton`commands. e.g.
|
||||
- `dsl-compile-tekton --py sdk/python/tests/compiler/testdata/tolerations.py --output pipeline.yaml --generate-pipelinerun`
|
||||
To compile Kubeflow Pipelines as Tekton pipelineRun, simply add the `--generate-pipelinerun` parameter to your `dsl-compile-tekton`
|
||||
commands. e.g.:
|
||||
|
||||
### 2. Compile Kubeflow Pipelines with artifact enabled
|
||||
dsl-compile-tekton \
|
||||
--py sdk/python/tests/compiler/testdata/tolerations.py \
|
||||
--output pipeline.yaml \
|
||||
--generate-pipelinerun
|
||||
|
||||
Prerequisite: Install [Kubeflow Pipeline](https://www.kubeflow.org/docs/pipelines/installation/).
|
||||
|
||||
By default, artifacts are disabled because it's dependent on Kubeflow Pipeline's minio setup. When artifacts are enabled, all the output parameters are also treated as artifacts and persist to the default object storage. Enabling artifacts also allow files to be downloaded or stored as artifact inputs/outputs. Since artifacts are depending on the Kubeflow Pipeline's setup by default, the generated Tekton pipeline must be deployed to the same namespace as Kubeflow Pipeline.
|
||||
### 2. Compile Kubeflow Pipelines with Artifacts Enabled
|
||||
|
||||
**Prerequisite**: Install [Kubeflow Pipeline](https://www.kubeflow.org/docs/pipelines/installation/).
|
||||
|
||||
By default, _artifacts_ are disabled because they are dependent on Kubeflow Pipeline's
|
||||
[Minio](https://docs.minio.io/) storage. When artifacts are enabled, all the output parameters are
|
||||
also treated as artifacts and persisted to the default object storage. Enabling artifacts
|
||||
also allows files to be downloaded or stored as artifact inputs/outputs.
|
||||
Since artifacts are dependent on the Kubeflow Pipeline's deployment, the generated Tekton pipeline
|
||||
must be deployed to the same namespace as Kubeflow Pipelines.
|
||||
|
||||
To compile Kubeflow Pipelines as a Tekton `PipelineRun`, simply add the `--enable-artifacts` argument
|
||||
to your `dsl-compile-tekton` commands. Then, run the pipeline in the same namespace that is used by
|
||||
Kubeflow Pipelines (typically `kubeflow`) by using the `-n` flag. e.g.:
|
||||
|
||||
To compile Kubeflow Pipelines as Tekton pipelineRun, simply add the `--enable-artifacts` as part of your `dsl-compile-tekton` commands. Then, run the pipeline on the same namespace as Kubeflow pipeline using the `-n` flag. e.g.
|
||||
```shell
|
||||
dsl-compile-tekton --py sdk/python/tests/compiler/testdata/artifact_location.py --output pipeline.yaml --enable-artifacts
|
||||
kubectl apply -f pipeline.yaml -n kubeflow
|
||||
|
|
@ -125,9 +167,12 @@ Waiting for logs to be available...
|
|||
[generate-output : copy-artifacts] Total: 0 B, Transferred: 6 B, Speed: 504 B/s
|
||||
```
|
||||
|
||||
## List of available features
|
||||
If you want to understand how each feature is implemented and its current status, please visit the [FEATURES](FEATURES.md) docs.
|
||||
|
||||
## List of Available Features
|
||||
|
||||
To understand how each feature is implemented and its current status, please visit the [FEATURES](FEATURES.md) doc.
|
||||
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
- When you encounter permission issues related to ServiceAccount, refer to [Servince Account and RBAC doc](sa-and-rbac.md)
|
||||
When you encounter permission issues related to ServiceAccount, refer to [Servince Account and RBAC doc](sa-and-rbac.md)
|
||||
|
|
|
|||
|
|
@ -54,10 +54,17 @@ or simply run:
|
|||
**Note**, if your container runtime does not support image-reference:tag@digest (like cri-o used in OpenShift 4.x),
|
||||
use `release.notags.yaml` instead.
|
||||
|
||||
**Note**, for KFP, we shouldn't modify the default work directory for any component. Therefore, please run the following
|
||||
command to disable the [home and work directory overwrite](https://github.com/tektoncd/pipeline/blob/master/docs/install.md#customizing-the-pipelines-controller-behavior) from Tekton default.
|
||||
|
||||
kubectl patch cm feature-flags -n tekton-pipelines \
|
||||
-p '{"data":{"disable-home-env-overwrite":"true","disable-working-directory-overwrite":"true"}}'
|
||||
|
||||
Optionally, for convenience, set the default namespace to `tekton-pipelines`:
|
||||
|
||||
kubectl config set-context --current --namespace=tekton-pipelines
|
||||
|
||||
|
||||
#### Tekton CLI
|
||||
|
||||
Follow the instructions [here](https://github.com/tektoncd/cli#installing-tkn).
|
||||
|
|
@ -94,7 +101,7 @@ the Tekton YAML instead of Argo YAML. Since the KFP SDK was not designed and imp
|
|||
_monkey-patching_ was used to replace non-class methods and functions at runtime.
|
||||
|
||||
In order for the _monkey patch_ to work properly, the `kfp-tekton` compiler source code has to be aligned with a
|
||||
specific version of the `kfp` SDK compiler. As of now that version is [`0.2.2`](https://github.com/kubeflow/pipelines/releases/tag/0.2.2).
|
||||
specific version of the `kfp` SDK compiler. As of now that version is [`0.5.0`](https://github.com/kubeflow/pipelines/releases/tag/0.5.0).
|
||||
|
||||
|
||||
## Adding New Code
|
||||
|
|
|
|||
|
|
@ -39,9 +39,6 @@ def convert_k8s_obj_to_json(k8s_obj):
|
|||
if k8s_obj is None:
|
||||
return None
|
||||
elif isinstance(k8s_obj, PRIMITIVE_TYPES):
|
||||
# if re.search(r'^{{.*}}$', k8s_obj):
|
||||
# k8s_obj = k8s_obj.replace('{{', '$(')
|
||||
# k8s_obj = k8s_obj.replace('}}', ')')
|
||||
return k8s_obj
|
||||
elif isinstance(k8s_obj, list):
|
||||
return [convert_k8s_obj_to_json(sub_obj)
|
||||
|
|
@ -64,10 +61,8 @@ def convert_k8s_obj_to_json(k8s_obj):
|
|||
# and attributes which value is not None.
|
||||
# Convert attribute name to json key in
|
||||
# model definition for request.
|
||||
attr_types = (k8s_obj.swagger_types if hasattr(k8s_obj, "swagger_types")
|
||||
else k8s_obj.openapi_types)
|
||||
obj_dict = {k8s_obj.attribute_map[attr]: getattr(k8s_obj, attr)
|
||||
for attr, _ in iteritems(attr_types)
|
||||
for attr in k8s_obj.attribute_map
|
||||
if getattr(k8s_obj, attr) is not None}
|
||||
|
||||
return {key: convert_k8s_obj_to_json(val)
|
||||
|
|
|
|||
|
|
@ -11,17 +11,20 @@
|
|||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
from collections import OrderedDict
|
||||
|
||||
from kfp.compiler._k8s_helper import convert_k8s_obj_to_json
|
||||
from kfp.compiler._op_to_template import _process_obj, _inputs_to_json, _outputs_to_json
|
||||
from kfp import dsl
|
||||
from kfp.dsl._container_op import BaseOp
|
||||
from typing import List, Text, Dict, Any
|
||||
from kfp.dsl import ArtifactLocation
|
||||
import json
|
||||
import re
|
||||
import textwrap
|
||||
import yaml
|
||||
import re
|
||||
|
||||
from collections import OrderedDict
|
||||
from typing import List, Text, Dict, Any
|
||||
|
||||
from kfp import dsl
|
||||
from kfp.compiler._k8s_helper import convert_k8s_obj_to_json
|
||||
from kfp.compiler._op_to_template import _process_obj, _inputs_to_json, _outputs_to_json
|
||||
from kfp.dsl import ArtifactLocation
|
||||
from kfp.dsl._container_op import BaseOp
|
||||
|
||||
from .. import tekton_api_version
|
||||
|
||||
|
|
@ -482,10 +485,6 @@ def _op_to_template(op: BaseOp, enable_artifacts=False):
|
|||
if copy_artifacts_step:
|
||||
template['spec']['steps'].append(copy_artifacts_step)
|
||||
|
||||
# **********************************************************
|
||||
# NOTE: the following features are still under development
|
||||
# **********************************************************
|
||||
|
||||
# metadata
|
||||
if processed_op.pod_annotations or processed_op.pod_labels:
|
||||
template.setdefault('metadata', {}) # Tekton change, don't wipe out existing metadata
|
||||
|
|
@ -508,7 +507,10 @@ def _op_to_template(op: BaseOp, enable_artifacts=False):
|
|||
template.setdefault('metadata', {}).setdefault('annotations', {})['pipelines.kubeflow.org/task_display_name'] = processed_op.display_name
|
||||
|
||||
if isinstance(op, dsl.ContainerOp) and op._metadata:
|
||||
import json
|
||||
template.setdefault('metadata', {}).setdefault('annotations', {})['pipelines.kubeflow.org/component_spec'] = json.dumps(op._metadata.to_dict(), sort_keys=True)
|
||||
|
||||
if isinstance(op, dsl.ContainerOp) and op.execution_options:
|
||||
if op.execution_options.caching_strategy.max_cache_staleness:
|
||||
template.setdefault('metadata', {}).setdefault('annotations', {})['pipelines.kubeflow.org/max_cache_staleness'] = str(op.execution_options.caching_strategy.max_cache_staleness)
|
||||
|
||||
return template
|
||||
|
|
|
|||
|
|
@ -29,6 +29,7 @@ from kfp import dsl
|
|||
from kfp.compiler._default_transformers import add_pod_env
|
||||
from kfp.compiler._k8s_helper import sanitize_k8s_name
|
||||
from kfp.compiler.compiler import Compiler
|
||||
# from kfp.components._yaml_utils import dump_yaml
|
||||
from kfp.components.structures import InputSpec
|
||||
from kfp.dsl._metadata import _extract_pipeline_metadata
|
||||
from kfp.compiler._k8s_helper import convert_k8s_obj_to_json
|
||||
|
|
@ -627,6 +628,7 @@ class TektonCompiler(Compiler) :
|
|||
package_path: file path to be written. If not specified, a yaml_text string
|
||||
will be returned.
|
||||
"""
|
||||
# yaml_text = dump_yaml(workflow)
|
||||
yaml.Dumper.ignore_aliases = lambda *args : True
|
||||
yaml_text = yaml.dump_all(workflow, default_flow_style=False) # Tekton change
|
||||
|
||||
|
|
@ -694,3 +696,35 @@ class TektonCompiler(Compiler) :
|
|||
params_list,
|
||||
pipeline_conf)
|
||||
TektonCompiler._write_workflow(workflow=workflow, package_path=package_path) # Tekton change
|
||||
_validate_workflow(workflow)
|
||||
|
||||
|
||||
def _validate_workflow(workflow: List[Dict[Text, Any]]): # Tekton change, signature
|
||||
# TODO: Tekton pipeline parameter validation
|
||||
# workflow = workflow.copy()
|
||||
# # Working around Argo lint issue
|
||||
# for argument in workflow['spec'].get('arguments', {}).get('parameters', []):
|
||||
# if 'value' not in argument:
|
||||
# argument['value'] = ''
|
||||
#
|
||||
# yaml_text = dump_yaml(workflow)
|
||||
# if '{{pipelineparam' in yaml_text:
|
||||
# raise RuntimeError(
|
||||
# '''Internal compiler error: Found unresolved PipelineParam.
|
||||
# Please create a new issue at https://github.com/kubeflow/pipelines/issues attaching the pipeline code and the pipeline package.'''
|
||||
# )
|
||||
|
||||
# TODO: Tekton lint, if a tool exists for it
|
||||
# # Running Argo lint if available
|
||||
# import shutil
|
||||
# import subprocess
|
||||
# argo_path = shutil.which('argo')
|
||||
# if argo_path:
|
||||
# result = subprocess.run([argo_path, 'lint', '/dev/stdin'], input=yaml_text.encode('utf-8'), stdout=subprocess.PIPE, stderr=subprocess.PIPE)
|
||||
# if result.returncode:
|
||||
# raise RuntimeError(
|
||||
# '''Internal compiler error: Compiler has produced Argo-incompatible workflow.
|
||||
# Please create a new issue at https://github.com/kubeflow/pipelines/issues attaching the pipeline code and the pipeline package.
|
||||
# Error: {}'''.format(result.stderr.decode('utf-8'))
|
||||
# )
|
||||
pass
|
||||
|
|
|
|||
|
|
@ -1 +1 @@
|
|||
kfp>=0.2.2
|
||||
kfp==0.5.0
|
||||
|
|
|
|||
|
|
@ -24,7 +24,7 @@ LICENSE = "Apache 2.0"
|
|||
HOMEPAGE = "https://github.com/kubeflow/kfp-tekton/"
|
||||
|
||||
REQUIRES = [
|
||||
'kfp>=0.2.2',
|
||||
'kfp==0.5.0',
|
||||
]
|
||||
|
||||
|
||||
|
|
@ -47,8 +47,8 @@ def find_version(*file_path_parts):
|
|||
setup(
|
||||
name=NAME,
|
||||
version=find_version("kfp_tekton", "__init__.py"),
|
||||
description="KubeFlow Pipelines compiler generating Tekton YAML (instead of Argo YAML).",
|
||||
long_description="Extension of Kubeflow Pipelines compiler generating Tekton YAML (instead of Argo YAML).",
|
||||
description="Kubeflow Pipelines DSL compiler generating Tekton YAML (instead of Argo YAML).",
|
||||
long_description="Extension of the Kubeflow Pipelines compiler generating Tekton YAML (instead of Argo YAML).",
|
||||
author="kubeflow.org",
|
||||
license=LICENSE,
|
||||
url=HOMEPAGE,
|
||||
|
|
|
|||
|
|
@ -24,7 +24,7 @@ or run this command from the project root directory:
|
|||
You should see an output similar to the one below, outlining which test scripts have passed and which are failing:
|
||||
|
||||
```YAML
|
||||
KFP version: 0.2.2
|
||||
KFP version: 0.5.0
|
||||
|
||||
SUCCESS: add_pod_env.py
|
||||
SUCCESS: artifact_location.py
|
||||
|
|
@ -35,6 +35,7 @@ SUCCESS: compose.py
|
|||
SUCCESS: default_value.py
|
||||
SUCCESS: input_artifact_raw_value.py
|
||||
FAILURE: loop_over_lightweight_output.py
|
||||
FAILURE: parallelfor_item_argument_resolving.py
|
||||
SUCCESS: param_op_transform.py
|
||||
SUCCESS: param_substitutions.py
|
||||
SUCCESS: pipelineparams.py
|
||||
|
|
@ -57,16 +58,20 @@ FAILURE: withparam_global_dict.py
|
|||
FAILURE: withparam_output.py
|
||||
FAILURE: withparam_output_dict.py
|
||||
|
||||
Success: 25
|
||||
Failure: 5
|
||||
Total: 30
|
||||
Compilation status for testdata DSL scripts:
|
||||
|
||||
Success: 25
|
||||
Failure: 6
|
||||
Total: 31
|
||||
|
||||
Overall success rate: 25/31 = 81%
|
||||
|
||||
Compilation status report: sdk/python/tests/test_kfp_samples_report.txt
|
||||
Accumulated compiler logs: temp/test_kfp_samples_output.txt
|
||||
Compiled Tekton YAML files: temp/tekton_compiler_output/
|
||||
```
|
||||
|
||||
The goal is to have all the 30 tests pass before we can have a degree of confidence that the compiler can handle
|
||||
The goal is to have all the `31` tests pass before we can have a degree of confidence that the compiler can handle
|
||||
a fair number of pipelines.
|
||||
|
||||
|
||||
|
|
@ -85,22 +90,22 @@ This will include all `core/samples`, 3rd-party contributed samples, tutorials,
|
|||
Compilation status for testdata DSL scripts:
|
||||
|
||||
Success: 25
|
||||
Failure: 5
|
||||
Total: 30
|
||||
Failure: 6
|
||||
Total: 31
|
||||
|
||||
Compilation status for core samples:
|
||||
|
||||
Success: 18
|
||||
Failure: 5
|
||||
Total: 23
|
||||
Failure: 4
|
||||
Total: 22
|
||||
|
||||
Compilation status for 3rd-party contributed samples:
|
||||
|
||||
Success: 23
|
||||
Failure: 5
|
||||
Total: 28
|
||||
Success: 25
|
||||
Failure: 7
|
||||
Total: 32
|
||||
|
||||
Overall success rate: 69/84 = 82%
|
||||
Overall success rate: 71/88 = 81%
|
||||
```
|
||||
|
||||
When the `--print-error-details` flag is used, a summary of all the compilation errors is appended to the console
|
||||
|
|
@ -111,23 +116,21 @@ output -- sorted by their respective number of occurrences:
|
|||
```YAML
|
||||
...
|
||||
|
||||
Overall success rate: 69/84 = 82%
|
||||
Overall success rate: 71/88 = 81%
|
||||
|
||||
Occurences of NotImplementedError:
|
||||
7: dynamic params are not yet implemented
|
||||
8 dynamic params are not yet implemented
|
||||
|
||||
Occurences of other Errors:
|
||||
2 ValueError: These Argo variables are not supported in Tekton Pipeline: {{workflow.uid}}
|
||||
2 ValueError: These Argo variables are not supported in Tekton Pipeline: {{pod.name}}, {{workflow.name}}
|
||||
1 ValueError: These Argo variables are not supported in Tekton Pipeline: {{workflow.uid}}, {{pod.name}}
|
||||
1 ValueError: These Argo variables are not supported in Tekton Pipeline: {{workflow.name}}, {{pod.name}}
|
||||
1 ValueError: These Argo variables are not supported in Tekton Pipeline: {{workflow.name}}
|
||||
1 ValueError: These Argo variables are not supported in Tekton Pipeline: {{pod.name}}, {{workflow.uid}}
|
||||
1 ValueError: These Argo variables are not supported in Tekton Pipeline: {{pod.name}}, {{workflow.name}}
|
||||
1 ValueError: There are multiple pipelines: ['flipcoin_pipeline', 'flipcoin_exit_pipeline']. Please specify --function.
|
||||
1 ValueError: A function with @dsl.pipeline decorator is required in the py file.
|
||||
```
|
||||
|
||||
## Disclaimer
|
||||
|
||||
**Note:** The reports above were created for the pipeline scripts found in KFP version `0.2.2` since the
|
||||
`kfp_tekton` compiler code is still based on the `kfp` SDK compiler version `0.2.2`. We are working on
|
||||
upgrading the `kfp_tekton` compiler code to be based on `kfp` version `0.5.0`
|
||||
([issue #133](https://github.com/kubeflow/kfp-tekton/issues/133)).
|
||||
**Note:** The reports above were created for the pipeline scripts found in KFP version `0.5.0` since the
|
||||
`kfp_tekton` compiler code is currently based on the `kfp` SDK compiler version `0.5.0`.
|
||||
|
|
|
|||
|
|
@ -12,20 +12,27 @@
|
|||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
import logging
|
||||
import os
|
||||
import re
|
||||
import shutil
|
||||
import tempfile
|
||||
import textwrap
|
||||
import unittest
|
||||
import yaml
|
||||
import re
|
||||
import textwrap
|
||||
from os import environ as env
|
||||
|
||||
from kfp_tekton import compiler
|
||||
|
||||
|
||||
# after code changes that change the YAML output, temporarily set this flag to True
|
||||
# in order to generate new "golden" YAML files
|
||||
GENERATE_GOLDEN_YAML = False
|
||||
GENERATE_GOLDEN_YAML = env.get("GENERATE_GOLDEN_YAML", "False") == "True"
|
||||
|
||||
if GENERATE_GOLDEN_YAML:
|
||||
logging.warning(
|
||||
"The environment variable 'GENERATE_GOLDEN_YAML' was set to 'True'. Test cases will regenerate "
|
||||
"the 'golden' YAML files instead of verifying the YAML produced by compiler.")
|
||||
|
||||
# License header for Kubeflow project
|
||||
LICENSE_HEADER = textwrap.dedent("""\
|
||||
|
|
@ -329,6 +336,8 @@ class TestTektonCompiler(unittest.TestCase):
|
|||
"""
|
||||
if GENERATE_GOLDEN_YAML:
|
||||
with open(golden_yaml_file, 'w') as f:
|
||||
f.write(LICENSE_HEADER)
|
||||
with open(golden_yaml_file, 'a+') as f:
|
||||
yaml.dump_all(compiled_workflow, f, default_flow_style=False)
|
||||
else:
|
||||
with open(golden_yaml_file, 'r') as f:
|
||||
|
|
|
|||
|
|
@ -11,6 +11,7 @@
|
|||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
apiVersion: tekton.dev/v1beta1
|
||||
kind: Task
|
||||
metadata:
|
||||
|
|
|
|||
|
|
@ -11,6 +11,7 @@
|
|||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
apiVersion: tekton.dev/v1beta1
|
||||
kind: Task
|
||||
metadata:
|
||||
|
|
|
|||
|
|
@ -11,6 +11,7 @@
|
|||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
apiVersion: tekton.dev/v1alpha1
|
||||
kind: Condition
|
||||
metadata:
|
||||
|
|
|
|||
|
|
@ -14,6 +14,22 @@
|
|||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
# *************************************************************************************************
|
||||
#
|
||||
# Usage, running this script directly vs running it with Make.
|
||||
#
|
||||
# i.e. compile report for all KFP samples with error statistics and no file listing:
|
||||
#
|
||||
# - running the script directly:
|
||||
#
|
||||
# VIRTUAL_ENV=temp/.venv ./test_kfp_samples.sh -a -e -s
|
||||
#
|
||||
# - with Make you can run:
|
||||
#
|
||||
# make report VENV=temp/.venv ALL_SAMPLES="TRUE" PRINT_ERRORS="TRUE" SKIP_FILES="TRUE"
|
||||
#
|
||||
# *************************************************************************************************
|
||||
|
||||
function help {
|
||||
bold=$(tput bold)
|
||||
normal=$(tput sgr0)
|
||||
|
|
@ -36,7 +52,7 @@ function help {
|
|||
# process command line parameters
|
||||
while (( $# > 0 )); do
|
||||
case "$1" in
|
||||
-v|--kfp-version) KFP_VERSION="$2"; shift 2 ;; # KFP SDK version, default: 0.2.2
|
||||
-v|--kfp-version) KFP_VERSION="$2"; shift 2 ;; # KFP SDK version, default: 0.5.0
|
||||
-a|--include-all-samples) ALL_SAMPLES="TRUE"; shift 1 ;; # Compile all DSL scripts in KFP repo
|
||||
-s|--dont-list-files) SKIP_FILES="TRUE"; shift 1 ;; # Suppress compile status for each DSL file
|
||||
-e|--print-error-details) PRINT_ERRORS="TRUE"; shift 1 ;; # Print summary of compilation errors
|
||||
|
|
@ -47,7 +63,7 @@ while (( $# > 0 )); do
|
|||
done
|
||||
|
||||
# define global variables
|
||||
KFP_VERSION=${KFP_VERSION:-0.2.2}
|
||||
KFP_VERSION=${KFP_VERSION:-0.5.0}
|
||||
KFP_REPO_URL="https://github.com/kubeflow/pipelines.git"
|
||||
SCRIPT_DIR="$(cd "$( dirname "${BASH_SOURCE[0]}" )" >/dev/null 2>&1 && pwd)"
|
||||
PROJECT_DIR="${TRAVIS_BUILD_DIR:-$(cd "${SCRIPT_DIR%/sdk/python/tests}"; pwd)}"
|
||||
|
|
@ -60,6 +76,12 @@ COMPILER_OUTPUTS_FILE="${TEMP_DIR}/test_kfp_samples_output.txt"
|
|||
CONFIG_FILE="${PROJECT_DIR}/sdk/python/tests/config.yaml"
|
||||
REPLACE_EXCEPTIONS="FALSE" # "TRUE" | "FALSE"
|
||||
|
||||
# show value of the _DIR variables, use set in POSIX mode in a sub-shell to supress functions definitions (https://stackoverflow.com/a/1305273/5601796)
|
||||
#(set -o posix ; set) | grep "_DIR" | sort
|
||||
|
||||
# show which virtual environment being used
|
||||
#env | grep VIRTUAL_ENV
|
||||
|
||||
mkdir -p "${TEMP_DIR}"
|
||||
mkdir -p "${TEKTON_COMPILED_YAML_DIR}"
|
||||
|
||||
|
|
@ -113,9 +135,9 @@ if [[ "${ALL_SAMPLES}" == "TRUE" ]]; then
|
|||
pip show ai_pipeline_params >/dev/null 2>&1 || pip install ai_pipeline_params
|
||||
pip show kfp-azure-databricks >/dev/null 2>&1 || pip install -e "${KFP_CLONE_DIR}/samples/contrib/azure-samples/kfp-azure-databricks"
|
||||
pip show kfp-arena >/dev/null 2>&1 || pip install "http://kubeflow.oss-cn-beijing.aliyuncs.com/kfp-arena/kfp-arena-0.6.tar.gz"
|
||||
pip show fire >/dev/null 2>&1 || pip install fire
|
||||
pip show tfx >/dev/null 2>&1 || pip install tfx
|
||||
|
||||
# reinstall KFP with the desired version to get all of its dependencies with their respective desired versions
|
||||
# pip uninstall -q -y kfp
|
||||
pip install -q -e "${KFP_CLONE_DIR}/sdk/python"
|
||||
fi
|
||||
|
||||
|
|
|
|||
|
|
@ -7,6 +7,7 @@ SUCCESS: compose.py
|
|||
SUCCESS: default_value.py
|
||||
SUCCESS: input_artifact_raw_value.py
|
||||
FAILURE: loop_over_lightweight_output.py
|
||||
FAILURE: parallelfor_item_argument_resolving.py
|
||||
SUCCESS: param_op_transform.py
|
||||
SUCCESS: param_substitutions.py
|
||||
SUCCESS: pipelineparams.py
|
||||
|
|
|
|||
|
|
@ -18,7 +18,8 @@
|
|||
#
|
||||
# 1. Find the paragraph headings with grep (2nd and 3rd level heading starting with "##" and "###")
|
||||
# 2. Extract the heading's text with sed and transform into '|' separated records of the form '###|Full Text|Full Text'
|
||||
# 3. Generate the TcC lines with awk by replacing '#' with ' ', converting spaces to dashes '-' and lower-casing caps
|
||||
# 3. Generate the ToC lines with awk by replacing '#' with ' ', converting spaces to dashes '-',
|
||||
# removing special chars from hyperlink, and lower-casing caps
|
||||
# 4. Replace leading 2 spaces since our ToC does not include 1st level headings
|
||||
#
|
||||
# Inspired by https://medium.com/@acrodriguez/one-liner-to-generate-a-markdown-toc-f5292112fd14
|
||||
|
|
@ -29,5 +30,5 @@ SEP="|"
|
|||
|
||||
grep -E "^#{2,3}" "${1}" | grep -v "Table of Contents" | \
|
||||
sed -E "s/(#+) (.+)/\1${SEP}\2${SEP}\2/g" | \
|
||||
awk -F "${SEP}" '{ gsub(/#/," ",$1); gsub(/[ ]/,"-",$3); print $1 "- [" $2 "](#" tolower($3) ")" }' | \
|
||||
awk -F "${SEP}" '{ gsub(/#/," ",$1); gsub(/[ ]/,"-",$3); gsub(/[`.]/,"",$3); print $1 "- [" $2 "](#" tolower($3) ")" }' | \
|
||||
sed -e 's/^ //g'
|
||||
|
|
|
|||
Loading…
Reference in New Issue