chore(release): bumped version to 2.0.0-alpha.3
This commit is contained in:
parent
36e308002f
commit
f604406238
182
CHANGELOG.md
182
CHANGELOG.md
|
|
@ -1,5 +1,187 @@
|
|||
# Changelog
|
||||
|
||||
## [2.0.0-alpha.3](https://github.com/kubeflow/pipelines/compare/2.0.0-alpha.2...2.0.0-alpha.3) (2022-07-13)
|
||||
|
||||
|
||||
### ⚠ BREAKING CHANGES
|
||||
|
||||
* **sdk:** make CLI output consistent, readable, and usable ([\#7739](https://github.com/kubeflow/pipelines/issues/7739))
|
||||
|
||||
### Features
|
||||
|
||||
* Upgrade argo-workflow to v3.3.8 ([\#8009](https://github.com/kubeflow/pipelines/issues/8009)) ([8bee292](https://github.com/kubeflow/pipelines/commit/8bee2922c296b2e49e485fd0803cb9c74f49549b))
|
||||
* **api:** add runtime_config.parameter_values with string typed values ([\#7734](https://github.com/kubeflow/pipelines/issues/7734)) ([5b64733](https://github.com/kubeflow/pipelines/commit/5b64733ef640f04658729495e01547660cde10da))
|
||||
* **backend:** Add ExecutionSpec Interface ([\#7531](https://github.com/kubeflow/pipelines/issues/7531)) ([8cf022c](https://github.com/kubeflow/pipelines/commit/8cf022c84bb61c369347da240716d33102d31898))
|
||||
* **backend:** Expand ExecutionSpec Interface ([\#7766](https://github.com/kubeflow/pipelines/issues/7766)) ([06c9310](https://github.com/kubeflow/pipelines/commit/06c93101603636a17b7177885589f4a4c9244c4e))
|
||||
* **backend:** update minio-go to support irsa ([\#7946](https://github.com/kubeflow/pipelines/issues/7946)) ([7923ba3](https://github.com/kubeflow/pipelines/commit/7923ba37fd197a2179163dbe23406a4a233a294b))
|
||||
* **backend:** use cert-manager for cache server cert ([\#7843](https://github.com/kubeflow/pipelines/issues/7843)) ([784f9fa](https://github.com/kubeflow/pipelines/commit/784f9fac24c156fdabaf356e71e1ae6f793dc049))
|
||||
* **component:** Support BigQuery drop model job components ([affbc09](https://github.com/kubeflow/pipelines/commit/affbc09ad8155c933efcc993fbca7c53de396082))
|
||||
* **component:** Support BigQuery ML advanced weights job component ([f6b560a](https://github.com/kubeflow/pipelines/commit/f6b560af8a9c7db07d12cf76e19e86d4afea5ce2))
|
||||
* **component:** Support BigQuery ML centroids job components ([45cfb91](https://github.com/kubeflow/pipelines/commit/45cfb915a9ed9cfa36f5455d4fca37489dc1c173))
|
||||
* **component:** Support BigQuery ML reconstruction loss and trial info job components ([d2f33c8](https://github.com/kubeflow/pipelines/commit/d2f33c84df34c9a13b8befb91a2a6aae65d4774b))
|
||||
* **component:** Support BigQuery ML weights job component ([4183344](https://github.com/kubeflow/pipelines/commit/41833448b25aae6fd6ac7a12018dc17378bed2d7))
|
||||
* **components:** Add display name to import model evaluation component ([9b12f6e](https://github.com/kubeflow/pipelines/commit/9b12f6e2b54c7ff3aa0004e78a0b55b4337fee07))
|
||||
* **components:** Outputting system.artifact and google.BQTable as a replacement for the google.VertexBatchPredictionJob artifact. ([18b3ab8](https://github.com/kubeflow/pipelines/commit/18b3ab8ac3692c3479ec8c0a50cb1180af83a242))
|
||||
* **conformance:** adapt remaining tests to KF env ([\#7670](https://github.com/kubeflow/pipelines/issues/7670)) ([bcbdbd0](https://github.com/kubeflow/pipelines/commit/bcbdbd050fe70560734915ca9949e1c1e5a425d6))
|
||||
* **conformance:** containerize KFP conformance test. ([\#7738](https://github.com/kubeflow/pipelines/issues/7738)) ([3867496](https://github.com/kubeflow/pipelines/commit/38674969837666a2481abe82827b72699356a0bf))
|
||||
* **frontend:** Add ability to filter by visualization type ([\#7906](https://github.com/kubeflow/pipelines/issues/7906)) ([f5a1264](https://github.com/kubeflow/pipelines/commit/f5a126421e194239f6438dab438e4b98d134a09b))
|
||||
* **frontend:** Add loading and error states for HTML and Markdown displays ([\#7989](https://github.com/kubeflow/pipelines/issues/7989)) ([6cbc10b](https://github.com/kubeflow/pipelines/commit/6cbc10b1a70b4b242a6038f8b8d8fb292f05f641))
|
||||
* **frontend:** Add V2 Run Comparison page ([\#7793](https://github.com/kubeflow/pipelines/issues/7793)) ([f822eea](https://github.com/kubeflow/pipelines/commit/f822eea8840db67f19583e66eee56d9222f62da7))
|
||||
* **frontend:** Create metrics tabs ([\#7905](https://github.com/kubeflow/pipelines/issues/7905)) ([e683290](https://github.com/kubeflow/pipelines/commit/e6832909e8b774a3aef02481a24f0dac4ee387f8))
|
||||
* **frontend:** Create sections and run list for KFPv2 Run Comparison page ([\#7882](https://github.com/kubeflow/pipelines/issues/7882)) ([146dae7](https://github.com/kubeflow/pipelines/commit/146dae79cfa6ba2b45c524a7ca58dd610ea6ed8c))
|
||||
* **frontend:** Create two-level dropdown ([\#7933](https://github.com/kubeflow/pipelines/issues/7933)) ([832858b](https://github.com/kubeflow/pipelines/commit/832858b682727cae73ab08e59e70e83e3d48614d))
|
||||
* **frontend:** Display HTML and Markdown files ([\#7981](https://github.com/kubeflow/pipelines/issues/7981)) ([cfe3278](https://github.com/kubeflow/pipelines/commit/cfe3278605ac4f7edbef4c0cb82f743e6fa3febd))
|
||||
* **frontend:** Display multi-level dropdown on KFPv2 Run Comparison page ([\#7943](https://github.com/kubeflow/pipelines/issues/7943)) ([53d8f7e](https://github.com/kubeflow/pipelines/commit/53d8f7ea6c2c2b0db148bf63bed2c6c023b0d8f6))
|
||||
* **frontend:** Display two-panel layout and confusion matrices ([\#7966](https://github.com/kubeflow/pipelines/issues/7966)) ([617ad83](https://github.com/kubeflow/pipelines/commit/617ad83b6f34f88905530c71d2ca092ae8817b5a))
|
||||
* **frontend:** NewRunV2 page and NewRunParametersV2 page. ([\#7769](https://github.com/kubeflow/pipelines/issues/7769)) ([e246ffb](https://github.com/kubeflow/pipelines/commit/e246ffb936546ae9a98123960d8a7677dd9bc9b5))
|
||||
* **frontend:** Request MLMD information for KFPv2 Run Comparison ([\#7897](https://github.com/kubeflow/pipelines/issues/7897)) ([3bcdeb6](https://github.com/kubeflow/pipelines/commit/3bcdeb65cc9acff259d48e330ab42923cf8b7240))
|
||||
* **frontend:** Update compare page and banner states based on run versions and count ([\#7844](https://github.com/kubeflow/pipelines/issues/7844)) ([468d780](https://github.com/kubeflow/pipelines/commit/468d78046528f35b623b9c2eb3cc10fe5558e657))
|
||||
* **google-cloud:** add explanation_metadata_artifact input arg to custom Batch Prediction component. ([ad364d9](https://github.com/kubeflow/pipelines/commit/ad364d999b5fabd38cecbbfd6a3fcc646c7e35b2))
|
||||
* **google-cloud:** Add GCPResources to ImportModelEvaluation component. ([ba82969](https://github.com/kubeflow/pipelines/commit/ba8296991a2213a59582b3afac9c4be333208de3))
|
||||
* **google-cloud:** Add new components 'evaluation_data_sampler' and 'evaluation_data_splitter'. ([d6f9265](https://github.com/kubeflow/pipelines/commit/d6f92654163ec0ab0111804c5d56fe5a29cd0802))
|
||||
* **google-cloud:** Change evaluation preprocessing component output type. ([059afb0](https://github.com/kubeflow/pipelines/commit/059afb0654ebe6d9948d5203512003dd53646293))
|
||||
* **google-cloud:** Modify GCPC evaluation templates. ([176d3ff](https://github.com/kubeflow/pipelines/commit/176d3ff1f40a46a2e951b76a8e5b0ea45c287088))
|
||||
* **google-cloud:** Release eval changes with v0.3 ([d2fd41b](https://github.com/kubeflow/pipelines/commit/d2fd41b07eeef8d30daee03de08aed9561ee63b4))
|
||||
* **google-cloud:** Release notes for 1.0.12 and 1.0.13 ([e42d9d2](https://github.com/kubeflow/pipelines/commit/e42d9d2609369b96973c821dca11fe5b2565e705))
|
||||
* **google-cloud:** Update AutoML pipelines to reference next GCPC package release (1.0.8). ([c849871](https://github.com/kubeflow/pipelines/commit/c849871e595b9fb6dcbca788663f0411a96c9bb5))
|
||||
* **IR:** add is_optional to ParameterSpec proto ([\#7704](https://github.com/kubeflow/pipelines/issues/7704)) ([f960a52](https://github.com/kubeflow/pipelines/commit/f960a52c8285d603b605b4b4c732b5cd471f6e47))
|
||||
* **IR:** adds IteratorPolicy to PipelineTaskSpec proto for support of parallelism setting on ParallelFor ([\#7804](https://github.com/kubeflow/pipelines/issues/7804)) ([a86ae9b](https://github.com/kubeflow/pipelines/commit/a86ae9b89b87a4c34f7877b670ba2b848a8573fa))
|
||||
* **kfp:** implement config file support for registry client ([\#7908](https://github.com/kubeflow/pipelines/issues/7908)) ([4f01b7e](https://github.com/kubeflow/pipelines/commit/4f01b7e602fb5335d3847c4cf3b500072cd30fda))
|
||||
* **sdk:** Add default registry context ([\#7948](https://github.com/kubeflow/pipelines/issues/7948)) ([aab7fda](https://github.com/kubeflow/pipelines/commit/aab7fda89b99a8324c7360c1f981332a014175b7))
|
||||
* **sdk:** Add function to sdk client for terminating run ([\#7835](https://github.com/kubeflow/pipelines/issues/7835)) ([9ffcb2e](https://github.com/kubeflow/pipelines/commit/9ffcb2e9dbd3612eb8ee4b3c540507ddc55a5699))
|
||||
* **sdk:** add retry policy support to kfp v2 ([\#7867](https://github.com/kubeflow/pipelines/issues/7867)) ([850a750](https://github.com/kubeflow/pipelines/commit/850a7504966dde25f691218d76337c83ee8e8143))
|
||||
* **sdk:** add support for IfPresentPlaceholder and ConcatPlaceholder strings ([\#7795](https://github.com/kubeflow/pipelines/issues/7795)) ([de0b824](https://github.com/kubeflow/pipelines/commit/de0b824be1638815c840a09860bedf18949ed228))
|
||||
* **sdk:** compile JSON with formatting ([\#7712](https://github.com/kubeflow/pipelines/issues/7712)) ([2570922](https://github.com/kubeflow/pipelines/commit/2570922a7fd1cb71d5178a6f4898958a66e450ae))
|
||||
* **sdk:** enable compilation of primitive components ([\#7580](https://github.com/kubeflow/pipelines/issues/7580)) ([62972ec](https://github.com/kubeflow/pipelines/commit/62972eccf970f0a3f50c54a541518a7ce7edafaa))
|
||||
* **sdk:** enable component compilation via cli ([\#7649](https://github.com/kubeflow/pipelines/issues/7649)) ([c6125ff](https://github.com/kubeflow/pipelines/commit/c6125ffc44df0eced27ae83451c6243b8fc8d73f))
|
||||
* **sdk:** enable component compilation via component decorator ([\#7554](https://github.com/kubeflow/pipelines/issues/7554)) ([6935a47](https://github.com/kubeflow/pipelines/commit/6935a47e72e5ba7cb4ac87755d5680306930ab98))
|
||||
* **sdk:** enable read in component using IR ([\#7689](https://github.com/kubeflow/pipelines/issues/7689)) ([cbc2ac5](https://github.com/kubeflow/pipelines/commit/cbc2ac52662c65c054f136949ff9b90ae67d59fe))
|
||||
* **sdk:** Implement Registry client ([\#7597](https://github.com/kubeflow/pipelines/issues/7597)) ([25e4c58](https://github.com/kubeflow/pipelines/commit/25e4c5882003f82704be1bf2677bbe75724b343c))
|
||||
* **sdk:** make CLI output consistent, readable, and usable ([\#7739](https://github.com/kubeflow/pipelines/issues/7739)) ([b0db428](https://github.com/kubeflow/pipelines/commit/b0db42816577f92810172ca1e516aac82d3d1c92))
|
||||
* **sdk:** make client return values consistent ([\#7659](https://github.com/kubeflow/pipelines/issues/7659)) ([1ae6e11](https://github.com/kubeflow/pipelines/commit/1ae6e11bab73a4197153cf11f0e0b2788451ef98))
|
||||
* **sdk:** only use IR when saving component ([\#7690](https://github.com/kubeflow/pipelines/issues/7690)) ([a6ec449](https://github.com/kubeflow/pipelines/commit/a6ec449273b2b785362eb529b3a25c5f618b651e))
|
||||
* **sdk:** throw informative exception on uncalled [@pipeline](https://github.com/pipeline) decorator ([\#7913](https://github.com/kubeflow/pipelines/issues/7913)) ([ecb55bd](https://github.com/kubeflow/pipelines/commit/ecb55bde74471ea03d549fc5eedd2d0574fd9abc))
|
||||
* **sdk:** validate pipeline resource name before submission ([\#7713](https://github.com/kubeflow/pipelines/issues/7713)) ([101d243](https://github.com/kubeflow/pipelines/commit/101d243f4813dfecfd202e09f46c2b9223096653))
|
||||
* **sdk/client:** implements overriding caching options at submission ([\#7912](https://github.com/kubeflow/pipelines/issues/7912)) ([b95c5ab](https://github.com/kubeflow/pipelines/commit/b95c5aba85f25294ceb0e779b8290d7c338f2f77))
|
||||
|
||||
|
||||
### Bug Fixes
|
||||
|
||||
* Update proxy agent image to fix CVE-2022-1292 ([\#8019](https://github.com/kubeflow/pipelines/issues/8019)) ([5724849](https://github.com/kubeflow/pipelines/commit/572484951f02957b34f4ae8b555c94ccc90f9bc9))
|
||||
* **components:** fix launch_crd.py to be compatible with kubernetes python SDK API. Fixes [\#7984](https://github.com/kubeflow/pipelines/issues/7984) ([\#7985](https://github.com/kubeflow/pipelines/issues/7985)) ([c7bf68e](https://github.com/kubeflow/pipelines/commit/c7bf68edde91b7691ef9753581864a6661ac1f7f))
|
||||
* Update GCP marketplace deployer base image ([\#8018](https://github.com/kubeflow/pipelines/issues/8018)) ([1d871fe](https://github.com/kubeflow/pipelines/commit/1d871fe9d285f162ea2f05f3594746f0ceff289e))
|
||||
* Update mysql image to fix CVE-2022-1292 ([\#8017](https://github.com/kubeflow/pipelines/issues/8017)) ([3bbc1d9](https://github.com/kubeflow/pipelines/commit/3bbc1d9afb97f6d5958da29da108562f6861c268))
|
||||
* **backend:** "Started at", "Finished at" and "Duration" display error. ([\#7876](https://github.com/kubeflow/pipelines/issues/7876)) ([d11b097](https://github.com/kubeflow/pipelines/commit/d11b097a08c55543ff6d856fd58896422c250784))
|
||||
* **backend:** change downloaded IR from JSON to YAML. Fixes [\#7673](https://github.com/kubeflow/pipelines/issues/7673) ([\#7768](https://github.com/kubeflow/pipelines/issues/7768)) ([c7b56d6](https://github.com/kubeflow/pipelines/commit/c7b56d6ca6f1de4729c975a16a9d7df1e12c15e0))
|
||||
* **backend:** fixes healthz response by adding json content type. Fixes [\#7525](https://github.com/kubeflow/pipelines/issues/7525) ([\#7532](https://github.com/kubeflow/pipelines/issues/7532)) ([bd38cb5](https://github.com/kubeflow/pipelines/commit/bd38cb5ae8a043e9965bd569a5a1930d2a306a4f))
|
||||
* **backend:** Upgrade grpc-gateway patch version to enable correct swagger config ([\#7788](https://github.com/kubeflow/pipelines/issues/7788)) ([cf1b873](https://github.com/kubeflow/pipelines/commit/cf1b873a9aec8f2eb733ff2fa8707a0de2fcf438))
|
||||
* **cache:** Change kfp cache cert names to match cert-manager ([\#7538](https://github.com/kubeflow/pipelines/issues/7538)) ([2a82a0f](https://github.com/kubeflow/pipelines/commit/2a82a0f1590bc7ad0119cf946b7fe70bbd263e80))
|
||||
* **components:** Pin pip version to fix apache-beam[gcp] installation issue ([\#8016](https://github.com/kubeflow/pipelines/issues/8016)) ([bb83581](https://github.com/kubeflow/pipelines/commit/bb835818bae41ca42ebf92769714f1e3aafecc54))
|
||||
* **components:** Preserve empty features in explanation_spec ([4c70132](https://github.com/kubeflow/pipelines/commit/4c70132952e8b0d7f0e456d74358749a9eba2c3f))
|
||||
* **components/google-cloud:** custom job util should add the nfs_mounts as a sub node of worker_pool_spec. ([6296c18](https://github.com/kubeflow/pipelines/commit/6296c18c7c9ec37489f36efc2cd121a800155da6))
|
||||
* **deps:** update dependency npm to v8 [security] ([\#7885](https://github.com/kubeflow/pipelines/issues/7885)) ([624f197](https://github.com/kubeflow/pipelines/commit/624f197d521a97d293866c99c0073b9427c987e9))
|
||||
* **frontend:** Change "Layer" to unclickable. ([\#7737](https://github.com/kubeflow/pipelines/issues/7737)) ([b7cf1fb](https://github.com/kubeflow/pipelines/commit/b7cf1fbc1cb551aad33fa2a720606af3de1b7971))
|
||||
* **frontend:** Fix run comparison filter ([\#7833](https://github.com/kubeflow/pipelines/issues/7833)) ([f41b545](https://github.com/kubeflow/pipelines/commit/f41b5458d0fc766104eb614dceaf2724e6504f6e))
|
||||
* **frontend:** Fix Run Comparison refresh button ([\#7872](https://github.com/kubeflow/pipelines/issues/7872)) ([85c7901](https://github.com/kubeflow/pipelines/commit/85c7901874f19b6e7ecf6e1b152f53ab429e6edc))
|
||||
* **frontend:** Move all V2 flag check to V2_ALPHA ([\#7773](https://github.com/kubeflow/pipelines/issues/7773)) ([6cc33f5](https://github.com/kubeflow/pipelines/commit/6cc33f576b618c6477f16705ec45b4708393e7c4))
|
||||
* **ir:** fix kfp-pipeline-spec proto imports ([\#7754](https://github.com/kubeflow/pipelines/issues/7754)) ([436d7af](https://github.com/kubeflow/pipelines/commit/436d7afe83c98b7290ade66646d3582a80d3dcdf))
|
||||
* **manifests:** Update snapshot to 2.0.0-alpha.2 ([\#7852](https://github.com/kubeflow/pipelines/issues/7852)) ([d099864](https://github.com/kubeflow/pipelines/commit/d099864bd91d56e973446468b02a1a7745373905))
|
||||
* **sdk:** Fix corner cases and implement validation ([\#7763](https://github.com/kubeflow/pipelines/issues/7763)) ([eaa8ec7](https://github.com/kubeflow/pipelines/commit/eaa8ec78076f78dddad048515a8a862de075c481))
|
||||
* **sdk:** fix extract docstring in load component ([\#7921](https://github.com/kubeflow/pipelines/issues/7921)) ([a020c9b](https://github.com/kubeflow/pipelines/commit/a020c9b01cc54dfbf8074ae54613ebde54399e4f))
|
||||
* **sdk:** fix ifpresent and concat placeholder compile ([\#7930](https://github.com/kubeflow/pipelines/issues/7930)) ([f877a1f](https://github.com/kubeflow/pipelines/commit/f877a1ffc8f67d442ca414f0271a751e0c531249))
|
||||
* **sdk:** fix load v1 if present placeholder ([\#7765](https://github.com/kubeflow/pipelines/issues/7765)) ([0cf8173](https://github.com/kubeflow/pipelines/commit/0cf817364cc7b3156ea4b4e16fee3f193d8a6744))
|
||||
* **sdk:** fix placeholder read/write logic ([\#7928](https://github.com/kubeflow/pipelines/issues/7928)) ([ad458d5](https://github.com/kubeflow/pipelines/commit/ad458d535c55abb230ed04c9403100d81bc868be))
|
||||
* **test:** chmod +x test_load_all_components.sh ([\#7865](https://github.com/kubeflow/pipelines/issues/7865)) ([95d0f7f](https://github.com/kubeflow/pipelines/commit/95d0f7fd38dd624b1670c39c93936d28be0ec58f))
|
||||
* **test:** Make exit handler test install KFP SDK from source ([\#7785](https://github.com/kubeflow/pipelines/issues/7785)) ([d25bbc8](https://github.com/kubeflow/pipelines/commit/d25bbc88b1ecaa26ac3ef24039ced74661284ff2))
|
||||
|
||||
|
||||
### Other Pull Requests
|
||||
|
||||
* Fix the problem that AutoML Tabular pipeline could fail when using large number of features. (e.g., > 500 features). ([9f6689e](https://github.com/kubeflow/pipelines/commit/9f6689e5f0cfe21b595cba229d3b1aed14ee3342))
|
||||
* Move new bq components from experimental to v1 ([527d9e1](https://github.com/kubeflow/pipelines/commit/527d9e1b38a33b263747bdeff68d53a9a2266b79))
|
||||
* remove kms key name from the drop model interface. ([660e847](https://github.com/kubeflow/pipelines/commit/660e84704daccdf54712652f451c0085560012bf))
|
||||
* Internal change ([6372420](https://github.com/kubeflow/pipelines/commit/63724207bceb189a7e1a78a9b8b374fefc14135e))
|
||||
* feat(frontend) Support LIST, STRUCT type in RuntimeConfig parameters ([\#7991](https://github.com/kubeflow/pipelines/issues/7991)) ([2d81e7b](https://github.com/kubeflow/pipelines/commit/2d81e7b6244d4e3c6cf550b14417f2607ffbaf83))
|
||||
* chore(components/pytorch):kserve migration ([\#7615](https://github.com/kubeflow/pipelines/issues/7615)) ([49c3587](https://github.com/kubeflow/pipelines/commit/49c3587591fc291dc4b0f85ea7339d330ffe451c))
|
||||
* add registry json to package data ([\#7987](https://github.com/kubeflow/pipelines/issues/7987)) ([7253203](https://github.com/kubeflow/pipelines/commit/7253203239f5f1442a4d4521fe5f6b3bfe7517aa))
|
||||
* feat(frontend) Convert string-type RuntimeConfig parameters to real-type (currently support num, bool, str) ([\#7919](https://github.com/kubeflow/pipelines/issues/7919)) ([b01000a](https://github.com/kubeflow/pipelines/commit/b01000a1a1449db56b2f75d779454b0c7b50f5ef))
|
||||
* Fix AutoML Tabular pipeline always runs evaluation. ([f2d3d4d](https://github.com/kubeflow/pipelines/commit/f2d3d4df632675f366dd13d292d451afaaec094d))
|
||||
* feat(components):Add pipeline metadata to import evaluation component ([1fe3641](https://github.com/kubeflow/pipelines/commit/1fe36417349c3555c3fd42093ec362e0131f1008))
|
||||
* Use FTE image directly to launch FTE component ([e9221ec](https://github.com/kubeflow/pipelines/commit/e9221ec613714cd32651dea0a440cf63031f3cdd))
|
||||
* Add automl_tabular_pipeline pipeline for Tabular Workflow. Pin Tabular Workflow images to 20220629_0725_RC00. ([5373af5](https://github.com/kubeflow/pipelines/commit/5373af5d001a8b9f074fae15531cdb17978ba03e))
|
||||
* Make generate analyze/transform data and split materialized data as components ([9d3ac26](https://github.com/kubeflow/pipelines/commit/9d3ac2692dcbf6272ad6c83afa5977b871655404))
|
||||
* For built-in algorithms, the transform_config input is expected to be a GCS file path. The file contains transformations to apply. ([736f7fe](https://github.com/kubeflow/pipelines/commit/736f7fe760f9d560f7174de65ab4192fabf4ea3e))
|
||||
* weight_column_name -> weight_column and target_column_name -> target_column for Tables v1 and skip_architecture_search pipelines ([a4a7462](https://github.com/kubeflow/pipelines/commit/a4a7462ece971acebcea7616c698ed3629357d89))
|
||||
* Make calculation logic in SDK helper function run inside a component for Tables v1 and skip_architecture_search pipelines ([9b23034](https://github.com/kubeflow/pipelines/commit/9b23034993a5792f54cf3ca8db89f5c789ad9ba5))
|
||||
* Use feature transform engine docker image for related components ([7dc6448](https://github.com/kubeflow/pipelines/commit/7dc644882df299ce3ae76f735fdf2cc90072b7f8))
|
||||
* add algorithm to pipeline, also switch the default alg to be AMI. ([4ca9993](https://github.com/kubeflow/pipelines/commit/4ca999356cfeb705ed9a52d2ba315dcbe7728ac8))
|
||||
* Update skip_architecture_search pipeline ([a8aa75d](https://github.com/kubeflow/pipelines/commit/a8aa75dad116ac9a9ea50c35acf329c233f973a8))
|
||||
* google-cloud-pipeline-components-1.0.11 release notes ([aedd748](https://github.com/kubeflow/pipelines/commit/aedd748f394b7a6f42d57ba5535e8b1f32b8b81f))
|
||||
* Added dataflow_service_account to specify custom service account to run dataflow jobs for stats_and_example_gen and transform components. ([745abc2](https://github.com/kubeflow/pipelines/commit/745abc2cff3fc108d931632aec91d7728c1fa2d7))
|
||||
* Update Tabular workflow to reference 1.0.10 launcher image. ([419c8fa](https://github.com/kubeflow/pipelines/commit/419c8fa1496502b68ca4f1436b1f6a5ab99db00f))
|
||||
* Change JSON to primitive types for Tables v1, built-in algorithm and internal pipelines ([daf40bb](https://github.com/kubeflow/pipelines/commit/daf40bb265536344ec971f62f6a144ce176ab1a8))
|
||||
* move generating feature ranking to utils to be available in sdk ([50ea394](https://github.com/kubeflow/pipelines/commit/50ea394b20979ba340987d2ae8e0f79f3ee51063))
|
||||
* The added field transformations_path enables users to specify transformations in a file and provide a path to the file as an input. ([5e39259](https://github.com/kubeflow/pipelines/commit/5e39259b0738c504185b16fcdb18fd2daca9576b))
|
||||
* add algorithm parameter to be configurable by users. ([c5df3e8](https://github.com/kubeflow/pipelines/commit/c5df3e88cc48580e7c4ffba03a163e9fddbaf3fd))
|
||||
* Add a `study_spec_parameters_override` parameter to Tables v1 and stage_1 component pipelines and update handling logic ([7f886db](https://github.com/kubeflow/pipelines/commit/7f886dbf301cf114bc7d41fef91c902990c6fa89))
|
||||
* Add explanation metadata to upload model component for L2L pipelines ([5898a19](https://github.com/kubeflow/pipelines/commit/5898a1954d07fe71b9f64b05c33f7a653c1ee09c))
|
||||
* Use 'unmanaged_container_model' instead of 'model' in infra validator component for all pipelines ([768d7a9](https://github.com/kubeflow/pipelines/commit/768d7a933c346748916161a31fb804195d17732f))
|
||||
* Remove use of `input_directionary_to_parameter` in v1 Tables and component-specific pipelines only and read from executor_input instead ([958a181](https://github.com/kubeflow/pipelines/commit/958a181ec57f3df225e049fc0a66548603746099))
|
||||
* Feature transform engine config planner to generate training schema & instance baseline ([b0a7ba5](https://github.com/kubeflow/pipelines/commit/b0a7ba5ef408f44ee415f5863dac52f6cae105f9))
|
||||
* Decouple transform config planner from metadata ([fcb8f5f](https://github.com/kubeflow/pipelines/commit/fcb8f5fb4f96d5d102c2d66f1c228908276c60c3))
|
||||
* Make Tables ensemble also output explanation_metadata artifact. ([ed8e7e9](https://github.com/kubeflow/pipelines/commit/ed8e7e9b65da52eeae0263744e8f7f3716a4bcc6))
|
||||
* Internal change ([26fb337](https://github.com/kubeflow/pipelines/commit/26fb3376d0e90e6643c7cc29a1175e988243a980))
|
||||
* Internal change ([5b74b21](https://github.com/kubeflow/pipelines/commit/5b74b218d1a3a74c7496a21e1f7db1a03bc62fc6))
|
||||
* Merge pull request [\#7776](https://github.com/kubeflow/pipelines/issues/7776) from evanseabrook:fix/empty-explanation-features ([e3fc7cd](https://github.com/kubeflow/pipelines/commit/e3fc7cd365c23c810a9199f90715d2a140bd68a7))
|
||||
* Fix AutoML Tables pipeline and builtin pipelines on VPC-SC environment. ([023680d](https://github.com/kubeflow/pipelines/commit/023680dc8d2b15119b57bf8350e54ae0e53bdbd3))
|
||||
* Make stats-gen component to support running with example-gen only mode ([f85dbe6](https://github.com/kubeflow/pipelines/commit/f85dbe67ef4d65a95bf7a4fbdd1d8dd8ddd672e8))
|
||||
* Remove trailing whitespace ([cf0a2ca](https://github.com/kubeflow/pipelines/commit/cf0a2ca445a1daa16bbc53de116b47c0e5e3f3ab))
|
||||
* KFP component for ml.detect_anomalies ([d05110e](https://github.com/kubeflow/pipelines/commit/d05110e0b0c300879614b3bb28c06fee71470de6))
|
||||
* Adding ML.GLOBAL_EXPLAIN KFP BQ Component. ([b1c803b](https://github.com/kubeflow/pipelines/commit/b1c803bfc3ae5ec4a90fbf228a54c5f8f0cf5b5f))
|
||||
* Add distill + evaluation pipeline for Tables ([83e618e](https://github.com/kubeflow/pipelines/commit/83e618e1f4cdd98608f23336e207a003dd475b61))
|
||||
* KFP component for ml.forecast ([79b3335](https://github.com/kubeflow/pipelines/commit/79b33356f34360013e7ea5bfa8a47821a40fd644))
|
||||
* KFP component for ml.explain_forecast ([44d9575](https://github.com/kubeflow/pipelines/commit/44d9575884d786a2900063878c27901a15781056))
|
||||
* Add ML.ARIMA_EVALUATE in component.yaml ([dd8ccd2](https://github.com/kubeflow/pipelines/commit/dd8ccd20f517ee816cc442e67d9cdf8a65f3e228))
|
||||
* Adding ML.Recommend KFP BQ component. ([56c9111](https://github.com/kubeflow/pipelines/commit/56c91111ec67c1fd2b42a5d7788b13ca6ef5954e))
|
||||
* Add ML.ARIMA_COEFFICIENTS in component.yaml ([7af13b6](https://github.com/kubeflow/pipelines/commit/7af13b621788b34052a281117ca6725decfaed21))
|
||||
* Adding ML.FEATURE_IMPORTANCE KFP BQ Component. ([b067fc6](https://github.com/kubeflow/pipelines/commit/b067fc63eb6b71e25d504e330c861db384e83c09))
|
||||
* Adding ML.PRINCIPAL_COMPONENTS and ML.PRINCIPAL_COMPONENT_INFO KFP BQ component. ([e7dbbcb](https://github.com/kubeflow/pipelines/commit/e7dbbcb3b43006fc60c335f814b87b776f4cee96))
|
||||
* Adding ML. ROC_CURVE KFP BQ Component. ([640ef6f](https://github.com/kubeflow/pipelines/commit/640ef6fdefa1687232c111531bba01ff51f04f82))
|
||||
* Merge distill_skip_evaluation and skip_evaluation pipelines with default pipeline using dsl.Condition ([e95b9f5](https://github.com/kubeflow/pipelines/commit/e95b9f5f4663a0713d4e2565c686e71aa96643c1))
|
||||
* Adding ML.FEATURE_INFO KFP BQ Component. ([cdd5760](https://github.com/kubeflow/pipelines/commit/cdd5760e22d55a5f96b5dbf2e5bcb260bda5c62e))
|
||||
* Adding ML.CONFUSION_MATRIX KFP BQ Component. ([8b9fe02](https://github.com/kubeflow/pipelines/commit/8b9fe02cc0fba6073b260c6e6f7570b81a8ec49e))
|
||||
* Wide and Deep and Tabnet models both now use the Feature Transform Engine pipeline instead of the Transform component. ([1365f83](https://github.com/kubeflow/pipelines/commit/1365f83e242a2325766dedf9049642b93e33c90b))
|
||||
* fix(frontend) Regenerate the Typescript client library from swagger to match the backend changes in [\#7788](https://github.com/kubeflow/pipelines/issues/7788) ([\#7847](https://github.com/kubeflow/pipelines/issues/7847)) ([2c065d3](https://github.com/kubeflow/pipelines/commit/2c065d3374b53823410c3b33a663e9bda5b97b32))
|
||||
* add additional experiments in distillation pipeline. ([db9ee59](https://github.com/kubeflow/pipelines/commit/db9ee59f18f1625dcd6f53004ab1b66571b9a071))
|
||||
* Adding ML.TRAINING_INFO KFP and ML.EXPLAIN_PREDICT BQ Component. ([a6d0b86](https://github.com/kubeflow/pipelines/commit/a6d0b86edfe0b56aa2b7d86ad5dde440ee7720b4))
|
||||
* FTE now outputs training schema. ([be317d0](https://github.com/kubeflow/pipelines/commit/be317d0bcbbe65f8a0994e9e562fbf64bcdb4a21))
|
||||
* FTE transform config passed as path to config file instead of directly as string to FTE. ([b83327f](https://github.com/kubeflow/pipelines/commit/b83327f876d1cc9e9d34f688cd84d40fe89346d6))
|
||||
* Add model eval component to built-in algorithm default pipelines ([8829400](https://github.com/kubeflow/pipelines/commit/8829400a67c8d4be500d9d269af18ab7ebb3d2cf))
|
||||
* add README with project description ([\#7782](https://github.com/kubeflow/pipelines/issues/7782)) ([2c27475](https://github.com/kubeflow/pipelines/commit/2c27475fbc64b919b49882ac0978e61cd5996fb9))
|
||||
* Renamed "Feature Transform Engine Transform Configuration" component to "Transform Configuration Planner" for clarity. ([e38df3a](https://github.com/kubeflow/pipelines/commit/e38df3a8cdf1f31565f4990a5027796048045e41))
|
||||
* feat[deployment]: add startup probe to pipeline API deployment ([\#7741](https://github.com/kubeflow/pipelines/issues/7741)) ([dd98b1e](https://github.com/kubeflow/pipelines/commit/dd98b1eaf803aad77dbbb23a32bef259c9f1f539))
|
||||
* Update default machine type to c2-standard-16 for built-in algorithms Custom and HyperparameterTuning Jobs ([93ea81b](https://github.com/kubeflow/pipelines/commit/93ea81bf07181f22e9f5fd855d1416ffb31248c5))
|
||||
* Removed default location setting from AutoML components and documentation. ([0932905](https://github.com/kubeflow/pipelines/commit/0932905ff58a5effae510a9eeb30bbe86fc3adf1))
|
||||
* Add window config to ARIMA pipeline. ([f77ebc8](https://github.com/kubeflow/pipelines/commit/f77ebc879741d4f5f5a6b7eede6d996788df3a45))
|
||||
* Stats Gen and Feature Transform Engine pipeline integration. ([1e105d3](https://github.com/kubeflow/pipelines/commit/1e105d3ba07a094075c7ac4372cb3d35544a491d))
|
||||
* Uses BigQuery batch queries in ARIMA pipeline after first 50 queries. ([9e7de60](https://github.com/kubeflow/pipelines/commit/9e7de60a3fb0c16e84598ab7a4cfbb0ff4782a98))
|
||||
* Typo fix ([99b278f](https://github.com/kubeflow/pipelines/commit/99b278f536fbf1f49f832deb5fccfcdfc1e1cd7d))
|
||||
* Allow ARIMA pipeline to overwrite output tables. ([6efb7e5](https://github.com/kubeflow/pipelines/commit/6efb7e5cb2a6ec43598df93df44faefdc9319ff4))
|
||||
* Replace custom copy_table component with BQ first-party query component. ([59c8c6f](https://github.com/kubeflow/pipelines/commit/59c8c6f9c82aa09be0c944b0c98e6e0949c1abe6))
|
||||
* support vpc in feature selection ([a446ec6](https://github.com/kubeflow/pipelines/commit/a446ec6d34e55bbd48f8caca866ca55687d0d1b6))
|
||||
* Fix failure when running distillation pipeline due to unexpected additional_experiments pipeline parameter. ([076b050](https://github.com/kubeflow/pipelines/commit/076b050e316d8f0afe37ab9fa089cd90274259f1))
|
||||
* fix cli upload pipeline version ([\#7722](https://github.com/kubeflow/pipelines/issues/7722)) ([49cdb42](https://github.com/kubeflow/pipelines/commit/49cdb42c129dce653b542d4323d7087f67097e2e))
|
||||
* Create Python-based component to set study_spec_parameters for Wide & Deep HyperparameterTuningJobs ([4d4b88b](https://github.com/kubeflow/pipelines/commit/4d4b88b9c208b24289e81fab1dd0381ed541b1e2))
|
||||
* Create Python-based component to set study_spec_parameters for TabNet HyperparameterTuningJobs ([8fa1235](https://github.com/kubeflow/pipelines/commit/8fa12353d5cfe17ca8c5ea2f0ba7695cd443bb13))
|
||||
* Add default Wide & Deep study_spec_parameters configs and add helper function to utils.py to get parameters ([638a38e](https://github.com/kubeflow/pipelines/commit/638a38e6ff7b3f06bb8f32cbdde9152c85a88561))
|
||||
* Internal testing change ([669374b](https://github.com/kubeflow/pipelines/commit/669374be7d7e1f9465078fff555c1ad5d84c7147))
|
||||
* Update google-cloud-pipeline-components version and release notes ([10d9fe3](https://github.com/kubeflow/pipelines/commit/10d9fe387232e0563718eced45cf380fa96ad5e4))
|
||||
* Updates Workbench Executor component for Pipelines to support kernel_spec. ([b40f450](https://github.com/kubeflow/pipelines/commit/b40f45088e6d20343aa7b915015cd14cc6d5bfa7))
|
||||
* Add default TabNet study_spec_parameters configs for different dataset sizes and search space modes and helper function to get the parameters ([f877b6e](https://github.com/kubeflow/pipelines/commit/f877b6e95769f5ee9372f5a110fea1dec9d11c5e))
|
||||
* Add init file to container/experimental directory to recognize as a python module. ([8c20a2c](https://github.com/kubeflow/pipelines/commit/8c20a2c570aac422106e866d63a8c1e40cc2a98f))
|
||||
* add E2E test cases for classification type feature attribution pipeline. ([0eb9d3b](https://github.com/kubeflow/pipelines/commit/0eb9d3b0cc6dc1b696c7d1d1288b8794ca535b89))
|
||||
* Minor changes to the feature transform engine and transform configuration component specs to support their integration. ([ab2a84e](https://github.com/kubeflow/pipelines/commit/ab2a84e15539c05353c29866f38aec51ba0c5a58))
|
||||
|
||||
## [2.0.0-alpha.2](https://github.com/kubeflow/pipelines/compare/2.0.0-alpha.1...2.0.0-alpha.2) (2022-05-05)
|
||||
|
||||
|
||||
|
|
|
|||
|
|
@ -36,10 +36,7 @@ func request_AuthService_Authorize_0(ctx context.Context, marshaler runtime.Mars
|
|||
var protoReq AuthorizeRequest
|
||||
var metadata runtime.ServerMetadata
|
||||
|
||||
if err := req.ParseForm(); err != nil {
|
||||
return nil, metadata, status.Errorf(codes.InvalidArgument, "%v", err)
|
||||
}
|
||||
if err := runtime.PopulateQueryParameters(&protoReq, req.Form, filter_AuthService_Authorize_0); err != nil {
|
||||
if err := runtime.PopulateQueryParameters(&protoReq, req.URL.Query(), filter_AuthService_Authorize_0); err != nil {
|
||||
return nil, metadata, status.Errorf(codes.InvalidArgument, "%v", err)
|
||||
}
|
||||
|
||||
|
|
@ -110,7 +107,7 @@ func RegisterAuthServiceHandlerClient(ctx context.Context, mux *runtime.ServeMux
|
|||
}
|
||||
|
||||
var (
|
||||
pattern_AuthService_Authorize_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2}, []string{"apis", "v1beta1", "auth"}, "", runtime.AssumeColonVerbOpt(true)))
|
||||
pattern_AuthService_Authorize_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2}, []string{"apis", "v1beta1", "auth"}, ""))
|
||||
)
|
||||
|
||||
var (
|
||||
|
|
|
|||
|
|
@ -80,10 +80,7 @@ func request_ExperimentService_ListExperiment_0(ctx context.Context, marshaler r
|
|||
var protoReq ListExperimentsRequest
|
||||
var metadata runtime.ServerMetadata
|
||||
|
||||
if err := req.ParseForm(); err != nil {
|
||||
return nil, metadata, status.Errorf(codes.InvalidArgument, "%v", err)
|
||||
}
|
||||
if err := runtime.PopulateQueryParameters(&protoReq, req.Form, filter_ExperimentService_ListExperiment_0); err != nil {
|
||||
if err := runtime.PopulateQueryParameters(&protoReq, req.URL.Query(), filter_ExperimentService_ListExperiment_0); err != nil {
|
||||
return nil, metadata, status.Errorf(codes.InvalidArgument, "%v", err)
|
||||
}
|
||||
|
||||
|
|
@ -335,17 +332,17 @@ func RegisterExperimentServiceHandlerClient(ctx context.Context, mux *runtime.Se
|
|||
}
|
||||
|
||||
var (
|
||||
pattern_ExperimentService_CreateExperiment_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2}, []string{"apis", "v1beta1", "experiments"}, "", runtime.AssumeColonVerbOpt(true)))
|
||||
pattern_ExperimentService_CreateExperiment_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2}, []string{"apis", "v1beta1", "experiments"}, ""))
|
||||
|
||||
pattern_ExperimentService_GetExperiment_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2, 1, 0, 4, 1, 5, 3}, []string{"apis", "v1beta1", "experiments", "id"}, "", runtime.AssumeColonVerbOpt(true)))
|
||||
pattern_ExperimentService_GetExperiment_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2, 1, 0, 4, 1, 5, 3}, []string{"apis", "v1beta1", "experiments", "id"}, ""))
|
||||
|
||||
pattern_ExperimentService_ListExperiment_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2}, []string{"apis", "v1beta1", "experiments"}, "", runtime.AssumeColonVerbOpt(true)))
|
||||
pattern_ExperimentService_ListExperiment_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2}, []string{"apis", "v1beta1", "experiments"}, ""))
|
||||
|
||||
pattern_ExperimentService_DeleteExperiment_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2, 1, 0, 4, 1, 5, 3}, []string{"apis", "v1beta1", "experiments", "id"}, "", runtime.AssumeColonVerbOpt(true)))
|
||||
pattern_ExperimentService_DeleteExperiment_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2, 1, 0, 4, 1, 5, 3}, []string{"apis", "v1beta1", "experiments", "id"}, ""))
|
||||
|
||||
pattern_ExperimentService_ArchiveExperiment_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2, 1, 0, 4, 1, 5, 3}, []string{"apis", "v1beta1", "experiments", "id"}, "archive", runtime.AssumeColonVerbOpt(true)))
|
||||
pattern_ExperimentService_ArchiveExperiment_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2, 1, 0, 4, 1, 5, 3}, []string{"apis", "v1beta1", "experiments", "id"}, "archive"))
|
||||
|
||||
pattern_ExperimentService_UnarchiveExperiment_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2, 1, 0, 4, 1, 5, 3}, []string{"apis", "v1beta1", "experiments", "id"}, "unarchive", runtime.AssumeColonVerbOpt(true)))
|
||||
pattern_ExperimentService_UnarchiveExperiment_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2, 1, 0, 4, 1, 5, 3}, []string{"apis", "v1beta1", "experiments", "id"}, "unarchive"))
|
||||
)
|
||||
|
||||
var (
|
||||
|
|
|
|||
|
|
@ -100,7 +100,7 @@ func RegisterHealthzServiceHandlerClient(ctx context.Context, mux *runtime.Serve
|
|||
}
|
||||
|
||||
var (
|
||||
pattern_HealthzService_GetHealthz_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2}, []string{"apis", "v1beta1", "healthz"}, "", runtime.AssumeColonVerbOpt(true)))
|
||||
pattern_HealthzService_GetHealthz_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2}, []string{"apis", "v1beta1", "healthz"}, ""))
|
||||
)
|
||||
|
||||
var (
|
||||
|
|
|
|||
|
|
@ -80,10 +80,7 @@ func request_JobService_ListJobs_0(ctx context.Context, marshaler runtime.Marsha
|
|||
var protoReq ListJobsRequest
|
||||
var metadata runtime.ServerMetadata
|
||||
|
||||
if err := req.ParseForm(); err != nil {
|
||||
return nil, metadata, status.Errorf(codes.InvalidArgument, "%v", err)
|
||||
}
|
||||
if err := runtime.PopulateQueryParameters(&protoReq, req.Form, filter_JobService_ListJobs_0); err != nil {
|
||||
if err := runtime.PopulateQueryParameters(&protoReq, req.URL.Query(), filter_JobService_ListJobs_0); err != nil {
|
||||
return nil, metadata, status.Errorf(codes.InvalidArgument, "%v", err)
|
||||
}
|
||||
|
||||
|
|
@ -335,17 +332,17 @@ func RegisterJobServiceHandlerClient(ctx context.Context, mux *runtime.ServeMux,
|
|||
}
|
||||
|
||||
var (
|
||||
pattern_JobService_CreateJob_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2}, []string{"apis", "v1beta1", "jobs"}, "", runtime.AssumeColonVerbOpt(true)))
|
||||
pattern_JobService_CreateJob_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2}, []string{"apis", "v1beta1", "jobs"}, ""))
|
||||
|
||||
pattern_JobService_GetJob_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2, 1, 0, 4, 1, 5, 3}, []string{"apis", "v1beta1", "jobs", "id"}, "", runtime.AssumeColonVerbOpt(true)))
|
||||
pattern_JobService_GetJob_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2, 1, 0, 4, 1, 5, 3}, []string{"apis", "v1beta1", "jobs", "id"}, ""))
|
||||
|
||||
pattern_JobService_ListJobs_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2}, []string{"apis", "v1beta1", "jobs"}, "", runtime.AssumeColonVerbOpt(true)))
|
||||
pattern_JobService_ListJobs_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2}, []string{"apis", "v1beta1", "jobs"}, ""))
|
||||
|
||||
pattern_JobService_EnableJob_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2, 1, 0, 4, 1, 5, 3, 2, 4}, []string{"apis", "v1beta1", "jobs", "id", "enable"}, "", runtime.AssumeColonVerbOpt(true)))
|
||||
pattern_JobService_EnableJob_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2, 1, 0, 4, 1, 5, 3, 2, 4}, []string{"apis", "v1beta1", "jobs", "id", "enable"}, ""))
|
||||
|
||||
pattern_JobService_DisableJob_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2, 1, 0, 4, 1, 5, 3, 2, 4}, []string{"apis", "v1beta1", "jobs", "id", "disable"}, "", runtime.AssumeColonVerbOpt(true)))
|
||||
pattern_JobService_DisableJob_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2, 1, 0, 4, 1, 5, 3, 2, 4}, []string{"apis", "v1beta1", "jobs", "id", "disable"}, ""))
|
||||
|
||||
pattern_JobService_DeleteJob_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2, 1, 0, 4, 1, 5, 3}, []string{"apis", "v1beta1", "jobs", "id"}, "", runtime.AssumeColonVerbOpt(true)))
|
||||
pattern_JobService_DeleteJob_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2, 1, 0, 4, 1, 5, 3}, []string{"apis", "v1beta1", "jobs", "id"}, ""))
|
||||
)
|
||||
|
||||
var (
|
||||
|
|
|
|||
|
|
@ -118,10 +118,7 @@ func request_PipelineService_ListPipelines_0(ctx context.Context, marshaler runt
|
|||
var protoReq ListPipelinesRequest
|
||||
var metadata runtime.ServerMetadata
|
||||
|
||||
if err := req.ParseForm(); err != nil {
|
||||
return nil, metadata, status.Errorf(codes.InvalidArgument, "%v", err)
|
||||
}
|
||||
if err := runtime.PopulateQueryParameters(&protoReq, req.Form, filter_PipelineService_ListPipelines_0); err != nil {
|
||||
if err := runtime.PopulateQueryParameters(&protoReq, req.URL.Query(), filter_PipelineService_ListPipelines_0); err != nil {
|
||||
return nil, metadata, status.Errorf(codes.InvalidArgument, "%v", err)
|
||||
}
|
||||
|
||||
|
|
@ -236,10 +233,7 @@ func request_PipelineService_ListPipelineVersions_0(ctx context.Context, marshal
|
|||
var protoReq ListPipelineVersionsRequest
|
||||
var metadata runtime.ServerMetadata
|
||||
|
||||
if err := req.ParseForm(); err != nil {
|
||||
return nil, metadata, status.Errorf(codes.InvalidArgument, "%v", err)
|
||||
}
|
||||
if err := runtime.PopulateQueryParameters(&protoReq, req.Form, filter_PipelineService_ListPipelineVersions_0); err != nil {
|
||||
if err := runtime.PopulateQueryParameters(&protoReq, req.URL.Query(), filter_PipelineService_ListPipelineVersions_0); err != nil {
|
||||
return nil, metadata, status.Errorf(codes.InvalidArgument, "%v", err)
|
||||
}
|
||||
|
||||
|
|
@ -622,29 +616,29 @@ func RegisterPipelineServiceHandlerClient(ctx context.Context, mux *runtime.Serv
|
|||
}
|
||||
|
||||
var (
|
||||
pattern_PipelineService_CreatePipeline_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2}, []string{"apis", "v1beta1", "pipelines"}, "", runtime.AssumeColonVerbOpt(true)))
|
||||
pattern_PipelineService_CreatePipeline_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2}, []string{"apis", "v1beta1", "pipelines"}, ""))
|
||||
|
||||
pattern_PipelineService_GetPipeline_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2, 1, 0, 4, 1, 5, 3}, []string{"apis", "v1beta1", "pipelines", "id"}, "", runtime.AssumeColonVerbOpt(true)))
|
||||
pattern_PipelineService_GetPipeline_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2, 1, 0, 4, 1, 5, 3}, []string{"apis", "v1beta1", "pipelines", "id"}, ""))
|
||||
|
||||
pattern_PipelineService_GetPipelineByName_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2, 1, 0, 4, 1, 5, 3, 2, 4, 1, 0, 4, 1, 5, 5}, []string{"apis", "v1beta1", "namespaces", "namespace", "pipelines", "name"}, "", runtime.AssumeColonVerbOpt(true)))
|
||||
pattern_PipelineService_GetPipelineByName_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2, 1, 0, 4, 1, 5, 3, 2, 4, 1, 0, 4, 1, 5, 5}, []string{"apis", "v1beta1", "namespaces", "namespace", "pipelines", "name"}, ""))
|
||||
|
||||
pattern_PipelineService_ListPipelines_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2}, []string{"apis", "v1beta1", "pipelines"}, "", runtime.AssumeColonVerbOpt(true)))
|
||||
pattern_PipelineService_ListPipelines_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2}, []string{"apis", "v1beta1", "pipelines"}, ""))
|
||||
|
||||
pattern_PipelineService_DeletePipeline_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2, 1, 0, 4, 1, 5, 3}, []string{"apis", "v1beta1", "pipelines", "id"}, "", runtime.AssumeColonVerbOpt(true)))
|
||||
pattern_PipelineService_DeletePipeline_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2, 1, 0, 4, 1, 5, 3}, []string{"apis", "v1beta1", "pipelines", "id"}, ""))
|
||||
|
||||
pattern_PipelineService_GetTemplate_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2, 1, 0, 4, 1, 5, 3, 2, 4}, []string{"apis", "v1beta1", "pipelines", "id", "templates"}, "", runtime.AssumeColonVerbOpt(true)))
|
||||
pattern_PipelineService_GetTemplate_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2, 1, 0, 4, 1, 5, 3, 2, 4}, []string{"apis", "v1beta1", "pipelines", "id", "templates"}, ""))
|
||||
|
||||
pattern_PipelineService_CreatePipelineVersion_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2}, []string{"apis", "v1beta1", "pipeline_versions"}, "", runtime.AssumeColonVerbOpt(true)))
|
||||
pattern_PipelineService_CreatePipelineVersion_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2}, []string{"apis", "v1beta1", "pipeline_versions"}, ""))
|
||||
|
||||
pattern_PipelineService_GetPipelineVersion_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2, 1, 0, 4, 1, 5, 3}, []string{"apis", "v1beta1", "pipeline_versions", "version_id"}, "", runtime.AssumeColonVerbOpt(true)))
|
||||
pattern_PipelineService_GetPipelineVersion_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2, 1, 0, 4, 1, 5, 3}, []string{"apis", "v1beta1", "pipeline_versions", "version_id"}, ""))
|
||||
|
||||
pattern_PipelineService_ListPipelineVersions_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2}, []string{"apis", "v1beta1", "pipeline_versions"}, "", runtime.AssumeColonVerbOpt(true)))
|
||||
pattern_PipelineService_ListPipelineVersions_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2}, []string{"apis", "v1beta1", "pipeline_versions"}, ""))
|
||||
|
||||
pattern_PipelineService_DeletePipelineVersion_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2, 1, 0, 4, 1, 5, 3}, []string{"apis", "v1beta1", "pipeline_versions", "version_id"}, "", runtime.AssumeColonVerbOpt(true)))
|
||||
pattern_PipelineService_DeletePipelineVersion_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2, 1, 0, 4, 1, 5, 3}, []string{"apis", "v1beta1", "pipeline_versions", "version_id"}, ""))
|
||||
|
||||
pattern_PipelineService_GetPipelineVersionTemplate_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2, 1, 0, 4, 1, 5, 3, 2, 4}, []string{"apis", "v1beta1", "pipeline_versions", "version_id", "templates"}, "", runtime.AssumeColonVerbOpt(true)))
|
||||
pattern_PipelineService_GetPipelineVersionTemplate_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2, 1, 0, 4, 1, 5, 3, 2, 4}, []string{"apis", "v1beta1", "pipeline_versions", "version_id", "templates"}, ""))
|
||||
|
||||
pattern_PipelineService_UpdatePipelineDefaultVersion_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2, 1, 0, 4, 1, 5, 3, 2, 4, 1, 0, 4, 1, 5, 5}, []string{"apis", "v1beta1", "pipelines", "pipeline_id", "default_version", "version_id"}, "", runtime.AssumeColonVerbOpt(true)))
|
||||
pattern_PipelineService_UpdatePipelineDefaultVersion_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2, 1, 0, 4, 1, 5, 3, 2, 4, 1, 0, 4, 1, 5, 5}, []string{"apis", "v1beta1", "pipelines", "pipeline_id", "default_version", "version_id"}, ""))
|
||||
)
|
||||
|
||||
var (
|
||||
|
|
|
|||
|
|
@ -144,9 +144,9 @@ func RegisterReportServiceHandlerClient(ctx context.Context, mux *runtime.ServeM
|
|||
}
|
||||
|
||||
var (
|
||||
pattern_ReportService_ReportWorkflow_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2}, []string{"apis", "v1beta1", "workflows"}, "", runtime.AssumeColonVerbOpt(true)))
|
||||
pattern_ReportService_ReportWorkflow_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2}, []string{"apis", "v1beta1", "workflows"}, ""))
|
||||
|
||||
pattern_ReportService_ReportScheduledWorkflow_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2}, []string{"apis", "v1beta1", "scheduledworkflows"}, "", runtime.AssumeColonVerbOpt(true)))
|
||||
pattern_ReportService_ReportScheduledWorkflow_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2}, []string{"apis", "v1beta1", "scheduledworkflows"}, ""))
|
||||
)
|
||||
|
||||
var (
|
||||
|
|
|
|||
|
|
@ -80,10 +80,7 @@ func request_RunService_ListRuns_0(ctx context.Context, marshaler runtime.Marsha
|
|||
var protoReq ListRunsRequest
|
||||
var metadata runtime.ServerMetadata
|
||||
|
||||
if err := req.ParseForm(); err != nil {
|
||||
return nil, metadata, status.Errorf(codes.InvalidArgument, "%v", err)
|
||||
}
|
||||
if err := runtime.PopulateQueryParameters(&protoReq, req.Form, filter_RunService_ListRuns_0); err != nil {
|
||||
if err := runtime.PopulateQueryParameters(&protoReq, req.URL.Query(), filter_RunService_ListRuns_0); err != nil {
|
||||
return nil, metadata, status.Errorf(codes.InvalidArgument, "%v", err)
|
||||
}
|
||||
|
||||
|
|
@ -553,25 +550,25 @@ func RegisterRunServiceHandlerClient(ctx context.Context, mux *runtime.ServeMux,
|
|||
}
|
||||
|
||||
var (
|
||||
pattern_RunService_CreateRun_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2}, []string{"apis", "v1beta1", "runs"}, "", runtime.AssumeColonVerbOpt(true)))
|
||||
pattern_RunService_CreateRun_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2}, []string{"apis", "v1beta1", "runs"}, ""))
|
||||
|
||||
pattern_RunService_GetRun_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2, 1, 0, 4, 1, 5, 3}, []string{"apis", "v1beta1", "runs", "run_id"}, "", runtime.AssumeColonVerbOpt(true)))
|
||||
pattern_RunService_GetRun_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2, 1, 0, 4, 1, 5, 3}, []string{"apis", "v1beta1", "runs", "run_id"}, ""))
|
||||
|
||||
pattern_RunService_ListRuns_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2}, []string{"apis", "v1beta1", "runs"}, "", runtime.AssumeColonVerbOpt(true)))
|
||||
pattern_RunService_ListRuns_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2}, []string{"apis", "v1beta1", "runs"}, ""))
|
||||
|
||||
pattern_RunService_ArchiveRun_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2, 1, 0, 4, 1, 5, 3}, []string{"apis", "v1beta1", "runs", "id"}, "archive", runtime.AssumeColonVerbOpt(true)))
|
||||
pattern_RunService_ArchiveRun_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2, 1, 0, 4, 1, 5, 3}, []string{"apis", "v1beta1", "runs", "id"}, "archive"))
|
||||
|
||||
pattern_RunService_UnarchiveRun_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2, 1, 0, 4, 1, 5, 3}, []string{"apis", "v1beta1", "runs", "id"}, "unarchive", runtime.AssumeColonVerbOpt(true)))
|
||||
pattern_RunService_UnarchiveRun_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2, 1, 0, 4, 1, 5, 3}, []string{"apis", "v1beta1", "runs", "id"}, "unarchive"))
|
||||
|
||||
pattern_RunService_DeleteRun_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2, 1, 0, 4, 1, 5, 3}, []string{"apis", "v1beta1", "runs", "id"}, "", runtime.AssumeColonVerbOpt(true)))
|
||||
pattern_RunService_DeleteRun_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2, 1, 0, 4, 1, 5, 3}, []string{"apis", "v1beta1", "runs", "id"}, ""))
|
||||
|
||||
pattern_RunService_ReportRunMetrics_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2, 1, 0, 4, 1, 5, 3}, []string{"apis", "v1beta1", "runs", "run_id"}, "reportMetrics", runtime.AssumeColonVerbOpt(true)))
|
||||
pattern_RunService_ReportRunMetrics_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2, 1, 0, 4, 1, 5, 3}, []string{"apis", "v1beta1", "runs", "run_id"}, "reportMetrics"))
|
||||
|
||||
pattern_RunService_ReadArtifact_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2, 1, 0, 4, 1, 5, 3, 2, 4, 1, 0, 4, 1, 5, 5, 2, 6, 1, 0, 4, 1, 5, 7}, []string{"apis", "v1beta1", "runs", "run_id", "nodes", "node_id", "artifacts", "artifact_name"}, "read", runtime.AssumeColonVerbOpt(true)))
|
||||
pattern_RunService_ReadArtifact_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2, 1, 0, 4, 1, 5, 3, 2, 4, 1, 0, 4, 1, 5, 5, 2, 6, 1, 0, 4, 1, 5, 7}, []string{"apis", "v1beta1", "runs", "run_id", "nodes", "node_id", "artifacts", "artifact_name"}, "read"))
|
||||
|
||||
pattern_RunService_TerminateRun_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2, 1, 0, 4, 1, 5, 3, 2, 4}, []string{"apis", "v1beta1", "runs", "run_id", "terminate"}, "", runtime.AssumeColonVerbOpt(true)))
|
||||
pattern_RunService_TerminateRun_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2, 1, 0, 4, 1, 5, 3, 2, 4}, []string{"apis", "v1beta1", "runs", "run_id", "terminate"}, ""))
|
||||
|
||||
pattern_RunService_RetryRun_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2, 1, 0, 4, 1, 5, 3, 2, 4}, []string{"apis", "v1beta1", "runs", "run_id", "retry"}, "", runtime.AssumeColonVerbOpt(true)))
|
||||
pattern_RunService_RetryRun_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2, 1, 0, 4, 1, 5, 3, 2, 4}, []string{"apis", "v1beta1", "runs", "run_id", "retry"}, ""))
|
||||
)
|
||||
|
||||
var (
|
||||
|
|
|
|||
|
|
@ -53,10 +53,7 @@ func request_TaskService_ListTasks_0(ctx context.Context, marshaler runtime.Mars
|
|||
var protoReq ListTasksRequest
|
||||
var metadata runtime.ServerMetadata
|
||||
|
||||
if err := req.ParseForm(); err != nil {
|
||||
return nil, metadata, status.Errorf(codes.InvalidArgument, "%v", err)
|
||||
}
|
||||
if err := runtime.PopulateQueryParameters(&protoReq, req.Form, filter_TaskService_ListTasks_0); err != nil {
|
||||
if err := runtime.PopulateQueryParameters(&protoReq, req.URL.Query(), filter_TaskService_ListTasks_0); err != nil {
|
||||
return nil, metadata, status.Errorf(codes.InvalidArgument, "%v", err)
|
||||
}
|
||||
|
||||
|
|
@ -147,9 +144,9 @@ func RegisterTaskServiceHandlerClient(ctx context.Context, mux *runtime.ServeMux
|
|||
}
|
||||
|
||||
var (
|
||||
pattern_TaskService_CreateTask_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2}, []string{"apis", "v1alpha1", "tasks"}, "", runtime.AssumeColonVerbOpt(true)))
|
||||
pattern_TaskService_CreateTask_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2}, []string{"apis", "v1alpha1", "tasks"}, ""))
|
||||
|
||||
pattern_TaskService_ListTasks_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2}, []string{"apis", "v1alpha1", "tasks"}, "", runtime.AssumeColonVerbOpt(true)))
|
||||
pattern_TaskService_ListTasks_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2}, []string{"apis", "v1alpha1", "tasks"}, ""))
|
||||
)
|
||||
|
||||
var (
|
||||
|
|
|
|||
|
|
@ -125,7 +125,7 @@ func RegisterVisualizationServiceHandlerClient(ctx context.Context, mux *runtime
|
|||
}
|
||||
|
||||
var (
|
||||
pattern_VisualizationService_CreateVisualization_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2, 1, 0, 4, 1, 5, 3}, []string{"apis", "v1beta1", "visualizations", "namespace"}, "", runtime.AssumeColonVerbOpt(true)))
|
||||
pattern_VisualizationService_CreateVisualization_0 = runtime.MustPattern(runtime.NewPattern(1, []int{2, 0, 2, 1, 2, 2, 1, 0, 4, 1, 5, 3}, []string{"apis", "v1beta1", "visualizations", "namespace"}, ""))
|
||||
)
|
||||
|
||||
var (
|
||||
|
|
|
|||
|
|
@ -8,7 +8,9 @@ package job_model
|
|||
import (
|
||||
strfmt "github.com/go-openapi/strfmt"
|
||||
|
||||
"github.com/go-openapi/errors"
|
||||
"github.com/go-openapi/swag"
|
||||
"github.com/go-openapi/validate"
|
||||
)
|
||||
|
||||
// PipelineSpecRuntimeConfig The runtime config of a PipelineSpec.
|
||||
|
|
@ -18,7 +20,7 @@ type PipelineSpecRuntimeConfig struct {
|
|||
// The runtime parameters of the PipelineSpec. The parameters will be
|
||||
// used to replace the placeholders
|
||||
// at runtime.
|
||||
Parameters map[string]interface{} `json:"parameters,omitempty"`
|
||||
Parameters map[string]ProtobufValue `json:"parameters,omitempty"`
|
||||
|
||||
// A path in a object store bucket which will be treated as the root
|
||||
// output directory of the pipeline. It is used by the system to
|
||||
|
|
@ -28,6 +30,37 @@ type PipelineSpecRuntimeConfig struct {
|
|||
|
||||
// Validate validates this pipeline spec runtime config
|
||||
func (m *PipelineSpecRuntimeConfig) Validate(formats strfmt.Registry) error {
|
||||
var res []error
|
||||
|
||||
if err := m.validateParameters(formats); err != nil {
|
||||
res = append(res, err)
|
||||
}
|
||||
|
||||
if len(res) > 0 {
|
||||
return errors.CompositeValidationError(res...)
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
func (m *PipelineSpecRuntimeConfig) validateParameters(formats strfmt.Registry) error {
|
||||
|
||||
if swag.IsZero(m.Parameters) { // not required
|
||||
return nil
|
||||
}
|
||||
|
||||
for k := range m.Parameters {
|
||||
|
||||
if err := validate.Required("parameters"+"."+k, "body", m.Parameters[k]); err != nil {
|
||||
return err
|
||||
}
|
||||
if val, ok := m.Parameters[k]; ok {
|
||||
if err := val.Validate(formats); err != nil {
|
||||
return err
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
|
|
|
|||
|
|
@ -0,0 +1,82 @@
|
|||
// Code generated by go-swagger; DO NOT EDIT.
|
||||
|
||||
package job_model
|
||||
|
||||
// This file was generated by the swagger tool.
|
||||
// Editing this file might prove futile when you re-run the swagger generate command
|
||||
|
||||
import (
|
||||
"strconv"
|
||||
|
||||
strfmt "github.com/go-openapi/strfmt"
|
||||
|
||||
"github.com/go-openapi/errors"
|
||||
"github.com/go-openapi/swag"
|
||||
)
|
||||
|
||||
// ProtobufListValue `ListValue` is a wrapper around a repeated field of values.
|
||||
//
|
||||
// The JSON representation for `ListValue` is JSON array.
|
||||
// swagger:model protobufListValue
|
||||
type ProtobufListValue struct {
|
||||
|
||||
// Repeated field of dynamically typed values.
|
||||
Values []*ProtobufValue `json:"values"`
|
||||
}
|
||||
|
||||
// Validate validates this protobuf list value
|
||||
func (m *ProtobufListValue) Validate(formats strfmt.Registry) error {
|
||||
var res []error
|
||||
|
||||
if err := m.validateValues(formats); err != nil {
|
||||
res = append(res, err)
|
||||
}
|
||||
|
||||
if len(res) > 0 {
|
||||
return errors.CompositeValidationError(res...)
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
func (m *ProtobufListValue) validateValues(formats strfmt.Registry) error {
|
||||
|
||||
if swag.IsZero(m.Values) { // not required
|
||||
return nil
|
||||
}
|
||||
|
||||
for i := 0; i < len(m.Values); i++ {
|
||||
if swag.IsZero(m.Values[i]) { // not required
|
||||
continue
|
||||
}
|
||||
|
||||
if m.Values[i] != nil {
|
||||
if err := m.Values[i].Validate(formats); err != nil {
|
||||
if ve, ok := err.(*errors.Validation); ok {
|
||||
return ve.ValidateName("values" + "." + strconv.Itoa(i))
|
||||
}
|
||||
return err
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
// MarshalBinary interface implementation
|
||||
func (m *ProtobufListValue) MarshalBinary() ([]byte, error) {
|
||||
if m == nil {
|
||||
return nil, nil
|
||||
}
|
||||
return swag.WriteJSON(m)
|
||||
}
|
||||
|
||||
// UnmarshalBinary interface implementation
|
||||
func (m *ProtobufListValue) UnmarshalBinary(b []byte) error {
|
||||
var res ProtobufListValue
|
||||
if err := swag.ReadJSON(b, &res); err != nil {
|
||||
return err
|
||||
}
|
||||
*m = res
|
||||
return nil
|
||||
}
|
||||
|
|
@ -0,0 +1,83 @@
|
|||
// Code generated by go-swagger; DO NOT EDIT.
|
||||
|
||||
package job_model
|
||||
|
||||
// This file was generated by the swagger tool.
|
||||
// Editing this file might prove futile when you re-run the swagger generate command
|
||||
|
||||
import (
|
||||
strfmt "github.com/go-openapi/strfmt"
|
||||
|
||||
"github.com/go-openapi/errors"
|
||||
"github.com/go-openapi/swag"
|
||||
"github.com/go-openapi/validate"
|
||||
)
|
||||
|
||||
// ProtobufStruct `Struct` represents a structured data value, consisting of fields
|
||||
// which map to dynamically typed values. In some languages, `Struct`
|
||||
// might be supported by a native representation. For example, in
|
||||
// scripting languages like JS a struct is represented as an
|
||||
// object. The details of that representation are described together
|
||||
// with the proto support for the language.
|
||||
//
|
||||
// The JSON representation for `Struct` is JSON object.
|
||||
// swagger:model protobufStruct
|
||||
type ProtobufStruct struct {
|
||||
|
||||
// Unordered map of dynamically typed values.
|
||||
Fields map[string]ProtobufValue `json:"fields,omitempty"`
|
||||
}
|
||||
|
||||
// Validate validates this protobuf struct
|
||||
func (m *ProtobufStruct) Validate(formats strfmt.Registry) error {
|
||||
var res []error
|
||||
|
||||
if err := m.validateFields(formats); err != nil {
|
||||
res = append(res, err)
|
||||
}
|
||||
|
||||
if len(res) > 0 {
|
||||
return errors.CompositeValidationError(res...)
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
func (m *ProtobufStruct) validateFields(formats strfmt.Registry) error {
|
||||
|
||||
if swag.IsZero(m.Fields) { // not required
|
||||
return nil
|
||||
}
|
||||
|
||||
for k := range m.Fields {
|
||||
|
||||
if err := validate.Required("fields"+"."+k, "body", m.Fields[k]); err != nil {
|
||||
return err
|
||||
}
|
||||
if val, ok := m.Fields[k]; ok {
|
||||
if err := val.Validate(formats); err != nil {
|
||||
return err
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
// MarshalBinary interface implementation
|
||||
func (m *ProtobufStruct) MarshalBinary() ([]byte, error) {
|
||||
if m == nil {
|
||||
return nil, nil
|
||||
}
|
||||
return swag.WriteJSON(m)
|
||||
}
|
||||
|
||||
// UnmarshalBinary interface implementation
|
||||
func (m *ProtobufStruct) UnmarshalBinary(b []byte) error {
|
||||
var res ProtobufStruct
|
||||
if err := swag.ReadJSON(b, &res); err != nil {
|
||||
return err
|
||||
}
|
||||
*m = res
|
||||
return nil
|
||||
}
|
||||
|
|
@ -0,0 +1,133 @@
|
|||
// Code generated by go-swagger; DO NOT EDIT.
|
||||
|
||||
package job_model
|
||||
|
||||
// This file was generated by the swagger tool.
|
||||
// Editing this file might prove futile when you re-run the swagger generate command
|
||||
|
||||
import (
|
||||
strfmt "github.com/go-openapi/strfmt"
|
||||
|
||||
"github.com/go-openapi/errors"
|
||||
"github.com/go-openapi/swag"
|
||||
)
|
||||
|
||||
// ProtobufValue `Value` represents a dynamically typed value which can be either
|
||||
// null, a number, a string, a boolean, a recursive struct value, or a
|
||||
// list of values. A producer of value is expected to set one of that
|
||||
// variants, absence of any variant indicates an error.
|
||||
//
|
||||
// The JSON representation for `Value` is JSON value.
|
||||
// swagger:model protobufValue
|
||||
type ProtobufValue struct {
|
||||
|
||||
// Represents a boolean value.
|
||||
BoolValue bool `json:"bool_value,omitempty"`
|
||||
|
||||
// Represents a repeated `Value`.
|
||||
ListValue *ProtobufListValue `json:"list_value,omitempty"`
|
||||
|
||||
// Represents a null value.
|
||||
NullValue ProtobufNullValue `json:"null_value,omitempty"`
|
||||
|
||||
// Represents a double value.
|
||||
NumberValue float64 `json:"number_value,omitempty"`
|
||||
|
||||
// Represents a string value.
|
||||
StringValue string `json:"string_value,omitempty"`
|
||||
|
||||
// Represents a structured value.
|
||||
StructValue *ProtobufStruct `json:"struct_value,omitempty"`
|
||||
}
|
||||
|
||||
// Validate validates this protobuf value
|
||||
func (m *ProtobufValue) Validate(formats strfmt.Registry) error {
|
||||
var res []error
|
||||
|
||||
if err := m.validateListValue(formats); err != nil {
|
||||
res = append(res, err)
|
||||
}
|
||||
|
||||
if err := m.validateNullValue(formats); err != nil {
|
||||
res = append(res, err)
|
||||
}
|
||||
|
||||
if err := m.validateStructValue(formats); err != nil {
|
||||
res = append(res, err)
|
||||
}
|
||||
|
||||
if len(res) > 0 {
|
||||
return errors.CompositeValidationError(res...)
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
func (m *ProtobufValue) validateListValue(formats strfmt.Registry) error {
|
||||
|
||||
if swag.IsZero(m.ListValue) { // not required
|
||||
return nil
|
||||
}
|
||||
|
||||
if m.ListValue != nil {
|
||||
if err := m.ListValue.Validate(formats); err != nil {
|
||||
if ve, ok := err.(*errors.Validation); ok {
|
||||
return ve.ValidateName("list_value")
|
||||
}
|
||||
return err
|
||||
}
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
func (m *ProtobufValue) validateNullValue(formats strfmt.Registry) error {
|
||||
|
||||
if swag.IsZero(m.NullValue) { // not required
|
||||
return nil
|
||||
}
|
||||
|
||||
if err := m.NullValue.Validate(formats); err != nil {
|
||||
if ve, ok := err.(*errors.Validation); ok {
|
||||
return ve.ValidateName("null_value")
|
||||
}
|
||||
return err
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
func (m *ProtobufValue) validateStructValue(formats strfmt.Registry) error {
|
||||
|
||||
if swag.IsZero(m.StructValue) { // not required
|
||||
return nil
|
||||
}
|
||||
|
||||
if m.StructValue != nil {
|
||||
if err := m.StructValue.Validate(formats); err != nil {
|
||||
if ve, ok := err.(*errors.Validation); ok {
|
||||
return ve.ValidateName("struct_value")
|
||||
}
|
||||
return err
|
||||
}
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
// MarshalBinary interface implementation
|
||||
func (m *ProtobufValue) MarshalBinary() ([]byte, error) {
|
||||
if m == nil {
|
||||
return nil, nil
|
||||
}
|
||||
return swag.WriteJSON(m)
|
||||
}
|
||||
|
||||
// UnmarshalBinary interface implementation
|
||||
func (m *ProtobufValue) UnmarshalBinary(b []byte) error {
|
||||
var res ProtobufValue
|
||||
if err := swag.ReadJSON(b, &res); err != nil {
|
||||
return err
|
||||
}
|
||||
*m = res
|
||||
return nil
|
||||
}
|
||||
|
|
@ -8,7 +8,9 @@ package run_model
|
|||
import (
|
||||
strfmt "github.com/go-openapi/strfmt"
|
||||
|
||||
"github.com/go-openapi/errors"
|
||||
"github.com/go-openapi/swag"
|
||||
"github.com/go-openapi/validate"
|
||||
)
|
||||
|
||||
// PipelineSpecRuntimeConfig The runtime config of a PipelineSpec.
|
||||
|
|
@ -18,7 +20,7 @@ type PipelineSpecRuntimeConfig struct {
|
|||
// The runtime parameters of the PipelineSpec. The parameters will be
|
||||
// used to replace the placeholders
|
||||
// at runtime.
|
||||
Parameters map[string]interface{} `json:"parameters,omitempty"`
|
||||
Parameters map[string]ProtobufValue `json:"parameters,omitempty"`
|
||||
|
||||
// A path in a object store bucket which will be treated as the root
|
||||
// output directory of the pipeline. It is used by the system to
|
||||
|
|
@ -28,6 +30,37 @@ type PipelineSpecRuntimeConfig struct {
|
|||
|
||||
// Validate validates this pipeline spec runtime config
|
||||
func (m *PipelineSpecRuntimeConfig) Validate(formats strfmt.Registry) error {
|
||||
var res []error
|
||||
|
||||
if err := m.validateParameters(formats); err != nil {
|
||||
res = append(res, err)
|
||||
}
|
||||
|
||||
if len(res) > 0 {
|
||||
return errors.CompositeValidationError(res...)
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
func (m *PipelineSpecRuntimeConfig) validateParameters(formats strfmt.Registry) error {
|
||||
|
||||
if swag.IsZero(m.Parameters) { // not required
|
||||
return nil
|
||||
}
|
||||
|
||||
for k := range m.Parameters {
|
||||
|
||||
if err := validate.Required("parameters"+"."+k, "body", m.Parameters[k]); err != nil {
|
||||
return err
|
||||
}
|
||||
if val, ok := m.Parameters[k]; ok {
|
||||
if err := val.Validate(formats); err != nil {
|
||||
return err
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
|
|
|
|||
|
|
@ -0,0 +1,82 @@
|
|||
// Code generated by go-swagger; DO NOT EDIT.
|
||||
|
||||
package run_model
|
||||
|
||||
// This file was generated by the swagger tool.
|
||||
// Editing this file might prove futile when you re-run the swagger generate command
|
||||
|
||||
import (
|
||||
"strconv"
|
||||
|
||||
strfmt "github.com/go-openapi/strfmt"
|
||||
|
||||
"github.com/go-openapi/errors"
|
||||
"github.com/go-openapi/swag"
|
||||
)
|
||||
|
||||
// ProtobufListValue `ListValue` is a wrapper around a repeated field of values.
|
||||
//
|
||||
// The JSON representation for `ListValue` is JSON array.
|
||||
// swagger:model protobufListValue
|
||||
type ProtobufListValue struct {
|
||||
|
||||
// Repeated field of dynamically typed values.
|
||||
Values []*ProtobufValue `json:"values"`
|
||||
}
|
||||
|
||||
// Validate validates this protobuf list value
|
||||
func (m *ProtobufListValue) Validate(formats strfmt.Registry) error {
|
||||
var res []error
|
||||
|
||||
if err := m.validateValues(formats); err != nil {
|
||||
res = append(res, err)
|
||||
}
|
||||
|
||||
if len(res) > 0 {
|
||||
return errors.CompositeValidationError(res...)
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
func (m *ProtobufListValue) validateValues(formats strfmt.Registry) error {
|
||||
|
||||
if swag.IsZero(m.Values) { // not required
|
||||
return nil
|
||||
}
|
||||
|
||||
for i := 0; i < len(m.Values); i++ {
|
||||
if swag.IsZero(m.Values[i]) { // not required
|
||||
continue
|
||||
}
|
||||
|
||||
if m.Values[i] != nil {
|
||||
if err := m.Values[i].Validate(formats); err != nil {
|
||||
if ve, ok := err.(*errors.Validation); ok {
|
||||
return ve.ValidateName("values" + "." + strconv.Itoa(i))
|
||||
}
|
||||
return err
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
// MarshalBinary interface implementation
|
||||
func (m *ProtobufListValue) MarshalBinary() ([]byte, error) {
|
||||
if m == nil {
|
||||
return nil, nil
|
||||
}
|
||||
return swag.WriteJSON(m)
|
||||
}
|
||||
|
||||
// UnmarshalBinary interface implementation
|
||||
func (m *ProtobufListValue) UnmarshalBinary(b []byte) error {
|
||||
var res ProtobufListValue
|
||||
if err := swag.ReadJSON(b, &res); err != nil {
|
||||
return err
|
||||
}
|
||||
*m = res
|
||||
return nil
|
||||
}
|
||||
|
|
@ -0,0 +1,83 @@
|
|||
// Code generated by go-swagger; DO NOT EDIT.
|
||||
|
||||
package run_model
|
||||
|
||||
// This file was generated by the swagger tool.
|
||||
// Editing this file might prove futile when you re-run the swagger generate command
|
||||
|
||||
import (
|
||||
strfmt "github.com/go-openapi/strfmt"
|
||||
|
||||
"github.com/go-openapi/errors"
|
||||
"github.com/go-openapi/swag"
|
||||
"github.com/go-openapi/validate"
|
||||
)
|
||||
|
||||
// ProtobufStruct `Struct` represents a structured data value, consisting of fields
|
||||
// which map to dynamically typed values. In some languages, `Struct`
|
||||
// might be supported by a native representation. For example, in
|
||||
// scripting languages like JS a struct is represented as an
|
||||
// object. The details of that representation are described together
|
||||
// with the proto support for the language.
|
||||
//
|
||||
// The JSON representation for `Struct` is JSON object.
|
||||
// swagger:model protobufStruct
|
||||
type ProtobufStruct struct {
|
||||
|
||||
// Unordered map of dynamically typed values.
|
||||
Fields map[string]ProtobufValue `json:"fields,omitempty"`
|
||||
}
|
||||
|
||||
// Validate validates this protobuf struct
|
||||
func (m *ProtobufStruct) Validate(formats strfmt.Registry) error {
|
||||
var res []error
|
||||
|
||||
if err := m.validateFields(formats); err != nil {
|
||||
res = append(res, err)
|
||||
}
|
||||
|
||||
if len(res) > 0 {
|
||||
return errors.CompositeValidationError(res...)
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
func (m *ProtobufStruct) validateFields(formats strfmt.Registry) error {
|
||||
|
||||
if swag.IsZero(m.Fields) { // not required
|
||||
return nil
|
||||
}
|
||||
|
||||
for k := range m.Fields {
|
||||
|
||||
if err := validate.Required("fields"+"."+k, "body", m.Fields[k]); err != nil {
|
||||
return err
|
||||
}
|
||||
if val, ok := m.Fields[k]; ok {
|
||||
if err := val.Validate(formats); err != nil {
|
||||
return err
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
// MarshalBinary interface implementation
|
||||
func (m *ProtobufStruct) MarshalBinary() ([]byte, error) {
|
||||
if m == nil {
|
||||
return nil, nil
|
||||
}
|
||||
return swag.WriteJSON(m)
|
||||
}
|
||||
|
||||
// UnmarshalBinary interface implementation
|
||||
func (m *ProtobufStruct) UnmarshalBinary(b []byte) error {
|
||||
var res ProtobufStruct
|
||||
if err := swag.ReadJSON(b, &res); err != nil {
|
||||
return err
|
||||
}
|
||||
*m = res
|
||||
return nil
|
||||
}
|
||||
|
|
@ -0,0 +1,133 @@
|
|||
// Code generated by go-swagger; DO NOT EDIT.
|
||||
|
||||
package run_model
|
||||
|
||||
// This file was generated by the swagger tool.
|
||||
// Editing this file might prove futile when you re-run the swagger generate command
|
||||
|
||||
import (
|
||||
strfmt "github.com/go-openapi/strfmt"
|
||||
|
||||
"github.com/go-openapi/errors"
|
||||
"github.com/go-openapi/swag"
|
||||
)
|
||||
|
||||
// ProtobufValue `Value` represents a dynamically typed value which can be either
|
||||
// null, a number, a string, a boolean, a recursive struct value, or a
|
||||
// list of values. A producer of value is expected to set one of that
|
||||
// variants, absence of any variant indicates an error.
|
||||
//
|
||||
// The JSON representation for `Value` is JSON value.
|
||||
// swagger:model protobufValue
|
||||
type ProtobufValue struct {
|
||||
|
||||
// Represents a boolean value.
|
||||
BoolValue bool `json:"bool_value,omitempty"`
|
||||
|
||||
// Represents a repeated `Value`.
|
||||
ListValue *ProtobufListValue `json:"list_value,omitempty"`
|
||||
|
||||
// Represents a null value.
|
||||
NullValue ProtobufNullValue `json:"null_value,omitempty"`
|
||||
|
||||
// Represents a double value.
|
||||
NumberValue float64 `json:"number_value,omitempty"`
|
||||
|
||||
// Represents a string value.
|
||||
StringValue string `json:"string_value,omitempty"`
|
||||
|
||||
// Represents a structured value.
|
||||
StructValue *ProtobufStruct `json:"struct_value,omitempty"`
|
||||
}
|
||||
|
||||
// Validate validates this protobuf value
|
||||
func (m *ProtobufValue) Validate(formats strfmt.Registry) error {
|
||||
var res []error
|
||||
|
||||
if err := m.validateListValue(formats); err != nil {
|
||||
res = append(res, err)
|
||||
}
|
||||
|
||||
if err := m.validateNullValue(formats); err != nil {
|
||||
res = append(res, err)
|
||||
}
|
||||
|
||||
if err := m.validateStructValue(formats); err != nil {
|
||||
res = append(res, err)
|
||||
}
|
||||
|
||||
if len(res) > 0 {
|
||||
return errors.CompositeValidationError(res...)
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
func (m *ProtobufValue) validateListValue(formats strfmt.Registry) error {
|
||||
|
||||
if swag.IsZero(m.ListValue) { // not required
|
||||
return nil
|
||||
}
|
||||
|
||||
if m.ListValue != nil {
|
||||
if err := m.ListValue.Validate(formats); err != nil {
|
||||
if ve, ok := err.(*errors.Validation); ok {
|
||||
return ve.ValidateName("list_value")
|
||||
}
|
||||
return err
|
||||
}
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
func (m *ProtobufValue) validateNullValue(formats strfmt.Registry) error {
|
||||
|
||||
if swag.IsZero(m.NullValue) { // not required
|
||||
return nil
|
||||
}
|
||||
|
||||
if err := m.NullValue.Validate(formats); err != nil {
|
||||
if ve, ok := err.(*errors.Validation); ok {
|
||||
return ve.ValidateName("null_value")
|
||||
}
|
||||
return err
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
func (m *ProtobufValue) validateStructValue(formats strfmt.Registry) error {
|
||||
|
||||
if swag.IsZero(m.StructValue) { // not required
|
||||
return nil
|
||||
}
|
||||
|
||||
if m.StructValue != nil {
|
||||
if err := m.StructValue.Validate(formats); err != nil {
|
||||
if ve, ok := err.(*errors.Validation); ok {
|
||||
return ve.ValidateName("struct_value")
|
||||
}
|
||||
return err
|
||||
}
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
// MarshalBinary interface implementation
|
||||
func (m *ProtobufValue) MarshalBinary() ([]byte, error) {
|
||||
if m == nil {
|
||||
return nil, nil
|
||||
}
|
||||
return swag.WriteJSON(m)
|
||||
}
|
||||
|
||||
// UnmarshalBinary interface implementation
|
||||
func (m *ProtobufValue) UnmarshalBinary(b []byte) error {
|
||||
var res ProtobufValue
|
||||
if err := swag.ReadJSON(b, &res); err != nil {
|
||||
return err
|
||||
}
|
||||
*m = res
|
||||
return nil
|
||||
}
|
||||
|
|
@ -3,8 +3,8 @@ This file contains REST API specification for Kubeflow Pipelines. The file is au
|
|||
|
||||
This Python package is automatically generated by the [OpenAPI Generator](https://openapi-generator.tech) project:
|
||||
|
||||
- API version: 2.0.0-alpha.2
|
||||
- Package version: 2.0.0-alpha.2
|
||||
- API version: 2.0.0-alpha.3
|
||||
- Package version: 2.0.0-alpha.3
|
||||
- Build package: org.openapitools.codegen.languages.PythonClientCodegen
|
||||
For more information, please visit [https://www.google.com](https://www.google.com)
|
||||
|
||||
|
|
|
|||
|
|
@ -14,7 +14,7 @@
|
|||
|
||||
from __future__ import absolute_import
|
||||
|
||||
__version__ = "2.0.0-alpha.2"
|
||||
__version__ = "2.0.0-alpha.3"
|
||||
|
||||
# import apis into sdk package
|
||||
from kfp_server_api.api.experiment_service_api import ExperimentServiceApi
|
||||
|
|
|
|||
|
|
@ -78,7 +78,7 @@ class ApiClient(object):
|
|||
self.default_headers[header_name] = header_value
|
||||
self.cookie = cookie
|
||||
# Set default User-Agent.
|
||||
self.user_agent = 'OpenAPI-Generator/2.0.0-alpha.2/python'
|
||||
self.user_agent = 'OpenAPI-Generator/2.0.0-alpha.3/python'
|
||||
self.client_side_validation = configuration.client_side_validation
|
||||
|
||||
def __enter__(self):
|
||||
|
|
|
|||
|
|
@ -351,8 +351,8 @@ conf = kfp_server_api.Configuration(
|
|||
return "Python SDK Debug Report:\n"\
|
||||
"OS: {env}\n"\
|
||||
"Python Version: {pyversion}\n"\
|
||||
"Version of the API: 2.0.0-alpha.2\n"\
|
||||
"SDK Package Version: 2.0.0-alpha.2".\
|
||||
"Version of the API: 2.0.0-alpha.3\n"\
|
||||
"SDK Package Version: 2.0.0-alpha.3".\
|
||||
format(env=sys.platform, pyversion=sys.version)
|
||||
|
||||
def get_host_settings(self):
|
||||
|
|
|
|||
|
|
@ -13,7 +13,7 @@
|
|||
from setuptools import setup, find_packages # noqa: H301
|
||||
|
||||
NAME = "kfp-server-api"
|
||||
VERSION = "2.0.0-alpha.2"
|
||||
VERSION = "2.0.0-alpha.3"
|
||||
# To install the library, run the following
|
||||
#
|
||||
# python setup.py install
|
||||
|
|
|
|||
|
|
@ -268,7 +268,7 @@
|
|||
"parameters": {
|
||||
"type": "object",
|
||||
"additionalProperties": {
|
||||
"type": "object"
|
||||
"$ref": "#/definitions/protobufValue"
|
||||
},
|
||||
"description": "The runtime parameters of the PipelineSpec. The parameters will be\nused to replace the placeholders\nat runtime."
|
||||
},
|
||||
|
|
@ -551,6 +551,19 @@
|
|||
},
|
||||
"description": "`Any` contains an arbitrary serialized protocol buffer message along with a\nURL that describes the type of the serialized message.\n\nProtobuf library provides support to pack/unpack Any values in the form\nof utility functions or additional generated methods of the Any type.\n\nExample 1: Pack and unpack a message in C++.\n\n Foo foo = ...;\n Any any;\n any.PackFrom(foo);\n ...\n if (any.UnpackTo(\u0026foo)) {\n ...\n }\n\nExample 2: Pack and unpack a message in Java.\n\n Foo foo = ...;\n Any any = Any.pack(foo);\n ...\n if (any.is(Foo.class)) {\n foo = any.unpack(Foo.class);\n }\n\n Example 3: Pack and unpack a message in Python.\n\n foo = Foo(...)\n any = Any()\n any.Pack(foo)\n ...\n if any.Is(Foo.DESCRIPTOR):\n any.Unpack(foo)\n ...\n\n Example 4: Pack and unpack a message in Go\n\n foo := \u0026pb.Foo{...}\n any, err := anypb.New(foo)\n if err != nil {\n ...\n }\n ...\n foo := \u0026pb.Foo{}\n if err := any.UnmarshalTo(foo); err != nil {\n ...\n }\n\nThe pack methods provided by protobuf library will by default use\n'type.googleapis.com/full.type.name' as the type URL and the unpack\nmethods only use the fully qualified type name after the last '/'\nin the type URL, for example \"foo.bar.com/x/y.z\" will yield type\nname \"y.z\".\n\n\nJSON\n====\nThe JSON representation of an `Any` value uses the regular\nrepresentation of the deserialized, embedded message, with an\nadditional field `@type` which contains the type URL. Example:\n\n package google.profile;\n message Person {\n string first_name = 1;\n string last_name = 2;\n }\n\n {\n \"@type\": \"type.googleapis.com/google.profile.Person\",\n \"firstName\": \u003cstring\u003e,\n \"lastName\": \u003cstring\u003e\n }\n\nIf the embedded message type is well-known and has a custom JSON\nrepresentation, that representation will be embedded adding a field\n`value` which holds the custom JSON in addition to the `@type`\nfield. Example (for message [google.protobuf.Duration][]):\n\n {\n \"@type\": \"type.googleapis.com/google.protobuf.Duration\",\n \"value\": \"1.212s\"\n }"
|
||||
},
|
||||
"protobufListValue": {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"values": {
|
||||
"type": "array",
|
||||
"items": {
|
||||
"$ref": "#/definitions/protobufValue"
|
||||
},
|
||||
"description": "Repeated field of dynamically typed values."
|
||||
}
|
||||
},
|
||||
"description": "`ListValue` is a wrapper around a repeated field of values.\n\nThe JSON representation for `ListValue` is JSON array."
|
||||
},
|
||||
"protobufNullValue": {
|
||||
"type": "string",
|
||||
"enum": [
|
||||
|
|
@ -558,6 +571,51 @@
|
|||
],
|
||||
"default": "NULL_VALUE",
|
||||
"description": "`NullValue` is a singleton enumeration to represent the null value for the\n`Value` type union.\n\n The JSON representation for `NullValue` is JSON `null`.\n\n - NULL_VALUE: Null value."
|
||||
},
|
||||
"protobufStruct": {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"fields": {
|
||||
"type": "object",
|
||||
"additionalProperties": {
|
||||
"$ref": "#/definitions/protobufValue"
|
||||
},
|
||||
"description": "Unordered map of dynamically typed values."
|
||||
}
|
||||
},
|
||||
"description": "`Struct` represents a structured data value, consisting of fields\nwhich map to dynamically typed values. In some languages, `Struct`\nmight be supported by a native representation. For example, in\nscripting languages like JS a struct is represented as an\nobject. The details of that representation are described together\nwith the proto support for the language.\n\nThe JSON representation for `Struct` is JSON object."
|
||||
},
|
||||
"protobufValue": {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"null_value": {
|
||||
"$ref": "#/definitions/protobufNullValue",
|
||||
"description": "Represents a null value."
|
||||
},
|
||||
"number_value": {
|
||||
"type": "number",
|
||||
"format": "double",
|
||||
"description": "Represents a double value."
|
||||
},
|
||||
"string_value": {
|
||||
"type": "string",
|
||||
"description": "Represents a string value."
|
||||
},
|
||||
"bool_value": {
|
||||
"type": "boolean",
|
||||
"format": "boolean",
|
||||
"description": "Represents a boolean value."
|
||||
},
|
||||
"struct_value": {
|
||||
"$ref": "#/definitions/protobufStruct",
|
||||
"description": "Represents a structured value."
|
||||
},
|
||||
"list_value": {
|
||||
"$ref": "#/definitions/protobufListValue",
|
||||
"description": "Represents a repeated `Value`."
|
||||
}
|
||||
},
|
||||
"description": "`Value` represents a dynamically typed value which can be either\nnull, a number, a string, a boolean, a recursive struct value, or a\nlist of values. A producer of value is expected to set one of that\nvariants, absence of any variant indicates an error.\n\nThe JSON representation for `Value` is JSON value."
|
||||
}
|
||||
},
|
||||
"securityDefinitions": {
|
||||
|
|
|
|||
|
|
@ -2,7 +2,7 @@
|
|||
"swagger": "2.0",
|
||||
"info": {
|
||||
"title": "Kubeflow Pipelines API",
|
||||
"version": "2.0.0-alpha.2",
|
||||
"version": "2.0.0-alpha.3",
|
||||
"description": "This file contains REST API specification for Kubeflow Pipelines. The file is autogenerated from the swagger definition.",
|
||||
"contact": {
|
||||
"name": "google",
|
||||
|
|
@ -1499,7 +1499,7 @@
|
|||
"parameters": {
|
||||
"type": "object",
|
||||
"additionalProperties": {
|
||||
"type": "object"
|
||||
"$ref": "#/definitions/protobufValue"
|
||||
},
|
||||
"description": "The runtime parameters of the PipelineSpec. The parameters will be\nused to replace the placeholders\nat runtime."
|
||||
},
|
||||
|
|
@ -1856,6 +1856,19 @@
|
|||
},
|
||||
"description": "`Any` contains an arbitrary serialized protocol buffer message along with a\nURL that describes the type of the serialized message.\n\nProtobuf library provides support to pack/unpack Any values in the form\nof utility functions or additional generated methods of the Any type.\n\nExample 1: Pack and unpack a message in C++.\n\n Foo foo = ...;\n Any any;\n any.PackFrom(foo);\n ...\n if (any.UnpackTo(&foo)) {\n ...\n }\n\nExample 2: Pack and unpack a message in Java.\n\n Foo foo = ...;\n Any any = Any.pack(foo);\n ...\n if (any.is(Foo.class)) {\n foo = any.unpack(Foo.class);\n }\n\n Example 3: Pack and unpack a message in Python.\n\n foo = Foo(...)\n any = Any()\n any.Pack(foo)\n ...\n if any.Is(Foo.DESCRIPTOR):\n any.Unpack(foo)\n ...\n\n Example 4: Pack and unpack a message in Go\n\n foo := &pb.Foo{...}\n any, err := anypb.New(foo)\n if err != nil {\n ...\n }\n ...\n foo := &pb.Foo{}\n if err := any.UnmarshalTo(foo); err != nil {\n ...\n }\n\nThe pack methods provided by protobuf library will by default use\n'type.googleapis.com/full.type.name' as the type URL and the unpack\nmethods only use the fully qualified type name after the last '/'\nin the type URL, for example \"foo.bar.com/x/y.z\" will yield type\nname \"y.z\".\n\n\nJSON\n====\nThe JSON representation of an `Any` value uses the regular\nrepresentation of the deserialized, embedded message, with an\nadditional field `@type` which contains the type URL. Example:\n\n package google.profile;\n message Person {\n string first_name = 1;\n string last_name = 2;\n }\n\n {\n \"@type\": \"type.googleapis.com/google.profile.Person\",\n \"firstName\": <string>,\n \"lastName\": <string>\n }\n\nIf the embedded message type is well-known and has a custom JSON\nrepresentation, that representation will be embedded adding a field\n`value` which holds the custom JSON in addition to the `@type`\nfield. Example (for message [google.protobuf.Duration][]):\n\n {\n \"@type\": \"type.googleapis.com/google.protobuf.Duration\",\n \"value\": \"1.212s\"\n }"
|
||||
},
|
||||
"protobufListValue": {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"values": {
|
||||
"type": "array",
|
||||
"items": {
|
||||
"$ref": "#/definitions/protobufValue"
|
||||
},
|
||||
"description": "Repeated field of dynamically typed values."
|
||||
}
|
||||
},
|
||||
"description": "`ListValue` is a wrapper around a repeated field of values.\n\nThe JSON representation for `ListValue` is JSON array."
|
||||
},
|
||||
"protobufNullValue": {
|
||||
"type": "string",
|
||||
"enum": [
|
||||
|
|
@ -1864,6 +1877,51 @@
|
|||
"default": "NULL_VALUE",
|
||||
"description": "`NullValue` is a singleton enumeration to represent the null value for the\n`Value` type union.\n\n The JSON representation for `NullValue` is JSON `null`.\n\n - NULL_VALUE: Null value."
|
||||
},
|
||||
"protobufStruct": {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"fields": {
|
||||
"type": "object",
|
||||
"additionalProperties": {
|
||||
"$ref": "#/definitions/protobufValue"
|
||||
},
|
||||
"description": "Unordered map of dynamically typed values."
|
||||
}
|
||||
},
|
||||
"description": "`Struct` represents a structured data value, consisting of fields\nwhich map to dynamically typed values. In some languages, `Struct`\nmight be supported by a native representation. For example, in\nscripting languages like JS a struct is represented as an\nobject. The details of that representation are described together\nwith the proto support for the language.\n\nThe JSON representation for `Struct` is JSON object."
|
||||
},
|
||||
"protobufValue": {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"null_value": {
|
||||
"$ref": "#/definitions/protobufNullValue",
|
||||
"description": "Represents a null value."
|
||||
},
|
||||
"number_value": {
|
||||
"type": "number",
|
||||
"format": "double",
|
||||
"description": "Represents a double value."
|
||||
},
|
||||
"string_value": {
|
||||
"type": "string",
|
||||
"description": "Represents a string value."
|
||||
},
|
||||
"bool_value": {
|
||||
"type": "boolean",
|
||||
"format": "boolean",
|
||||
"description": "Represents a boolean value."
|
||||
},
|
||||
"struct_value": {
|
||||
"$ref": "#/definitions/protobufStruct",
|
||||
"description": "Represents a structured value."
|
||||
},
|
||||
"list_value": {
|
||||
"$ref": "#/definitions/protobufListValue",
|
||||
"description": "Represents a repeated `Value`."
|
||||
}
|
||||
},
|
||||
"description": "`Value` represents a dynamically typed value which can be either\nnull, a number, a string, a boolean, a recursive struct value, or a\nlist of values. A producer of value is expected to set one of that\nvariants, absence of any variant indicates an error.\n\nThe JSON representation for `Value` is JSON value."
|
||||
},
|
||||
"JobMode": {
|
||||
"type": "string",
|
||||
"enum": [
|
||||
|
|
|
|||
|
|
@ -409,7 +409,7 @@
|
|||
"parameters": {
|
||||
"type": "object",
|
||||
"additionalProperties": {
|
||||
"type": "object"
|
||||
"$ref": "#/definitions/protobufValue"
|
||||
},
|
||||
"description": "The runtime parameters of the PipelineSpec. The parameters will be\nused to replace the placeholders\nat runtime."
|
||||
},
|
||||
|
|
@ -766,6 +766,19 @@
|
|||
},
|
||||
"description": "`Any` contains an arbitrary serialized protocol buffer message along with a\nURL that describes the type of the serialized message.\n\nProtobuf library provides support to pack/unpack Any values in the form\nof utility functions or additional generated methods of the Any type.\n\nExample 1: Pack and unpack a message in C++.\n\n Foo foo = ...;\n Any any;\n any.PackFrom(foo);\n ...\n if (any.UnpackTo(\u0026foo)) {\n ...\n }\n\nExample 2: Pack and unpack a message in Java.\n\n Foo foo = ...;\n Any any = Any.pack(foo);\n ...\n if (any.is(Foo.class)) {\n foo = any.unpack(Foo.class);\n }\n\n Example 3: Pack and unpack a message in Python.\n\n foo = Foo(...)\n any = Any()\n any.Pack(foo)\n ...\n if any.Is(Foo.DESCRIPTOR):\n any.Unpack(foo)\n ...\n\n Example 4: Pack and unpack a message in Go\n\n foo := \u0026pb.Foo{...}\n any, err := anypb.New(foo)\n if err != nil {\n ...\n }\n ...\n foo := \u0026pb.Foo{}\n if err := any.UnmarshalTo(foo); err != nil {\n ...\n }\n\nThe pack methods provided by protobuf library will by default use\n'type.googleapis.com/full.type.name' as the type URL and the unpack\nmethods only use the fully qualified type name after the last '/'\nin the type URL, for example \"foo.bar.com/x/y.z\" will yield type\nname \"y.z\".\n\n\nJSON\n====\nThe JSON representation of an `Any` value uses the regular\nrepresentation of the deserialized, embedded message, with an\nadditional field `@type` which contains the type URL. Example:\n\n package google.profile;\n message Person {\n string first_name = 1;\n string last_name = 2;\n }\n\n {\n \"@type\": \"type.googleapis.com/google.profile.Person\",\n \"firstName\": \u003cstring\u003e,\n \"lastName\": \u003cstring\u003e\n }\n\nIf the embedded message type is well-known and has a custom JSON\nrepresentation, that representation will be embedded adding a field\n`value` which holds the custom JSON in addition to the `@type`\nfield. Example (for message [google.protobuf.Duration][]):\n\n {\n \"@type\": \"type.googleapis.com/google.protobuf.Duration\",\n \"value\": \"1.212s\"\n }"
|
||||
},
|
||||
"protobufListValue": {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"values": {
|
||||
"type": "array",
|
||||
"items": {
|
||||
"$ref": "#/definitions/protobufValue"
|
||||
},
|
||||
"description": "Repeated field of dynamically typed values."
|
||||
}
|
||||
},
|
||||
"description": "`ListValue` is a wrapper around a repeated field of values.\n\nThe JSON representation for `ListValue` is JSON array."
|
||||
},
|
||||
"protobufNullValue": {
|
||||
"type": "string",
|
||||
"enum": [
|
||||
|
|
@ -773,6 +786,51 @@
|
|||
],
|
||||
"default": "NULL_VALUE",
|
||||
"description": "`NullValue` is a singleton enumeration to represent the null value for the\n`Value` type union.\n\n The JSON representation for `NullValue` is JSON `null`.\n\n - NULL_VALUE: Null value."
|
||||
},
|
||||
"protobufStruct": {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"fields": {
|
||||
"type": "object",
|
||||
"additionalProperties": {
|
||||
"$ref": "#/definitions/protobufValue"
|
||||
},
|
||||
"description": "Unordered map of dynamically typed values."
|
||||
}
|
||||
},
|
||||
"description": "`Struct` represents a structured data value, consisting of fields\nwhich map to dynamically typed values. In some languages, `Struct`\nmight be supported by a native representation. For example, in\nscripting languages like JS a struct is represented as an\nobject. The details of that representation are described together\nwith the proto support for the language.\n\nThe JSON representation for `Struct` is JSON object."
|
||||
},
|
||||
"protobufValue": {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"null_value": {
|
||||
"$ref": "#/definitions/protobufNullValue",
|
||||
"description": "Represents a null value."
|
||||
},
|
||||
"number_value": {
|
||||
"type": "number",
|
||||
"format": "double",
|
||||
"description": "Represents a double value."
|
||||
},
|
||||
"string_value": {
|
||||
"type": "string",
|
||||
"description": "Represents a string value."
|
||||
},
|
||||
"bool_value": {
|
||||
"type": "boolean",
|
||||
"format": "boolean",
|
||||
"description": "Represents a boolean value."
|
||||
},
|
||||
"struct_value": {
|
||||
"$ref": "#/definitions/protobufStruct",
|
||||
"description": "Represents a structured value."
|
||||
},
|
||||
"list_value": {
|
||||
"$ref": "#/definitions/protobufListValue",
|
||||
"description": "Represents a repeated `Value`."
|
||||
}
|
||||
},
|
||||
"description": "`Value` represents a dynamically typed value which can be either\nnull, a number, a string, a boolean, a recursive struct value, or a\nlist of values. A producer of value is expected to set one of that\nvariants, absence of any variant indicates an error.\n\nThe JSON representation for `Value` is JSON value."
|
||||
}
|
||||
},
|
||||
"securityDefinitions": {
|
||||
|
|
|
|||
|
|
@ -12,7 +12,7 @@ metadata:
|
|||
spec:
|
||||
descriptor:
|
||||
type: Kubeflow Pipelines
|
||||
version: 2.0.0-alpha.2
|
||||
version: 2.0.0-alpha.3
|
||||
description: |-
|
||||
Reusable end-to-end ML workflow
|
||||
maintainers:
|
||||
|
|
|
|||
|
|
@ -1,9 +1,9 @@
|
|||
x-google-marketplace:
|
||||
schemaVersion: v2
|
||||
applicationApiVersion: v1beta1
|
||||
publishedVersion: 2.0.0-alpha.2
|
||||
publishedVersion: 2.0.0-alpha.3
|
||||
publishedVersionMetadata:
|
||||
releaseNote: Based on 2.0.0-alpha.2 version.
|
||||
releaseNote: Based on 2.0.0-alpha.3 version.
|
||||
releaseTypes:
|
||||
- Feature
|
||||
recommended: false
|
||||
|
|
|
|||
|
|
@ -8,4 +8,4 @@ commonLabels:
|
|||
app: cache-deployer
|
||||
images:
|
||||
- name: gcr.io/ml-pipeline/cache-deployer
|
||||
newTag: 2.0.0-alpha.2
|
||||
newTag: 2.0.0-alpha.3
|
||||
|
|
|
|||
|
|
@ -10,4 +10,4 @@ commonLabels:
|
|||
app: cache-server
|
||||
images:
|
||||
- name: gcr.io/ml-pipeline/cache-server
|
||||
newTag: 2.0.0-alpha.2
|
||||
newTag: 2.0.0-alpha.3
|
||||
|
|
|
|||
|
|
@ -11,7 +11,7 @@ data:
|
|||
until the changes take effect. A quick way to restart all deployments in a
|
||||
namespace: `kubectl rollout restart deployment -n <your-namespace>`.
|
||||
appName: pipeline
|
||||
appVersion: 2.0.0-alpha.2
|
||||
appVersion: 2.0.0-alpha.3
|
||||
dbHost: mysql
|
||||
dbPort: "3306"
|
||||
mlmdDb: metadb
|
||||
|
|
|
|||
|
|
@ -9,4 +9,4 @@ resources:
|
|||
- metadata-grpc-sa.yaml
|
||||
images:
|
||||
- name: gcr.io/ml-pipeline/metadata-envoy
|
||||
newTag: 2.0.0-alpha.2
|
||||
newTag: 2.0.0-alpha.3
|
||||
|
|
|
|||
|
|
@ -37,14 +37,14 @@ resources:
|
|||
- kfp-launcher-configmap.yaml
|
||||
images:
|
||||
- name: gcr.io/ml-pipeline/api-server
|
||||
newTag: 2.0.0-alpha.2
|
||||
newTag: 2.0.0-alpha.3
|
||||
- name: gcr.io/ml-pipeline/persistenceagent
|
||||
newTag: 2.0.0-alpha.2
|
||||
newTag: 2.0.0-alpha.3
|
||||
- name: gcr.io/ml-pipeline/scheduledworkflow
|
||||
newTag: 2.0.0-alpha.2
|
||||
newTag: 2.0.0-alpha.3
|
||||
- name: gcr.io/ml-pipeline/frontend
|
||||
newTag: 2.0.0-alpha.2
|
||||
newTag: 2.0.0-alpha.3
|
||||
- name: gcr.io/ml-pipeline/viewer-crd-controller
|
||||
newTag: 2.0.0-alpha.2
|
||||
newTag: 2.0.0-alpha.3
|
||||
- name: gcr.io/ml-pipeline/visualization-server
|
||||
newTag: 2.0.0-alpha.2
|
||||
newTag: 2.0.0-alpha.3
|
||||
|
|
|
|||
|
|
@ -7,4 +7,4 @@ resources:
|
|||
- metadata-writer-sa.yaml
|
||||
images:
|
||||
- name: gcr.io/ml-pipeline/metadata-writer
|
||||
newTag: 2.0.0-alpha.2
|
||||
newTag: 2.0.0-alpha.3
|
||||
|
|
|
|||
|
|
@ -2,7 +2,7 @@ apiVersion: kustomize.config.k8s.io/v1beta1
|
|||
kind: Kustomization
|
||||
images:
|
||||
- name: gcr.io/ml-pipeline/inverse-proxy-agent
|
||||
newTag: 2.0.0-alpha.2
|
||||
newTag: 2.0.0-alpha.3
|
||||
resources:
|
||||
- proxy-configmap.yaml
|
||||
- proxy-deployment.yaml
|
||||
|
|
|
|||
Loading…
Reference in New Issue