Commit Graph

5 Commits

Author SHA1 Message Date
jingzhang36 56d519e9f7
Add archive experiment feature in backend (#3359)
* add new field in db schema and api schema

* auto genereted types for experiment storage state

* add archive and unarchive methods to backend for experiments.

* auto generated archive/unarchive methods for epxeriments

* add archive and unarchive to client

* set proper storage state when creating experiment

* retrieve storage state when we get/list epxeriment(s)

* change expection in test to have storage state

* add storage state in resource manager test

* revise experiemnt server test

* revise api converter test

* integration test of experiment archive

* archive/unarchive experiment affect the storage state of runs in it

* test all the runs in archive/unarchive experiment

* test all runs are archived/unarchived with their experiment in experiment server

* integration test

* integration test: value type mismatch in assertion

* unused import; default value for storage state

* autogen code for frontend

* reorder the fields in api experiment schema

* switch the position of the two enum to verify a hypothesis

* Put a place hodler to prevent any valid item to take the value 0

* Get rid of the place holder since the cause of issue related to value 0 is found and fixed.

* The returned api experiment now has storage state field

* create experiment return doesn't contain storege state

* Cleanup needs to clean runs and pipeliens now

* a missing client

* use resource reference as fileter instead of experiment uuid

* use same namespace in archive unit test

* Leave archive/unarchive experiment integration test to a separate PR

* also need to update jobs when experiments are archived

* Change of unarchiving logic. When experiment is unarchived, jobs/runs in
it stay archived

* add unit test for the job status in archived/unarchived experiment

* change archive state to 3 value enum; add experiment integration test

* make archive state 3 value enum to avoid 0 value mapped to available; add integration test

* run swagger autogen

* fix an expected value

* fix experiment server test

* add job check in experiment server test

* update job crds

* fix a typo

* remove accidentally included irrelevant changes
2020-04-16 07:31:09 +08:00
Chen Sun 2f1dbbedd2
Experiment API change (#3198) 2020-03-06 09:17:23 -08:00
jingzhang36 4c491823a0
Update auto-generated files' license date (generated by swagger-gen) (#2998) 2020-02-07 14:44:50 +08:00
Ajay Gopinathan 163545b370 Use Bazel to build the entire backend and perform API code generation (#609)
* Use Bazel to build the entire backend.

This also uses Bazel to generate code from the API definition in the
proto files.

The Makefile is replaced with a script that uses Bazel to first generate
the code, and then copy them back into the source tree.

Most of the BUILD files were generated automatically using Gazelle.

* Fix indentation in generate_api.sh

* Clean up WORKSPACE

* Add README for building/testing backend.

Also fix the missing licenses in the generated proto files.

* Add license to files under go_http_client
2019-01-04 17:17:20 -08:00
Pascal Vicaire 633e2ddcc8 Initial commit of the kubeflow/pipeline project. 2018-11-02 14:02:31 -07:00