* Add initial CRD types for Viewer resource, and generate corresponding
code.
* Use controller-runtime to scaffold out a controller main
* Start adding a deployment
* Clean up and separate reconciler logic into its own package for future testing.
* Clean up with comments
* Run dep ensure
* Update auto-generate script. Only need deepcopy funcs for viewer crd types
* Cleanup previously generated but unused viewer client code
* [WIP] Adding tests
* More tests
* Completed unit tests for reconciler with logic for max viewers
* Add CRD definition, sample instance and update README.
* Fix merge conflict
* Fix readme typo for kube and add direct port-forwarding instructions.
* Add tests for when persistent volume is used with Tensorboard viewer.
Also add a sample YAML to show how to mount and use a GCE persistent
disk in the viewer CRD.
* Remove vendor directory
* Use Bazel to build the entire backend.
This also uses Bazel to generate code from the API definition in the
proto files.
The Makefile is replaced with a script that uses Bazel to first generate
the code, and then copy them back into the source tree.
Most of the BUILD files were generated automatically using Gazelle.
* Fix indentation in generate_api.sh
* Clean up WORKSPACE
* Add README for building/testing backend.
Also fix the missing licenses in the generated proto files.
* Add license to files under go_http_client
* Make all ListXXX operations use POST instead of GET.
Generate new swagger definitions and use these to generate the frontend
APIs using `npm run apis`.
This is to support filtering in List requests, as the current
grpc-gateway swagger generator tool does not support repeated fields in
requests used in GET endpoints.
* Use base64-encoded JSON-stringified version of Filter instead.
This lets us keep filter as a simple parameter in the ListXXX requests,
and gets around having to use POST for List requests.
* refactor filter parsing to parseAPIFilter and add tests
* Hack to ensure correct Swagger definitions are generated for Filter.
* Fix merge conflicts with master after rebase
* fix indentation
* Fix hack so frontend apis compile.
* print failing experiments
* try print again.
* revert experiment_api_test
* Use StdEncoding for base64 encoding
* Fix nil pointer dereference error caused err variable shadowing
* add vendor to gitignore
* switch to go module
* switch to go module
* switch to go module
* prune go mod
* prune go mod
* turn on go mod for test
* enable go module in docker image
* enable go module in docker image
* fix images
* debug
* debug
* debug
* update image
* skip integration tests when unit test flag is set to true
* wip
* add StorageState enum to proto
* add StorageState to model
* archive proto/model changes
* wip archive endpoint
* wip adding tests
* archive test
* unarchive proto and implementation
* cleanup
* make storage state required, with a default value
* remove unspecified value from storage state enum
* pr comments
* pr comments
* fix archive/unarchive endpoints, add api integration test
* typo
* WIP: Add filter package with tests.
* Add tests for IN predicate.
* Add listing functions
* Try updating list experiments
* Cleanup and finalize list API.
Add tests for list package, and let ExperimentStore use this new API.
Update tests for the latter as well.
* Add comments. BuildSQL -> AddToSelect for flexibility
* Run dep ensure
* Add filter proto to all other resources
* Add filtering for pipeline server
* Add filtering for job server
* Add filtering for run server
* Try to fix integration tests
This change pins the versions of the libraries that were used to
generate the proto definitions using dep. The Makefile is then modified
so that the tool and library versions used to build the proto generated
files are from the vendor directory. This is a hacky, short-term
solution to ensure a reproducible build while we work on switching to
bazel.
The versions in the Gopkg.toml file were chosen based on my experiments
that generated proto files that did not change from what is already
checked in.
The code generator should not be run from HEAD, as it will generate code
that diverges from the pinned version of client-go, and also any
previously generated CRD controller clients.
This change pins both code generator and client-go to the specified
kubernetes release, and ensures the update-codegen.sh script uses the
code-generator specified in the vendor directory rather than HEAD. This
ensures the build is always reproducible.
* Now pipeline function takes direct default values rather than dsp.PipelineParam. It simplifies the sample code a lot.
* Remove extraneous parenthesis.
* Follow up CR comments.
* Change Dockerfile (not done).
* Fix dockerfile.
* Fix Dockerfile again.
* Remove unneeded installation of packages in Dockerfile.
* First integration test for the ML Pipeline CLI (Pipeline List).
* Fixing an issue with an undefined variable
* Adding the --debug flag to help with debugging.
* Changing the namespace to Kubeflow.