* Added multi-user pipelines backend
corrected typo
updating code based on review
fixes for pipelines server
reverting this back
* removing unnecessary info logging
* chore(backend): tidy go.mod and update tools.go
* go install, instead of go get
* fix problems reported by go vet
* simplify some ide reported redundant syntax
* license is not required for generated code
* remove licenses for generated code
* cleanup
* remove license more
* rm unused BUILD.bazel files
* fixed generate_api.sh
* reimport error.proto
* initial work on exposing the default version of pipeline
* update description
* added missing files
* updated api build, unsure if this is correct ...
* updated after feedback
* clean up
* remove empty line
* started to make the integration test
* added integreation test
* fixed build and feedback
* updated the tests
* Updated the test
* new test
* typo
* updated the pipeline default
* updated the pipeline version
* formatting
* error in comparison
* list experiment desc
* changes should be made in proto
* add comments and descriptions
* comments/descriptions in run.proto
* comments in job.proto and pipeline.proto
* try starting a new line
* newline doesnt help
* add swagger gen'ed file
* address comments
* regenerate json and client via swagger
* address comments
* regenerate go_http_client and swagger from proto
* two periods
* re-generate
* Open the version api in BE for later FE PR to use. Including
auto-generated BE and FE code.
* format FE
* re-generate
* remove an unnecessary auto-generated file
* format
* add version api
* unit tests
* remove debug fmt
* remove unused func
* remove another unused method
* formatting
* remove unused consts
* some comments
* build
* unit tests
* unit tests
* formatting
* unit tests
* run from pipeline version
* pipeline version as resource type
* run store and resource reference store
* formatting and removing debug traces
* run server test
* job created from pipeline version
* variable names
* address comments
* Get pipeline version template is used on pipeline details page, which fetches pipelien version file.
* a temp revert
* address comment
* address comment
* add comment
* get pipeline version template
* verify pipeline version in resource reference
* add unit test for create run from pipeline version
* unit test for create job from pipeline version
* remove some comments
* reformat
* reformat again
* Remove an unrelated change
* change method name
* Add necessary data types/tables for pipeline version. Mostly based
on Yang's branch at https://github.com/IronPan/pipelines/tree/kfpci/.
Backward compatible.
* Modified comment
* Modify api converter according with new pipeline (version) definition
* Change pipeline_store for DefaultVersionId field
* Add pipeline spec to pipeline version
* fix model converter
* fix a comment
* Add foreign key, pagination of list request, refactor code source
* Refactor code source
* Foreign key
* Change code source and package source type
* Fix ; separator
* Add versions table and modify existing pipeline apis
* Remove api pipeline defintiion change and leave it for later PR
* Add comment
* Make schema changing and data backfilling a single transaction
* Tolerate null default version id in code
* fix status
* Revise delete pipeline func
* Use raw query to migrate data
* No need to update versions status
* rename and minor changes
* accidentally removed a where clause
* Fix a model name prefix
* Refine comments
* Revise if condition
* Address comments
* address more comments
* Rearrange pipeline and version related parts inside CreatePipeline, to make them more separate.
* Add package url to pipeline version. Required when calling CreatePipelineVersionRequest
* Single code source url; remove pipeline id as sorting field; reformat
* resolve remote branch and local branch diff
* remove unused func
* Remove an empty line
* Fix API package names and regenerate checked-in proto files. Also bump version of GRPC gateway used.
* Fix BUILD.bazel file for api as well.
* Update Bazel version
* add count to protos and libs
* close db rows before second query
* count -> total_size
* int32 -> int
* move scan count row to util
* add comments
* add logs when transactions fail
* dedup from and where clauses
* simplify job count query
* job count queries
* run count queries
* add job_store total size test
* added tests for list util
* pr comments
* list_utils -> list
* fix clients and fake clients to support TotalSize
* added TotalSize checks in api integration tests
* Use Bazel to build the entire backend.
This also uses Bazel to generate code from the API definition in the
proto files.
The Makefile is replaced with a script that uses Bazel to first generate
the code, and then copy them back into the source tree.
Most of the BUILD files were generated automatically using Gazelle.
* Fix indentation in generate_api.sh
* Clean up WORKSPACE
* Add README for building/testing backend.
Also fix the missing licenses in the generated proto files.
* Add license to files under go_http_client
* Make all ListXXX operations use POST instead of GET.
Generate new swagger definitions and use these to generate the frontend
APIs using `npm run apis`.
This is to support filtering in List requests, as the current
grpc-gateway swagger generator tool does not support repeated fields in
requests used in GET endpoints.
* Use base64-encoded JSON-stringified version of Filter instead.
This lets us keep filter as a simple parameter in the ListXXX requests,
and gets around having to use POST for List requests.
* refactor filter parsing to parseAPIFilter and add tests
* Hack to ensure correct Swagger definitions are generated for Filter.
* Fix merge conflicts with master after rebase
* fix indentation
* Fix hack so frontend apis compile.
* print failing experiments
* try print again.
* revert experiment_api_test
* Use StdEncoding for base64 encoding
* Fix nil pointer dereference error caused err variable shadowing
* WIP: Add filter package with tests.
* Add tests for IN predicate.
* Add listing functions
* Try updating list experiments
* Cleanup and finalize list API.
Add tests for list package, and let ExperimentStore use this new API.
Update tests for the latter as well.
* Add comments. BuildSQL -> AddToSelect for flexibility
* Run dep ensure
* Add filter proto to all other resources
* Add filtering for pipeline server
* Add filtering for job server
* Add filtering for run server
* Try to fix integration tests