Compare commits

...

184 Commits

Author SHA1 Message Date
Manaswini Das 3bfb7cdef7
fix: remove scrollbars from model cards and sort versions by date (#1498)
Signed-off-by: manaswinidas <dasmanaswini10@gmail.com>
2025-08-19 15:41:05 +00:00
Matteo Mortari 4949f66e95
feat: add securityContext to async-upload Job sample (#1472)
Signed-off-by: Matteo Mortari <matteo.mortari@gmail.com>
2025-08-19 13:24:05 +00:00
dependabot[bot] b293d53867
build(deps-dev): bump schemathesis from 4.0.25 to 4.1.0 in /clients/python (#1491)
Bumps [schemathesis](https://github.com/schemathesis/schemathesis) from 4.0.25 to 4.1.0.
- [Release notes](https://github.com/schemathesis/schemathesis/releases)
- [Changelog](https://github.com/schemathesis/schemathesis/blob/master/CHANGELOG.md)
- [Commits](https://github.com/schemathesis/schemathesis/compare/v4.0.25...v4.1.0)

---
updated-dependencies:
- dependency-name: schemathesis
  dependency-version: 4.1.0
  dependency-type: direct:development
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-08-19 12:18:04 +00:00
dependabot[bot] a420f612fd
build(deps): bump huggingface-hub from 0.33.5 to 0.34.4 in /jobs/async-upload (#1497)
Bumps [huggingface-hub](https://github.com/huggingface/huggingface_hub) from 0.33.5 to 0.34.4.
- [Release notes](https://github.com/huggingface/huggingface_hub/releases)
- [Commits](https://github.com/huggingface/huggingface_hub/compare/v0.33.5...v0.34.4)

---
updated-dependencies:
- dependency-name: huggingface-hub
  dependency-version: 0.34.4
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-08-19 07:50:04 +00:00
dependabot[bot] 07d7ca9c19
build(deps): bump huggingface-hub from 0.34.3 to 0.34.4 in /clients/python (#1493)
Bumps [huggingface-hub](https://github.com/huggingface/huggingface_hub) from 0.34.3 to 0.34.4.
- [Release notes](https://github.com/huggingface/huggingface_hub/releases)
- [Commits](https://github.com/huggingface/huggingface_hub/compare/v0.34.3...v0.34.4)

---
updated-dependencies:
- dependency-name: huggingface-hub
  dependency-version: 0.34.4
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-08-19 07:49:04 +00:00
dependabot[bot] 525fca9bfb
build(deps-dev): bump coverage from 7.10.3 to 7.10.4 in /clients/python (#1492)
Bumps [coverage](https://github.com/nedbat/coveragepy) from 7.10.3 to 7.10.4.
- [Release notes](https://github.com/nedbat/coveragepy/releases)
- [Changelog](https://github.com/nedbat/coveragepy/blob/master/CHANGES.rst)
- [Commits](https://github.com/nedbat/coveragepy/compare/7.10.3...7.10.4)

---
updated-dependencies:
- dependency-name: coverage
  dependency-version: 7.10.4
  dependency-type: direct:development
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-08-19 07:14:03 +00:00
dependabot[bot] e4e7e03afc
build(deps): bump boto3 from 1.40.6 to 1.40.11 in /clients/python (#1490)
Bumps [boto3](https://github.com/boto/boto3) from 1.40.6 to 1.40.11.
- [Release notes](https://github.com/boto/boto3/releases)
- [Commits](https://github.com/boto/boto3/compare/1.40.6...1.40.11)

---
updated-dependencies:
- dependency-name: boto3
  dependency-version: 1.40.11
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-08-19 07:13:03 +00:00
dependabot[bot] dca71b4c82
build(deps-dev): bump ruff from 0.12.8 to 0.12.9 in /clients/python (#1488)
Bumps [ruff](https://github.com/astral-sh/ruff) from 0.12.8 to 0.12.9.
- [Release notes](https://github.com/astral-sh/ruff/releases)
- [Changelog](https://github.com/astral-sh/ruff/blob/main/CHANGELOG.md)
- [Commits](https://github.com/astral-sh/ruff/compare/0.12.8...0.12.9)

---
updated-dependencies:
- dependency-name: ruff
  dependency-version: 0.12.9
  dependency-type: direct:development
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-08-19 07:09:04 +00:00
dependabot[bot] 3b4d11b9bb
build(deps): bump actions/checkout from 4 to 5 (#1489)
Bumps [actions/checkout](https://github.com/actions/checkout) from 4 to 5.
- [Release notes](https://github.com/actions/checkout/releases)
- [Changelog](https://github.com/actions/checkout/blob/main/CHANGELOG.md)
- [Commits](https://github.com/actions/checkout/compare/v4...v5)

---
updated-dependencies:
- dependency-name: actions/checkout
  dependency-version: '5'
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-08-18 21:22:04 +00:00
dependabot[bot] 99a27c2377
build(deps): bump tmp from 0.2.3 to 0.2.4 in /clients/ui/frontend (#1432)
Bumps [tmp](https://github.com/raszi/node-tmp) from 0.2.3 to 0.2.4.
- [Changelog](https://github.com/raszi/node-tmp/blob/master/CHANGELOG.md)
- [Commits](https://github.com/raszi/node-tmp/compare/v0.2.3...v0.2.4)

---
updated-dependencies:
- dependency-name: tmp
  dependency-version: 0.2.4
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-08-18 21:21:03 +00:00
dependabot[bot] 6eaeb71afa
build(deps): bump github.com/brianvoe/gofakeit/v7 from 7.1.2 to 7.3.0 in /clients/ui/bff (#1421)
Bumps [github.com/brianvoe/gofakeit/v7](https://github.com/brianvoe/gofakeit) from 7.1.2 to 7.3.0.
- [Release notes](https://github.com/brianvoe/gofakeit/releases)
- [Commits](https://github.com/brianvoe/gofakeit/compare/v7.1.2...v7.3.0)

---
updated-dependencies:
- dependency-name: github.com/brianvoe/gofakeit/v7
  dependency-version: 7.3.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-08-18 21:20:04 +00:00
dependabot[bot] 1674840dc1
build(deps): bump k8s.io/client-go from 0.33.3 to 0.33.4 in /clients/ui/bff (#1484)
Bumps [k8s.io/client-go](https://github.com/kubernetes/client-go) from 0.33.3 to 0.33.4.
- [Changelog](https://github.com/kubernetes/client-go/blob/master/CHANGELOG.md)
- [Commits](https://github.com/kubernetes/client-go/compare/v0.33.3...v0.33.4)

---
updated-dependencies:
- dependency-name: k8s.io/client-go
  dependency-version: 0.33.4
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-08-18 21:19:04 +00:00
dependabot[bot] 9900fdc69a
build(deps): bump k8s.io/client-go from 0.33.3 to 0.33.4 (#1481)
Bumps [k8s.io/client-go](https://github.com/kubernetes/client-go) from 0.33.3 to 0.33.4.
- [Changelog](https://github.com/kubernetes/client-go/blob/master/CHANGELOG.md)
- [Commits](https://github.com/kubernetes/client-go/compare/v0.33.3...v0.33.4)

---
updated-dependencies:
- dependency-name: k8s.io/client-go
  dependency-version: 0.33.4
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-08-18 21:18:04 +00:00
dependabot[bot] 6a255e8e35
build(deps): bump esbuild and @module-federation/enhanced in /clients/ui/frontend (#1335)
Bumps [esbuild](https://github.com/evanw/esbuild) to 0.25.5 and updates ancestor dependency [@module-federation/enhanced](https://github.com/module-federation/core/tree/HEAD/packages/enhanced). These dependencies need to be updated together.


Updates `esbuild` from 0.17.19 to 0.25.5
- [Release notes](https://github.com/evanw/esbuild/releases)
- [Changelog](https://github.com/evanw/esbuild/blob/main/CHANGELOG-2023.md)
- [Commits](https://github.com/evanw/esbuild/compare/v0.17.19...v0.25.5)

Updates `@module-federation/enhanced` from 0.13.1 to 0.17.0
- [Release notes](https://github.com/module-federation/core/releases)
- [Changelog](https://github.com/module-federation/core/blob/main/packages/enhanced/CHANGELOG.md)
- [Commits](https://github.com/module-federation/core/commits/v0.17.0/packages/enhanced)

---
updated-dependencies:
- dependency-name: esbuild
  dependency-version: 0.25.5
  dependency-type: indirect
- dependency-name: "@module-federation/enhanced"
  dependency-version: 0.17.0
  dependency-type: direct:development
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-08-18 20:13:03 +00:00
Matteo Mortari 4248ee077a
chore: bump MR py version to 0.3.0 (#1463)
see https://github.com/kubeflow/model-registry/pull/1267#pullrequestreview-2985779111

Signed-off-by: Matteo Mortari <matteo.mortari@gmail.com>
2025-08-18 14:30:03 +00:00
Matteo Mortari 772cb25693
test: refresh async-upload Job integration testing (#1474)
* test: refresh integration testing

- remove unused parameter name f/up #1375
- describe applied

Signed-off-by: Matteo Mortari <matteo.mortari@gmail.com>

* Show async-upload Job ci/GHA failing

Signed-off-by: Matteo Mortari <matteo.mortari@gmail.com>

* use correct image

Signed-off-by: Matteo Mortari <matteo.mortari@gmail.com>

---------

Signed-off-by: Matteo Mortari <matteo.mortari@gmail.com>
2025-08-18 14:29:05 +00:00
Mike Turley cb567bae15
[Rebase of abandoned #1396] changes to the version details page format (#1480)
fix lint tests



address comments



address comment



versions cards rebase

Signed-off-by: Taj010 <arsyed@redhat.com>
Signed-off-by: Mike Turley <mike.turley@alum.cs.umass.edu>
Co-authored-by: Taj010 <arsyed@redhat.com>
2025-08-18 14:00:05 +00:00
Matteo Mortari 390d4835cd
docs: update RELEASE.md for kustomize catalog image too (#1469)
Signed-off-by: Matteo Mortari <matteo.mortari@gmail.com>
2025-08-18 10:07:04 +00:00
Dhiraj Bokde b1cedab627
fix: add image pull policy always to make sure docker compose up uses the latest image, fixes 1478 (#1479)
Signed-off-by: Dhiraj Bokde <dhirajsb@users.noreply.github.com>
2025-08-18 05:44:03 +00:00
Matteo Mortari 7499971950
ci: followup #1458 for build-and-push-image defaults (#1473)
Signed-off-by: Matteo Mortari <matteo.mortari@gmail.com>
2025-08-14 14:04:16 +00:00
Alessio Pragliola 228b62d77e
chore: update go.work.sum file (#1471)
Signed-off-by: Alessio Pragliola <seth.pro@gmail.com>
2025-08-14 12:31:14 +00:00
dependabot[bot] a30065bc2c
build(deps): bump google.golang.org/protobuf from 1.36.6 to 1.36.7 (#1447)
Bumps google.golang.org/protobuf from 1.36.6 to 1.36.7.

---
updated-dependencies:
- dependency-name: google.golang.org/protobuf
  dependency-version: 1.36.7
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-08-14 10:54:13 +00:00
Alessio Pragliola 34550a5f07
fix: possible int overflow (#1470)
Signed-off-by: Alessio Pragliola <seth.pro@gmail.com>
2025-08-14 10:43:13 +00:00
Matteo Mortari 41b53ae73d
ci: clamp GHA with base pemissions: block (#1458)
Signed-off-by: Matteo Mortari <matteo.mortari@gmail.com>
2025-08-14 10:20:13 +00:00
Chris Hambridge dfc1a224de
ci: Run fuzz as a workflow_dispatch (#1459)
* Provide simple workflow dispatch mechanism to avoid GHA vulnerabilities.

Signed-off-by: Chris Hambridge <chambrid@redhat.com>
2025-08-14 08:25:14 +00:00
Robert Sun e4dd95ec87
fixed filtering in model registry views (#1451)
* fixed filtering in model registry views

Signed-off-by: rsun19 <robertssun1234@gmail.com>

* extracted utils function

Signed-off-by: rsun19 <robertssun1234@gmail.com>

* moved filter toolbar to shared folder

Signed-off-by: rsun19 <robertssun1234@gmail.com>

---------

Signed-off-by: rsun19 <robertssun1234@gmail.com>
2025-08-13 20:08:12 +00:00
Matteo Mortari eb3bd81c4f
chore: add mturley to module clients/ui (#1462)
@mturley is providing extensive contributions to the UI efforts
both in terms of code, architecture, and coordination with UI
team

Signed-off-by: Matteo Mortari <matteo.mortari@gmail.com>
2025-08-13 18:52:13 +00:00
Alessio Pragliola 266023f9dc
feat: mlmd removal from codebase (#1267)
* feat: remove most of MLMD references

Signed-off-by: Alessio Pragliola <seth.pro@gmail.com>

* fix: post rebase

Signed-off-by: Alessio Pragliola <seth.pro@gmail.com>

* fix: add function from mlmd deleted file

Signed-off-by: Alessio Pragliola <seth.pro@gmail.com>

* chore: removed pending mlmd references

Signed-off-by: Alessio Pragliola <seth.pro@gmail.com>

* fix: remove unused function

Signed-off-by: Alessio Pragliola <seth.pro@gmail.com>

* chore: remove unused files

Signed-off-by: Alessio Pragliola <seth.pro@gmail.com>

---------

Signed-off-by: Alessio Pragliola <seth.pro@gmail.com>
2025-08-13 13:31:14 +00:00
Manaswini Das dd26f5dc51
Structure MR BFF server to use secure HTTP calls (#1438)
* Structure MR BFF server to use secure HTTP calls

Signed-off-by: manaswinidas <dasmanaswini10@gmail.com>

* add configurable TLS verification for MR HTTP client

Signed-off-by: manaswinidas <dasmanaswini10@gmail.com>

* Add instructions to disable TLS for local

Signed-off-by: manaswinidas <dasmanaswini10@gmail.com>

---------

Signed-off-by: manaswinidas <dasmanaswini10@gmail.com>
2025-08-13 11:57:13 +00:00
Matteo Mortari 29ab55cfef
ci: remove GHAs from #1454 (#1457)
as it's blocking merge on Approval label
which do Skip the optional Fuzz test

Removing as anyway we need another approach

Signed-off-by: Matteo Mortari <matteo.mortari@gmail.com>
2025-08-13 08:41:12 +00:00
Chris Hambridge 9573e69c0a
ci: Update the GHA workflow to utilize labels instead of comments (#1454)
* ci: Update the GHA workflow to utilize labels instead of comments

* Use label "run-fuzz-test" to trigger GHA flow

Signed-off-by: Chris Hambridge <chambrid@redhat.com>

* ci: fix the image version variable to work with the deploy_on_kind expectation

Signed-off-by: Chris Hambridge <chambrid@redhat.com>

---------

Signed-off-by: Chris Hambridge <chambrid@redhat.com>
2025-08-12 19:26:12 +00:00
Yulia Krimerman 7c50427c02
Version selector improvements (#1443)
Signed-off-by: Yulia Krimerman <juliapiterova@hotmail.com>
2025-08-12 15:14:11 +00:00
Dhiraj Bokde 655a9d5eee
Add support for Experiment tracking in Model Registry, fixes #1224 (#1318)
* feat: initial version of experiments and runs API

Signed-off-by: Dhiraj Bokde <dhirajsb@users.noreply.github.com>

* feat: experiments and runs initial implementation (wip)

Signed-off-by: Dhiraj Bokde <dhirajsb@users.noreply.github.com>

* fix: fixed failing unit tests for experiments and runs

Signed-off-by: Dhiraj Bokde <dhirajsb@users.noreply.github.com>

* fix: added experiment and experimentrun tests

Signed-off-by: Dhiraj Bokde <dhirajsb@users.noreply.github.com>

* feat: added DataSet, Metric, and Parameter types

Signed-off-by: Dhiraj Bokde <dhirajsb@users.noreply.github.com>

* feat: added implementatio of DataSet, Metric, and Param, including service tests

Signed-off-by: Dhiraj Bokde <dhirajsb@users.noreply.github.com>

* fix: replace int properties for timestamps with string because mlmd type properties only support int32, not int64

Signed-off-by: Dhiraj Bokde <dhirajsb@users.noreply.github.com>

* feat: add support for artifactType query param to filter artifact types in artifact queries

Signed-off-by: Dhiraj Bokde <dhirajsb@users.noreply.github.com>

* fix: add metrics history endpoint and metric history storage for experiment run metrics

Signed-off-by: Dhiraj Bokde <dhirajsb@users.noreply.github.com>

* fix: fix artifactType query param type in generated service

Signed-off-by: Dhiraj Bokde <dhirajsb@users.noreply.github.com>

* fix: fix go lint error in unit test

Signed-off-by: Dhiraj Bokde <dhirajsb@users.noreply.github.com>

* fix: filter out metric history from artifacts endpoints

Signed-off-by: Dhiraj Bokde <dhirajsb@users.noreply.github.com>

* fix: fix metric history name to use last update time to avoid name conflicts

Signed-off-by: Dhiraj Bokde <dhirajsb@users.noreply.github.com>

* feat: add filterQuery param on all context types to search by properties and custom properties

Signed-off-by: Dhiraj Bokde <dhirajsb@users.noreply.github.com>

* feat: initial version of experiment tracking implemented on embedmd, rebased on main

Signed-off-by: Dhiraj Bokde <dhirajsb@users.noreply.github.com>

* feat: add support for filterQuery parameter for all ListResponse endpoints for embedmd datastore

Signed-off-by: Dhiraj Bokde <dhirajsb@users.noreply.github.com>

* fix: add support for stepIds query parameter in embedmd datastore

Signed-off-by: Dhiraj Bokde <dhirajsb@users.noreply.github.com>

* feat: refactor embedmd db service to use generic repository implementation to reduce code duplication

Signed-off-by: Dhiraj Bokde <dhirajsb@users.noreply.github.com>

* fix: add support for artifactType query parameter for embedmd datastore

Signed-off-by: Dhiraj Bokde <dhirajsb@users.noreply.github.com>

* fix: use mysql 8.3 in unit tests

Signed-off-by: Dhiraj Bokde <dhirajsb@users.noreply.github.com>

* fix: refactor name mapping and default name handling in embedmd datastore

Signed-off-by: Dhiraj Bokde <dhirajsb@users.noreply.github.com>

* feat: support updating metrics and parameters by name, fix ignoring metric history when retrieving all artifacts for runs and versions

Signed-off-by: Dhiraj Bokde <dhirajsb@users.noreply.github.com>

* fix: add missing generated openapi python client files for PR github action check

Signed-off-by: Dhiraj Bokde <dhirajsb@users.noreply.github.com>

* fix: fix failing shared db tests

Signed-off-by: Dhiraj Bokde <dhirajsb@users.noreply.github.com>

* fix: add support for metric and parameter description, add missing type property migraiton

Signed-off-by: Dhiraj Bokde <dhirajsb@users.noreply.github.com>

* chore: update files from main

Signed-off-by: Alessio Pragliola <seth.pro@gmail.com>

* fix: added missing godoc comments in pkg/api/api.go

Signed-off-by: Dhiraj Bokde <dhirajsb@users.noreply.github.com>

* fix: replace ambiguous ArtifactListReponse return type from GetExperimentRunMetricHistory with MetricListResponse

Signed-off-by: Dhiraj Bokde <dhirajsb@users.noreply.github.com>

* fix: fixed incorrect artifactType in dataset response, added tests to verify all artifact types

Signed-off-by: Dhiraj Bokde <dhirajsb@users.noreply.github.com>

* feat: add validation for endTimeSinceEpoch property on experiment run updates

Signed-off-by: Dhiraj Bokde <dhirajsb@users.noreply.github.com>

* Replace value type validation map with a switch in query_translator.go

Co-authored-by: Paul Boyd <paul@camelot.email>
Signed-off-by: Dhiraj Bokde <dhirajsb@users.noreply.github.com>

* fix: add service e2e tests for filterQuery, fix name query param handling, fix DB tests that didn't use parent id prefix

Signed-off-by: Dhiraj Bokde <dhirajsb@users.noreply.github.com>

* chore: code cleanup, replace interface{} with any, added vetting for internal/db/filter

Signed-off-by: Dhiraj Bokde <dhirajsb@users.noreply.github.com>

* chore: added flag vF for fixed string grep exclude

Signed-off-by: Dhiraj Bokde <dhirajsb@users.noreply.github.com>

* fix: copied orderby and parameters back to registry and catalog to have different values

Signed-off-by: Dhiraj Bokde <dhirajsb@users.noreply.github.com>

* fix: fixed mlmd query translator handling of escaped backslashes

Signed-off-by: Dhiraj Bokde <dhirajsb@users.noreply.github.com>

* chore: add test to verify parseCustomPropertyField won't panic with a property name ending in dot

Signed-off-by: Dhiraj Bokde <dhirajsb@users.noreply.github.com>

* fix: sync generated python client code

Signed-off-by: Dhiraj Bokde <dhirajsb@users.noreply.github.com>

* fix: readiness probe tests and new types

Signed-off-by: Alessio Pragliola <seth.pro@gmail.com>

* chore: refactor readiness_test

Signed-off-by: Alessio Pragliola <seth.pro@gmail.com>

* fix: ensure parentResourceId is used to filter resource lookup by params, add unit tests for duplicate child resource lookups

Signed-off-by: Dhiraj Bokde <dhirajsb@users.noreply.github.com>

* fix: throw an error if a metric value is missing, add test to validate

Signed-off-by: Dhiraj Bokde <dhirajsb@users.noreply.github.com>

* fix: fix http status error code for invalid ids

Signed-off-by: Dhiraj Bokde <dhirajsb@users.noreply.github.com>

* fix: more id validation, fixed filterQuery passing to DB layer

Signed-off-by: Dhiraj Bokde <dhirajsb@users.noreply.github.com>

* fix: fix failing unit test

Signed-off-by: Dhiraj Bokde <dhirajsb@users.noreply.github.com>

* fix: validate experiment id when listing runs

Signed-off-by: Dhiraj Bokde <dhirajsb@users.noreply.github.com>

* fix: fix failing validation test after fixing http status codes

Signed-off-by: Dhiraj Bokde <dhirajsb@users.noreply.github.com>

* fix: avoid duplicate key errors if externalid is set in metric when creating metric history entries

Signed-off-by: Dhiraj Bokde <dhirajsb@users.noreply.github.com>

* fix: add fuzzer tests for experiment runs and new artifact types

Signed-off-by: Dhiraj Bokde <dhirajsb@users.noreply.github.com>

* chore: code cleanup and format fuzzer tests

Signed-off-by: Dhiraj Bokde <dhirajsb@users.noreply.github.com>

* fix: log error in fuzzer test

Signed-off-by: Dhiraj Bokde <dhirajsb@users.noreply.github.com>

* fix: handle null artifact names correctly on create

Signed-off-by: Dhiraj Bokde <dhirajsb@users.noreply.github.com>

---------

Signed-off-by: Dhiraj Bokde <dhirajsb@users.noreply.github.com>
Signed-off-by: Alessio Pragliola <seth.pro@gmail.com>
Signed-off-by: Alessio Pragliola <83355398+Al-Pragliola@users.noreply.github.com>
Co-authored-by: Alessio Pragliola <seth.pro@gmail.com>
Co-authored-by: Alessio Pragliola <83355398+Al-Pragliola@users.noreply.github.com>
Co-authored-by: Paul Boyd <paul@camelot.email>
2025-08-12 09:26:11 +00:00
Adysen Rothman 60ab78af95
add filter support to model catalog (#1436)
* add filter support to model catalog

Signed-off-by: Adysen Rothman <85646824+adysenrothman@users.noreply.github.com>

* add filter support to yaml

Signed-off-by: Adysen Rothman <85646824+adysenrothman@users.noreply.github.com>

* update tests

Signed-off-by: Adysen Rothman <85646824+adysenrothman@users.noreply.github.com>

* more clear model verbiage & logging

Signed-off-by: Adysen Rothman <85646824+adysenrothman@users.noreply.github.com>

* rm sample rhec catalog --> example in readme

Signed-off-by: Adysen Rothman <85646824+adysenrothman@users.noreply.github.com>

---------

Signed-off-by: Adysen Rothman <85646824+adysenrothman@users.noreply.github.com>
2025-08-12 09:11:12 +00:00
dependabot[bot] 36e658f17a
build(deps-dev): bump schemathesis from 4.0.21 to 4.0.25 in /clients/python (#1450)
Bumps [schemathesis](https://github.com/schemathesis/schemathesis) from 4.0.21 to 4.0.25.
- [Release notes](https://github.com/schemathesis/schemathesis/releases)
- [Changelog](https://github.com/schemathesis/schemathesis/blob/master/CHANGELOG.md)
- [Commits](https://github.com/schemathesis/schemathesis/compare/v4.0.21...v4.0.25)

---
updated-dependencies:
- dependency-name: schemathesis
  dependency-version: 4.0.25
  dependency-type: direct:development
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-08-12 09:07:11 +00:00
Chris Hambridge fb8d493f12
ci: Enable execution of make test-fuzz via pull request comment (#1435)
* ci: Enable execution of make test-fuzz via pull request comment

* Provide simplified mechanism for executing `make test-fuzz` for specific pull requests

Signed-off-by: Chris Hambridge <chambrid@redhat.com>

* ci: Fix comment check and update status checks section.

Signed-off-by: Chris Hambridge <chambrid@redhat.com>

* Update .github/workflows/test-fuzz.yml

Signed-off-by: Matteo Mortari <matteo.mortari@gmail.com>

---------

Signed-off-by: Chris Hambridge <chambrid@redhat.com>
Signed-off-by: Matteo Mortari <matteo.mortari@gmail.com>
Co-authored-by: Matteo Mortari <matteo.mortari@gmail.com>
2025-08-12 09:05:12 +00:00
dependabot[bot] 9be856091f
build(deps-dev): bump ruff from 0.12.7 to 0.12.8 in /clients/python (#1449)
Bumps [ruff](https://github.com/astral-sh/ruff) from 0.12.7 to 0.12.8.
- [Release notes](https://github.com/astral-sh/ruff/releases)
- [Changelog](https://github.com/astral-sh/ruff/blob/main/CHANGELOG.md)
- [Commits](https://github.com/astral-sh/ruff/compare/0.12.7...0.12.8)

---
updated-dependencies:
- dependency-name: ruff
  dependency-version: 0.12.8
  dependency-type: direct:development
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-08-12 09:03:12 +00:00
dependabot[bot] b9085ccc56
build(deps): bump boto3 from 1.40.3 to 1.40.6 in /clients/python (#1448)
Bumps [boto3](https://github.com/boto/boto3) from 1.40.3 to 1.40.6.
- [Release notes](https://github.com/boto/boto3/releases)
- [Commits](https://github.com/boto/boto3/compare/1.40.3...1.40.6)

---
updated-dependencies:
- dependency-name: boto3
  dependency-version: 1.40.6
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-08-12 09:02:10 +00:00
dependabot[bot] 1da137c913
build(deps-dev): bump coverage from 7.10.2 to 7.10.3 in /clients/python (#1446)
Bumps [coverage](https://github.com/nedbat/coveragepy) from 7.10.2 to 7.10.3.
- [Release notes](https://github.com/nedbat/coveragepy/releases)
- [Changelog](https://github.com/nedbat/coveragepy/blob/master/CHANGES.rst)
- [Commits](https://github.com/nedbat/coveragepy/compare/7.10.2...7.10.3)

---
updated-dependencies:
- dependency-name: coverage
  dependency-version: 7.10.3
  dependency-type: direct:development
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-08-12 09:01:12 +00:00
dependabot[bot] 1c9b6ffabc
build(deps-dev): bump types-python-dateutil from 2.9.0.20250708 to 2.9.0.20250809 in /clients/python (#1445)
Bumps [types-python-dateutil](https://github.com/typeshed-internal/stub_uploader) from 2.9.0.20250708 to 2.9.0.20250809.
- [Commits](https://github.com/typeshed-internal/stub_uploader/commits)

---
updated-dependencies:
- dependency-name: types-python-dateutil
  dependency-version: 2.9.0.20250809
  dependency-type: direct:development
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-08-12 09:00:12 +00:00
Matteo Mortari 58680c990d
chore: bump MR py version to 0.2.23 (#1440)
Signed-off-by: Matteo Mortari <matteo.mortari@gmail.com>
2025-08-12 08:34:11 +00:00
Juntao Wang 16866854a7
Add model details card to version details page (#1397)
Signed-off-by: Juntao Wang <juntwang@redhat.com>
2025-08-11 10:06:42 +00:00
Pushpa Padti ca4b82ba11
Add latest version column to models list (#1388)
* Add latest version column to models list

Signed-off-by: ppadti <ppadti@redhat.com>

* Fix typo

Signed-off-by: ppadti <ppadti@redhat.com>

* Remove custom css

Signed-off-by: ppadti <ppadti@redhat.com>

* update route

Signed-off-by: ppadti <ppadti@redhat.com>

---------

Signed-off-by: ppadti <ppadti@redhat.com>
2025-08-08 16:43:56 +00:00
Yulia Krimerman 6d7b314a4b
Initial Model Catalog skeleton (#1373)
* initial Model Catalog skeleton

Signed-off-by: Yulia Krimerman <juliapiterova@hotmail.com>

* signed

Signed-off-by: Yulia Krimerman <juliapiterova@hotmail.com>

* Hide MC as stanalone

Signed-off-by: Yulia Krimerman <juliapiterova@hotmail.com>

* lint

Signed-off-by: Yulia Krimerman <juliapiterova@hotmail.com>

* final version

Signed-off-by: Yulia Krimerman <juliapiterova@hotmail.com>

* added tests

Signed-off-by: Yulia Krimerman <juliapiterova@hotmail.com>

* removed unit tests, added todo and util

Signed-off-by: Yulia Krimerman <juliapiterova@hotmail.com>

* updated the tests

Signed-off-by: Yulia Krimerman <juliapiterova@hotmail.com>

* added removed TODO

Signed-off-by: Yulia Krimerman <juliapiterova@hotmail.com>

* fix new imports

Signed-off-by: Yulia Krimerman <juliapiterova@hotmail.com>

---------

Signed-off-by: Yulia Krimerman <juliapiterova@hotmail.com>
2025-08-08 06:17:56 +00:00
Manaswini Das 0666a7c3dc
Add NamespaceContext (#1376)
* Add ProjectContext

Signed-off-by: manaswinidas <dasmanaswini10@gmail.com>

* Add namespaces and remove project reference

Signed-off-by: manaswinidas <dasmanaswini10@gmail.com>

* Add BFF changes

Signed-off-by: manaswinidas <dasmanaswini10@gmail.com>

* Add unit tests

Signed-off-by: manaswinidas <dasmanaswini10@gmail.com>

* Cleaning up project references

Signed-off-by: manaswinidas <dasmanaswini10@gmail.com>

* Fix linting error

Signed-off-by: manaswinidas <dasmanaswini10@gmail.com>

* Change to displayName

Signed-off-by: manaswinidas <dasmanaswini10@gmail.com>

* Fix Go test

Signed-off-by: manaswinidas <dasmanaswini10@gmail.com>

* Fix Go test

Signed-off-by: manaswinidas <dasmanaswini10@gmail.com>

* Fix library imports and unit tests

Signed-off-by: manaswinidas <dasmanaswini10@gmail.com>

---------

Signed-off-by: manaswinidas <dasmanaswini10@gmail.com>
2025-08-08 06:08:56 +00:00
Arsheen Taj Syed 96b840330f
Model Registry Content Description changes (#1368)
* description and content edits for MR

Signed-off-by: Taj010 <arsyed@redhat.com>

* change registered to created

Signed-off-by: Taj010 <arsyed@redhat.com>

* update in model cards and its test

Signed-off-by: Taj010 <arsyed@redhat.com>

---------

Signed-off-by: Taj010 <arsyed@redhat.com>
Signed-off-by: Arsheen Taj Syed  <87820563+Taj010@users.noreply.github.com>
2025-08-08 06:05:56 +00:00
Robert Sun 77db33d3e1
added versions card to model details (#1392)
* added versions card to model details

Signed-off-by: rsun19 <robertssun1234@gmail.com>

* fixed linting errors

Signed-off-by: rsun19 <robertssun1234@gmail.com>

* added archive handling

Signed-off-by: rsun19 <robertssun1234@gmail.com>

* reverted filter changes

Signed-off-by: rsun19 <robertssun1234@gmail.com>

* addressed comments

Signed-off-by: rsun19 <robertssun1234@gmail.com>

* added tests

Signed-off-by: rsun19 <robertssun1234@gmail.com>

* modularized all versions button

Signed-off-by: rsun19 <robertssun1234@gmail.com>

* moved button to components

Signed-off-by: rsun19 <robertssun1234@gmail.com>

* removed deployments from routes

Signed-off-by: rsun19 <robertssun1234@gmail.com>

* edited shared component

Signed-off-by: rsun19 <robertssun1234@gmail.com>

* addressed comment

Signed-off-by: rsun19 <robertssun1234@gmail.com>

* added underline to truncate

Signed-off-by: rsun19 <robertssun1234@gmail.com>

* revert custom underline

Signed-off-by: rsun19 <robertssun1234@gmail.com>

* fixed import

Signed-off-by: rsun19 <robertssun1234@gmail.com>

---------

Signed-off-by: rsun19 <robertssun1234@gmail.com>
2025-08-08 06:04:56 +00:00
Lucas Fernandez e419bba715
Refactor codebase to adapt new mod arch libraries (#1428)
Signed-off-by: lucferbux <lferrnan@redhat.com>
2025-08-07 18:05:55 +00:00
Eric Dobroveanu 348ec21a63
Add HuggingFace model-catalog source minimal skeleton (#1412)
Signed-off-by: Eric Dobroveanu <edobrove@redhat.com>
2025-08-07 14:34:55 +00:00
dependabot[bot] 9c8852767f
build(deps): bump github.com/onsi/gomega from 1.37.0 to 1.38.0 (#1419)
Bumps [github.com/onsi/gomega](https://github.com/onsi/gomega) from 1.37.0 to 1.38.0.
- [Release notes](https://github.com/onsi/gomega/releases)
- [Changelog](https://github.com/onsi/gomega/blob/master/CHANGELOG.md)
- [Commits](https://github.com/onsi/gomega/compare/v1.37.0...v1.38.0)

---
updated-dependencies:
- dependency-name: github.com/onsi/gomega
  dependency-version: 1.38.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-08-07 14:26:55 +00:00
Adysen Rothman 4ec88777b9
update catalog source docs (#1413)
Signed-off-by: Adysen Rothman <85646824+adysenrothman@users.noreply.github.com>
2025-08-07 14:24:55 +00:00
dependabot[bot] 220c23c9be
build(deps-dev): bump aiohttp from 3.12.14 to 3.12.15 in /jobs/async-upload (#1427)
---
updated-dependencies:
- dependency-name: aiohttp
  dependency-version: 3.12.15
  dependency-type: direct:development
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-08-07 13:24:55 +00:00
Chris Hambridge 585ea7c495
fix: Handle rolling update when running e2e tests on an existing kind cluster (#1433)
* Fixed kubectl output handling in deploy_on_kind.sh to properly handle multiple pods with the same image.

Signed-off-by: Chris Hambridge <chambrid@redhat.com>
2025-08-07 13:00:55 +00:00
dependabot[bot] 3844d7c287
build(deps-dev): bump mypy from 1.16.1 to 1.17.1 in /clients/python (#1426)
Bumps [mypy](https://github.com/python/mypy) from 1.16.1 to 1.17.1.
- [Changelog](https://github.com/python/mypy/blob/master/CHANGELOG.md)
- [Commits](https://github.com/python/mypy/compare/v1.16.1...v1.17.1)

---
updated-dependencies:
- dependency-name: mypy
  dependency-version: 1.17.1
  dependency-type: direct:development
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-08-07 12:59:55 +00:00
dependabot[bot] 30c1fc1452
build(deps): bump boto3 from 1.39.15 to 1.40.3 in /clients/python (#1425)
Bumps [boto3](https://github.com/boto/boto3) from 1.39.15 to 1.40.3.
- [Release notes](https://github.com/boto/boto3/releases)
- [Commits](https://github.com/boto/boto3/compare/1.39.15...1.40.3)

---
updated-dependencies:
- dependency-name: boto3
  dependency-version: 1.40.3
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-08-07 12:58:55 +00:00
dependabot[bot] 2b02a22307
build(deps-dev): bump ruff from 0.12.5 to 0.12.7 in /clients/python (#1424)
Bumps [ruff](https://github.com/astral-sh/ruff) from 0.12.5 to 0.12.7.
- [Release notes](https://github.com/astral-sh/ruff/releases)
- [Changelog](https://github.com/astral-sh/ruff/blob/main/CHANGELOG.md)
- [Commits](https://github.com/astral-sh/ruff/compare/0.12.5...0.12.7)

---
updated-dependencies:
- dependency-name: ruff
  dependency-version: 0.12.7
  dependency-type: direct:development
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-08-07 12:57:55 +00:00
dependabot[bot] 4a8756613d
build(deps): bump aiohttp from 3.12.14 to 3.12.15 in /clients/python (#1423)
---
updated-dependencies:
- dependency-name: aiohttp
  dependency-version: 3.12.15
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-08-07 12:56:55 +00:00
Matteo Mortari b3d4b1ef5e
doc: update link for Model Catalog Swagger UI to KF website (#1418)
Signed-off-by: Matteo Mortari <matteo.mortari@gmail.com>
2025-08-06 13:22:54 +00:00
Matteo Mortari f1c84d6715
ci: add Dependabot to jobs/async-upload module (#1411)
Signed-off-by: Matteo Mortari <matteo.mortari@gmail.com>
2025-08-06 13:21:55 +00:00
Chris Hambridge bff6a47ad2
Add links to existing READMEs and add README for the model catalog (#1416)
* Add links to existing READMEs and add README for the model catalog

* Add links to existing READMEs to make components more discoverable
* Add a README for the model catalog

Signed-off-by: Chris Hambridge <chambrid@redhat.com>

* Update RHEC source as a reference implementation

Signed-off-by: Chris Hambridge <chambrid@redhat.com>

* Update catalog/README.md

Co-authored-by: Paul Boyd <paul@pboyd.io>
Signed-off-by: Matteo Mortari <matteo.mortari@gmail.com>

---------

Signed-off-by: Chris Hambridge <chambrid@redhat.com>
Signed-off-by: Matteo Mortari <matteo.mortari@gmail.com>
Co-authored-by: Matteo Mortari <matteo.mortari@gmail.com>
Co-authored-by: Paul Boyd <paul@pboyd.io>
2025-08-06 11:34:54 +00:00
dependabot[bot] 3ddb84d3ae
build(deps): bump github.com/testcontainers/testcontainers-go/modules/postgres from 0.37.0 to 0.38.0 (#1407)
Bumps [github.com/testcontainers/testcontainers-go/modules/postgres](https://github.com/testcontainers/testcontainers-go) from 0.37.0 to 0.38.0.
- [Release notes](https://github.com/testcontainers/testcontainers-go/releases)
- [Commits](https://github.com/testcontainers/testcontainers-go/compare/v0.37.0...v0.38.0)

---
updated-dependencies:
- dependency-name: github.com/testcontainers/testcontainers-go/modules/postgres
  dependency-version: 0.38.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-08-05 16:46:53 +00:00
dependabot[bot] c2f5a7c776
build(deps): bump github.com/spf13/pflag from 1.0.6 to 1.0.7 (#1406)
Bumps [github.com/spf13/pflag](https://github.com/spf13/pflag) from 1.0.6 to 1.0.7.
- [Release notes](https://github.com/spf13/pflag/releases)
- [Commits](https://github.com/spf13/pflag/compare/v1.0.6...v1.0.7)

---
updated-dependencies:
- dependency-name: github.com/spf13/pflag
  dependency-version: 1.0.7
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-08-05 16:45:53 +00:00
dependabot[bot] 3df6225763
build(deps): bump github.com/go-sql-driver/mysql from 1.9.2 to 1.9.3 (#1404)
Bumps [github.com/go-sql-driver/mysql](https://github.com/go-sql-driver/mysql) from 1.9.2 to 1.9.3.
- [Release notes](https://github.com/go-sql-driver/mysql/releases)
- [Changelog](https://github.com/go-sql-driver/mysql/blob/v1.9.3/CHANGELOG.md)
- [Commits](https://github.com/go-sql-driver/mysql/compare/v1.9.2...v1.9.3)

---
updated-dependencies:
- dependency-name: github.com/go-sql-driver/mysql
  dependency-version: 1.9.3
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-08-05 16:44:54 +00:00
dependabot[bot] 71db5939a5
build(deps): bump google.golang.org/grpc from 1.73.0 to 1.74.2 (#1410)
Bumps [google.golang.org/grpc](https://github.com/grpc/grpc-go) from 1.73.0 to 1.74.2.
- [Release notes](https://github.com/grpc/grpc-go/releases)
- [Commits](https://github.com/grpc/grpc-go/compare/v1.73.0...v1.74.2)

---
updated-dependencies:
- dependency-name: google.golang.org/grpc
  dependency-version: 1.74.2
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-08-05 16:43:53 +00:00
dependabot[bot] 39e91a6980
build(deps-dev): bump schemathesis from 4.0.9 to 4.0.20 in /clients/python (#1402)
Bumps [schemathesis](https://github.com/schemathesis/schemathesis) from 4.0.9 to 4.0.20.
- [Release notes](https://github.com/schemathesis/schemathesis/releases)
- [Changelog](https://github.com/schemathesis/schemathesis/blob/master/CHANGELOG.md)
- [Commits](https://github.com/schemathesis/schemathesis/compare/v4.0.9...v4.0.20)

---
updated-dependencies:
- dependency-name: schemathesis
  dependency-version: 4.0.20
  dependency-type: direct:development
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-08-05 12:32:53 +00:00
dependabot[bot] e5293f97f8
build(deps-dev): bump types-python-dateutil from 2.9.0.20250516 to 2.9.0.20250708 in /clients/python (#1403)
Bumps [types-python-dateutil](https://github.com/typeshed-internal/stub_uploader) from 2.9.0.20250516 to 2.9.0.20250708.
- [Commits](https://github.com/typeshed-internal/stub_uploader/commits)

---
updated-dependencies:
- dependency-name: types-python-dateutil
  dependency-version: 2.9.0.20250708
  dependency-type: direct:development
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-08-05 07:05:54 +00:00
dependabot[bot] 77b7a58360
build(deps-dev): bump coverage from 7.9.2 to 7.10.2 in /clients/python (#1401)
Bumps [coverage](https://github.com/nedbat/coveragepy) from 7.9.2 to 7.10.2.
- [Release notes](https://github.com/nedbat/coveragepy/releases)
- [Changelog](https://github.com/nedbat/coveragepy/blob/master/CHANGES.rst)
- [Commits](https://github.com/nedbat/coveragepy/compare/7.9.2...7.10.2)

---
updated-dependencies:
- dependency-name: coverage
  dependency-version: 7.10.2
  dependency-type: direct:development
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-08-05 07:04:53 +00:00
dependabot[bot] 875e349d5b
build(deps-dev): bump furo from 2024.8.6 to 2025.7.19 in /clients/python (#1399)
Bumps [furo](https://github.com/pradyunsg/furo) from 2024.8.6 to 2025.7.19.
- [Release notes](https://github.com/pradyunsg/furo/releases)
- [Changelog](https://github.com/pradyunsg/furo/blob/main/docs/changelog.md)
- [Commits](https://github.com/pradyunsg/furo/compare/2024.08.06...2025.07.19)

---
updated-dependencies:
- dependency-name: furo
  dependency-version: 2025.7.19
  dependency-type: direct:development
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-08-05 06:54:53 +00:00
Matteo Mortari a56dc20249
chore: add Area label for Jobs/async-upload (#1398)
* chore: add Area label for Jobs/async-upload

Signed-off-by: Matteo Mortari <matteo.mortari@gmail.com>

* Update .github/labeler.yml

Co-authored-by: Alessio Pragliola <83355398+Al-Pragliola@users.noreply.github.com>
Signed-off-by: Matteo Mortari <matteo.mortari@gmail.com>

---------

Signed-off-by: Matteo Mortari <matteo.mortari@gmail.com>
Co-authored-by: Alessio Pragliola <83355398+Al-Pragliola@users.noreply.github.com>
2025-08-04 11:58:53 +00:00
Adysen Rothman 949f73e6b9
add query for list models & source enablement (#1391)
* add query for list models & source enablement

Signed-off-by: Adysen Rothman <85646824+adysenrothman@users.noreply.github.com>

* add enablement/disablement to sources

Signed-off-by: Adysen Rothman <85646824+adysenrothman@users.noreply.github.com>

* update tests

Signed-off-by: Adysen Rothman <85646824+adysenrothman@users.noreply.github.com>

* better bool readability

Signed-off-by: Adysen Rothman <85646824+adysenrothman@users.noreply.github.com>

* skip disabled sources

Signed-off-by: Adysen Rothman <85646824+adysenrothman@users.noreply.github.com>

* hasenabled check

Signed-off-by: Adysen Rothman <85646824+adysenrothman@users.noreply.github.com>

* small fix

Signed-off-by: Adysen Rothman <85646824+adysenrothman@users.noreply.github.com>

---------

Signed-off-by: Adysen Rothman <85646824+adysenrothman@users.noreply.github.com>
2025-08-04 11:04:52 +00:00
Alessio Pragliola 32bf6ec955
feat: added a generic readyz endpoint (#1390)
* feat: added a generic readyz endpoint

Signed-off-by: Alessio Pragliola <seth.pro@gmail.com>

* chore: remover database health check as default

Signed-off-by: Alessio Pragliola <seth.pro@gmail.com>

* chore: refactor the dirty schema handler to use the general

Signed-off-by: Alessio Pragliola <seth.pro@gmail.com>

* chore: remove unnecessary readyz handler

Signed-off-by: Alessio Pragliola <seth.pro@gmail.com>

* chore: remove unnecessary readinessdirty handler

Signed-off-by: Alessio Pragliola <seth.pro@gmail.com>

* chore: remove duplicate health check

Signed-off-by: Alessio Pragliola <seth.pro@gmail.com>

* feat: improve test coverage

Signed-off-by: Alessio Pragliola <seth.pro@gmail.com>

* feat: add tests to cover edge cases

Signed-off-by: Alessio Pragliola <seth.pro@gmail.com>

* fix: wrong readyz endpoint

Signed-off-by: Alessio Pragliola <seth.pro@gmail.com>

* feat: improve probes

Signed-off-by: Alessio Pragliola <seth.pro@gmail.com>

* feat: introduce consts for string literals

Signed-off-by: Alessio Pragliola <seth.pro@gmail.com>

---------

Signed-off-by: Alessio Pragliola <seth.pro@gmail.com>
2025-08-04 10:55:52 +00:00
Manaswini Das 8fc73138c2
Fix artifacts fetched twice on model version details page (#1333)
* Fix artifacts fetched twice on model version details page

Signed-off-by: manaswinidas <dasmanaswini10@gmail.com>

* Fix Cypress tests

Signed-off-by: manaswinidas <dasmanaswini10@gmail.com>

---------

Signed-off-by: manaswinidas <dasmanaswini10@gmail.com>
2025-08-04 07:47:52 +00:00
Arsheen Taj Syed a70a022d1d
Prompt Changes for Registration from MR (#1371)
* prompt changes for registration from MR

Signed-off-by: Taj010 <arsyed@redhat.com>

* add test to verify the redirection

Signed-off-by: Taj010 <arsyed@redhat.com>

---------

Signed-off-by: Taj010 <arsyed@redhat.com>
2025-08-04 07:46:53 +00:00
Sidney Glinton 91f18fb72e
fix(client/tests) Upgrade Pytest Asyncio and fix affected tests (#1394)
* chore: update pytest-asyncio

Signed-off-by: syntaxsdev <sglinton@redhat.com>

* fix: adjust uvloop for pytest-asyncio>=1.0

Signed-off-by: syntaxsdev <sglinton@redhat.com>

* fix: test skip logic

Signed-off-by: syntaxsdev <sglinton@redhat.com>

* fix: upgrade to pytest-asyncio>=1.1.0, more indepth workaround for actual loop swap

Signed-off-by: syntaxsdev <sglinton@redhat.com>

* lint:

Signed-off-by: syntaxsdev <sglinton@redhat.com>

* feat: swap patch for monkeypatch

Signed-off-by: syntaxsdev <sglinton@redhat.com>

---------

Signed-off-by: syntaxsdev <sglinton@redhat.com>
2025-08-01 15:40:50 +00:00
Yulia Krimerman 81dfa00643
Bumped saas and mod-arch-shared versions (#1393)
Signed-off-by: Yulia Krimerman <juliapiterova@hotmail.com>
2025-07-31 14:31:49 +00:00
Jon Burdo ee86a5bc80
add support to async job for http source (#1375)
* add support to async job for http source

This allows individual files to be specified:

  http://example.com/some-file.bin
  https://example.com/some-file.bin

A tar or zip archive will be extracted when they are downloaded:

  https://example.com/some-file.tar.gz
  https://example.com/some-file.zip

Signed-off-by: Jon Burdo <jon@jonburdo.com>

* update dependencies

Signed-off-by: Jon Burdo <jon@jonburdo.com>

* add async job e2e test for http uri type

Signed-off-by: Jon Burdo <jon@jonburdo.com>

* ruff format

Signed-off-by: Jon Burdo <jon@jonburdo.com>

* allow custom MODEL_SYNC_DESTINATION_OCI_BASE_IMAGE value in e2e test

Signed-off-by: Jon Burdo <jon@jonburdo.com>

* clean up e2e test

Signed-off-by: Jon Burdo <jon@jonburdo.com>

* update poetry.lock

Signed-off-by: Jon Burdo <jon@jonburdo.com>

* remove headers env var logic, mimetype error

Signed-off-by: Jon Burdo <jon@jonburdo.com>

* move mimetype checks

Signed-off-by: Jon Burdo <jon@jonburdo.com>

* add test for unpack_archive_file

Signed-off-by: Jon Burdo <jon@jonburdo.com>

* update http source for e2e

Signed-off-by: Jon Burdo <jon@jonburdo.com>

---------

Signed-off-by: Jon Burdo <jon@jonburdo.com>
2025-07-31 13:53:49 +00:00
dependabot[bot] b4c43a0dfc
build(deps): bump github.com/go-chi/cors from 1.2.1 to 1.2.2 (#1382)
Bumps [github.com/go-chi/cors](https://github.com/go-chi/cors) from 1.2.1 to 1.2.2.
- [Release notes](https://github.com/go-chi/cors/releases)
- [Commits](https://github.com/go-chi/cors/compare/v1.2.1...v1.2.2)

---
updated-dependencies:
- dependency-name: github.com/go-chi/cors
  dependency-version: 1.2.2
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-07-30 14:31:49 +00:00
dependabot[bot] f97d9c682c
build(deps): bump k8s.io/client-go from 0.33.2 to 0.33.3 (#1381)
Bumps [k8s.io/client-go](https://github.com/kubernetes/client-go) from 0.33.2 to 0.33.3.
- [Changelog](https://github.com/kubernetes/client-go/blob/master/CHANGELOG.md)
- [Commits](https://github.com/kubernetes/client-go/compare/v0.33.2...v0.33.3)

---
updated-dependencies:
- dependency-name: k8s.io/client-go
  dependency-version: 0.33.3
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-07-30 14:28:48 +00:00
Lucas Fernandez 223e8bd8e1
refactor: reorganize dependencies in package.json (#1387)
Signed-off-by: Lucas Fernandez <lferrnan@redhat.com>
2025-07-30 14:04:48 +00:00
dependabot[bot] 2706506f21
build(deps): bump github.com/docker/docker from 28.2.2+incompatible to 28.3.3+incompatible (#1385)
Bumps [github.com/docker/docker](https://github.com/docker/docker) from 28.2.2+incompatible to 28.3.3+incompatible.
- [Release notes](https://github.com/docker/docker/releases)
- [Commits](https://github.com/docker/docker/compare/v28.2.2...v28.3.3)

---
updated-dependencies:
- dependency-name: github.com/docker/docker
  dependency-version: 28.3.3+incompatible
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-07-30 13:32:47 +00:00
dependabot[bot] 7a98228f9d
build(deps): bump gorm.io/gorm from 1.30.0 to 1.30.1 (#1380)
Bumps [gorm.io/gorm](https://github.com/go-gorm/gorm) from 1.30.0 to 1.30.1.
- [Release notes](https://github.com/go-gorm/gorm/releases)
- [Commits](https://github.com/go-gorm/gorm/compare/v1.30.0...v1.30.1)

---
updated-dependencies:
- dependency-name: gorm.io/gorm
  dependency-version: 1.30.1
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-07-30 13:30:48 +00:00
Juntao Wang 31188d675c
Refactor details page for the registered model (#1384)
* Refactor details page for the registered model

Signed-off-by: Juntao Wang <juntwang@redhat.com>

* remove unnecessary comments

Signed-off-by: Juntao Wang <juntwang@redhat.com>

---------

Signed-off-by: Juntao Wang <juntwang@redhat.com>
2025-07-30 09:32:48 +00:00
dependabot[bot] fa02ee6bb4
build(deps): bump k8s.io/client-go from 0.33.2 to 0.33.3 in /clients/ui/bff (#1383)
Bumps [k8s.io/client-go](https://github.com/kubernetes/client-go) from 0.33.2 to 0.33.3.
- [Changelog](https://github.com/kubernetes/client-go/blob/master/CHANGELOG.md)
- [Commits](https://github.com/kubernetes/client-go/compare/v0.33.2...v0.33.3)

---
updated-dependencies:
- dependency-name: k8s.io/client-go
  dependency-version: 0.33.3
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-07-29 12:47:48 +00:00
Robert Sun 8c9d1903a7
deleted model serving references in frontend code (#1367)
* deleted model serving references in frontend code

Signed-off-by: rsun19 <robertssun1234@gmail.com>

* deleted deployment test

Signed-off-by: rsun19 <robertssun1234@gmail.com>

---------

Signed-off-by: rsun19 <robertssun1234@gmail.com>
2025-07-29 11:19:47 +00:00
Robert Sun a81ed96d2b
feat: added unit test files to frontend (#1312)
* added unit test files

Signed-off-by: rsun19 <robertssun1234@gmail.com>

* fixed lint errors

Signed-off-by: rsun19 <robertssun1234@gmail.com>

* updated uri function

Signed-off-by: rsun19 <robertssun1234@gmail.com>

---------

Signed-off-by: rsun19 <robertssun1234@gmail.com>
2025-07-29 11:07:47 +00:00
dependabot[bot] 44b45989b2
build(deps): bump docker/build-push-action from 5 to 6 (#1311)
Bumps [docker/build-push-action](https://github.com/docker/build-push-action) from 5 to 6.
- [Release notes](https://github.com/docker/build-push-action/releases)
- [Commits](https://github.com/docker/build-push-action/compare/v5...v6)

---
updated-dependencies:
- dependency-name: docker/build-push-action
  dependency-version: '6'
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-07-29 09:10:47 +00:00
dependabot[bot] 724345d688
build(deps-dev): bump starlette from 0.40.0 to 0.47.2 in /clients/python (#1362)
Bumps [starlette](https://github.com/encode/starlette) from 0.40.0 to 0.47.2.
- [Release notes](https://github.com/encode/starlette/releases)
- [Changelog](https://github.com/encode/starlette/blob/master/docs/release-notes.md)
- [Commits](https://github.com/encode/starlette/compare/0.40.0...0.47.2)

---
updated-dependencies:
- dependency-name: starlette
  dependency-version: 0.47.2
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-07-29 09:08:47 +00:00
dependabot[bot] 41084e7c3d
build(deps): bump huggingface-hub from 0.33.1 to 0.34.2 in /clients/python (#1379)
Bumps [huggingface-hub](https://github.com/huggingface/huggingface_hub) from 0.33.1 to 0.34.2.
- [Release notes](https://github.com/huggingface/huggingface_hub/releases)
- [Commits](https://github.com/huggingface/huggingface_hub/compare/v0.33.1...v0.34.2)

---
updated-dependencies:
- dependency-name: huggingface-hub
  dependency-version: 0.34.2
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-07-29 09:04:47 +00:00
dependabot[bot] 26ce0085da
build(deps): bump boto3 from 1.39.4 to 1.39.14 in /clients/python (#1378)
Bumps [boto3](https://github.com/boto/boto3) from 1.39.4 to 1.39.14.
- [Release notes](https://github.com/boto/boto3/releases)
- [Commits](https://github.com/boto/boto3/compare/1.39.4...1.39.14)

---
updated-dependencies:
- dependency-name: boto3
  dependency-version: 1.39.14
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-07-29 09:03:47 +00:00
dependabot[bot] 34671eaaf4
build(deps-dev): bump ruff from 0.12.3 to 0.12.5 in /clients/python (#1377)
Bumps [ruff](https://github.com/astral-sh/ruff) from 0.12.3 to 0.12.5.
- [Release notes](https://github.com/astral-sh/ruff/releases)
- [Changelog](https://github.com/astral-sh/ruff/blob/main/CHANGELOG.md)
- [Commits](https://github.com/astral-sh/ruff/compare/0.12.3...0.12.5)

---
updated-dependencies:
- dependency-name: ruff
  dependency-version: 0.12.5
  dependency-type: direct:development
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-07-29 09:02:47 +00:00
dependabot[bot] f333dc9a70
build(deps-dev): bump ray from 2.47.1 to 2.48.0 in /clients/python (#1340)
Bumps [ray](https://github.com/ray-project/ray) from 2.47.1 to 2.48.0.
- [Release notes](https://github.com/ray-project/ray/releases)
- [Commits](https://github.com/ray-project/ray/compare/ray-2.47.1...ray-2.48.0)

---
updated-dependencies:
- dependency-name: ray
  dependency-version: 2.48.0
  dependency-type: direct:development
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-07-29 08:52:48 +00:00
Eric Dobroveanu de8a251c8a
Refactor async job config to be more pythonic and typesafe (#1374)
* refactor(async-job): create the models for the configuration object

Signed-off-by: Eric Dobroveanu <edobrove@redhat.com>

* chore(async-job): ensure python 3.11 is used for async job local dev

Signed-off-by: Eric Dobroveanu <edobrove@redhat.com>

* chore(async-job): ensure python 3.11 is used for async job local dev

Signed-off-by: Eric Dobroveanu <edobrove@redhat.com>

* chore(async-job): wip converting functions to use new config

Signed-off-by: Eric Dobroveanu <edobrove@redhat.com>

* refactor(async-job): more config conversions

Signed-off-by: Eric Dobroveanu <edobrove@redhat.com>

* refactor(async-upload): finish cleaning up the config

Signed-off-by: Eric Dobroveanu <edobrove@redhat.com>

* chore: adjust gitignores

Signed-off-by: Eric Dobroveanu <edobrove@redhat.com>

* chore(async-job): clean up some unnecessary classes

Signed-off-by: Eric Dobroveanu <edobrove@redhat.com>

* chore(async-job): add comments to the base config models to clarify their purpose

Signed-off-by: Eric Dobroveanu <edobrove@redhat.com>

---------

Signed-off-by: Eric Dobroveanu <edobrove@redhat.com>
2025-07-29 07:51:47 +00:00
Chanakya Thirumala Setty f86d88022e
Fix: Align PostgreSQL overlay and fix readiness probe (#1357)
* updated postgres manifests and fixed readiness bug

Signed-off-by: Chanakya Thirumala Setty <cthiruma@redhat.com>

* updated passwords to match

Signed-off-by: Chanakya Thirumala Setty <cthiruma@redhat.com>

---------

Signed-off-by: Chanakya Thirumala Setty <cthiruma@redhat.com>
2025-07-24 16:33:01 +00:00
dependabot[bot] 883f45696b
build(deps-dev): bump axios from 1.10.0 to 1.11.0 in /clients/ui/frontend (#1369)
---
updated-dependencies:
- dependency-name: axios
  dependency-version: 1.11.0
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-07-24 13:07:01 +00:00
Matteo Mortari 7a2cfeb384
ci: fix root Make image/push (#1372)
* ci: fix root Make image/push

Signed-off-by: Matteo Mortari <matteo.mortari@gmail.com>

* to trigger ci

Signed-off-by: Matteo Mortari <matteo.mortari@gmail.com>

* revert 8356c82a6b

Signed-off-by: Matteo Mortari <matteo.mortari@gmail.com>

* fix async-job Makefile

Signed-off-by: Matteo Mortari <matteo.mortari@gmail.com>

* invoke script with expected IMG var

Signed-off-by: Matteo Mortari <matteo.mortari@gmail.com>

---------

Signed-off-by: Matteo Mortari <matteo.mortari@gmail.com>
2025-07-24 08:31:01 +00:00
Eric Dobroveanu 8356c82a6b
[async-job] E2E Test with Sample Job (#1326)
* chore(async-job): add script to setup and run sample job

Signed-off-by: Eric Dobroveanu <edobrove@redhat.com>

* chore: adjust readiness probe for faster tests

Signed-off-by: Eric Dobroveanu <edobrove@redhat.com>

* test(async-job): convert bash-based test to python-based

Signed-off-by: Eric Dobroveanu <edobrove@redhat.com>

* test(async-job): add readme for integration tests

Signed-off-by: Eric Dobroveanu <edobrove@redhat.com>

* chore(async-job): ensure correct make target is run in GH action

Signed-off-by: Eric Dobroveanu <edobrove@redhat.com>

* chore(async-job): update lockfile and convert to use boto3

Signed-off-by: Eric Dobroveanu <edobrove@redhat.com>

* test(async-job): simplify the integration tests

Signed-off-by: Eric Dobroveanu <edobrove@redhat.com>

* chore(async-job): remove unused job-values.yaml

Signed-off-by: Eric Dobroveanu <edobrove@redhat.com>

* chore(async-job): ensure async job has a separate env var from mr service

Signed-off-by: Eric Dobroveanu <edobrove@redhat.com>

* chore(async-job): adjust e2e tests to be able to build the images

Signed-off-by: Eric Dobroveanu <edobrove@redhat.com>

* chore(async-job): move env vars to the top level

Signed-off-by: Eric Dobroveanu <edobrove@redhat.com>

---------

Signed-off-by: Eric Dobroveanu <edobrove@redhat.com>
2025-07-24 06:39:00 +00:00
Jon Burdo c8e680d13c
Add .port-forwards.pid to .gitignore (#1370)
This is a temporary file written to in some Makefiles when running
clusters locally for testing.

Signed-off-by: Jon Burdo <jon@jonburdo.com>
2025-07-23 19:04:00 +00:00
Jon Burdo be68035427
add support to async job for huggingface source (#1365)
* add support to async job for huggingface source

This enables the model storage async job to have as the source a hf URI
in the following forms:

  hf://repo-name/model-name
  hf://repo-name/model-name:hash

Signed-off-by: Jon Burdo <jon@jonburdo.com>

* use uri as source type for hf

Signed-off-by: Jon Burdo <jon@jonburdo.com>

---------

Signed-off-by: Jon Burdo <jon@jonburdo.com>
2025-07-23 15:31:00 +00:00
Adysen Rothman b59b25bdba
add rhec source & pull rhec models using graphql (#1330)
* add rhec source & pull rhec models using graphql

Signed-off-by: Adysen Rothman <85646824+adysenrothman@users.noreply.github.com>

* replace graphql client & update source.yaml

Signed-off-by: Adysen Rothman <85646824+adysenrothman@users.noreply.github.com>

* add testing

Signed-off-by: Adysen Rothman <85646824+adysenrothman@users.noreply.github.com>

* cleanup logs and todo

Signed-off-by: Adysen Rothman <85646824+adysenrothman@users.noreply.github.com>

* rm temp placeholder

Signed-off-by: Adysen Rothman <85646824+adysenrothman@users.noreply.github.com>

* add graphql gen code to makefile

Signed-off-by: Adysen Rothman <85646824+adysenrothman@users.noreply.github.com>

* add makefule targets

Signed-off-by: Adysen Rothman <85646824+adysenrothman@users.noreply.github.com>

* resolve deps

Signed-off-by: Adysen Rothman <85646824+adysenrothman@users.noreply.github.com>

* add context to genqlient and rhec sources

Signed-off-by: Adysen Rothman <85646824+adysenrothman@users.noreply.github.com>

* work vendor

Signed-off-by: Adysen Rothman <85646824+adysenrothman@users.noreply.github.com>

---------

Signed-off-by: Adysen Rothman <85646824+adysenrothman@users.noreply.github.com>
2025-07-22 15:43:58 +00:00
Matteo Mortari db98d7b190
chore: bump MR py client version to future 0.2.22 (#1360)
Signed-off-by: Matteo Mortari <matteo.mortari@gmail.com>
2025-07-22 15:34:59 +00:00
Pushpa Padti cdf362a42c
Update delete model registry logic (#1363)
Signed-off-by: ppadti <ppadti@redhat.com>
2025-07-22 11:36:58 +00:00
Dipanshu Gupta 23e6e420d4
Fixing form-data package CVE (CRITICAL) (#1361)
Signed-off-by: Dipanshu Gupta <dipgupta@redhat.com>
2025-07-22 10:30:58 +00:00
Alessio Pragliola 095bbae787
fix: build issues post go.work file introduction (#1358)
* fix: add gorm-gen to the go workspace file

Signed-off-by: Alessio Pragliola <seth.pro@gmail.com>

* feat: make the gha run without a paths filter

Signed-off-by: Alessio Pragliola <seth.pro@gmail.com>

* fix: disable workspace for gorm-gen commands

Signed-off-by: Alessio Pragliola <seth.pro@gmail.com>

* fix: bff ci build step

Signed-off-by: Alessio Pragliola <seth.pro@gmail.com>

---------

Signed-off-by: Alessio Pragliola <seth.pro@gmail.com>
2025-07-21 17:26:58 +00:00
dependabot[bot] 532525278a
build(deps): bump google.golang.org/grpc from 1.72.1 to 1.73.0 (#1353)
Bumps [google.golang.org/grpc](https://github.com/grpc/grpc-go) from 1.72.1 to 1.73.0.
- [Release notes](https://github.com/grpc/grpc-go/releases)
- [Commits](https://github.com/grpc/grpc-go/compare/v1.72.1...v1.73.0)

---
updated-dependencies:
- dependency-name: google.golang.org/grpc
  dependency-version: 1.73.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-07-21 15:02:59 +00:00
dependabot[bot] f12ff4c936
build(deps): bump k8s.io/api from 0.33.2 to 0.33.3 (#1354)
Bumps [k8s.io/api](https://github.com/kubernetes/api) from 0.33.2 to 0.33.3.
- [Commits](https://github.com/kubernetes/api/compare/v0.33.2...v0.33.3)

---
updated-dependencies:
- dependency-name: k8s.io/api
  dependency-version: 0.33.3
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-07-21 15:01:58 +00:00
Alessio Pragliola 903eb23453
chore: move to more general purpose tag for go 1.24 (#1355)
Signed-off-by: Alessio Pragliola <seth.pro@gmail.com>
2025-07-21 14:37:59 +00:00
dependabot[bot] 009bd76d71
build(deps): bump github.com/testcontainers/testcontainers-go/modules/mysql from 0.37.0 to 0.38.0 (#1352)
Bumps [github.com/testcontainers/testcontainers-go/modules/mysql](https://github.com/testcontainers/testcontainers-go) from 0.37.0 to 0.38.0.
- [Release notes](https://github.com/testcontainers/testcontainers-go/releases)
- [Commits](https://github.com/testcontainers/testcontainers-go/compare/v0.37.0...v0.38.0)

---
updated-dependencies:
- dependency-name: github.com/testcontainers/testcontainers-go/modules/mysql
  dependency-version: 0.38.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-07-21 14:21:00 +00:00
dependabot[bot] d83a40b536
build(deps): bump k8s.io/apimachinery from 0.33.2 to 0.33.3 (#1347)
Bumps [k8s.io/apimachinery](https://github.com/kubernetes/apimachinery) from 0.33.2 to 0.33.3.
- [Commits](https://github.com/kubernetes/apimachinery/compare/v0.33.2...v0.33.3)

---
updated-dependencies:
- dependency-name: k8s.io/apimachinery
  dependency-version: 0.33.3
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-07-21 14:16:58 +00:00
dependabot[bot] b5858325f5
build(deps): bump github.com/testcontainers/testcontainers-go from 0.37.0 to 0.38.0 (#1345)
Bumps [github.com/testcontainers/testcontainers-go](https://github.com/testcontainers/testcontainers-go) from 0.37.0 to 0.38.0.
- [Release notes](https://github.com/testcontainers/testcontainers-go/releases)
- [Commits](https://github.com/testcontainers/testcontainers-go/compare/v0.37.0...v0.38.0)

---
updated-dependencies:
- dependency-name: github.com/testcontainers/testcontainers-go
  dependency-version: 0.38.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-07-21 14:01:58 +00:00
dependabot[bot] 0816f1484d
build(deps): bump github.com/go-logr/logr from 1.4.2 to 1.4.3 (#1348)
Bumps [github.com/go-logr/logr](https://github.com/go-logr/logr) from 1.4.2 to 1.4.3.
- [Release notes](https://github.com/go-logr/logr/releases)
- [Changelog](https://github.com/go-logr/logr/blob/master/CHANGELOG.md)
- [Commits](https://github.com/go-logr/logr/compare/v1.4.2...v1.4.3)

---
updated-dependencies:
- dependency-name: github.com/go-logr/logr
  dependency-version: 1.4.3
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-07-21 13:58:58 +00:00
dependabot[bot] 1aa78b5fc9
build(deps): bump gorm.io/driver/postgres from 1.5.7 to 1.6.0 (#1349)
Bumps [gorm.io/driver/postgres](https://github.com/go-gorm/postgres) from 1.5.7 to 1.6.0.
- [Commits](https://github.com/go-gorm/postgres/compare/v1.5.7...v1.6.0)

---
updated-dependencies:
- dependency-name: gorm.io/driver/postgres
  dependency-version: 1.6.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-07-21 13:54:58 +00:00
Alessio Pragliola 1d850f1b1b
feat: address high code security alerts (#1339)
* feat: solve tls security issues

Signed-off-by: Alessio Pragliola <seth.pro@gmail.com>

* fix: solve unsafe int conversion

Signed-off-by: Alessio Pragliola <seth.pro@gmail.com>

* fix: pagination hardening

Signed-off-by: Alessio Pragliola <seth.pro@gmail.com>

* fix: check insecure cipher after parsing

Signed-off-by: Alessio Pragliola <seth.pro@gmail.com>

* fix: use sanitezed params in pagination ordering

Signed-off-by: Alessio Pragliola <seth.pro@gmail.com>

* fix: pagination always default

Signed-off-by: Alessio Pragliola <seth.pro@gmail.com>

---------

Signed-off-by: Alessio Pragliola <seth.pro@gmail.com>
2025-07-21 13:53:57 +00:00
Matteo Mortari 719ffd63b2
refactor: remove remaining Robot Framework after migration (#1338)
- Remove Robot Framework test infrastructure and files
- Update README.md to reflect KinD and Pytest for e2e testing
- Replace testing diagram with drawio version
- Remove robot-tests GitHub workflow

This concludes the migration of Robot to Pytest
by removing the remainder files and ci/GHA from the repo

Signed-off-by: Matteo Mortari <matteo.mortari@gmail.com>
2025-07-21 10:04:58 +00:00
Matt Prahl 4554d487e2
Make the openapi client a separate Go module (#1322)
This will allow Kubeflow Pipelines to use the Go client directly without
needing to import all of Model Registry and inheriting its dependencies.

This introduces the concept of a Go workspace to allow multiple Go
modules to live in the same repo.

Signed-off-by: mprahl <mprahl@users.noreply.github.com>
2025-07-21 09:37:57 +00:00
Matteo Mortari 88dfddfa5c
feat: migrate "Regression" user stories from Robot to pytest (#1332)
* feat: migrate "Regression" user stories from Robot to pytest

- Add structured error message tests for malformed model data (migrated from Robot)
- Test API returns proper error messages for malformed RegisteredModel and ModelVersion requests
  - for both `code` and `message`
- Clean ups/linting

Signed-off-by: Matteo Mortari <matteo.mortari@gmail.com>

* add comment to Robot

Signed-off-by: Matteo Mortari <matteo.mortari@gmail.com>

* linting

Signed-off-by: Matteo Mortari <matteo.mortari@gmail.com>

---------

Signed-off-by: Matteo Mortari <matteo.mortari@gmail.com>
2025-07-21 06:32:57 +00:00
Matteo Mortari 4a33154d9b
fix: remove incorrect prepare workflow call in build-and-push-image.yml (#1336)
follow-up to https://github.com/kubeflow/model-registry/pull/1334

The build-and-push-image workflow was incorrectly calling the prepare.yml
as a step action instead of using it as a reusable workflow. This caused
the error "Can't find 'action.yml', 'action.yaml' or 'Dockerfile' under
.github/workflows/prepare.yml".

Example: https://github.com/kubeflow/model-registry/actions/runs/16379707719/job/46288348593

The prepare job is already correctly called at the top of the jobs section,
so the duplicate call in the steps was unnecessary and likely the cause
of the failure.

Signed-off-by: Matteo Mortari <matteo.mortari@gmail.com>
2025-07-19 05:28:41 +00:00
Alessio Pragliola c843a03dfe
feat: remove linting/gen from dockerfile (#1334)
* feat: remove linting/gen from dockerfile

Signed-off-by: Alessio Pragliola <seth.pro@gmail.com>

* fix: missing checkout in action

Signed-off-by: Alessio Pragliola <seth.pro@gmail.com>

* fix: change prepare step

Signed-off-by: Alessio Pragliola <seth.pro@gmail.com>

* feat: make test target atomic

Signed-off-by: Alessio Pragliola <seth.pro@gmail.com>

* feat: make the jobs depend on prepare

Signed-off-by: Alessio Pragliola <seth.pro@gmail.com>

---------

Signed-off-by: Alessio Pragliola <seth.pro@gmail.com>
2025-07-18 20:27:41 +00:00
Matteo Mortari 36ba2cc295
feat: migrate remainder of User Story from Robot to pytest (and DocArtifact) (#1331)
* feat: migrate remainder of User Story from Robot to pytest (and DocArtifact)

- Migrate Robot Framework user stories to pytest with comprehensive test coverage:
  - Model name storage and validation
  - Model description storage and updates
  - Longer documentation storage via DocArtifact
- Enhance ModelVersion with registered_model_id field and improved documentation
- Add DocArtifact create() and update() methods with proper artifact type mapping
- Improve Makefile deployment with conditional image building (helpful locally)
- Add migration comments to existing Robot Framework tests per convention

This change completes the DocArtifact implementation and modernizes the test suite
by migrating user stories from Robot Framework to pytest for better maintainability
and as previously agreed and enacted.

With this PR, all Robot UserStor-ies are migrated to pytest

Signed-off-by: Matteo Mortari <matteo.mortari@gmail.com>

* linting

Signed-off-by: Matteo Mortari <matteo.mortari@gmail.com>

* implement code review feedback

Signed-off-by: Matteo Mortari <matteo.mortari@gmail.com>

---------

Signed-off-by: Matteo Mortari <matteo.mortari@gmail.com>
2025-07-18 12:01:40 +00:00
Jon Burdo 929b423de9
chore: fix "Reviewers" URL in pull request template (#1329)
Signed-off-by: Jon Burdo <jon@jonburdo.com>
2025-07-18 10:09:40 +00:00
Robert Sun f96d87df7e
fix: centered header buttons (#1303)
* centered header buttons

Signed-off-by: rsun19 <robertssun1234@gmail.com>

* removed unnecessary flex property

Signed-off-by: rsun19 <robertssun1234@gmail.com>

* added todo comment

Signed-off-by: rsun19 <robertssun1234@gmail.com>

---------

Signed-off-by: rsun19 <robertssun1234@gmail.com>
2025-07-18 07:47:40 +00:00
Alessio Pragliola 277b3c2a86
chore: remove unused dep (go-enum) (#1323)
Signed-off-by: Alessio Pragliola <seth.pro@gmail.com>
2025-07-17 16:46:40 +00:00
Jon Burdo 0f0b58b42b
feat(async-job): write fatal error to termination message path (#1327)
Signed-off-by: Jon Burdo <jon@jonburdo.com>
2025-07-17 16:28:40 +00:00
Paul Boyd 3014cb90f2
chore(deps): use ubi9 base images (#1328)
Upgrade from ubi8 to ubi9.

Signed-off-by: Paul Boyd <paul@pboyd.io>
2025-07-17 15:02:41 +00:00
Arsheen Taj Syed 54f2d8a503
Added Testing Coverage for Model Registry Setting Section (#1325)
* adding the test cases for MR settings

Signed-off-by: Taj010 <arsyed@redhat.com>

* MR settings cypress-tests

Signed-off-by: Taj010 <arsyed@redhat.com>

---------

Signed-off-by: Taj010 <arsyed@redhat.com>
2025-07-17 12:12:39 +00:00
dependabot[bot] bde2a612c7
build(deps): bump k8s.io/client-go from 0.32.3 to 0.33.2 (#1233)
Bumps [k8s.io/client-go](https://github.com/kubernetes/client-go) from 0.32.3 to 0.33.2.
- [Changelog](https://github.com/kubernetes/client-go/blob/master/CHANGELOG.md)
- [Commits](https://github.com/kubernetes/client-go/compare/v0.32.3...v0.33.2)

---
updated-dependencies:
- dependency-name: k8s.io/client-go
  dependency-version: 0.33.2
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-07-16 18:22:39 +00:00
Alessio Pragliola 8c030e257a
feat: improve testing speed in embedmd (#1299)
* feat: improve testing speed in embedmd

Signed-off-by: Alessio Pragliola <seth.pro@gmail.com>

* fix: wrong db config test values

Signed-off-by: Alessio Pragliola <seth.pro@gmail.com>

* fix: test cleanup function postgresql

Signed-off-by: Alessio Pragliola <seth.pro@gmail.com>

* fix: postgres test cleanup function

Signed-off-by: Alessio Pragliola <seth.pro@gmail.com>

---------

Signed-off-by: Alessio Pragliola <seth.pro@gmail.com>
2025-07-16 17:12:38 +00:00
Andrey Velichkevich 1f29feeae9
feat(docs): Guide to report security vulnerabilities (#1301)
Signed-off-by: Andrey Velichkevich <andrey.velichkevich@gmail.com>
2025-07-16 15:22:40 +00:00
Pushpa Padti 19df266997
Update mock version function to filter versions based on rmID (#1324)
Signed-off-by: ppadti <ppadti@redhat.com>
2025-07-16 12:25:39 +00:00
Lucas Fernandez 367a87fb56
Federated mr implementation (#1288)
Enable MR access in federated mode

Signed-off-by: lucferbux <lferrnan@redhat.com>
2025-07-15 20:55:38 +00:00
Eric Dobroveanu a7494aabab
chore(async-job): add additional logging statements for debugging (#1321)
Signed-off-by: Eric Dobroveanu <edobrove@redhat.com>
2025-07-15 17:15:38 +00:00
Matteo Mortari 06961086fb
k8s(async-upload): fix Job Manifest default image name (#1320)
Signed-off-by: Matteo Mortari <matteo.mortari@gmail.com>
2025-07-15 14:56:39 +00:00
Paul Boyd 9e0e3f8d64
feat(catalog): Implement ListModels endpoint (#1300)
Signed-off-by: Paul Boyd <paul@pboyd.io>
2025-07-15 14:54:38 +00:00
Sidney Glinton dbebee9380
feat(async-job): add labeling to sample job (#1295)
* feat: add labeling to sample job

Signed-off-by: syntaxsdev <sglinton@redhat.com>

* feat: add MR IDs for label tracking

Signed-off-by: syntaxsdev <sglinton@redhat.com>

* Update jobs/async-upload/samples/sample_job_s3_to_oci.yaml

Co-authored-by: Matteo Mortari <matteo.mortari@gmail.com>
Signed-off-by: Sidney Glinton <sglinton@redhat.com>

* Update jobs/async-upload/samples/sample_job_s3_to_oci.yaml

Co-authored-by: Matteo Mortari <matteo.mortari@gmail.com>
Signed-off-by: Sidney Glinton <sglinton@redhat.com>

* fix: change label names to match env vars

Signed-off-by: syntaxsdev <sglinton@redhat.com>

* fix: numbering

Signed-off-by: syntaxsdev <sglinton@redhat.com>

---------

Signed-off-by: syntaxsdev <sglinton@redhat.com>
Signed-off-by: Sidney Glinton <sglinton@redhat.com>
Co-authored-by: Matteo Mortari <matteo.mortari@gmail.com>
2025-07-15 14:40:38 +00:00
Eric Dobroveanu 412999891c
[async-job] Various fixes for config parsing, logging and sample yaml (#1310)
* chore(async-job): improve logging

Signed-off-by: Eric Dobroveanu <edobrove@redhat.com>

* chore(async-job): ensure s3 endpoint url matches correct official env var docs

Signed-off-by: Eric Dobroveanu <edobrove@redhat.com>

* chore(async-job): add new oci variable to ensure correct registry lookup

Signed-off-by: Eric Dobroveanu <edobrove@redhat.com>

* chore(async-job): make sample more representitive of usecase

Signed-off-by: Eric Dobroveanu <edobrove@redhat.com>

* test(async-upload): fix tests

Signed-off-by: Eric Dobroveanu <edobrove@redhat.com>

* chore(async-job): correct array-type in sample secret

Signed-off-by: Eric Dobroveanu <edobrove@redhat.com>

* chore(async-job): update async job to run all the way through

Signed-off-by: Eric Dobroveanu <edobrove@redhat.com>

* chore(async-job): use busybox for example base image

Signed-off-by: Eric Dobroveanu <edobrove@redhat.com>

* chore(async-job): use add_argument over add

Signed-off-by: Eric Dobroveanu <edobrove@redhat.com>

* fix(async-job): allow TLS configuration for oci destination

Signed-off-by: Eric Dobroveanu <edobrove@redhat.com>

* test(async-job): fix tests

Signed-off-by: Eric Dobroveanu <edobrove@redhat.com>

---------

Signed-off-by: Eric Dobroveanu <edobrove@redhat.com>
2025-07-15 14:37:38 +00:00
dependabot[bot] cce093bae3
build(deps): bump aiohttp from 3.12.12 to 3.12.14 in /jobs/async-upload (#1317)
Bumps [aiohttp](https://github.com/aio-libs/aiohttp) from 3.12.12 to 3.12.14.
- [Release notes](https://github.com/aio-libs/aiohttp/releases)
- [Changelog](https://github.com/aio-libs/aiohttp/blob/master/CHANGES.rst)
- [Commits](https://github.com/aio-libs/aiohttp/compare/v3.12.12...v3.12.14)

---
updated-dependencies:
- dependency-name: aiohttp
  dependency-version: 3.12.14
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-07-15 14:27:38 +00:00
Matteo Mortari d094092f92
deps(async-upload): move aiohttp as dev since used only in testing (#1319)
Signed-off-by: Matteo Mortari <matteo.mortari@gmail.com>
2025-07-15 14:13:38 +00:00
dependabot[bot] e0169c08e6
build(deps): bump boto3 from 1.39.3 to 1.39.4 in /clients/python (#1308)
Bumps [boto3](https://github.com/boto/boto3) from 1.39.3 to 1.39.4.
- [Release notes](https://github.com/boto/boto3/releases)
- [Commits](https://github.com/boto/boto3/compare/1.39.3...1.39.4)

---
updated-dependencies:
- dependency-name: boto3
  dependency-version: 1.39.4
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-07-15 13:29:38 +00:00
dependabot[bot] c7eb14c9cc
build(deps-dev): bump schemathesis from 4.0.4 to 4.0.9 in /clients/python (#1306)
Bumps [schemathesis](https://github.com/schemathesis/schemathesis) from 4.0.4 to 4.0.9.
- [Release notes](https://github.com/schemathesis/schemathesis/releases)
- [Changelog](https://github.com/schemathesis/schemathesis/blob/master/CHANGELOG.md)
- [Commits](https://github.com/schemathesis/schemathesis/compare/v4.0.4...v4.0.9)

---
updated-dependencies:
- dependency-name: schemathesis
  dependency-version: 4.0.9
  dependency-type: direct:development
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-07-15 13:28:37 +00:00
dependabot[bot] 4ad9f87b5f
build(deps-dev): bump ruff from 0.12.1 to 0.12.3 in /clients/python (#1305)
Bumps [ruff](https://github.com/astral-sh/ruff) from 0.12.1 to 0.12.3.
- [Release notes](https://github.com/astral-sh/ruff/releases)
- [Changelog](https://github.com/astral-sh/ruff/blob/main/CHANGELOG.md)
- [Commits](https://github.com/astral-sh/ruff/compare/0.12.1...0.12.3)

---
updated-dependencies:
- dependency-name: ruff
  dependency-version: 0.12.3
  dependency-type: direct:development
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-07-15 13:27:38 +00:00
dependabot[bot] 026add59fe
build(deps): bump aiohttp from 3.12.11 to 3.12.14 in /clients/python (#1304)
Bumps [aiohttp](https://github.com/aio-libs/aiohttp) from 3.12.11 to 3.12.14.
- [Release notes](https://github.com/aio-libs/aiohttp/releases)
- [Changelog](https://github.com/aio-libs/aiohttp/blob/master/CHANGES.rst)
- [Commits](https://github.com/aio-libs/aiohttp/compare/v3.12.11...v3.12.14)

---
updated-dependencies:
- dependency-name: aiohttp
  dependency-version: 3.12.14
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-07-15 13:26:37 +00:00
Juntao Wang 0fa7dbfbd6
Update ModelPropertiesDescriptionListGroup component filtering logic (#1297)
* Update ModelPropertiesDescriptionListGroup component filtering logic

Signed-off-by: Juntao Wang <juntwang@redhat.com>

* Add hasError prop to ModelRegistrySelector

Signed-off-by: Juntao Wang <juntwang@redhat.com>

* address comments

Signed-off-by: Juntao Wang <juntwang@redhat.com>

---------

Signed-off-by: Juntao Wang <juntwang@redhat.com>
2025-07-15 11:47:37 +00:00
Paul Boyd 6b11b58e43
chore: update deps that required go 1.24 (#1316)
Signed-off-by: Paul Boyd <paul@pboyd.io>
2025-07-14 20:03:37 +00:00
Alessio Pragliola 47374731d5
feat: move to go 1.24 (#1313)
* feat: move to go 1.24

Signed-off-by: Alessio Pragliola <seth.pro@gmail.com>

* fix: docker image tag

Signed-off-by: Alessio Pragliola <seth.pro@gmail.com>

---------

Signed-off-by: Alessio Pragliola <seth.pro@gmail.com>
2025-07-14 19:02:37 +00:00
Juntao Wang 89b97016d8
Move PasswordInput to shared folder and remove deprecated Modal component (#1296)
Signed-off-by: Juntao Wang <juntwang@redhat.com>
2025-07-14 16:25:22 +00:00
Matteo Mortari 19e0bc8ecf
container: use UBI and poetry export (#1302)
* container: use UBI and poetry export

- switches to pythhon-312-minimal with UBI
- leverage poetry for requirements export
  - install in container the depedencies via vanilla pip
- remove stale requirements.txt from VCS/git

Signed-off-by: Matteo Mortari <matteo.mortari@gmail.com>

* nit: minor and cosmetic Dockerfile changes

Signed-off-by: Matteo Mortari <matteo.mortari@gmail.com>

* implement code review: install skopeo

Signed-off-by: Matteo Mortari <matteo.mortari@gmail.com>

---------

Signed-off-by: Matteo Mortari <matteo.mortari@gmail.com>
2025-07-14 14:59:22 +00:00
Eric Dobroveanu aca5e43960
Expand README to list out available parameters for async job (#1298)
Signed-off-by: Eric Dobroveanu <edobrove@redhat.com>
2025-07-12 06:38:53 +00:00
Eric Dobroveanu 000d57cb42
Adjust sample async job (#1293)
* docs(async-job): add example async job that uses mounted secret

Signed-off-by: Eric Dobroveanu <edobrove@redhat.com>

* docs(async-job): add sample using args instead of env

Signed-off-by: Eric Dobroveanu <edobrove@redhat.com>

* docs(async-job): add kitchen sink example

Signed-off-by: Eric Dobroveanu <edobrove@redhat.com>

* chore: adjust sampel to be representitive of typical use case

Signed-off-by: Eric Dobroveanu <edobrove@redhat.com>

* chore: move AWS key in sample to the correct place

Signed-off-by: Eric Dobroveanu <edobrove@redhat.com>

---------

Signed-off-by: Eric Dobroveanu <edobrove@redhat.com>
2025-07-12 06:30:53 +00:00
Eric Dobroveanu 050b97c579
chore(mr-client): match up the hand-rolled artifact data model to match the api (#1294)
Signed-off-by: Eric Dobroveanu <edobrove@redhat.com>
2025-07-11 15:09:54 +00:00
Eric Dobroveanu 8b4ffb2dd2
async-job: S3 -> OCI synchornization (#1256)
* chore: update config get the storage related params

Signed-off-by: Eric Dobroveanu <edobrove@redhat.com>

* chore: update minio e2e deployment to also expose the console

Signed-off-by: Eric Dobroveanu <edobrove@redhat.com>

* test: ensure minio is launched for async-job e2e tests

Signed-off-by: Eric Dobroveanu <edobrove@redhat.com>

* feat(async-job): add s3-download function along with unit tests

Signed-off-by: Eric Dobroveanu <edobrove@redhat.com>

* feat(async-job): add upload function and unit tests

Signed-off-by: Eric Dobroveanu <edobrove@redhat.com>

* chore(async-job): add perform_download method

Signed-off-by: Eric Dobroveanu <edobrove@redhat.com>

* chore(async-job): add missing model-format-version param

Signed-off-by: Eric Dobroveanu <edobrove@redhat.com>

* chore(async-job): add model-car formatting

Signed-off-by: Eric Dobroveanu <edobrove@redhat.com>

* test(async-job): add tests for upload

Signed-off-by: Eric Dobroveanu <edobrove@redhat.com>

* fix(async-job): correctly lookup config variables

Signed-off-by: Eric Dobroveanu <edobrove@redhat.com>

* refactor(async-upload): register the model separately to avoid versioning issues

Signed-off-by: Eric Dobroveanu <edobrove@redhat.com>

* chore: clean up dockerfile a little

Signed-off-by: Eric Dobroveanu <edobrove@redhat.com>

* chore(async-upload): ensure extras are included

Signed-off-by: Eric Dobroveanu <edobrove@redhat.com>

* chore(async-upload): include sample yaml file

Signed-off-by: Eric Dobroveanu <edobrove@redhat.com>

* chore(async-job): fix positional args

Signed-off-by: Eric Dobroveanu <edobrove@redhat.com>

* refactor: rework model metadata loading

Signed-off-by: Eric Dobroveanu <edobrove@redhat.com>

* refactor(async-upload): use artifact_id and mr-raw-client to fetch artifact

Signed-off-by: Eric Dobroveanu <edobrove@redhat.com>

* chore: quick sample update

Signed-off-by: Eric Dobroveanu <edobrove@redhat.com>

* chore: ensure skopeo is included in the dockerfile

Signed-off-by: Eric Dobroveanu <edobrove@redhat.com>

* chore(async-job): clean up download and upload scenarios

Signed-off-by: Eric Dobroveanu <edobrove@redhat.com>

* test(async-upload): remove unneeded test

Signed-off-by: Eric Dobroveanu <edobrove@redhat.com>

* test(async-upload): fix tests

Signed-off-by: Eric Dobroveanu <edobrove@redhat.com>

* ci: place container image build and load under CI/GHA

Signed-off-by: Matteo Mortari <matteo.mortari@gmail.com>

* ci: fix indentation

Signed-off-by: Matteo Mortari <matteo.mortari@gmail.com>

* make: fix dev-load-image kind load

Signed-off-by: Matteo Mortari <matteo.mortari@gmail.com>

* test(async-upload): fix unit tests

Signed-off-by: Eric Dobroveanu <edobrove@redhat.com>

---------

Signed-off-by: Eric Dobroveanu <edobrove@redhat.com>
Signed-off-by: Matteo Mortari <matteo.mortari@gmail.com>
Co-authored-by: Matteo Mortari <matteo.mortari@gmail.com>
2025-07-11 14:37:52 +00:00
Matteo Mortari 26d7b2fa48
ci: add GHA to build container image for async-upload Job (#1292)
Signed-off-by: Matteo Mortari <matteo.mortari@gmail.com>
2025-07-11 07:30:53 +00:00
Matteo Mortari 2abf354763
ci: add test-e2e for the async-upload Job (#1291)
* ci: add test-e2e for the async-upload Job

Signed-off-by: Matteo Mortari <matteo.mortari@gmail.com>

* GHA fix for mysql in kind

Signed-off-by: Matteo Mortari <matteo.mortari@gmail.com>

---------

Signed-off-by: Matteo Mortari <matteo.mortari@gmail.com>
2025-07-10 19:39:53 +00:00
Alessio Pragliola c7f480f3e0
fix: force line endings to LF (#1290)
Signed-off-by: Alessio Pragliola <seth.pro@gmail.com>
2025-07-10 16:16:53 +00:00
Paul Boyd d6acd1d9c6
feat(catalog): get model artifacts endpoint (#1283)
* fix(catalog): intermittent test failure for file monitor

Signed-off-by: Paul Boyd <paul@pboyd.io>

* feat(catalog): get model artifacts endpoint

Signed-off-by: Paul Boyd <paul@pboyd.io>

---------

Signed-off-by: Paul Boyd <paul@pboyd.io>
2025-07-10 15:56:53 +00:00
Adysen Rothman fc5ab19f47
add pagination to find sources (#1287)
* add pagination to find sources

Signed-off-by: Adysen Rothman <85646824+adysenrothman@users.noreply.github.com>

* dedupe

Signed-off-by: Adysen Rothman <85646824+adysenrothman@users.noreply.github.com>

* add testng for source pagination

Signed-off-by: Adysen Rothman <85646824+adysenrothman@users.noreply.github.com>

---------

Signed-off-by: Adysen Rothman <85646824+adysenrothman@users.noreply.github.com>
2025-07-10 15:01:53 +00:00
Matteo Mortari b09af1c2c7
ci and fix: fix non-E2E tests, add GHA for CI (#1289)
Signed-off-by: Matteo Mortari <matteo.mortari@gmail.com>
2025-07-10 13:42:53 +00:00
Chanakya Thirumala Setty 7bdf9ba14b
Add SSL/TLS Config for Postgres (#1250)
* added ssl/tls support for postgres

Signed-off-by: Chanakya Thirumala Setty <cthiruma@redhat.com>

* fixed ssl error

Signed-off-by: Chanakya Thirumala Setty <cthiruma@redhat.com>

* fix: prevent multiple db instances in parallel (#1286)

* fix: prevent multiple db instances in parallel

Signed-off-by: Alessio Pragliola <seth.pro@gmail.com>

* feat: make readinessprobe tests more stable

Signed-off-by: Alessio Pragliola <seth.pro@gmail.com>

* feat: remove sync.once from connector

Signed-off-by: Alessio Pragliola <seth.pro@gmail.com>

* fix: set readiness response to OK uppercase

Signed-off-by: Alessio Pragliola <seth.pro@gmail.com>

---------

Signed-off-by: Alessio Pragliola <seth.pro@gmail.com>
Signed-off-by: Chanakya Thirumala Setty <cthiruma@redhat.com>

* added ssl/tls support for postgres

Signed-off-by: Chanakya Thirumala Setty <cthiruma@redhat.com>

* rebased

Signed-off-by: Chanakya Thirumala Setty <cthiruma@redhat.com>

* fixed rebase issues

Signed-off-by: Chanakya Thirumala Setty <cthiruma@redhat.com>

* Update connector.go

Signed-off-by: Chanakya Thirumala Setty <66557279+Chanakya-TS@users.noreply.github.com>

---------

Signed-off-by: Chanakya Thirumala Setty <cthiruma@redhat.com>
Signed-off-by: Alessio Pragliola <seth.pro@gmail.com>
Signed-off-by: Chanakya Thirumala Setty <66557279+Chanakya-TS@users.noreply.github.com>
Co-authored-by: Alessio Pragliola <83355398+Al-Pragliola@users.noreply.github.com>
2025-07-10 13:25:53 +00:00
Alessio Pragliola 2dbb1b0387
fix: prevent multiple db instances in parallel (#1286)
* fix: prevent multiple db instances in parallel

Signed-off-by: Alessio Pragliola <seth.pro@gmail.com>

* feat: make readinessprobe tests more stable

Signed-off-by: Alessio Pragliola <seth.pro@gmail.com>

* feat: remove sync.once from connector

Signed-off-by: Alessio Pragliola <seth.pro@gmail.com>

* fix: set readiness response to OK uppercase

Signed-off-by: Alessio Pragliola <seth.pro@gmail.com>

---------

Signed-off-by: Alessio Pragliola <seth.pro@gmail.com>
2025-07-09 15:38:52 +00:00
Lucas Fernandez 6545c37be1
Fix settings page (#1285)
Signed-off-by: lucferbux <lferrnan@redhat.com>
2025-07-09 08:52:52 +00:00
Yulia Krimerman 126945bc4e
Added URL and Fixed Registry Button alignment (#1281)
Signed-off-by: Yulia Krimerman <juliapiterova@hotmail.com>
2025-07-09 07:21:51 +00:00
Paul Boyd f8905e9295
chore: add myself as reviewer (#1282)
Signed-off-by: Paul Boyd <pboyd@redhat.com>
2025-07-08 18:29:51 +00:00
dependabot[bot] 06fdaafd69
build(deps-dev): bump coverage from 7.9.1 to 7.9.2 in /clients/python (#1280)
Bumps [coverage](https://github.com/nedbat/coveragepy) from 7.9.1 to 7.9.2.
- [Release notes](https://github.com/nedbat/coveragepy/releases)
- [Changelog](https://github.com/nedbat/coveragepy/blob/master/CHANGES.rst)
- [Commits](https://github.com/nedbat/coveragepy/compare/7.9.1...7.9.2)

---
updated-dependencies:
- dependency-name: coverage
  dependency-version: 7.9.2
  dependency-type: direct:development
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-07-08 12:38:51 +00:00
dependabot[bot] 1e91d33421
build(deps): bump typing-extensions from 4.14.0 to 4.14.1 in /clients/python (#1279)
Bumps [typing-extensions](https://github.com/python/typing_extensions) from 4.14.0 to 4.14.1.
- [Release notes](https://github.com/python/typing_extensions/releases)
- [Changelog](https://github.com/python/typing_extensions/blob/main/CHANGELOG.md)
- [Commits](https://github.com/python/typing_extensions/compare/4.14.0...4.14.1)

---
updated-dependencies:
- dependency-name: typing-extensions
  dependency-version: 4.14.1
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-07-08 12:35:51 +00:00
dependabot[bot] d2c12d0447
build(deps): bump boto3 from 1.38.46 to 1.39.3 in /clients/python (#1278)
Bumps [boto3](https://github.com/boto/boto3) from 1.38.46 to 1.39.3.
- [Release notes](https://github.com/boto/boto3/releases)
- [Commits](https://github.com/boto/boto3/compare/1.38.46...1.39.3)

---
updated-dependencies:
- dependency-name: boto3
  dependency-version: 1.39.3
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-07-08 12:34:51 +00:00
dependabot[bot] 6faa8679d5
build(deps-dev): bump pytest from 8.4.0 to 8.4.1 in /clients/python (#1277)
Bumps [pytest](https://github.com/pytest-dev/pytest) from 8.4.0 to 8.4.1.
- [Release notes](https://github.com/pytest-dev/pytest/releases)
- [Changelog](https://github.com/pytest-dev/pytest/blob/main/CHANGELOG.rst)
- [Commits](https://github.com/pytest-dev/pytest/compare/8.4.0...8.4.1)

---
updated-dependencies:
- dependency-name: pytest
  dependency-version: 8.4.1
  dependency-type: direct:development
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-07-08 12:31:51 +00:00
Alessio Pragliola 21197e5747
chore: remove mysql_native_password requirement (#1274)
Signed-off-by: Alessio Pragliola <seth.pro@gmail.com>
2025-07-07 21:26:50 +00:00
Pushpa Padti a3f4b6e707
Add rolebinding to manage permissions-project tab (#1273)
Signed-off-by: ppadti <ppadti@redhat.com>
2025-07-07 16:17:22 +00:00
Paul Boyd 3bf656a05d
feat(catalog): reload sources and static catalogs (#1266)
Reload sources and static catalogs after changes are made to the files
they were loaded from.

Signed-off-by: Paul Boyd <pboyd@redhat.com>
2025-07-07 15:58:23 +00:00
Lucas Fernandez c7aed7f8f9
Tweak to support federated platform (#1219)
Signed-off-by: lucferbux <lferrnan@redhat.com>
2025-07-04 15:55:19 +00:00
dependabot[bot] d2c04b069b
build(deps-dev): bump ts-jest from 29.3.4 to 29.4.0 in /clients/ui/frontend (#1261)
Bumps [ts-jest](https://github.com/kulshekhar/ts-jest) from 29.3.4 to 29.4.0.
- [Release notes](https://github.com/kulshekhar/ts-jest/releases)
- [Changelog](https://github.com/kulshekhar/ts-jest/blob/main/CHANGELOG.md)
- [Commits](https://github.com/kulshekhar/ts-jest/compare/v29.3.4...v29.4.0)

---
updated-dependencies:
- dependency-name: ts-jest
  dependency-version: 29.4.0
  dependency-type: direct:development
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-07-04 15:15:21 +00:00
Federico Mosca ef98424c1f
Add state machine tests (#1271)
* Add state machine tests

Signed-off-by: fege <fmosca@redhat.com>

* adjust if to test on the open PR

Signed-off-by: fege <fmosca@redhat.com>

* revert if and remove debugging prints

Signed-off-by: fege <fmosca@redhat.com>

* revert if and remove debugging prints

Signed-off-by: fege <fmosca@redhat.com>

* Remove db-cleanup

Signed-off-by: fege <fmosca@redhat.com>

---------

Signed-off-by: fege <fmosca@redhat.com>
2025-07-04 14:38:20 +00:00
Alessio Pragliola 185c389190
feat: reintroduce e2e order by test (#1269)
Signed-off-by: Alessio Pragliola <seth.pro@gmail.com>
2025-07-04 06:46:20 +00:00
Federico Mosca 89af055243
Add api tests stateless (#1242)
* Add api tests stateless

Signed-off-by: fege <fmosca@redhat.com>

* Fix github action failures, lint and nox

Signed-off-by: fege <fmosca@redhat.com>

* Add too_slow and remove code to force the model version id

Signed-off-by: fege <fmosca@redhat.com>

* Add filter_too_much

Signed-off-by: fege <fmosca@redhat.com>

* add hook to avoid Unsatisfiable

Signed-off-by: fege <fmosca@redhat.com>

* remove variable

Signed-off-by: fege <fmosca@redhat.com>

* add artifact_states

Signed-off-by: fege <fmosca@redhat.com>

* Add example to schema

Signed-off-by: fege <fmosca@redhat.com>

* Add example in src

Signed-off-by: fege <fmosca@redhat.com>

* register strategy for string of int64

Signed-off-by: fege <fmosca@redhat.com>

* sort imports

Signed-off-by: fege <fmosca@redhat.com>

* modify case for problematic endpoints

Signed-off-by: fege <fmosca@redhat.com>

* add more examples

Signed-off-by: fege <fmosca@redhat.com>

* pin urllib

Signed-off-by: fege <fmosca@redhat.com>

* exclude problematic endpoints and test with valid data

Signed-off-by: fege <fmosca@redhat.com>

* Add gha to run the test and mark them with fuzz

Signed-off-by: fege <fmosca@redhat.com>

* revert change in Makefile pushed by error

Signed-off-by: fege <fmosca@redhat.com>

* skip test not marked with e2e or fuzz, trigger the fuzz on pr comment

Signed-off-by: fege <fmosca@redhat.com>

* trigger with label

Signed-off-by: fege <fmosca@redhat.com>

* correct the label name

Signed-off-by: fege <fmosca@redhat.com>

* add types and semplify the if

Signed-off-by: fege <fmosca@redhat.com>

* do not run e2e if test-fuzz label is added

Signed-off-by: fege <fmosca@redhat.com>

* unpin urllib

Signed-off-by: fege <fmosca@redhat.com>

* modify lock

Signed-off-by: fege <fmosca@redhat.com>

* fix: remove proc on label

Signed-off-by: Alessio Pragliola <seth.pro@gmail.com>

---------

Signed-off-by: fege <fmosca@redhat.com>
Signed-off-by: Alessio Pragliola <seth.pro@gmail.com>
Co-authored-by: Alessio Pragliola <seth.pro@gmail.com>
2025-07-03 16:56:18 +00:00
Adysen Rothman 5f69126991
update broken link for dir rename in manifests (#1268)
Signed-off-by: Adysen Rothman <85646824+adysenrothman@users.noreply.github.com>
2025-07-03 15:54:18 +00:00
Robert Sun f22f82a20a
fix: clear all filters (#1262)
Signed-off-by: rsun19 <robertssun1234@gmail.com>
2025-07-03 07:36:18 +00:00
Robert Sun 3efb314e27
fix: added return value for id==2 in mocks version (#1264)
Signed-off-by: rsun19 <robertssun1234@gmail.com>
2025-07-03 07:35:18 +00:00
Alessio Pragliola 273cc63557
fix: prevent dsn format bugs when adding tls mode in mysql (#1265)
Signed-off-by: Alessio Pragliola <seth.pro@gmail.com>
2025-07-02 19:17:18 +00:00
Matteo Mortari 4e4447ab14
chore: bump MR py client version to 0.2.21 (#1255)
Signed-off-by: Matteo Mortari <matteo.mortari@gmail.com>
2025-07-02 10:38:18 +00:00
dependabot[bot] 11af8840ec
build(deps): bump boto3 from 1.38.41 to 1.38.46 in /clients/python (#1259)
Bumps [boto3](https://github.com/boto/boto3) from 1.38.41 to 1.38.46.
- [Release notes](https://github.com/boto/boto3/releases)
- [Commits](https://github.com/boto/boto3/compare/1.38.41...1.38.46)

---
updated-dependencies:
- dependency-name: boto3
  dependency-version: 1.38.46
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-07-02 08:21:17 +00:00
dependabot[bot] fdbb22956c
build(deps): bump olot from 0.1.7 to 0.1.8 in /clients/python (#1260)
Bumps olot from 0.1.7 to 0.1.8.

---
updated-dependencies:
- dependency-name: olot
  dependency-version: 0.1.8
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-07-02 08:19:18 +00:00
dependabot[bot] f011eff313
build(deps): bump huggingface-hub from 0.33.0 to 0.33.1 in /clients/python (#1258)
Bumps [huggingface-hub](https://github.com/huggingface/huggingface_hub) from 0.33.0 to 0.33.1.
- [Release notes](https://github.com/huggingface/huggingface_hub/releases)
- [Commits](https://github.com/huggingface/huggingface_hub/compare/v0.33.0...v0.33.1)

---
updated-dependencies:
- dependency-name: huggingface-hub
  dependency-version: 0.33.1
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-07-02 08:01:18 +00:00
dependabot[bot] 9e0beacd96
build(deps-dev): bump ruff from 0.12.0 to 0.12.1 in /clients/python (#1257)
Bumps [ruff](https://github.com/astral-sh/ruff) from 0.12.0 to 0.12.1.
- [Release notes](https://github.com/astral-sh/ruff/releases)
- [Changelog](https://github.com/astral-sh/ruff/blob/main/CHANGELOG.md)
- [Commits](https://github.com/astral-sh/ruff/compare/0.12.0...0.12.1)

---
updated-dependencies:
- dependency-name: ruff
  dependency-version: 0.12.1
  dependency-type: direct:development
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-07-02 08:00:17 +00:00
Alessio Pragliola edaacadbe2
fix: add backoff also to postgresql (#1263)
Signed-off-by: Alessio Pragliola <seth.pro@gmail.com>
2025-07-02 07:58:18 +00:00
Yulia Krimerman 04ba57994b
Admin Settings - RBAC - Main view and routes 2 (#1247)
* addressed comments

Signed-off-by: Yulia Krimerman <juliapiterova@hotmail.com>

* fixed merge conflicts

Signed-off-by: Yulia Krimerman <juliapiterova@hotmail.com>

* enabled Create call

Signed-off-by: Yulia Krimerman <juliapiterova@hotmail.com>

* addressed comments

Signed-off-by: Yulia Krimerman <juliapiterova@hotmail.com>

* addressed latest comments

Signed-off-by: Yulia Krimerman <juliapiterova@hotmail.com>

* fixed backend test

Signed-off-by: Yulia Krimerman <juliapiterova@hotmail.com>

---------

Signed-off-by: Yulia Krimerman <juliapiterova@hotmail.com>
2025-07-02 07:25:18 +00:00
Chanakya Thirumala Setty b59a4ebd5c
Add PostgreSQL Support to Model Registry (#1204)
* Added Postgres Support

Signed-off-by: Chanakya Thirumala Setty <cthiruma@redhat.com>

* Fixed imports and env param

Signed-off-by: Chanakya Thirumala Setty <cthiruma@redhat.com>

* Fixed order of migrations and syntax errors

Signed-off-by: Chanakya Thirumala Setty <cthiruma@redhat.com>

* Updated logging and used constants from service.go

Signed-off-by: Chanakya Thirumala Setty <cthiruma@redhat.com>

* fixed logging in migrate_test.go

Signed-off-by: Chanakya Thirumala Setty <cthiruma@redhat.com>

* Update params.env

Signed-off-by: Chanakya Thirumala Setty <66557279+Chanakya-TS@users.noreply.github.com>
Signed-off-by: Chanakya Thirumala Setty <cthiruma@redhat.com>

* Refactor database type handling to use constants from a new types package

Signed-off-by: Chanakya Thirumala Setty <cthiruma@redhat.com>

* Makefile update

Signed-off-by: Chanakya Thirumala Setty <cthiruma@redhat.com>

* Added GHA

Signed-off-by: Chanakya Thirumala Setty <cthiruma@redhat.com>

* mysql gens

Signed-off-by: Chanakya Thirumala Setty <cthiruma@redhat.com>

* fixed incorrect gens

Signed-off-by: Chanakya Thirumala Setty <cthiruma@redhat.com>

* fixed typekind datatype

Signed-off-by: Chanakya Thirumala Setty <cthiruma@redhat.com>

* updating makefile

Signed-off-by: Chanakya Thirumala Setty <cthiruma@redhat.com>

* fixed default null issue

Signed-off-by: Chanakya Thirumala Setty <cthiruma@redhat.com>

* fixed workflow

Signed-off-by: Chanakya Thirumala Setty <cthiruma@redhat.com>

* updated go.mod go.sum

Signed-off-by: Chanakya Thirumala Setty <cthiruma@redhat.com>

* fixed migrate test

Signed-off-by: Chanakya Thirumala Setty <cthiruma@redhat.com>

* fixed errors

Signed-off-by: Chanakya Thirumala Setty <cthiruma@redhat.com>

* fixed unused import

Signed-off-by: Chanakya Thirumala Setty <cthiruma@redhat.com>

* added error return

Signed-off-by: Chanakya Thirumala Setty <cthiruma@redhat.com>

---------

Signed-off-by: Chanakya Thirumala Setty <cthiruma@redhat.com>
Signed-off-by: Chanakya Thirumala Setty <66557279+Chanakya-TS@users.noreply.github.com>
2025-07-01 13:50:17 +00:00
616 changed files with 74606 additions and 41779 deletions

2
.gitattributes vendored Normal file
View File

@ -0,0 +1,2 @@
# Always check-out / check-in files with LF line endings.
* text=auto eol=lf

View File

@ -12,7 +12,9 @@ updates:
schedule:
interval: "weekly"
- package-ecosystem: "pip"
directory: "/clients/python/"
directories:
- "/clients/python/"
- "/jobs/async-upload"
schedule:
interval: "weekly"
- package-ecosystem: "docker"

4
.github/labeler.yml vendored
View File

@ -20,6 +20,10 @@
- changed-files:
- any-glob-to-any-file: "csi/**"
"Area/Jobs/Async-upload":
- changed-files:
- any-glob-to-any-file: "jobs/async-upload/**"
"Area/Manifests":
- changed-files:
- any-glob-to-any-file: "manifests/**"

View File

@ -18,7 +18,7 @@
- [ ] Automated tests are provided as part of the PR for major new functionalities; testing instructions have been added in the PR body (for PRs involving changes that are not immediately obvious).
- [ ] The developer has manually tested the changes and verified that the changes work.
- [ ] Code changes follow the [kubeflow contribution guidelines](https://www.kubeflow.org/docs/about/contributing/).
- [ ] **For first time contributors**: Please reach out to the [Reviewers](../OWNERS) to ensure all tests are being run, ensuring the label `ok-to-test` has been added to the PR.
- [ ] **For first time contributors**: Please reach out to the [Reviewers](https://github.com/kubeflow/model-registry/blob/main/OWNERS) to ensure all tests are being run, ensuring the label `ok-to-test` has been added to the PR.
If you have UI changes

83
.github/workflows/async-upload-test.yml vendored Normal file
View File

@ -0,0 +1,83 @@
name: Test async-upload Job
on:
push:
branches:
- "main"
paths-ignore:
- "LICENSE*"
- "**.gitignore"
- "**.md"
- "**.txt"
- ".github/ISSUE_TEMPLATE/**"
- ".github/dependabot.yml"
- "docs/**"
pull_request:
paths:
- "jobs/async-upload/**"
- ".github/workflows/**"
permissions:
contents: read
env:
# Async Job
JOB_IMG_REGISTRY: ghcr.io
JOB_IMG_ORG: kubeflow
JOB_IMG_NAME: model-registry/job/async-upload
JOB_IMG_VERSION: cicd
# MR Server
IMG_REGISTRY: ghcr.io
IMG_ORG: kubeflow
IMG_REPO: model-registry/server
IMG_VERSION: cicd
PUSH_IMAGE: false
jobs:
py-test:
runs-on: ubuntu-latest
defaults:
run:
working-directory: jobs/async-upload
steps:
- uses: actions/checkout@v5
- name: Set up Python
uses: actions/setup-python@v5
with:
python-version: "3.10" # refers to the Container image
- name: Install Poetry
run: |
pipx install poetry
- name: Install dependencies
run: |
make install
- name: Run tests
run: |
make test
- name: Remove AppArmor profile for mysql in KinD on GHA # https://github.com/kubeflow/manifests/issues/2507
run: |
set -x
sudo apparmor_parser -R /etc/apparmor.d/usr.sbin.mysqld
- name: Run E2E tests
run: |
make test-e2e
job-test:
runs-on: ubuntu-latest
defaults:
run:
working-directory: jobs/async-upload
steps:
- uses: actions/checkout@v5
- name: Set up Python
uses: actions/setup-python@v5
with:
python-version: "3.10" # refers to the Container image
- name: Install Poetry
run: |
pipx install poetry
- name: Remove AppArmor profile for mysql in KinD on GHA # https://github.com/kubeflow/manifests/issues/2507
run: |
set -x
sudo apparmor_parser -R /etc/apparmor.d/usr.sbin.mysqld
- name: Execute Sample Job E2E test
run: |
make test-integration

View File

@ -0,0 +1,63 @@
name: Build and Push async-upload container image
on:
push:
branches:
- 'main'
tags:
- 'v*'
paths:
- 'jobs/async-upload/**'
- '!LICENSE*'
- '!DOCKERFILE*'
- '!**.gitignore'
- '!**.md'
- '!**.txt'
env:
IMG_REGISTRY: ghcr.io
IMG_ORG: kubeflow
IMG_NAME: model-registry/job/async-upload
REGISTRY_USER: ${{ github.actor }}
REGISTRY_PWD: ${{ secrets.GITHUB_TOKEN }}
jobs:
build-and-push:
runs-on: ubuntu-latest
permissions:
contents: read
packages: write
steps:
- name: Checkout repository
uses: actions/checkout@v5
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
- name: Log in to the Container registry
uses: docker/login-action@v3
with:
registry: ${{ env.IMG_REGISTRY }}
username: ${{ env.REGISTRY_USER }}
password: ${{ env.REGISTRY_PWD }}
- name: Set main-branch environment # this is for main-sha tag image build
if: github.ref == 'refs/heads/main'
run: |
commit_sha=${{ github.sha }}
tag=main-${commit_sha:0:7}
echo "VERSION=${tag}" >> $GITHUB_ENV
- name: Set tag environment # this is for v* tag image build
if: startsWith(github.ref, 'refs/tags/v')
run: |
echo "VERSION=${{ github.ref_name }}" >> $GITHUB_ENV
- name: Build and push Docker image
uses: docker/build-push-action@v6
with:
context: ./jobs/async-upload
push: true
tags: ${{ env.IMG_REGISTRY }}/${{ env.IMG_ORG }}/${{ env.IMG_NAME }}:${{ env.VERSION }}
cache-from: type=gha
cache-to: type=gha,mode=max

View File

@ -23,6 +23,9 @@ env:
jobs:
build-csi-image:
runs-on: ubuntu-latest
permissions:
contents: read
packages: write
steps:
# Assign context variable for various action contexts (tag, main, CI)
- name: Assigning tag context
@ -32,7 +35,7 @@ jobs:
if: github.head_ref == '' && github.ref == 'refs/heads/main'
run: echo "BUILD_CONTEXT=main" >> $GITHUB_ENV
# checkout branch
- uses: actions/checkout@v4
- uses: actions/checkout@v5
# set image version
- name: Set main-branch environment
if: env.BUILD_CONTEXT == 'main'

View File

@ -20,9 +20,19 @@ env:
PUSH_IMAGE: true
DOCKER_USER: ${{ github.actor }}
DOCKER_PWD: ${{ secrets.GITHUB_TOKEN }}
permissions: # default workflow permission, overridden for specific job where required
contents: read
jobs:
prepare:
uses: ./.github/workflows/prepare.yml
build-image:
permissions:
contents: read
packages: write
runs-on: ubuntu-latest
needs: prepare
steps:
# Assign context variable for various action contexts (tag, main, CI)
- name: Assigning tag context
@ -32,7 +42,7 @@ jobs:
if: github.head_ref == '' && github.ref == 'refs/heads/main'
run: echo "BUILD_CONTEXT=main" >> $GITHUB_ENV
# checkout branch
- uses: actions/checkout@v4
- uses: actions/checkout@v5
# set image version
- name: Set main-branch environment
if: env.BUILD_CONTEXT == 'main'

View File

@ -27,7 +27,7 @@ jobs:
packages: write
steps:
- name: Checkout repository
uses: actions/checkout@v4
uses: actions/checkout@v5
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
@ -74,5 +74,6 @@ jobs:
labels: ${{ steps.meta.outputs.labels }}
build-args: |
DEPLOYMENT_MODE=standalone
STYLE_THEME=mui-theme
cache-from: type=gha
cache-to: type=gha,mode=max

View File

@ -1,4 +1,5 @@
name: Build and Push UI and BFF Images
name: Build and Push UI Image
# this workflow builds an image to support local testing
on:
push:
branches:
@ -7,51 +8,72 @@ on:
- 'v*'
paths:
- 'clients/ui/**'
- '!LICENSE*'
- '!DOCKERFILE*'
- '!**.gitignore'
- '!**.md'
- '!**.txt'
env:
IMG_REGISTRY: ghcr.io
IMG_ORG: kubeflow
IMG_UI_REPO: model-registry/ui
PUSH_IMAGE: true
IMG_UI_REPO: model-registry/ui # this image is intended for local development, not production
DOCKER_USER: ${{ github.actor }}
DOCKER_PWD: ${{ secrets.GITHUB_TOKEN }}
jobs:
build-image:
build-and-push:
runs-on: ubuntu-latest
permissions:
contents: read
packages: write
steps:
# Assign context variable for various action contexts (tag, main, CI)
- name: Assigning tag context
if: github.head_ref == '' && startsWith(github.ref, 'refs/tags/v')
run: echo "BUILD_CONTEXT=tag" >> $GITHUB_ENV
# Assign context variable for various action contexts (main, CI)
- name: Assigning main context
if: github.head_ref == '' && github.ref == 'refs/heads/main'
run: echo "BUILD_CONTEXT=main" >> $GITHUB_ENV
# checkout branch
- uses: actions/checkout@v4
# set image version
- name: Set main-branch environment
if: env.BUILD_CONTEXT == 'main'
run: |
commit_sha=${{ github.event.after }}
tag=main-${commit_sha:0:7}
echo "VERSION=${tag}" >> $GITHUB_ENV
- name: Set tag environment
if: env.BUILD_CONTEXT == 'tag'
run: |
echo "VERSION=${{ github.ref_name }}" >> $GITHUB_ENV
- name: Build and Push UI Image
shell: bash
env:
IMG_REPO: ${{ env.IMG_UI_REPO }}
run: ./scripts/build_deploy.sh
- name: Tag Latest UI Image
if: env.BUILD_CONTEXT == 'main'
shell: bash
env:
IMG_REPO: ${{ env.IMG_UI_REPO }}
IMG: "${{ env.IMG_REGISTRY }}/${{ env.IMG_ORG }}/${{ env.IMG_UI_REPO }}"
BUILD_IMAGE: false # image is already built in "Build and Push UI Image" step
run: |
docker tag ${{ env.IMG }}:$VERSION ${{ env.IMG }}:latest
# BUILD_IMAGE=false skip the build, just push the tag made above
VERSION=latest ./scripts/build_deploy.sh
- name: Checkout repository
uses: actions/checkout@v5
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
- name: Log in to the Container registry
uses: docker/login-action@v3
with:
registry: ${{ env.IMG_REGISTRY }}
username: ${{ env.DOCKER_USER }}
password: ${{ env.DOCKER_PWD }}
- name: Set main-branch environment
if: github.ref == 'refs/heads/main'
run: |
commit_sha=${{ github.sha }}
tag=main-${commit_sha:0:7}
echo "VERSION=${tag}" >> $GITHUB_ENV
- name: Set tag environment
if: startsWith(github.ref, 'refs/tags/v')
run: |
echo "VERSION=${{ github.ref_name }}" >> $GITHUB_ENV
- name: Extract metadata (tags, labels) for Docker
id: meta
uses: docker/metadata-action@v5
with:
images: "${{ env.IMG_REGISTRY }}/${{ env.IMG_ORG }}/${{ env.IMG_UI_REPO }}"
tags: |
type=ref,event=branch
type=ref,event=pr
type=semver,pattern={{version}}
type=semver,pattern={{major}}.{{minor}}
type=sha
type=raw,value=${{ env.VERSION }},enable=${{ env.VERSION != '' }}
type=raw,value=latest,enable=${{ github.ref == 'refs/heads/main' }}
- name: Build and push Docker image
uses: docker/build-push-action@v6
with:
context: ./clients/ui
push: true
tags: ${{ steps.meta.outputs.tags }}
labels: ${{ steps.meta.outputs.labels }}
build-args: |
DEPLOYMENT_MODE=kubeflow
STYLE_THEME=mui-theme
cache-from: type=gha
cache-to: type=gha,mode=max

View File

@ -11,6 +11,10 @@ on:
- ".github/dependabot.yml"
- "docs/**"
- "clients/python/**"
permissions:
contents: read
env:
IMG_REGISTRY: ghcr.io
IMG_ORG: kubeflow
@ -21,7 +25,7 @@ jobs:
build-and-test-image:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@v5
- name: Generate Tag
shell: bash
id: tags

View File

@ -10,6 +10,11 @@ on:
- "!**.gitignore"
- "!**.md"
- "!**.txt"
permissions:
contents: read
packages: read
env:
IMG_ORG: kubeflow
IMG_REPO: model-registry/ui
@ -20,7 +25,7 @@ jobs:
runs-on: ubuntu-latest
steps:
# checkout branch
- uses: actions/checkout@v4
- uses: actions/checkout@v5
- name: Build UI Image
shell: bash
run: ./scripts/build_deploy.sh

View File

@ -13,30 +13,24 @@ on:
- ".github/ISSUE_TEMPLATE/**"
- ".github/dependabot.yml"
- "docs/**"
permissions:
contents: read
jobs:
prepare:
uses: ./.github/workflows/prepare.yml
build:
needs: prepare
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Checkout
uses: actions/checkout@v5
- name: Setup Go
uses: actions/setup-go@v5
with:
go-version: "1.23"
- name: Set up Python
uses: actions/setup-python@v5
with:
python-version: 3.9
go-version: "1.24"
- name: Build
run: make clean build
- name: Check if there are uncommitted file changes
run: |
clean=$(git status --porcelain)
if [[ -z "$clean" ]]; then
echo "Empty git status --porcelain: $clean"
else
echo "Uncommitted file changes detected: $clean"
git diff
exit 1
fi
run: make build/compile
- name: Unit tests
run: make test-cover

View File

@ -1,28 +1,53 @@
name: Check DB schema structs
on:
pull_request:
paths:
- ".github/workflows/**"
- "internal/db/schema/**"
- "internal/datastore/embedmd/mysql/migrations/**"
push:
branches:
- 'main'
tags:
- 'v*'
permissions:
contents: read
jobs:
check-schema-structs:
check-mysql-schema-structs:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@v5
- name: Setup Go
uses: actions/setup-go@v5
with:
go-version: "1.23.6"
- name: Generate DB schema structs
run: make gen/gorm
go-version: "1.24.4"
- name: Generate MySQL DB schema structs
run: make gen/gorm/mysql
- name: Check if there are uncommitted file changes
run: |
clean=$(git status --porcelain)
if [[ -z "$clean" ]]; then
echo "Empty git status --porcelain: $clean"
echo "MySQL schema is up to date."
else
echo "Uncommitted file changes detected: $clean"
echo "Uncommitted file changes detected after generating MySQL schema: $clean"
git diff
exit 1
fi
check-postgres-schema-structs:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v5
- name: Setup Go
uses: actions/setup-go@v5
with:
go-version: "1.24.4"
- name: Generate PostgreSQL DB schema structs
run: make gen/gorm/postgres
- name: Check if there are uncommitted file changes
run: |
clean=$(git status --porcelain)
if [[ -z "$clean" ]]; then
echo "PostgreSQL schema is up to date."
else
echo "Uncommitted file changes detected after generating PostgreSQL schema: $clean"
git diff
exit 1
fi

View File

@ -4,11 +4,15 @@ on:
paths:
- ".github/workflows/**"
- "api/openapi/model-registry.yaml"
permissions:
contents: read
jobs:
validate:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@v5
- name: Validate OpenAPI spec
run: |
make openapi/validate

View File

@ -20,6 +20,9 @@ on:
- "pkg/openapi/**"
- "go.mod"
permissions:
contents: read
env:
BRANCH: ${{ github.base_ref }}
jobs:
@ -28,7 +31,7 @@ jobs:
runs-on: ubuntu-latest
steps:
- name: Clone the code
uses: actions/checkout@v4
uses: actions/checkout@v5
- name: Setup Go
uses: actions/setup-go@v5

View File

@ -21,6 +21,9 @@ on:
# csi build depends on base go.mod https://github.com/kubeflow/model-registry/issues/311
- "go.mod"
permissions:
contents: read
env:
IMG_REGISTRY: ghcr.io
IMG_ORG: kubeflow
@ -32,7 +35,7 @@ jobs:
build-and-test-csi-image:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@v5
- name: Generate tag
shell: bash

View File

@ -6,6 +6,9 @@ on:
- main
pull_request:
permissions:
contents: read
jobs:
fossa-scan:
if: github.repository_owner == 'kubeflow' # FOSSA is not intended to run on forks.
@ -16,7 +19,7 @@ jobs:
FOSSA_API_KEY: 80871bdd477c2c97f65e9822cae99d20 # This is a push-only token that is safe to be exposed.
steps:
- name: Checkout tree
uses: actions/checkout@v4
uses: actions/checkout@v5
- name: Run FOSSA scan and upload build data
uses: fossas/fossa-action@v1.7.0

27
.github/workflows/prepare.yml vendored Normal file
View File

@ -0,0 +1,27 @@
on:
workflow_call
permissions:
contents: read
jobs:
prepare:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v5
- name: Setup Go
uses: actions/setup-go@v5
with:
go-version: "1.24"
- name: Prepare
run: make clean build/prepare
- name: Check if there are uncommitted file changes
run: |
clean=$(git status --porcelain)
if [[ -z "$clean" ]]; then
echo "Empty git status --porcelain: $clean"
else
echo "Uncommitted file changes detected: $clean"
git diff
exit 1
fi

View File

@ -14,7 +14,7 @@ jobs:
FORCE_COLOR: "1"
steps:
- name: Check out the repository
uses: actions/checkout@v4
uses: actions/checkout@v5
with:
fetch-depth: 0
- name: Set up Python

View File

@ -14,6 +14,9 @@ on:
- ".github/dependabot.yml"
- "docs/**"
permissions:
contents: read
jobs:
lint:
name: ${{ matrix.session }}
@ -28,7 +31,7 @@ jobs:
FORCE_COLOR: "1"
steps:
- name: Check out the repository
uses: actions/checkout@v4
uses: actions/checkout@v5
- name: Set up Python ${{ matrix.python }}
uses: actions/setup-python@v5
with:
@ -75,7 +78,7 @@ jobs:
nodejs: ["20"]
steps:
- name: Check out the repository
uses: actions/checkout@v4
uses: actions/checkout@v5
with:
fetch-depth: 0
- name: Set up Python
@ -140,7 +143,7 @@ jobs:
IMG_REPO: model-registry
steps:
- name: Check out the repository
uses: actions/checkout@v4
uses: actions/checkout@v5
- name: Set up Python ${{ matrix.python }}
uses: actions/setup-python@v5
with:
@ -221,6 +224,15 @@ jobs:
kubectl port-forward service/distribution-registry-test-service 5001:5001 &
sleep 2
nox --python=${{ matrix.python }} --session=e2e -- --cov-report=xml
- name: Nox test fuzz (main only)
if: github.ref == 'refs/heads/main'
working-directory: clients/python
run: |
kubectl port-forward -n kubeflow service/model-registry-service 8080:8080 &
kubectl port-forward -n minio svc/minio 9000:9000 &
kubectl port-forward service/distribution-registry-test-service 5001:5001 &
sleep 2
nox --python=${{ matrix.python }} --session=fuzz
docs-build:
name: ${{ matrix.session }}
@ -235,7 +247,7 @@ jobs:
FORCE_COLOR: "1"
steps:
- name: Check out the repository
uses: actions/checkout@v4
uses: actions/checkout@v5
- name: Set up Python ${{ matrix.python }}
uses: actions/setup-python@v5
with:

View File

@ -1,61 +0,0 @@
name: run-robot-tests
run-name: Run Robot Framework tests
# Run workflow
on:
# For every push to repository
push:
# To any branch
branches:
- "*"
# For every pull request
pull_request:
# But ignore this paths
paths-ignore:
- "LICENSE*"
- "DOCKERFILE*"
- "**.gitignore"
- "**.md"
- "**.txt"
- ".github/ISSUE_TEMPLATE/**"
- ".github/dependabot.yml"
- "docs/**"
- "scripts/**"
# Define workflow jobs
jobs:
# Job runs Robot Framework tests against locally build image from current code
run-robot-tests:
# Ubuntu latest is sufficient system for run
runs-on: ubuntu-latest
# Define steps of job
steps:
# Get checkout action to get this repository
- uses: actions/checkout@v4
# Install defined Python version to run Robot Framework tests
- name: Install Python 3.9.x
# Get setup-python action to install Python
uses: actions/setup-python@v5
with:
# Set Python version to install
python-version: "3.9"
# Set architecture of Python to install
architecture: "x64"
# Install required Python packages for running Robot Framework tests
- name: Install required Python packages
# Install required Python packages using pip
run: pip install -r test/robot/requirements.txt
# Install model_registry Python package from current code
- name: Install model_registry Python package
# Install model_registry package as editable using pip
run: pip install -e clients/python
# Start docker compose with locally build image from current code
- name: Start docker compose with local image
# Start docker compose in the background
run: docker compose -f docker-compose-local.yaml up --detach
# Run Robot Framework tests in REST mode against running docker compose
- name: Run Robot Framework tests (REST mode)
# Run Robot Framework tests in REST mode from test/robot directory
run: robot test/robot
# Shutdown docker compose with locally build image from current code
- name: Shutdown docker compose with local image
# Shutdown docker compose running in the background
run: docker compose -f docker-compose-local.yaml down

69
.github/workflows/test-fuzz.yml vendored Normal file
View File

@ -0,0 +1,69 @@
name: Fuzz Test
on:
workflow_dispatch:
inputs:
pr_number:
description: 'The pull request number to run fuzz tests on'
required: true
type: number
permissions:
contents: read
env:
IMG_REGISTRY: ghcr.io
IMG_ORG: kubeflow
IMG_REPO: model-registry/server
IMG_VERSION: latest
PUSH_IMAGE: false
jobs:
test-fuzz:
runs-on: ubuntu-latest
defaults:
run:
working-directory: clients/python
steps:
- name: Get PR details
id: pr
uses: actions/github-script@v7
with:
script: |
try {
const pr = await github.rest.pulls.get({
owner: context.repo.owner,
repo: context.repo.repo,
pull_number: ${{ github.event.inputs.pr_number }}
});
return {
sha: pr.data.head.sha,
ref: pr.data.head.ref
};
} catch (error) {
console.log(`Error fetching PR #${{ github.event.inputs.pr_number }}: ${error.message}`);
throw error;
}
- name: Checkout PR
uses: actions/checkout@v5
with:
ref: ${{ fromJson(steps.pr.outputs.result).sha }}
- name: Set up Python
uses: actions/setup-python@v5
with:
python-version: "3.12"
- name: Install Poetry
run: |
pipx install poetry
- name: Remove AppArmor profile for mysql in KinD on GHA # https://github.com/kubeflow/manifests/issues/2507
run: |
set -x
sudo apparmor_parser -R /etc/apparmor.d/usr.sbin.mysqld
- name: Run Fuzz Tests
run: |
echo "Starting fuzz tests..."
make test-fuzz

View File

@ -18,11 +18,15 @@ on:
- "!DOCKERFILE*"
- "!**.gitignore"
- "!**.md"
permissions:
contents: read
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@v5
- name: Setup Go
uses: actions/setup-go@v5

View File

@ -18,11 +18,15 @@ on:
- "!DOCKERFILE*"
- "!**.gitignore"
- "!**.md"
permissions:
contents: read
jobs:
test-and-build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@v5
- name: Set up Node.js
uses: actions/setup-node@v4

10
.gitignore vendored
View File

@ -16,9 +16,6 @@ __debug*
# Output of the go coverage tool, specifically when used with LiteIDE
*.out
# Go workspace file
go.work
# Idea files
.idea
@ -27,6 +24,9 @@ go.work
model-registry
metadata.sqlite.db
# Temporary files for running the project
.port-forwards.pid
# Ignore go vendor and code coverage files
vendor
coverage.*
@ -52,3 +52,7 @@ istio-*
# VSCode files
.vscode/
# Python
venv/
.python-version

View File

@ -1,11 +1,12 @@
# Build the model-registry binary
FROM --platform=$BUILDPLATFORM registry.access.redhat.com/ubi8/go-toolset:1.23 AS common
FROM --platform=$BUILDPLATFORM registry.access.redhat.com/ubi9/go-toolset:1.24 AS common
ARG TARGETOS
ARG TARGETARCH
WORKDIR /workspace
# Copy the Go Modules manifests
COPY ["go.mod", "go.sum", "./"]
# Copy the Go Modules manifests and workspace file
COPY ["go.mod", "go.sum", "go.work", "go.work.sum", "./"]
COPY ["pkg/openapi/go.mod", "pkg/openapi/"]
# cache deps before building and copying source so that we don't need to re-download as much
# and so that source changes don't invalidate our downloaded layer
RUN go mod download
@ -22,49 +23,15 @@ COPY templates/ templates/
COPY patches/ patches/
COPY catalog/ catalog/
###### Dev stage - start ######
# see: https://github.com/kubeflow/model-registry/pull/984#discussion_r2048732415
FROM common AS dev
USER root
RUN CGO_ENABLED=0 GOOS=${TARGETOS:-linux} GOARCH=${TARGETARCH} go build -a -o model-registry
FROM registry.access.redhat.com/ubi8/ubi-minimal:latest AS dev-build
WORKDIR /
COPY --from=dev /workspace/model-registry .
USER 65532:65532
ENTRYPOINT ["/model-registry"]
###### Dev stage - end ######
FROM common AS builder
USER root
# default NodeJS 14 is not enough for openapi-generator-cli, switch to Node JS currently supported
RUN yum remove -y nodejs npm
RUN yum module -y reset nodejs
RUN yum module -y enable nodejs:18
# install npm and java for openapi-generator-cli
RUN yum install -y nodejs npm java-11 python3
RUN make deps
# NOTE: The two instructions below are effectively equivalent to 'make clean build'
# DO NOT REMOVE THE 'build/prepare' TARGET!!!
# It ensures consitent repeatable Dockerfile builds
# prepare the build in a separate layer
RUN make clean build/prepare
# compile separately to optimize multi-platform builds
RUN CGO_ENABLED=0 GOOS=${TARGETOS:-linux} GOARCH=${TARGETARCH} make build/compile
# Use distroless as minimal base image to package the model-registry binary
# Refer to https://github.com/GoogleContainerTools/distroless for more details
FROM registry.access.redhat.com/ubi8/ubi-minimal:latest
FROM registry.access.redhat.com/ubi9/ubi-minimal:latest
WORKDIR /
# copy the registry binary
COPY --from=builder /workspace/model-registry .

View File

@ -1,9 +1,10 @@
# Build the model-registry binary
FROM registry.access.redhat.com/ubi8/go-toolset:1.23 AS builder
FROM registry.access.redhat.com/ubi9/go-toolset:1.24 AS builder
WORKDIR /workspace
# Copy the Go Modules manifests
COPY ["go.mod", "go.sum", "./"]
# Copy the Go Modules manifests and workspace file
COPY ["go.mod", "go.sum", "go.work", "./"]
COPY ["pkg/openapi/go.mod", "pkg/openapi/"]
# cache deps before building and copying source so that we don't need to re-download as much
# and so that source changes don't invalidate our downloaded layer
RUN go mod download
@ -25,7 +26,7 @@ RUN CGO_ENABLED=1 GOOS=linux GOARCH=amd64 make clean/odh build/odh
# Use distroless as minimal base image to package the model-registry binary
# Refer to https://github.com/GoogleContainerTools/distroless for more details
FROM registry.access.redhat.com/ubi8/ubi-minimal:latest
FROM registry.access.redhat.com/ubi9/ubi-minimal:latest
WORKDIR /
# copy the registry binary
COPY --from=builder /workspace/model-registry .

114
Makefile
View File

@ -14,8 +14,6 @@ ENVTEST ?= $(PROJECT_BIN)/setup-envtest
# add tools bin directory
PATH := $(PROJECT_BIN):$(PATH)
MLMD_VERSION ?= 1.14.0
# docker executable
DOCKER ?= docker
# default Dockerfile
@ -31,7 +29,9 @@ IMG_REPO ?= model-registry/server
# container image build path
BUILD_PATH ?= .
# container image
ifdef IMG_REGISTRY
ifdef IMG
IMG := ${IMG}
else ifdef IMG_REGISTRY
IMG := ${IMG_REGISTRY}/${IMG_ORG}/${IMG_REPO}
else
IMG := ${IMG_ORG}/${IMG_REPO}
@ -55,44 +55,11 @@ endif
model-registry: build
# clean the ml-metadata protos and trigger a fresh new build which downloads
# ml-metadata protos based on specified MLMD_VERSION
.PHONY: update/ml_metadata
update/ml_metadata: clean/ml_metadata clean build
clean/ml_metadata:
rm -rf api/grpc/ml_metadata/proto/*.proto
api/grpc/ml_metadata/proto/metadata_source.proto:
mkdir -p api/grpc/ml_metadata/proto/
cd api/grpc/ml_metadata/proto/ && \
curl -LO "https://raw.githubusercontent.com/google/ml-metadata/v${MLMD_VERSION}/ml_metadata/proto/metadata_source.proto" && \
sed -i 's#syntax = "proto[23]";#&\noption go_package = "github.com/kubeflow/model-registry/internal/ml_metadata/proto";#' metadata_source.proto
api/grpc/ml_metadata/proto/metadata_store.proto:
mkdir -p api/grpc/ml_metadata/proto/
cd api/grpc/ml_metadata/proto/ && \
curl -LO "https://raw.githubusercontent.com/google/ml-metadata/v${MLMD_VERSION}/ml_metadata/proto/metadata_store.proto" && \
sed -i 's#syntax = "proto[23]";#&\noption go_package = "github.com/kubeflow/model-registry/internal/ml_metadata/proto";#' metadata_store.proto
api/grpc/ml_metadata/proto/metadata_store_service.proto:
mkdir -p api/grpc/ml_metadata/proto/
cd api/grpc/ml_metadata/proto/ && \
curl -LO "https://raw.githubusercontent.com/google/ml-metadata/v${MLMD_VERSION}/ml_metadata/proto/metadata_store_service.proto" && \
sed -i 's#syntax = "proto[23]";#&\noption go_package = "github.com/kubeflow/model-registry/internal/ml_metadata/proto";#' metadata_store_service.proto
internal/ml_metadata/proto/%.pb.go: api/grpc/ml_metadata/proto/%.proto
bin/protoc -I./api/grpc --go_out=./internal --go_opt=paths=source_relative \
--go-grpc_out=./internal --go-grpc_opt=paths=source_relative $<
.PHONY: gen/grpc
gen/grpc: internal/ml_metadata/proto/metadata_store.pb.go internal/ml_metadata/proto/metadata_store_service.pb.go
internal/converter/generated/converter.go: internal/converter/*.go
${GOVERTER} gen github.com/kubeflow/model-registry/internal/converter/
.PHONY: gen/converter
gen/converter: gen/grpc internal/converter/generated/converter.go
gen/converter: internal/converter/generated/converter.go
api/openapi/model-registry.yaml: api/openapi/src/model-registry.yaml api/openapi/src/lib/*.yaml bin/yq
scripts/merge_openapi.sh model-registry.yaml
@ -137,16 +104,47 @@ start/mysql:
stop/mysql:
./scripts/teardown_mysql_db.sh
# generate the gorm structs
.PHONY: gen/gorm
gen/gorm: bin/golang-migrate start/mysql
# Start the PostgreSQL database
.PHONY: start/postgres
start/postgres:
./scripts/start_postgres_db.sh
# Stop the PostgreSQL database
.PHONY: stop/postgres
stop/postgres:
./scripts/teardown_postgres_db.sh
# generate the gorm structs for MySQL
.PHONY: gen/gorm/mysql
gen/gorm/mysql: bin/golang-migrate start/mysql
@(trap 'cd $(CURDIR) && $(MAKE) stop/mysql' EXIT; \
$(GOLANG_MIGRATE) -path './internal/datastore/embedmd/mysql/migrations' -database 'mysql://root:root@tcp(localhost:3306)/model-registry' up && \
cd gorm-gen && go run main.go --db-type mysql --dsn 'root:root@tcp(localhost:3306)/model-registry?charset=utf8mb4&parseTime=True&loc=Local')
cd gorm-gen && GOWORK=off go run main.go --db-type mysql --dsn 'root:root@tcp(localhost:3306)/model-registry?charset=utf8mb4&parseTime=True&loc=Local')
# generate the gorm structs for PostgreSQL
.PHONY: gen/gorm/postgres
gen/gorm/postgres: bin/golang-migrate start/postgres
@(trap 'cd $(CURDIR) && $(MAKE) stop/postgres' EXIT; \
$(GOLANG_MIGRATE) -path './internal/datastore/embedmd/postgres/migrations' -database 'postgres://postgres:postgres@localhost:5432/model-registry?sslmode=disable' up && \
cd gorm-gen && GOWORK=off go run main.go --db-type postgres --dsn 'postgres://postgres:postgres@localhost:5432/model-registry?sslmode=disable' && \
cd $(CURDIR) && ./scripts/remove_gorm_defaults.sh)
# generate the gorm structs (defaults to MySQL for backward compatibility)
# Use GORM_DB_TYPE=postgres to generate for PostgreSQL instead
.PHONY: gen/gorm
gen/gorm: bin/golang-migrate
ifeq ($(GORM_DB_TYPE),postgres)
$(MAKE) gen/gorm/postgres
else
$(MAKE) gen/gorm/mysql
endif
.PHONY: vet
vet:
${GO} vet ./...
@echo "Running go vet on all packages..."
@${GO} vet $$(${GO} list ./... | grep -vF github.com/kubeflow/model-registry/internal/db/filter) && \
echo "Checking filter package (parser.go excluded due to participle struct tags)..." && \
cd internal/db/filter && ${GO} build -o /dev/null . 2>&1 | grep -E "vet:|error:" || echo "✓ Filter package builds successfully"
.PHONY: clean/csi
clean/csi:
@ -164,24 +162,12 @@ clean-internal-server-openapi:
.PHONY: clean
clean: clean-pkg-openapi clean-internal-server-openapi clean/csi
rm -Rf ./model-registry internal/ml_metadata/proto/*.go internal/converter/generated/*.go
rm -Rf ./model-registry internal/converter/generated/*.go
.PHONY: clean/odh
clean/odh:
rm -Rf ./model-registry
bin/protoc:
./scripts/install_protoc.sh
bin/go-enum:
GOBIN=$(PROJECT_BIN) ${GO} install github.com/searKing/golang/tools/go-enum@v1.2.97
bin/protoc-gen-go:
GOBIN=$(PROJECT_BIN) ${GO} install google.golang.org/protobuf/cmd/protoc-gen-go@v1.31.0
bin/protoc-gen-go-grpc:
GOBIN=$(PROJECT_BIN) ${GO} install google.golang.org/grpc/cmd/protoc-gen-go-grpc@v1.3.0
bin/envtest:
GOBIN=$(PROJECT_BIN) ${GO} install sigs.k8s.io/controller-runtime/tools/setup-envtest@v0.0.0-20240320141353-395cfc7486e6
@ -199,7 +185,11 @@ bin/yq:
GOLANG_MIGRATE ?= ${PROJECT_BIN}/migrate
bin/golang-migrate:
GOBIN=$(PROJECT_PATH)/bin ${GO} install -tags 'mysql' github.com/golang-migrate/migrate/v4/cmd/migrate@v4.18.3
GOBIN=$(PROJECT_PATH)/bin ${GO} install -tags 'mysql,postgres' github.com/golang-migrate/migrate/v4/cmd/migrate@v4.18.3
GENQLIENT ?= ${PROJECT_BIN}/genqlient
bin/genqlient:
GOBIN=$(PROJECT_PATH)/bin ${GO} install github.com/Khan/genqlient@v0.7.0
OPENAPI_GENERATOR ?= ${PROJECT_BIN}/openapi-generator-cli
NPM ?= "$(shell which npm)"
@ -224,7 +214,7 @@ clean/deps:
rm -Rf bin/*
.PHONY: deps
deps: bin/protoc bin/go-enum bin/protoc-gen-go bin/protoc-gen-go-grpc bin/golangci-lint bin/goverter bin/openapi-generator-cli bin/envtest
deps: bin/golangci-lint bin/goverter bin/openapi-generator-cli bin/envtest
.PHONY: vendor
vendor:
@ -261,7 +251,7 @@ build/compile/csi:
build/csi: build/prepare/csi build/compile/csi
.PHONY: gen
gen: deps gen/grpc gen/openapi gen/openapi-server gen/converter
gen: deps gen/openapi gen/openapi-server gen/converter
${GO} generate ./...
.PHONY: lint
@ -275,15 +265,15 @@ lint/csi: bin/golangci-lint
${GOLANGCI_LINT} run internal/csi/...
.PHONY: test
test: gen bin/envtest
test: bin/envtest
KUBEBUILDER_ASSETS="$(shell $(ENVTEST) use $(ENVTEST_K8S_VERSION) -p path)" ${GO} test ./internal/... ./pkg/...
.PHONY: test-nocache
test-nocache: gen bin/envtest
test-nocache: bin/envtest
KUBEBUILDER_ASSETS="$(shell $(ENVTEST) use $(ENVTEST_K8S_VERSION) -p path)" ${GO} test ./internal/... ./pkg/... -count=1
.PHONY: test-cover
test-cover: gen bin/envtest
test-cover: bin/envtest
KUBEBUILDER_ASSETS="$(shell $(ENVTEST) use $(ENVTEST_K8S_VERSION) -p path)" ${GO} test ./internal/... ./pkg/... -coverprofile=coverage.txt
${GO} tool cover -html=coverage.txt -o coverage.html
@ -366,7 +356,7 @@ controller/vet: ## Run go vet against code.
.PHONY: controller/test
controller/test: controller/manifests controller/generate controller/fmt controller/vet bin/envtest ## Run tests.
KUBEBUILDER_ASSETS="$(shell $(ENVTEST) use $(ENVTEST_K8S_VERSION) --bin-dir $(PROJECT_BIN) -p path)" go test $$(go list ./internal/controller/... | grep -v /e2e) -coverprofile cover.out
KUBEBUILDER_ASSETS="$(shell $(ENVTEST) use $(ENVTEST_K8S_VERSION) --bin-dir $(PROJECT_BIN) -p path)" go test $$(go list ./internal/controller/... | grep -vF /e2e) -coverprofile cover.out
##@ Build

1
OWNERS
View File

@ -10,3 +10,4 @@ approvers:
reviewers:
- andreyvelich
- Al-Pragliola
- pboyd

View File

@ -5,7 +5,7 @@
[![FOSSA Status](https://app.fossa.com/api/projects/custom%2B162%2Fgithub.com%2Fkubeflow%2Fmodel-registry.svg?type=shield&issueType=license)](https://app.fossa.com/projects/custom%2B162%2Fgithub.com%2Fkubeflow%2Fmodel-registry?ref=badge_shield&issueType=license)
[![OpenSSF Best Practices](https://www.bestpractices.dev/projects/9937/badge)](https://www.bestpractices.dev/projects/9937)
Model registry provides a central repository for model developers to store and manage models, versions, and artifacts metadata. A Go-based application that leverages [ml_metadata](https://github.com/google/ml-metadata/) project under the hood.
Model registry provides a central repository for model developers to store and manage models, versions, and artifacts metadata.
## Red Hat's Pledge
- Red Hat drives the project's development through Open Source principles, ensuring transparency, sustainability, and community ownership.
@ -23,7 +23,7 @@ Model registry provides a central repository for model developers to store and m
- [Blog KF 1.10 introducing UI for Model Registry, CSI, and other features](https://blog.kubeflow.org/kubeflow-1.10-release/#model-registry)
2. Installation
- [installing Model Registry standalone](https://www.kubeflow.org/docs/components/model-registry/installation/#standalone-installation)
- [installing Model Registry with Kubeflow manifests](https://github.com/kubeflow/manifests/tree/master/apps/model-registry/upstream#readme)
- [installing Model Registry with Kubeflow manifests](https://github.com/kubeflow/manifests/tree/master/applications/model-registry/upstream#readme)
- [installing Model Registry using ODH Operator](https://github.com/opendatahub-io/model-registry-operator/tree/main/docs#readme)
3. Concepts
- [Logical Model](./docs/logical_model.md)
@ -45,7 +45,7 @@ Model registry provides a central repository for model developers to store and m
8. [UI](clients/ui/README.md)
## Pre-requisites:
- go >= 1.23
- go >= 1.24
- protoc v24.3 - [Protocol Buffers v24.3 Release](https://github.com/protocolbuffers/protobuf/releases/tag/v24.3)
- npm >= 10.2.0 - [Installing Node.js and npm](https://docs.npmjs.com/downloading-and-installing-node-js-and-npm)
- Java >= 11.0
@ -62,9 +62,7 @@ Run the following command to start the OpenAPI proxy server from source:
```shell
make run/proxy
```
The proxy service implements the OpenAPI defined in [model-registry.yaml](api/openapi/model-registry.yaml) to create a Model Registry specific REST API on top of the existing ml-metadata server.
> **NOTE** The ml-metadata server must be running and accessible from the environment where model-registry starts up.
The proxy service implements the OpenAPI defined in [model-registry.yaml](api/openapi/model-registry.yaml) to create a Model Registry specific REST API.
### Model registry logical model
@ -72,8 +70,8 @@ For a high-level documentation of the Model Registry _logical model_, please che
## Model Registry Core
The model registry core is the layer which implements the core/business logic by interacting with the underlying ml-metadata server.
It provides a model registry domain-specific [api](pkg/api/api.go) that is in charge to proxy all, appropriately transformed, requests to ml-metadata using gRPC calls.
The model registry core is the layer which implements the core/business logic by interacting with the underlying datastore internal service.
It provides a model registry domain-specific [api](pkg/api/api.go) that is in charge to proxy all, appropriately transformed, requests to the datastore internal service.
### Model registry library
@ -143,8 +141,6 @@ Subsequent builds will re-use the cached tools layer.
#### Running the proxy server
> **NOTE:** ml-metadata server must be running and accessible, see more info on how to start the gRPC server in the official ml-metadata [documentation](https://github.com/google/ml-metadata).
The following command starts the proxy server:
```shell
@ -155,11 +151,11 @@ Where, `<uid>`, `<gid>`, and `<host-path>` are the same as in the migrate comman
And `<hostname>` and `<port>` are the local ip and port to use to expose the container's default `8080` listening port.
The server listens on `localhost` by default, hence the `-n 0.0.0.0` option allows the server port to be exposed.
#### Running model registry & ml-metadata
#### Running model registry
> **NOTE:** Docker compose must be installed in your environment.
There are two `docker-compose` files that make the startup of both model registry and ml-metadara easier, by simply running:
There are two `docker-compose` files that make the startup of both model registry and a MySQL database easier, by simply running:
```shell
docker compose -f docker-compose[-local].yaml up
@ -167,20 +163,38 @@ docker compose -f docker-compose[-local].yaml up
The main difference between the two docker compose files is that `-local` one build the model registry from source, the other one, instead, download the `latest` pushed [quay.io](https://quay.io/repository/opendatahub/model-registry?tab=tags) image.
When shutting down the docker compose, you might want to clean-up the SQLite db file generated by ML Metadata, for example `./test/config/ml-metadata/metadata.sqlite.db`
### Testing architecture
The following diagram illustrates testing strategy for the several components in Model Registry project:
![](/docs/Model%20Registry%20Testing%20areas.png)
![](/docs/Model%20Registry%20Testing%20areas.drawio.png)
Go layers components are tested with Unit Tests written in Go, as well as Integration Tests leveraging Testcontainers.
This allows to verify the expected "Core layer" of logical data mapping developed and implemented in Go, matches technical expectations.
Python client is also tested with Unit Tests and Integration Tests written in Python.
End-to-end testing is developed with Pytest and Robot Framework; this higher-lever layer of testing is used to demonstrate *User Stories* from high level perspective.
End-to-end testing is developed with KinD and Pytest; this higher-lever layer of testing is used to demonstrate *User Stories* from high level perspective.
## Related Components
### Model Catalog Service
- [Model Catalog Service](catalog/README.md) - Federated model discovery across external catalogs
### Kubernetes Components
- [Controller](cmd/controller/README.md) - Kubernetes controller for model registry CRDs
- [CSI Driver](cmd/csi/README.md) - Container Storage Interface for model artifacts
### Client Components
- [UI Backend for Frontend (BFF)](clients/ui/bff/README.md) - Go-based BFF service for the React UI
- [UI Frontend](clients/ui/frontend/README.md) - React-based frontend application
### Job Components
- [Async Upload Job](jobs/async-upload/README.md) - Background job for handling asynchronous model uploads
### Development & Deployment
- [Development Environment](devenv/README.md) - Local development setup and tools
- [Kubernetes Manifests](manifests/kustomize/README.md) - Kustomize-based Kubernetes deployment manifests
## FAQ

View File

@ -72,6 +72,7 @@ git checkout -b mr_maintainer-$TDATE-upstreamSync
pushd manifests/kustomize/base && kustomize edit set image ghcr.io/kubeflow/model-registry/server=ghcr.io/kubeflow/model-registry/server:$VVERSION && popd
pushd manifests/kustomize/options/csi && kustomize edit set image ghcr.io/kubeflow/model-registry/storage-initializer=ghcr.io/kubeflow/model-registry/storage-initializer:$VVERSION && popd
pushd manifests/kustomize/options/ui/base && kustomize edit set image model-registry-ui=ghcr.io/kubeflow/model-registry/ui:$VVERSION && popd
pushd manifests/kustomize/options/catalog && kustomize edit set image ghcr.io/kubeflow/model-registry/server=ghcr.io/kubeflow/model-registry/server:$VVERSION && popd
git add .
git commit -s

64
SECURITY.md Normal file
View File

@ -0,0 +1,64 @@
# Security Policy
## Supported Versions
Kubeflow Model Registry versions are expressed as `vX.Y.Z`, where X is the major version,
Y is the minor version, and Z is the patch version, following the
[Semantic Versioning](https://semver.org/) terminology.
The Kubeflow Model Registry project maintains release branches for the most recent two minor releases.
Applicable fixes, including security fixes, may be backported to those two release branches,
depending on severity and feasibility.
Users are encouraged to stay updated with the latest releases to benefit from security patches and
improvements.
## Reporting a Vulnerability
We're extremely grateful for security researchers and users that report vulnerabilities to the
Kubeflow Open Source Community. All reports are thoroughly investigated by Kubeflow projects owners.
You can use the following ways to report security vulnerabilities privately:
- Using the Kubeflow Model Registry repository [GitHub Security Advisory](https://github.com/kubeflow/model-registry/security/advisories/new).
- Using our private Kubeflow Steering Committee mailing list: ksc@kubeflow.org.
Please provide detailed information to help us understand and address the issue promptly.
## Disclosure Process
**Acknowledgment**: We will acknowledge receipt of your report within 10 business days.
**Assessment**: The Kubeflow projects owners will investigate the reported issue to determine its
validity and severity.
**Resolution**: If the issue is confirmed, we will work on a fix and prepare a release.
**Notification**: Once a fix is available, we will notify the reporter and coordinate a public
disclosure.
**Public Disclosure**: Details of the vulnerability and the fix will be published in the project's
release notes and communicated through appropriate channels.
## Prevention Mechanisms
Kubeflow Model Registry employs several measures to prevent security issues:
**Code Reviews**: All code changes are reviewed by maintainers to ensure code quality and security.
**Dependency Management**: Regular updates and monitoring of dependencies (e.g. Dependabot) to
address known vulnerabilities.
**Continuous Integration**: Automated testing and security checks are integrated into the CI/CD pipeline.
**Image Scanning**: Container images are scanned for vulnerabilities.
## Communication Channels
For the general questions please join the following resources:
- Kubeflow [Slack channels](https://www.kubeflow.org/docs/about/community/#kubeflow-slack-channels).
- Kubeflow discuss [mailing list](https://www.kubeflow.org/docs/about/community/#kubeflow-mailing-list).
Please **do not report** security vulnerabilities through public channels.

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@ -20,13 +20,12 @@ paths:
parameters:
- name: source
description: |-
Filter models by source. If not provided, models from all sources
are returned. If multiple sources are provided, models from any of
the sources are returned.
Filter models by source. This parameter is currently required and
may only be specified once.
schema:
type: string
in: query
required: false
required: true
- name: q
description: Free-form keyword search used to filter the response.
schema:
@ -138,6 +137,15 @@ paths:
required: true
components:
schemas:
ArtifactTypeQueryParam:
description: Supported artifact types for querying.
enum:
- model-artifact
- doc-artifact
- dataset-artifact
- metric
- parameter
type: string
BaseModel:
type: object
properties:
@ -298,6 +306,10 @@ components:
name:
description: The name of the catalog source.
type: string
enabled:
description: Whether the catalog source is enabled.
type: boolean
default: true
CatalogSourceList:
description: List of CatalogSource entities.
allOf:
@ -566,6 +578,43 @@ components:
type: string
in: query
required: false
filterQuery:
examples:
filterQuery:
value: "name='my-model' AND state='LIVE'"
name: filterQuery
description: |
A SQL-like query string to filter the list of entities. The query supports rich filtering capabilities with automatic type inference.
**Supported Operators:**
- Comparison: `=`, `!=`, `<>`, `>`, `<`, `>=`, `<=`
- Pattern matching: `LIKE`, `ILIKE` (case-insensitive)
- Set membership: `IN`
- Logical: `AND`, `OR`
- Grouping: `()` for complex expressions
**Data Types:**
- Strings: `"value"` or `'value'`
- Numbers: `42`, `3.14`, `1e-5`
- Booleans: `true`, `false` (case-insensitive)
**Property Access:**
- Standard properties: `name`, `id`, `state`, `createTimeSinceEpoch`
- Custom properties: Any user-defined property name
- Escaped properties: Use backticks for special characters: `` `custom-property` ``
- Type-specific access: `property.string_value`, `property.double_value`, `property.int_value`, `property.bool_value`
**Examples:**
- Basic: `name = "my-model"`
- Comparison: `accuracy > 0.95`
- Pattern: `name LIKE "%tensorflow%"`
- Complex: `(name = "model-a" OR name = "model-b") AND state = "LIVE"`
- Custom property: `framework.string_value = "pytorch"`
- Escaped property: `` `mlflow.source.type` = "notebook" ``
schema:
type: string
in: query
required: false
pageSize:
examples:
pageSize:
@ -595,6 +644,30 @@ components:
$ref: "#/components/schemas/SortOrder"
in: query
required: false
artifactType:
style: form
explode: true
examples:
artifactType:
value: model-artifact
name: artifactType
description: "Specifies the artifact type for listing artifacts."
schema:
$ref: "#/components/schemas/ArtifactTypeQueryParam"
in: query
required: false
stepIds:
style: form
explode: true
examples:
stepIds:
value: "1,2,3"
name: stepIds
description: "Comma-separated list of step IDs to filter metrics by."
schema:
type: string
in: query
required: false
securitySchemes:
Bearer:
scheme: bearer

File diff suppressed because it is too large Load Diff

View File

@ -20,13 +20,12 @@ paths:
parameters:
- name: source
description: |-
Filter models by source. If not provided, models from all sources
are returned. If multiple sources are provided, models from any of
the sources are returned.
Filter models by source. This parameter is currently required and
may only be specified once.
schema:
type: string
in: query
required: false
required: true
- name: q
description: Free-form keyword search used to filter the response.
schema:
@ -210,6 +209,10 @@ components:
name:
description: The name of the catalog source.
type: string
enabled:
description: Whether the catalog source is enabled.
type: boolean
default: true
CatalogSourceList:
description: List of CatalogSource entities.
allOf:

View File

@ -1,5 +1,14 @@
components:
schemas:
ArtifactTypeQueryParam:
description: Supported artifact types for querying.
enum:
- model-artifact
- doc-artifact
- dataset-artifact
- metric
- parameter
type: string
BaseModel:
type: object
properties:
@ -295,6 +304,43 @@ components:
type: string
in: query
required: false
filterQuery:
examples:
filterQuery:
value: "name='my-model' AND state='LIVE'"
name: filterQuery
description: |
A SQL-like query string to filter the list of entities. The query supports rich filtering capabilities with automatic type inference.
**Supported Operators:**
- Comparison: `=`, `!=`, `<>`, `>`, `<`, `>=`, `<=`
- Pattern matching: `LIKE`, `ILIKE` (case-insensitive)
- Set membership: `IN`
- Logical: `AND`, `OR`
- Grouping: `()` for complex expressions
**Data Types:**
- Strings: `"value"` or `'value'`
- Numbers: `42`, `3.14`, `1e-5`
- Booleans: `true`, `false` (case-insensitive)
**Property Access:**
- Standard properties: `name`, `id`, `state`, `createTimeSinceEpoch`
- Custom properties: Any user-defined property name
- Escaped properties: Use backticks for special characters: `` `custom-property` ``
- Type-specific access: `property.string_value`, `property.double_value`, `property.int_value`, `property.bool_value`
**Examples:**
- Basic: `name = "my-model"`
- Comparison: `accuracy > 0.95`
- Pattern: `name LIKE "%tensorflow%"`
- Complex: `(name = "model-a" OR name = "model-b") AND state = "LIVE"`
- Custom property: `framework.string_value = "pytorch"`
- Escaped property: `` `mlflow.source.type` = "notebook" ``
schema:
type: string
in: query
required: false
pageSize:
examples:
pageSize:
@ -324,6 +370,30 @@ components:
$ref: "#/components/schemas/SortOrder"
in: query
required: false
artifactType:
style: form
explode: true
examples:
artifactType:
value: model-artifact
name: artifactType
description: "Specifies the artifact type for listing artifacts."
schema:
$ref: "#/components/schemas/ArtifactTypeQueryParam"
in: query
required: false
stepIds:
style: form
explode: true
examples:
stepIds:
value: "1,2,3"
name: stepIds
description: "Comma-separated list of step IDs to filter metrics by."
schema:
type: string
in: query
required: false
securitySchemes:
Bearer:
scheme: bearer

File diff suppressed because it is too large Load Diff

View File

@ -1,5 +1,10 @@
PROJECT_BIN := $(CURDIR)/../bin
OPENAPI_GENERATOR := $(PROJECT_BIN)/openapi-generator-cli
GENQLIENT_BIN ?= $(PROJECT_BIN)/genqlient
GENQLIENT_CONFIG := internal/catalog/genqlient/genqlient.yaml
GENQLIENT_OUTPUT := internal/catalog/genqlient/generated.go
GENQLIENT_SOURCES := $(wildcard internal/catalog/genqlient/queries/*.graphql)
GRAPHQL_SCHEMA := internal/catalog/genqlient/queries/schema.graphql
.PHONY: gen/openapi-server
gen/openapi-server: internal/server/openapi/api_model_catalog_service.go
@ -16,10 +21,24 @@ pkg/openapi/client.go: ../api/openapi/catalog.yaml
--ignore-file-override ./.openapi-generator-ignore --additional-properties=isGoSubmodule=true,enumClassPrefix=true,useOneOfDiscriminatorLookup=true
gofmt -w pkg/openapi
.PHONY: gen/graphql
gen/graphql: $(GENQLIENT_OUTPUT)
$(GENQLIENT_OUTPUT): $(GENQLIENT_CONFIG) $(GENQLIENT_SOURCES) $(PROJECT_BIN)/genqlient
$(GENQLIENT_BIN) --config $(GENQLIENT_CONFIG)
.PHONY: download/graphql-schema
download/graphql-schema:
npx get-graphql-schema https://catalog.redhat.com/api/containers/graphql/ > $(GRAPHQL_SCHEMA)
.PHONY: clean-pkg-openapi
clean-pkg-openapi:
while IFS= read -r file; do rm -f "pkg/openapi/$$file"; done < pkg/openapi/.openapi-generator/FILES
.PHONY: clean-graphql
clean-graphql:
rm -f $(GENQLIENT_OUTPUT)
.PHONY: clean-internal-server-openapi
clean-internal-server-openapi:
while IFS= read -r file; do rm -f "internal/server/openapi/$$file"; done < internal/server/openapi/.openapi-generator/FILES

159
catalog/README.md Normal file
View File

@ -0,0 +1,159 @@
# Model Catalog Service
The Model Catalog Service provides a **read-only discovery service** for ML models across multiple catalog sources. It acts as a federated metadata aggregation layer, allowing users to search and discover models from various external catalogs through a unified REST API.
## Architecture Overview
The catalog service operates as a **metadata aggregation layer** that:
- Federates model discovery across different external catalogs
- Provides a unified REST API for model search and discovery
- Uses pluggable source providers for extensibility
- Operates without traditional database storage (file-based configuration)
### Supported Catalog Sources
- **YAML Catalog** - Static YAML files containing model metadata
- **Red Hat Ecosystem Catalog (RHEC)** - GraphQL API integration for container and model discovery. Can be used as a reference implementation of how one could extend with their own graphql providers.
## REST API
### Base URL
`/api/model_catalog/v1alpha1`
### Endpoints
| Method | Path | Description |
|--------|------|-------------|
| `GET` | `/sources` | List all catalog sources with pagination |
| `GET` | `/models` | Search models across sources (requires `source` parameter) |
| `GET` | `/sources/{source_id}/models/{model_name+}` | Get specific model details |
| `GET` | `/sources/{source_id}/models/{model_name}/artifacts` | List model artifacts |
### OpenAPI Specification
View the complete API specification:
- [Swagger UI](https://www.kubeflow.org/docs/components/model-registry/reference/model-catalog-rest-api/#swagger-ui)
- [Swagger Playground](https://petstore.swagger.io/?url=https://raw.githubusercontent.com/kubeflow/model-registry/main/api/openapi/catalog.yaml)
## Data Models
### CatalogSource
Simple source metadata:
```json
{
"id": "string",
"name": "string"
}
```
### CatalogModel
Rich model metadata including:
- Basic info: `name`, `description`, `readme`, `maturity`
- Technical: `language[]`, `tasks[]`, `libraryName`
- Legal: `license`, `licenseLink`, `provider`
- Extensible: `customProperties` (key-value metadata)
### CatalogModelArtifact
Artifact references:
```json
{
"uri": "string",
"customProperties": {}
}
```
## Configuration
The catalog service uses **file-based configuration** instead of traditional databases:
```yaml
# catalog-sources.yaml
catalogs:
- id: "yaml-catalog"
name: "Local YAML Catalog"
type: "yaml"
properties:
path: "./models"
- id: "rhec-catalog"
name: "Red Hat Ecosystem Catalog"
type: "rhec"
properties:
# RHEC-specific configuration
```
## Development
### Prerequisites
- Go >= 1.24
- Java >= 11.0 (for OpenAPI generation)
- Node.js >= 20.0.0 (for GraphQL schema downloads)
### Building
Generate OpenAPI server code:
```bash
make gen/openapi-server
```
Generate OpenAPI client code:
```bash
make gen/openapi
```
Generate GraphQL client (for RHEC integration):
```bash
make gen/graphql
```
### Project Structure
```
catalog/
├── cmd/ # Main application entry point
├── internal/
│ ├── catalog/ # Core catalog logic and providers
│ │ ├── genqlient/ # GraphQL client generation
│ │ └── testdata/ # Test fixtures
│ └── server/openapi/ # REST API implementation
├── pkg/openapi/ # Generated OpenAPI client
├── scripts/ # Build and generation scripts
└── Makefile # Build targets
```
### Adding New Catalog Providers
1. Implement the `CatalogSourceProvider` interface:
```go
type CatalogSourceProvider interface {
GetModel(ctx context.Context, name string) (*model.CatalogModel, error)
ListModels(ctx context.Context, params ListModelsParams) (model.CatalogModelList, error)
GetArtifacts(ctx context.Context, name string) (*model.CatalogModelArtifactList, error)
}
```
2. Register your provider:
```go
catalog.RegisterCatalogType("my-catalog", func(source *CatalogSourceConfig) (CatalogSourceProvider, error) {
return NewMyCatalogProvider(source)
})
```
### Testing
The catalog service includes comprehensive testing:
- Unit tests for core catalog logic
- Integration tests for provider implementations
- OpenAPI contract validation
### Configuration Hot Reloading
The service automatically reloads configuration when the catalog sources file changes, enabling dynamic catalog updates without service restarts.
## Integration
The catalog service is designed to complement the main Model Registry service by providing:
- External model discovery capabilities
- Unified metadata aggregation
- Read-only access to distributed model catalogs
For complete Model Registry documentation, see the [main README](../README.md).

View File

@ -5,6 +5,7 @@ import (
"fmt"
"os"
"path/filepath"
"sync"
"github.com/golang/glog"
"k8s.io/apimachinery/pkg/util/yaml"
@ -12,25 +13,10 @@ import (
model "github.com/kubeflow/model-registry/catalog/pkg/openapi"
)
type SortDirection int
const (
SortDirectionAscending SortDirection = iota
SortDirectionDescending
)
type SortField int
const (
SortByUnspecified SortField = iota
SortByName
SortByPublished
)
type ListModelsParams struct {
Query string
SortBy SortField
SortDirection SortDirection
Query string
OrderBy model.OrderByField
SortOrder model.SortOrder
}
// CatalogSourceProvider is implemented by catalog source types, e.g. YamlCatalog
@ -39,7 +25,15 @@ type CatalogSourceProvider interface {
// nothing is found with the name provided it returns nil, without an
// error.
GetModel(ctx context.Context, name string) (*model.CatalogModel, error)
// ListModels returns all models according to the parameters. If
// nothing suitable is found, it returns an empty list.
ListModels(ctx context.Context, params ListModelsParams) (model.CatalogModelList, error)
// GetArtifacts returns all artifacts for a particular model. If no
// model is found with that name, it returns nil. If the model is
// found, but has no artifacts, an empty list is returned.
GetArtifacts(ctx context.Context, name string) (*model.CatalogModelArtifactList, error)
}
// CatalogSourceConfig is a single entry from the catalog sources YAML file.
@ -75,11 +69,35 @@ type CatalogSource struct {
Metadata model.CatalogSource
}
func LoadCatalogSources(catalogsPath string) (map[string]CatalogSource, error) {
type SourceCollection struct {
sourcesMu sync.RWMutex
sources map[string]CatalogSource
}
func NewSourceCollection(sources map[string]CatalogSource) *SourceCollection {
return &SourceCollection{sources: sources}
}
func (sc *SourceCollection) All() map[string]CatalogSource {
sc.sourcesMu.RLock()
defer sc.sourcesMu.RUnlock()
return sc.sources
}
func (sc *SourceCollection) Get(name string) (src CatalogSource, ok bool) {
sc.sourcesMu.RLock()
defer sc.sourcesMu.RUnlock()
src, ok = sc.sources[name]
return
}
func (sc *SourceCollection) load(path string) error {
// Get absolute path of the catalog config file
absConfigPath, err := filepath.Abs(catalogsPath)
absConfigPath, err := filepath.Abs(path)
if err != nil {
return nil, fmt.Errorf("failed to get absolute path for %s: %v", catalogsPath, err)
return fmt.Errorf("failed to get absolute path for %s: %v", path, err)
}
// Get the directory of the config file to resolve relative paths
@ -88,12 +106,12 @@ func LoadCatalogSources(catalogsPath string) (map[string]CatalogSource, error) {
// Save current working directory
originalWd, err := os.Getwd()
if err != nil {
return nil, fmt.Errorf("failed to get current working directory: %v", err)
return fmt.Errorf("failed to get current working directory: %v", err)
}
// Change to the config directory to make relative paths work
if err := os.Chdir(configDir); err != nil {
return nil, fmt.Errorf("failed to change to config directory %s: %v", configDir, err)
return fmt.Errorf("failed to change to config directory %s: %v", configDir, err)
}
// Ensure we restore the original working directory when we're done
@ -106,34 +124,45 @@ func LoadCatalogSources(catalogsPath string) (map[string]CatalogSource, error) {
config := sourceConfig{}
bytes, err := os.ReadFile(absConfigPath)
if err != nil {
return nil, err
return err
}
if err = yaml.UnmarshalStrict(bytes, &config); err != nil {
return nil, err
return err
}
catalogs := make(map[string]CatalogSource, len(config.Catalogs))
sources := make(map[string]CatalogSource, len(config.Catalogs))
for _, catalogConfig := range config.Catalogs {
// If enabled is explicitly set to false, skip
hasEnabled := catalogConfig.HasEnabled()
if hasEnabled && *catalogConfig.Enabled == false {
continue
}
// If not explicitly set, default to enabled
if !hasEnabled {
t := true
catalogConfig.CatalogSource.Enabled = &t
}
catalogType := catalogConfig.Type
glog.Infof("reading config type %s...", catalogType)
registerFunc, ok := registeredCatalogTypes[catalogType]
if !ok {
return nil, fmt.Errorf("catalog type %s not registered", catalogType)
return fmt.Errorf("catalog type %s not registered", catalogType)
}
id := catalogConfig.GetId()
if len(id) == 0 {
return nil, fmt.Errorf("invalid catalog id %s", id)
return fmt.Errorf("invalid catalog id %s", id)
}
if _, exists := catalogs[id]; exists {
return nil, fmt.Errorf("duplicate catalog id %s", id)
if _, exists := sources[id]; exists {
return fmt.Errorf("duplicate catalog id %s", id)
}
provider, err := registerFunc(&catalogConfig)
if err != nil {
return nil, fmt.Errorf("error reading catalog type %s with id %s: %v", catalogType, id, err)
return fmt.Errorf("error reading catalog type %s with id %s: %v", catalogType, id, err)
}
catalogs[id] = CatalogSource{
sources[id] = CatalogSource{
Provider: provider,
Metadata: catalogConfig.CatalogSource,
}
@ -141,5 +170,36 @@ func LoadCatalogSources(catalogsPath string) (map[string]CatalogSource, error) {
glog.Infof("loaded config %s of type %s", id, catalogType)
}
return catalogs, nil
sc.sourcesMu.Lock()
defer sc.sourcesMu.Unlock()
sc.sources = sources
return nil
}
func LoadCatalogSources(path string) (*SourceCollection, error) {
sc := &SourceCollection{}
err := sc.load(path)
if err != nil {
return nil, err
}
go func() {
changes, err := getMonitor().Path(path)
if err != nil {
glog.Errorf("unable to watch sources file: %v", err)
// Not fatal, we just won't get automatic updates.
}
for range changes {
glog.Infof("Reloading sources %s", path)
err = sc.load(path)
if err != nil {
glog.Errorf("unable to load sources: %v", err)
}
}
}()
return sc, nil
}

View File

@ -4,6 +4,8 @@ import (
"reflect"
"sort"
"testing"
model "github.com/kubeflow/model-registry/catalog/pkg/openapi"
)
func TestLoadCatalogSources(t *testing.T) {
@ -19,7 +21,7 @@ func TestLoadCatalogSources(t *testing.T) {
{
name: "test-catalog-sources",
args: args{catalogsPath: "testdata/test-catalog-sources.yaml"},
want: []string{"catalog1", "catalog2"},
want: []string{"catalog1", "catalog3", "catalog4"},
wantErr: false,
},
}
@ -30,8 +32,8 @@ func TestLoadCatalogSources(t *testing.T) {
t.Errorf("LoadCatalogSources() error = %v, wantErr %v", err, tt.wantErr)
return
}
gotKeys := make([]string, 0, len(got))
for k := range got {
gotKeys := make([]string, 0, len(got.All()))
for k := range got.All() {
gotKeys = append(gotKeys, k)
}
sort.Strings(gotKeys)
@ -41,3 +43,60 @@ func TestLoadCatalogSources(t *testing.T) {
})
}
}
func TestLoadCatalogSourcesEnabledDisabled(t *testing.T) {
trueValue := true
type args struct {
catalogsPath string
}
tests := []struct {
name string
args args
want map[string]model.CatalogSource
wantErr bool
}{
{
name: "test-catalog-sources-enabled-and-disabled",
args: args{catalogsPath: "testdata/test-catalog-sources.yaml"},
want: map[string]model.CatalogSource{
"catalog1": {
Id: "catalog1",
Name: "Catalog 1",
Enabled: &trueValue,
},
"catalog3": {
Id: "catalog3",
Name: "Catalog 3",
Enabled: &trueValue,
},
"catalog4": {
Id: "catalog4",
Name: "Catalog 4",
Enabled: &trueValue,
},
},
wantErr: false,
},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
got, err := LoadCatalogSources(tt.args.catalogsPath)
if (err != nil) != tt.wantErr {
t.Errorf("LoadCatalogSources() error = %v, wantErr %v", err, tt.wantErr)
return
}
if err != nil {
return
}
gotMetadata := make(map[string]model.CatalogSource)
for id, source := range got.All() {
gotMetadata[id] = source.Metadata
}
if !reflect.DeepEqual(gotMetadata, tt.want) {
t.Errorf("LoadCatalogSources() got metadata = %#v, want %#v", gotMetadata, tt.want)
}
})
}
}

View File

@ -0,0 +1,32 @@
## Using Genqlient with the Red Hat Ecosystem Catalog
The Genqlient is used to retrieve metadata from a CatalogSource, in this case the Red Hat Ecosystem Catalog (RHEC), for the Model Catalog based on GraphQL queries to the RHEC API.
This directory contains the necessary files to generate a type-safe Go GraphQL client for the RHEC using [genqlient](https://github.com/Khan/genqlient).
### File Structure
- `genqlient.yaml`: The configuration file for `genqlient`. It specifies the location of the GraphQL schema, the directory containing the GraphQL queries, and the output file for the generated code.
- `queries/`: This directory contains the GraphQL schema and query files.
- `schema.graphql`: The GraphQL schema for the RHEC API.
- `*.graphql`: Files containing the GraphQL queries.
### Generating the Client
To regenerate the client, you will first need to ensure the required tools are installed by running `make deps` from the project root. Once the tools are installed, you can generate the client by running the following command from the `catalog` directory:
```bash
make gen/graphql
```
This will generate the `generated.go` file in the current directory.
### Downloading the Schema
The `schema.graphql` file can be updated by downloading the latest version from the RHEC API. You can do this by running the following command from the `catalog` directory:
```bash
make download/graphql-schema
```
This will download the schema and save it to the correct location. After updating the schema, you should regenerate the client to ensure it is up to date.

View File

@ -0,0 +1,389 @@
// Code generated by github.com/Khan/genqlient, DO NOT EDIT.
package genqlient
import (
"context"
"time"
"github.com/Khan/genqlient/graphql"
)
// FindRepositoryImagesFind_repository_images_by_registry_pathContainerImagePaginatedResponse includes the requested fields of the GraphQL type ContainerImagePaginatedResponse.
type FindRepositoryImagesFind_repository_images_by_registry_pathContainerImagePaginatedResponse struct {
Error FindRepositoryImagesFind_repository_images_by_registry_pathContainerImagePaginatedResponseError `json:"error"`
Total int `json:"total"`
Data []FindRepositoryImagesFind_repository_images_by_registry_pathContainerImagePaginatedResponseDataContainerImage `json:"data"`
}
// GetError returns FindRepositoryImagesFind_repository_images_by_registry_pathContainerImagePaginatedResponse.Error, and is useful for accessing the field via an interface.
func (v *FindRepositoryImagesFind_repository_images_by_registry_pathContainerImagePaginatedResponse) GetError() FindRepositoryImagesFind_repository_images_by_registry_pathContainerImagePaginatedResponseError {
return v.Error
}
// GetTotal returns FindRepositoryImagesFind_repository_images_by_registry_pathContainerImagePaginatedResponse.Total, and is useful for accessing the field via an interface.
func (v *FindRepositoryImagesFind_repository_images_by_registry_pathContainerImagePaginatedResponse) GetTotal() int {
return v.Total
}
// GetData returns FindRepositoryImagesFind_repository_images_by_registry_pathContainerImagePaginatedResponse.Data, and is useful for accessing the field via an interface.
func (v *FindRepositoryImagesFind_repository_images_by_registry_pathContainerImagePaginatedResponse) GetData() []FindRepositoryImagesFind_repository_images_by_registry_pathContainerImagePaginatedResponseDataContainerImage {
return v.Data
}
// FindRepositoryImagesFind_repository_images_by_registry_pathContainerImagePaginatedResponseDataContainerImage includes the requested fields of the GraphQL type ContainerImage.
// The GraphQL type's documentation follows.
//
// Metadata about images contained in RedHat and ISV repositories
type FindRepositoryImagesFind_repository_images_by_registry_pathContainerImagePaginatedResponseDataContainerImage struct {
// The date when the entry was created. Value is created automatically on creation.
Creation_date time.Time `json:"creation_date"`
// The date when the entry was last updated.
Last_update_date time.Time `json:"last_update_date"`
// Published repositories associated with the container image.
Repositories []FindRepositoryImagesFind_repository_images_by_registry_pathContainerImagePaginatedResponseDataContainerImageRepositoriesContainerImageRepo `json:"repositories"`
// Data parsed from image metadata.
// These fields are not computed from any other source.
Parsed_data FindRepositoryImagesFind_repository_images_by_registry_pathContainerImagePaginatedResponseDataContainerImageParsed_dataParsedData `json:"parsed_data"`
}
// GetCreation_date returns FindRepositoryImagesFind_repository_images_by_registry_pathContainerImagePaginatedResponseDataContainerImage.Creation_date, and is useful for accessing the field via an interface.
func (v *FindRepositoryImagesFind_repository_images_by_registry_pathContainerImagePaginatedResponseDataContainerImage) GetCreation_date() time.Time {
return v.Creation_date
}
// GetLast_update_date returns FindRepositoryImagesFind_repository_images_by_registry_pathContainerImagePaginatedResponseDataContainerImage.Last_update_date, and is useful for accessing the field via an interface.
func (v *FindRepositoryImagesFind_repository_images_by_registry_pathContainerImagePaginatedResponseDataContainerImage) GetLast_update_date() time.Time {
return v.Last_update_date
}
// GetRepositories returns FindRepositoryImagesFind_repository_images_by_registry_pathContainerImagePaginatedResponseDataContainerImage.Repositories, and is useful for accessing the field via an interface.
func (v *FindRepositoryImagesFind_repository_images_by_registry_pathContainerImagePaginatedResponseDataContainerImage) GetRepositories() []FindRepositoryImagesFind_repository_images_by_registry_pathContainerImagePaginatedResponseDataContainerImageRepositoriesContainerImageRepo {
return v.Repositories
}
// GetParsed_data returns FindRepositoryImagesFind_repository_images_by_registry_pathContainerImagePaginatedResponseDataContainerImage.Parsed_data, and is useful for accessing the field via an interface.
func (v *FindRepositoryImagesFind_repository_images_by_registry_pathContainerImagePaginatedResponseDataContainerImage) GetParsed_data() FindRepositoryImagesFind_repository_images_by_registry_pathContainerImagePaginatedResponseDataContainerImageParsed_dataParsedData {
return v.Parsed_data
}
// FindRepositoryImagesFind_repository_images_by_registry_pathContainerImagePaginatedResponseDataContainerImageParsed_dataParsedData includes the requested fields of the GraphQL type ParsedData.
type FindRepositoryImagesFind_repository_images_by_registry_pathContainerImagePaginatedResponseDataContainerImageParsed_dataParsedData struct {
Labels []FindRepositoryImagesFind_repository_images_by_registry_pathContainerImagePaginatedResponseDataContainerImageParsed_dataParsedDataLabelsLabel `json:"labels"`
}
// GetLabels returns FindRepositoryImagesFind_repository_images_by_registry_pathContainerImagePaginatedResponseDataContainerImageParsed_dataParsedData.Labels, and is useful for accessing the field via an interface.
func (v *FindRepositoryImagesFind_repository_images_by_registry_pathContainerImagePaginatedResponseDataContainerImageParsed_dataParsedData) GetLabels() []FindRepositoryImagesFind_repository_images_by_registry_pathContainerImagePaginatedResponseDataContainerImageParsed_dataParsedDataLabelsLabel {
return v.Labels
}
// FindRepositoryImagesFind_repository_images_by_registry_pathContainerImagePaginatedResponseDataContainerImageParsed_dataParsedDataLabelsLabel includes the requested fields of the GraphQL type Label.
// The GraphQL type's documentation follows.
//
// Image label.
type FindRepositoryImagesFind_repository_images_by_registry_pathContainerImagePaginatedResponseDataContainerImageParsed_dataParsedDataLabelsLabel struct {
// The name of the label
Name string `json:"name"`
// Value of the label.
Value string `json:"value"`
}
// GetName returns FindRepositoryImagesFind_repository_images_by_registry_pathContainerImagePaginatedResponseDataContainerImageParsed_dataParsedDataLabelsLabel.Name, and is useful for accessing the field via an interface.
func (v *FindRepositoryImagesFind_repository_images_by_registry_pathContainerImagePaginatedResponseDataContainerImageParsed_dataParsedDataLabelsLabel) GetName() string {
return v.Name
}
// GetValue returns FindRepositoryImagesFind_repository_images_by_registry_pathContainerImagePaginatedResponseDataContainerImageParsed_dataParsedDataLabelsLabel.Value, and is useful for accessing the field via an interface.
func (v *FindRepositoryImagesFind_repository_images_by_registry_pathContainerImagePaginatedResponseDataContainerImageParsed_dataParsedDataLabelsLabel) GetValue() string {
return v.Value
}
// FindRepositoryImagesFind_repository_images_by_registry_pathContainerImagePaginatedResponseDataContainerImageRepositoriesContainerImageRepo includes the requested fields of the GraphQL type ContainerImageRepo.
type FindRepositoryImagesFind_repository_images_by_registry_pathContainerImagePaginatedResponseDataContainerImageRepositoriesContainerImageRepo struct {
// Hostname of the registry where the repository can be accessed.
Registry string `json:"registry"`
// List of container tags assigned to this layer.
Tags []FindRepositoryImagesFind_repository_images_by_registry_pathContainerImagePaginatedResponseDataContainerImageRepositoriesContainerImageRepoTagsContainerImageRepoTag `json:"tags"`
}
// GetRegistry returns FindRepositoryImagesFind_repository_images_by_registry_pathContainerImagePaginatedResponseDataContainerImageRepositoriesContainerImageRepo.Registry, and is useful for accessing the field via an interface.
func (v *FindRepositoryImagesFind_repository_images_by_registry_pathContainerImagePaginatedResponseDataContainerImageRepositoriesContainerImageRepo) GetRegistry() string {
return v.Registry
}
// GetTags returns FindRepositoryImagesFind_repository_images_by_registry_pathContainerImagePaginatedResponseDataContainerImageRepositoriesContainerImageRepo.Tags, and is useful for accessing the field via an interface.
func (v *FindRepositoryImagesFind_repository_images_by_registry_pathContainerImagePaginatedResponseDataContainerImageRepositoriesContainerImageRepo) GetTags() []FindRepositoryImagesFind_repository_images_by_registry_pathContainerImagePaginatedResponseDataContainerImageRepositoriesContainerImageRepoTagsContainerImageRepoTag {
return v.Tags
}
// FindRepositoryImagesFind_repository_images_by_registry_pathContainerImagePaginatedResponseDataContainerImageRepositoriesContainerImageRepoTagsContainerImageRepoTag includes the requested fields of the GraphQL type ContainerImageRepoTag.
type FindRepositoryImagesFind_repository_images_by_registry_pathContainerImagePaginatedResponseDataContainerImageRepositoriesContainerImageRepoTagsContainerImageRepoTag struct {
// The name of the tag.
Name string `json:"name"`
}
// GetName returns FindRepositoryImagesFind_repository_images_by_registry_pathContainerImagePaginatedResponseDataContainerImageRepositoriesContainerImageRepoTagsContainerImageRepoTag.Name, and is useful for accessing the field via an interface.
func (v *FindRepositoryImagesFind_repository_images_by_registry_pathContainerImagePaginatedResponseDataContainerImageRepositoriesContainerImageRepoTagsContainerImageRepoTag) GetName() string {
return v.Name
}
// FindRepositoryImagesFind_repository_images_by_registry_pathContainerImagePaginatedResponseError includes the requested fields of the GraphQL type ResponseError.
type FindRepositoryImagesFind_repository_images_by_registry_pathContainerImagePaginatedResponseError struct {
Detail string `json:"detail"`
Status int `json:"status"`
}
// GetDetail returns FindRepositoryImagesFind_repository_images_by_registry_pathContainerImagePaginatedResponseError.Detail, and is useful for accessing the field via an interface.
func (v *FindRepositoryImagesFind_repository_images_by_registry_pathContainerImagePaginatedResponseError) GetDetail() string {
return v.Detail
}
// GetStatus returns FindRepositoryImagesFind_repository_images_by_registry_pathContainerImagePaginatedResponseError.Status, and is useful for accessing the field via an interface.
func (v *FindRepositoryImagesFind_repository_images_by_registry_pathContainerImagePaginatedResponseError) GetStatus() int {
return v.Status
}
// FindRepositoryImagesResponse is returned by FindRepositoryImages on success.
type FindRepositoryImagesResponse struct {
// List images for a repository. Exclude total for improved performance.
Find_repository_images_by_registry_path FindRepositoryImagesFind_repository_images_by_registry_pathContainerImagePaginatedResponse `json:"find_repository_images_by_registry_path"`
}
// GetFind_repository_images_by_registry_path returns FindRepositoryImagesResponse.Find_repository_images_by_registry_path, and is useful for accessing the field via an interface.
func (v *FindRepositoryImagesResponse) GetFind_repository_images_by_registry_path() FindRepositoryImagesFind_repository_images_by_registry_pathContainerImagePaginatedResponse {
return v.Find_repository_images_by_registry_path
}
// GetRepositoryGet_repository_by_registry_pathContainerRepositoryResponse includes the requested fields of the GraphQL type ContainerRepositoryResponse.
type GetRepositoryGet_repository_by_registry_pathContainerRepositoryResponse struct {
Error GetRepositoryGet_repository_by_registry_pathContainerRepositoryResponseError `json:"error"`
Data GetRepositoryGet_repository_by_registry_pathContainerRepositoryResponseDataContainerRepository `json:"data"`
}
// GetError returns GetRepositoryGet_repository_by_registry_pathContainerRepositoryResponse.Error, and is useful for accessing the field via an interface.
func (v *GetRepositoryGet_repository_by_registry_pathContainerRepositoryResponse) GetError() GetRepositoryGet_repository_by_registry_pathContainerRepositoryResponseError {
return v.Error
}
// GetData returns GetRepositoryGet_repository_by_registry_pathContainerRepositoryResponse.Data, and is useful for accessing the field via an interface.
func (v *GetRepositoryGet_repository_by_registry_pathContainerRepositoryResponse) GetData() GetRepositoryGet_repository_by_registry_pathContainerRepositoryResponseDataContainerRepository {
return v.Data
}
// GetRepositoryGet_repository_by_registry_pathContainerRepositoryResponseDataContainerRepository includes the requested fields of the GraphQL type ContainerRepository.
// The GraphQL type's documentation follows.
//
// Contains metadata associated with Red Hat and ISV repositories
type GetRepositoryGet_repository_by_registry_pathContainerRepositoryResponseDataContainerRepository struct {
// The date when the entry was created. Value is created automatically on creation.
Creation_date time.Time `json:"creation_date"`
// The date when the entry was last updated.
Last_update_date time.Time `json:"last_update_date"`
// The release categories of a repository.
Release_categories []string `json:"release_categories"`
// Label of the vendor that owns this repository.
Vendor_label string `json:"vendor_label"`
Display_data GetRepositoryGet_repository_by_registry_pathContainerRepositoryResponseDataContainerRepositoryDisplay_dataRepositoryDisplayData `json:"display_data"`
}
// GetCreation_date returns GetRepositoryGet_repository_by_registry_pathContainerRepositoryResponseDataContainerRepository.Creation_date, and is useful for accessing the field via an interface.
func (v *GetRepositoryGet_repository_by_registry_pathContainerRepositoryResponseDataContainerRepository) GetCreation_date() time.Time {
return v.Creation_date
}
// GetLast_update_date returns GetRepositoryGet_repository_by_registry_pathContainerRepositoryResponseDataContainerRepository.Last_update_date, and is useful for accessing the field via an interface.
func (v *GetRepositoryGet_repository_by_registry_pathContainerRepositoryResponseDataContainerRepository) GetLast_update_date() time.Time {
return v.Last_update_date
}
// GetRelease_categories returns GetRepositoryGet_repository_by_registry_pathContainerRepositoryResponseDataContainerRepository.Release_categories, and is useful for accessing the field via an interface.
func (v *GetRepositoryGet_repository_by_registry_pathContainerRepositoryResponseDataContainerRepository) GetRelease_categories() []string {
return v.Release_categories
}
// GetVendor_label returns GetRepositoryGet_repository_by_registry_pathContainerRepositoryResponseDataContainerRepository.Vendor_label, and is useful for accessing the field via an interface.
func (v *GetRepositoryGet_repository_by_registry_pathContainerRepositoryResponseDataContainerRepository) GetVendor_label() string {
return v.Vendor_label
}
// GetDisplay_data returns GetRepositoryGet_repository_by_registry_pathContainerRepositoryResponseDataContainerRepository.Display_data, and is useful for accessing the field via an interface.
func (v *GetRepositoryGet_repository_by_registry_pathContainerRepositoryResponseDataContainerRepository) GetDisplay_data() GetRepositoryGet_repository_by_registry_pathContainerRepositoryResponseDataContainerRepositoryDisplay_dataRepositoryDisplayData {
return v.Display_data
}
// GetRepositoryGet_repository_by_registry_pathContainerRepositoryResponseDataContainerRepositoryDisplay_dataRepositoryDisplayData includes the requested fields of the GraphQL type RepositoryDisplayData.
// The GraphQL type's documentation follows.
//
// Display data for Catalog.
type GetRepositoryGet_repository_by_registry_pathContainerRepositoryResponseDataContainerRepositoryDisplay_dataRepositoryDisplayData struct {
// The short description of the repository.
Short_description string `json:"short_description"`
// The long description of the repository.
Long_description string `json:"long_description"`
}
// GetShort_description returns GetRepositoryGet_repository_by_registry_pathContainerRepositoryResponseDataContainerRepositoryDisplay_dataRepositoryDisplayData.Short_description, and is useful for accessing the field via an interface.
func (v *GetRepositoryGet_repository_by_registry_pathContainerRepositoryResponseDataContainerRepositoryDisplay_dataRepositoryDisplayData) GetShort_description() string {
return v.Short_description
}
// GetLong_description returns GetRepositoryGet_repository_by_registry_pathContainerRepositoryResponseDataContainerRepositoryDisplay_dataRepositoryDisplayData.Long_description, and is useful for accessing the field via an interface.
func (v *GetRepositoryGet_repository_by_registry_pathContainerRepositoryResponseDataContainerRepositoryDisplay_dataRepositoryDisplayData) GetLong_description() string {
return v.Long_description
}
// GetRepositoryGet_repository_by_registry_pathContainerRepositoryResponseError includes the requested fields of the GraphQL type ResponseError.
type GetRepositoryGet_repository_by_registry_pathContainerRepositoryResponseError struct {
Detail string `json:"detail"`
Status int `json:"status"`
}
// GetDetail returns GetRepositoryGet_repository_by_registry_pathContainerRepositoryResponseError.Detail, and is useful for accessing the field via an interface.
func (v *GetRepositoryGet_repository_by_registry_pathContainerRepositoryResponseError) GetDetail() string {
return v.Detail
}
// GetStatus returns GetRepositoryGet_repository_by_registry_pathContainerRepositoryResponseError.Status, and is useful for accessing the field via an interface.
func (v *GetRepositoryGet_repository_by_registry_pathContainerRepositoryResponseError) GetStatus() int {
return v.Status
}
// GetRepositoryResponse is returned by GetRepository on success.
type GetRepositoryResponse struct {
// Get a repository by registry and path (product line/image name).
Get_repository_by_registry_path GetRepositoryGet_repository_by_registry_pathContainerRepositoryResponse `json:"get_repository_by_registry_path"`
}
// GetGet_repository_by_registry_path returns GetRepositoryResponse.Get_repository_by_registry_path, and is useful for accessing the field via an interface.
func (v *GetRepositoryResponse) GetGet_repository_by_registry_path() GetRepositoryGet_repository_by_registry_pathContainerRepositoryResponse {
return v.Get_repository_by_registry_path
}
// __FindRepositoryImagesInput is used internally by genqlient
type __FindRepositoryImagesInput struct {
Registry string `json:"registry"`
Repository string `json:"repository"`
}
// GetRegistry returns __FindRepositoryImagesInput.Registry, and is useful for accessing the field via an interface.
func (v *__FindRepositoryImagesInput) GetRegistry() string { return v.Registry }
// GetRepository returns __FindRepositoryImagesInput.Repository, and is useful for accessing the field via an interface.
func (v *__FindRepositoryImagesInput) GetRepository() string { return v.Repository }
// __GetRepositoryInput is used internally by genqlient
type __GetRepositoryInput struct {
Registry string `json:"registry"`
Repository string `json:"repository"`
}
// GetRegistry returns __GetRepositoryInput.Registry, and is useful for accessing the field via an interface.
func (v *__GetRepositoryInput) GetRegistry() string { return v.Registry }
// GetRepository returns __GetRepositoryInput.Repository, and is useful for accessing the field via an interface.
func (v *__GetRepositoryInput) GetRepository() string { return v.Repository }
// The query executed by FindRepositoryImages.
const FindRepositoryImages_Operation = `
query FindRepositoryImages ($registry: String!, $repository: String!) {
find_repository_images_by_registry_path(registry: $registry, repository: $repository, sort_by: [{field:"creation_date",order:DESC}]) {
error {
detail
status
}
total
data {
creation_date
last_update_date
repositories {
registry
tags {
name
}
}
parsed_data {
labels {
name
value
}
}
}
}
}
`
func FindRepositoryImages(
ctx_ context.Context,
client_ graphql.Client,
registry string,
repository string,
) (data_ *FindRepositoryImagesResponse, err_ error) {
req_ := &graphql.Request{
OpName: "FindRepositoryImages",
Query: FindRepositoryImages_Operation,
Variables: &__FindRepositoryImagesInput{
Registry: registry,
Repository: repository,
},
}
data_ = &FindRepositoryImagesResponse{}
resp_ := &graphql.Response{Data: data_}
err_ = client_.MakeRequest(
ctx_,
req_,
resp_,
)
return data_, err_
}
// The query executed by GetRepository.
const GetRepository_Operation = `
query GetRepository ($registry: String!, $repository: String!) {
get_repository_by_registry_path(registry: $registry, repository: $repository) {
error {
detail
status
}
data {
creation_date
last_update_date
release_categories
vendor_label
display_data {
short_description
long_description
}
}
}
}
`
func GetRepository(
ctx_ context.Context,
client_ graphql.Client,
registry string,
repository string,
) (data_ *GetRepositoryResponse, err_ error) {
req_ := &graphql.Request{
OpName: "GetRepository",
Query: GetRepository_Operation,
Variables: &__GetRepositoryInput{
Registry: registry,
Repository: repository,
},
}
data_ = &GetRepositoryResponse{}
resp_ := &graphql.Response{Data: data_}
err_ = client_.MakeRequest(
ctx_,
req_,
resp_,
)
return data_, err_
}

View File

@ -0,0 +1,10 @@
# genqlient.yaml
schema: queries/schema.graphql
operations:
- "queries/find_repository_images.graphql"
- "queries/get_repository.graphql"
generated: generated.go
package: genqlient
bindings:
DateTime:
type: time.Time

View File

@ -0,0 +1,29 @@
query FindRepositoryImages($registry: String!, $repository: String!) {
find_repository_images_by_registry_path(
registry: $registry
repository: $repository
sort_by: [{ field: "creation_date", order: DESC }]
) {
error {
detail
status
}
total
data {
creation_date
last_update_date
repositories {
registry
tags {
name
}
}
parsed_data {
labels {
name
value
}
}
}
}
}

View File

@ -0,0 +1,21 @@
query GetRepository($registry: String!, $repository: String!) {
get_repository_by_registry_path(
registry: $registry
repository: $repository
) {
error {
detail
status
}
data {
creation_date
last_update_date
release_categories
vendor_label
display_data {
short_description
long_description
}
}
}
}

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,125 @@
package catalog
import (
"context"
"fmt"
"net/http"
"strings"
"time"
"github.com/golang/glog"
"github.com/kubeflow/model-registry/catalog/pkg/openapi"
model "github.com/kubeflow/model-registry/catalog/pkg/openapi"
)
type hfCatalogImpl struct {
client *http.Client
apiKey string
baseURL string
}
var _ CatalogSourceProvider = &hfCatalogImpl{}
const (
defaultHuggingFaceURL = "https://huggingface.co"
)
func (h *hfCatalogImpl) GetModel(ctx context.Context, name string) (*openapi.CatalogModel, error) {
// TODO: Implement HuggingFace model retrieval
return nil, fmt.Errorf("HuggingFace model retrieval not yet implemented")
}
func (h *hfCatalogImpl) ListModels(ctx context.Context, params ListModelsParams) (model.CatalogModelList, error) {
// TODO: Implement HuggingFace model listing
// For now, return empty list to satisfy interface
return model.CatalogModelList{
Items: []model.CatalogModel{},
PageSize: 0,
Size: 0,
}, nil
}
func (h *hfCatalogImpl) GetArtifacts(ctx context.Context, name string) (*openapi.CatalogModelArtifactList, error) {
// TODO: Implement HuggingFace model artifacts retrieval
// For now, return empty list to satisfy interface
return &openapi.CatalogModelArtifactList{
Items: []openapi.CatalogModelArtifact{},
PageSize: 0,
Size: 0,
}, nil
}
// validateCredentials checks if the HuggingFace API credentials are valid
func (h *hfCatalogImpl) validateCredentials(ctx context.Context) error {
glog.Infof("Validating HuggingFace API credentials")
// Make a simple API call to validate credentials
apiURL := h.baseURL + "/api/whoami-v2"
req, err := http.NewRequestWithContext(ctx, "GET", apiURL, nil)
if err != nil {
return fmt.Errorf("failed to create validation request: %w", err)
}
if h.apiKey != "" {
req.Header.Set("Authorization", "Bearer "+h.apiKey)
}
resp, err := h.client.Do(req)
if err != nil {
return fmt.Errorf("failed to validate HuggingFace credentials: %w", err)
}
defer resp.Body.Close()
if resp.StatusCode == http.StatusUnauthorized {
return fmt.Errorf("invalid HuggingFace API credentials")
}
if resp.StatusCode != http.StatusOK {
return fmt.Errorf("HuggingFace API validation failed with status: %d", resp.StatusCode)
}
glog.Infof("HuggingFace credentials validated successfully")
return nil
}
// newHfCatalog creates a new HuggingFace catalog source
func newHfCatalog(source *CatalogSourceConfig) (CatalogSourceProvider, error) {
apiKey, ok := source.Properties["apiKey"].(string)
if !ok || apiKey == "" {
return nil, fmt.Errorf("missing or invalid 'apiKey' property for HuggingFace catalog")
}
baseURL := defaultHuggingFaceURL
if url, ok := source.Properties["url"].(string); ok && url != "" {
baseURL = strings.TrimSuffix(url, "/")
}
// Optional model limit for future implementation
modelLimit := 100
if limit, ok := source.Properties["modelLimit"].(int); ok && limit > 0 {
modelLimit = limit
}
glog.Infof("Configuring HuggingFace catalog with URL: %s, modelLimit: %d", baseURL, modelLimit)
h := &hfCatalogImpl{
client: &http.Client{Timeout: 30 * time.Second},
apiKey: apiKey,
baseURL: baseURL,
}
// Validate credentials during initialization (as required by Jira ticket)
ctx := context.Background()
if err := h.validateCredentials(ctx); err != nil {
glog.Errorf("HuggingFace catalog credential validation failed: %v", err)
return nil, fmt.Errorf("failed to validate HuggingFace catalog credentials: %w", err)
}
glog.Infof("HuggingFace catalog source configured successfully")
return h, nil
}
func init() {
if err := RegisterCatalogType("hf", newHfCatalog); err != nil {
panic(err)
}
}

View File

@ -0,0 +1,174 @@
package catalog
import (
"context"
"net/http"
"net/http/httptest"
"strings"
"testing"
"github.com/kubeflow/model-registry/catalog/pkg/openapi"
)
func TestNewHfCatalog_MissingAPIKey(t *testing.T) {
source := &CatalogSourceConfig{
CatalogSource: openapi.CatalogSource{
Id: "test_hf",
Name: "Test HF",
},
Type: "hf",
Properties: map[string]any{
"url": "https://huggingface.co",
},
}
_, err := newHfCatalog(source)
if err == nil {
t.Fatal("Expected error for missing API key, got nil")
}
if err.Error() != "missing or invalid 'apiKey' property for HuggingFace catalog" {
t.Fatalf("Expected specific error message, got: %s", err.Error())
}
}
func TestNewHfCatalog_WithValidCredentials(t *testing.T) {
// Create mock server that returns valid response for credential validation
server := httptest.NewServer(http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
// Check for authorization header
auth := r.Header.Get("Authorization")
if auth != "Bearer test-api-key" {
w.WriteHeader(http.StatusUnauthorized)
return
}
switch r.URL.Path {
case "/api/whoami-v2":
w.Header().Set("Content-Type", "application/json")
w.WriteHeader(http.StatusOK)
w.Write([]byte(`{"name": "test-user", "type": "user"}`))
default:
w.WriteHeader(http.StatusNotFound)
}
}))
defer server.Close()
source := &CatalogSourceConfig{
CatalogSource: openapi.CatalogSource{
Id: "test_hf",
Name: "Test HF",
},
Type: "hf",
Properties: map[string]any{
"apiKey": "test-api-key",
"url": server.URL,
"modelLimit": 10,
},
}
catalog, err := newHfCatalog(source)
if err != nil {
t.Fatalf("Failed to create HF catalog: %v", err)
}
hfCatalog := catalog.(*hfCatalogImpl)
// Test that methods return appropriate responses for stub implementation
ctx := context.Background()
// Test GetModel - should return not implemented error
model, err := hfCatalog.GetModel(ctx, "test-model")
if err == nil {
t.Fatal("Expected not implemented error, got nil")
}
if model != nil {
t.Fatal("Expected nil model, got non-nil")
}
// Test ListModels - should return empty list
listParams := ListModelsParams{
Query: "",
OrderBy: openapi.ORDERBYFIELD_NAME,
SortOrder: openapi.SORTORDER_ASC,
}
modelList, err := hfCatalog.ListModels(ctx, listParams)
if err != nil {
t.Fatalf("Failed to list models: %v", err)
}
if len(modelList.Items) != 0 {
t.Fatalf("Expected 0 models, got %d", len(modelList.Items))
}
// Test GetArtifacts - should return empty list
artifacts, err := hfCatalog.GetArtifacts(ctx, "test-model")
if err != nil {
t.Fatalf("Failed to get artifacts: %v", err)
}
if artifacts == nil {
t.Fatal("Expected artifacts list, got nil")
}
if len(artifacts.Items) != 0 {
t.Fatalf("Expected 0 artifacts, got %d", len(artifacts.Items))
}
}
func TestNewHfCatalog_InvalidCredentials(t *testing.T) {
// Create mock server that returns 401
server := httptest.NewServer(http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
w.WriteHeader(http.StatusUnauthorized)
}))
defer server.Close()
source := &CatalogSourceConfig{
CatalogSource: openapi.CatalogSource{
Id: "test_hf",
Name: "Test HF",
},
Type: "hf",
Properties: map[string]any{
"apiKey": "invalid-key",
"url": server.URL,
},
}
_, err := newHfCatalog(source)
if err == nil {
t.Fatal("Expected error for invalid credentials, got nil")
}
if !strings.Contains(err.Error(), "invalid HuggingFace API credentials") {
t.Fatalf("Expected credential validation error, got: %s", err.Error())
}
}
func TestNewHfCatalog_DefaultConfiguration(t *testing.T) {
// Create mock server for default HuggingFace URL
server := httptest.NewServer(http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
w.WriteHeader(http.StatusOK)
w.Write([]byte(`{"name": "test-user"}`))
}))
defer server.Close()
source := &CatalogSourceConfig{
CatalogSource: openapi.CatalogSource{
Id: "test_hf",
Name: "Test HF",
},
Type: "hf",
Properties: map[string]any{
"apiKey": "test-key",
"url": server.URL, // Override default for testing
},
}
catalog, err := newHfCatalog(source)
if err != nil {
t.Fatalf("Failed to create HF catalog with defaults: %v", err)
}
hfCatalog := catalog.(*hfCatalogImpl)
if hfCatalog.apiKey != "test-key" {
t.Fatalf("Expected apiKey 'test-key', got '%s'", hfCatalog.apiKey)
}
if hfCatalog.baseURL != server.URL {
t.Fatalf("Expected baseURL '%s', got '%s'", server.URL, hfCatalog.baseURL)
}
}

View File

@ -0,0 +1,238 @@
package catalog
import (
"fmt"
"hash/crc32"
"io"
"os"
"path/filepath"
"sync"
"sync/atomic"
"github.com/fsnotify/fsnotify"
"github.com/golang/glog"
)
// monitor sends events when the contents of a file have changed.
//
// Unfortunately, simply watching the file misses events for our primary case
// of k8s mounted configmaps because the files we're watching are actually
// symlinks which aren't modified:
//
// drwxrwxrwx 1 root root 138 Jul 2 15:45 .
// drwxr-xr-x 1 root root 116 Jul 2 15:52 ..
// drwxr-xr-x 1 root root 62 Jul 2 15:45 ..2025_07_02_15_45_09.2837733502
// lrwxrwxrwx 1 root root 32 Jul 2 15:45 ..data -> ..2025_07_02_15_45_09.2837733502
// lrwxrwxrwx 1 root root 26 Jul 2 13:18 sample-catalog.yaml -> ..data/sample-catalog.yaml
// lrwxrwxrwx 1 root root 19 Jul 2 13:18 sources.yaml -> ..data/sources.yaml
//
// Updates are written to a new directory and the ..data symlink is updated. No
// fsnotify events will ever be triggered for the YAML files.
//
// The approach taken here is to watch the directory containing the file for
// any change and then hash the contents of the file to avoid false-positives.
type monitor struct {
watcher *fsnotify.Watcher
closed <-chan struct{}
recordsMu sync.RWMutex
records map[string]map[string]*monitorRecord
}
var _monitor *monitor
var initMonitor sync.Once
// getMonitor returns a singleton monitor instance. Panics on failure.
func getMonitor() *monitor {
initMonitor.Do(func() {
var err error
_monitor, err = newMonitor()
if err != nil {
panic(fmt.Sprintf("Unable to create file monitor: %v", err))
}
})
if _monitor == nil {
// Panic in case someone traps the panic that occurred during
// initialization and tries to call this again.
panic("Unable to get file monitor")
}
return _monitor
}
func newMonitor() (*monitor, error) {
watcher, err := fsnotify.NewWatcher()
if err != nil {
return nil, err
}
m := &monitor{
watcher: watcher,
records: map[string]map[string]*monitorRecord{},
}
go m.monitor()
return m, nil
}
// Close stops the monitor and waits for the background goroutine to exit.
//
// All channels returned by Path() will be closed.
func (m *monitor) Close() {
select {
case <-m.closed:
// Already closed, nothing to do.
return
default:
// Fallthrough
}
m.watcher.Close()
<-m.closed
m.recordsMu.Lock()
defer m.recordsMu.Unlock()
uniqCh := make(map[chan<- struct{}]struct{})
for dir := range m.records {
for file := range m.records[dir] {
record, ok := m.records[dir][file]
if !ok {
continue
}
for _, ch := range record.channels {
uniqCh[ch] = struct{}{}
}
}
}
for ch := range uniqCh {
close(ch)
}
m.records = nil
}
// Path returns a channel that receives an event when the contents of a file
// change. The file does not need to exist before calling this method, however
// the provided path should only be a file or a symlink (not a directory,
// device, etc.). The returned channel will be closed when the monitor is
// closed.
func (m *monitor) Path(p string) (<-chan struct{}, error) {
absPath, err := filepath.Abs(p)
if err != nil {
return nil, fmt.Errorf("abs: %w", err)
}
m.recordsMu.Lock()
defer m.recordsMu.Unlock()
dir, base := filepath.Split(absPath)
dir = filepath.Clean(dir)
err = m.watcher.Add(dir)
if err != nil {
return nil, fmt.Errorf("unable to watch directory %q: %w", dir, err)
}
if _, exists := m.records[dir]; !exists {
m.records[dir] = make(map[string]*monitorRecord, 1)
}
ch := make(chan struct{}, 1)
if _, exists := m.records[dir][base]; !exists {
m.records[dir][base] = &monitorRecord{
channels: []chan<- struct{}{ch},
}
} else {
r := m.records[dir][base]
r.channels = append(r.channels, ch)
}
m.records[dir][base].updateHash(filepath.Join(dir, base))
return ch, nil
}
func (m *monitor) monitor() {
closed := make(chan struct{})
m.closed = closed
defer close(closed)
for {
select {
case err, ok := <-m.watcher.Errors:
if !ok {
return
}
glog.Errorf("fsnotify error: %v", err)
case e, ok := <-m.watcher.Events:
if !ok {
return
}
glog.V(2).Infof("fsnotify.Event: %v", e)
switch e.Op {
case fsnotify.Create, fsnotify.Write:
// Fallthrough
default:
// Ignore fsnotify.Remove, fsnotify.Rename and fsnotify.Chmod
continue
}
func() {
m.recordsMu.RLock()
defer m.recordsMu.RUnlock()
dir := filepath.Dir(e.Name)
dc := m.records[dir]
if dc == nil {
return
}
for base, record := range dc {
path := filepath.Join(dir, base)
if !record.updateHash(path) {
continue
}
for _, ch := range record.channels {
// Send the event, ignore any that would block.
select {
case ch <- struct{}{}:
default:
glog.Errorf("monitor: missed event for path %s", path)
}
}
}
}()
}
}
}
type monitorRecord struct {
channels []chan<- struct{}
hash uint32
}
// updateHash recalculates the hash and returns true if it has changed.
func (mr *monitorRecord) updateHash(path string) bool {
newHash := mr.calculateHash(path)
oldHash := atomic.SwapUint32(&mr.hash, newHash)
return oldHash != newHash
}
func (monitorRecord) calculateHash(path string) uint32 {
fh, err := os.Open(path)
if err != nil {
return 0
}
defer fh.Close()
h := crc32.NewIEEE()
_, err = io.Copy(h, fh)
if err != nil {
return 0
}
return h.Sum32()
}

View File

@ -0,0 +1,179 @@
package catalog
import (
"fmt"
"os"
"path/filepath"
"sync/atomic"
"testing"
"time"
"github.com/stretchr/testify/assert"
)
func TestMonitor(t *testing.T) {
assert := assert.New(t)
mon, err := newMonitor()
if !assert.NoError(err) {
return
}
tmpDir := t.TempDir()
fileA := filepath.Join(tmpDir, "a")
fileB := filepath.Join(tmpDir, "b")
fileC := filepath.Join(tmpDir, "c")
_watchMonitor := func(ch <-chan struct{}, err error) *monitorWatcher {
if err != nil {
t.Fatalf("watchMonitor passed error %v", err)
}
return watchMonitor(ch)
}
a := _watchMonitor(mon.Path(fileA))
b := _watchMonitor(mon.Path(fileB))
updateFile(t, fileA)
a.AssertCount(t, 1)
b.AssertCount(t, 0, "unchanged file should not have any events")
a.Reset()
updateFile(t, fileB)
b.AssertCount(t, 1)
updateFile(t, fileB)
b.AssertCount(t, 2)
a.AssertCount(t, 0, "unchanged file should not have any events")
b.Reset()
updateFile(t, fileC)
a.AssertCount(t, 0, "unchanged file should not have an event")
b.AssertCount(t, 0, "unchanged file should not have an event")
// Ensure that Close doesn't hang.
finished := make(chan struct{})
go func() {
defer close(finished)
mon.Close()
}()
assert.Eventually(func() bool {
select {
case <-finished:
return true
default:
return false
}
}, time.Second, 50*time.Millisecond)
// Verify that the monitor channels closed.
assert.True(a.Done())
assert.True(b.Done())
}
func TestMonitorSymlinks(t *testing.T) {
assert := assert.New(t)
tmpDir := t.TempDir()
mon, err := newMonitor()
if !assert.NoError(err) {
return
}
defer mon.Close()
// Watch the files on the published path.
_watchMonitor := func(ch <-chan struct{}, err error) *monitorWatcher {
if err != nil {
t.Fatalf("watchMonitor passed error %v", err)
}
return watchMonitor(ch)
}
a := _watchMonitor(mon.Path(filepath.Join(tmpDir, "a")))
b := _watchMonitor(mon.Path(filepath.Join(tmpDir, "b")))
// Set up a directory structure with symlinks like k8s does for mounted
// configmaps.
// a -> latest/a, b -> latest/b, latest -> v1
assert.NoError(os.Mkdir(filepath.Join(tmpDir, "v1"), 0777))
updateFile(t, filepath.Join(tmpDir, "v1", "a"), "foo")
updateFile(t, filepath.Join(tmpDir, "v1", "b"), "bar")
assert.NoError(os.Symlink("v1", filepath.Join(tmpDir, "latest")))
assert.NoError(os.Symlink(filepath.Join("latest", "a"), filepath.Join(tmpDir, "a")))
assert.NoError(os.Symlink(filepath.Join("latest", "b"), filepath.Join(tmpDir, "b")))
a.AssertCount(t, 1)
b.AssertCount(t, 1)
a.Reset()
b.Reset()
// Make a new version directory
os.Mkdir(filepath.Join(tmpDir, "v2"), 0777)
updateFile(t, filepath.Join(tmpDir, "v2", "a"), "UPDATED")
updateFile(t, filepath.Join(tmpDir, "v2", "b"), "bar")
a.AssertCount(t, 0)
b.AssertCount(t, 0)
a.Reset()
b.Reset()
// Update the symlink to point to the new version:
assert.NoError(os.Rename(filepath.Join(tmpDir, "latest"), filepath.Join(tmpDir, "latest_tmp")))
assert.NoError(os.Symlink(filepath.Join("v2"), filepath.Join(tmpDir, "latest")))
assert.NoError(os.Remove(filepath.Join(tmpDir, "latest_tmp")))
assert.NoError(os.RemoveAll(filepath.Join(tmpDir, "v1")))
a.AssertCount(t, 1)
b.AssertCount(t, 0)
}
type monitorWatcher struct {
count int32
done int32
}
func (mw *monitorWatcher) Reset() {
atomic.StoreInt32(&mw.count, 0)
}
func (mw *monitorWatcher) AssertCount(t *testing.T, expected int, args ...any) bool {
t.Helper()
return assert.Eventually(t, func() bool {
return int(atomic.LoadInt32(&mw.count)) == expected
}, time.Second, 10*time.Millisecond, args...)
}
func (mw *monitorWatcher) Count() int {
return int(atomic.LoadInt32(&mw.count))
}
func (mw *monitorWatcher) Done() bool {
return atomic.LoadInt32(&mw.done) != 0
}
func watchMonitor(ch <-chan struct{}) *monitorWatcher {
mw := &monitorWatcher{}
go func() {
defer atomic.StoreInt32(&mw.done, 1)
for range ch {
atomic.AddInt32(&mw.count, 1)
}
}()
return mw
}
func updateFile(t *testing.T, path string, contents ...string) {
fh, err := os.Create(path)
if err != nil {
t.Fatalf("unable to open %q: %v", path, err)
}
if len(contents) == 0 {
fmt.Fprintf(fh, "%s\n", time.Now())
} else {
for _, line := range contents {
fmt.Fprintf(fh, "%s\n", line)
}
}
fh.Close()
}

View File

@ -0,0 +1,334 @@
package catalog
import (
"context"
"fmt"
"math"
"net/http"
"sort"
"strconv"
"strings"
"sync"
"time"
"github.com/Khan/genqlient/graphql"
"github.com/kubeflow/model-registry/catalog/internal/catalog/genqlient"
"github.com/kubeflow/model-registry/catalog/pkg/openapi"
model "github.com/kubeflow/model-registry/catalog/pkg/openapi"
models "github.com/kubeflow/model-registry/catalog/pkg/openapi"
)
type rhecModel struct {
models.CatalogModel `yaml:",inline"`
Artifacts []*openapi.CatalogModelArtifact `yaml:"artifacts"`
}
// rhecCatalogConfig defines the structure of the RHEC catalog configuration.
type rhecCatalogConfig struct {
Models []string `yaml:"models"`
ExcludedModels []string `yaml:"excludedModels"`
}
type rhecCatalogImpl struct {
modelsLock sync.RWMutex
models map[string]*rhecModel
}
var _ CatalogSourceProvider = &rhecCatalogImpl{}
func (r *rhecCatalogImpl) GetModel(ctx context.Context, name string) (*openapi.CatalogModel, error) {
r.modelsLock.RLock()
defer r.modelsLock.RUnlock()
rm := r.models[name]
if rm == nil {
return nil, nil
}
cp := rm.CatalogModel
return &cp, nil
}
func (r *rhecCatalogImpl) ListModels(ctx context.Context, params ListModelsParams) (openapi.CatalogModelList, error) {
r.modelsLock.RLock()
defer r.modelsLock.RUnlock()
var filteredModels []*model.CatalogModel
for _, rm := range r.models {
cm := rm.CatalogModel
if params.Query != "" {
query := strings.ToLower(params.Query)
// Check if query matches name, description, tasks, provider, or libraryName
if !strings.Contains(strings.ToLower(cm.Name), query) &&
!strings.Contains(strings.ToLower(cm.GetDescription()), query) &&
!strings.Contains(strings.ToLower(cm.GetProvider()), query) &&
!strings.Contains(strings.ToLower(cm.GetLibraryName()), query) {
// Check tasks
foundInTasks := false
for _, task := range cm.GetTasks() { // Use GetTasks() for nil safety
if strings.Contains(strings.ToLower(task), query) {
foundInTasks = true
break
}
}
if !foundInTasks {
continue // Skip if no match in any searchable field
}
}
}
filteredModels = append(filteredModels, &cm)
}
// Sort the filtered models
sort.Slice(filteredModels, func(i, j int) bool {
a := filteredModels[i]
b := filteredModels[j]
var less bool
switch params.OrderBy {
case model.ORDERBYFIELD_CREATE_TIME:
// Convert CreateTimeSinceEpoch (string) to int64 for comparison
// Handle potential nil or conversion errors by treating as 0
aTime, _ := strconv.ParseInt(a.GetCreateTimeSinceEpoch(), 10, 64)
bTime, _ := strconv.ParseInt(b.GetCreateTimeSinceEpoch(), 10, 64)
less = aTime < bTime
case model.ORDERBYFIELD_LAST_UPDATE_TIME:
// Convert LastUpdateTimeSinceEpoch (string) to int64 for comparison
// Handle potential nil or conversion errors by treating as 0
aTime, _ := strconv.ParseInt(a.GetLastUpdateTimeSinceEpoch(), 10, 64)
bTime, _ := strconv.ParseInt(b.GetLastUpdateTimeSinceEpoch(), 10, 64)
less = aTime < bTime
case model.ORDERBYFIELD_NAME:
fallthrough
default:
// Fallback to name sort if an unknown sort field is provided
less = strings.Compare(a.Name, b.Name) < 0
}
if params.SortOrder == model.SORTORDER_DESC {
return !less
}
return less
})
count := len(filteredModels)
if count > math.MaxInt32 {
count = math.MaxInt32
}
list := model.CatalogModelList{
Items: make([]model.CatalogModel, count),
PageSize: int32(count),
Size: int32(count),
}
for i := range list.Items {
list.Items[i] = *filteredModels[i]
}
return list, nil // Return the struct value directly
}
func (r *rhecCatalogImpl) GetArtifacts(ctx context.Context, name string) (*openapi.CatalogModelArtifactList, error) {
r.modelsLock.RLock()
defer r.modelsLock.RUnlock()
rm := r.models[name]
if rm == nil {
return nil, nil
}
count := len(rm.Artifacts)
if count > math.MaxInt32 {
count = math.MaxInt32
}
list := openapi.CatalogModelArtifactList{
Items: make([]openapi.CatalogModelArtifact, count),
PageSize: int32(count),
Size: int32(count),
}
for i := range list.Items {
list.Items[i] = *rm.Artifacts[i]
}
return &list, nil
}
func fetchRepository(ctx context.Context, client graphql.Client, repository string) (*genqlient.GetRepositoryResponse, error) {
resp, err := genqlient.GetRepository(ctx, client, "registry.access.redhat.com", repository)
if err != nil {
return nil, fmt.Errorf("failed to query rhec repository: %w", err)
}
if err := resp.Get_repository_by_registry_path.Error; err.Detail != "" || err.Status != 0 {
return nil, fmt.Errorf("rhec repository query error: detail: %s, status: %d", err.Detail, err.Status)
}
return resp, nil
}
func fetchRepositoryImages(ctx context.Context, client graphql.Client, repository string) ([]genqlient.FindRepositoryImagesFind_repository_images_by_registry_pathContainerImagePaginatedResponseDataContainerImage, error) {
resp, err := genqlient.FindRepositoryImages(ctx, client, "registry.access.redhat.com", repository)
if err != nil {
return nil, fmt.Errorf("failed to query rhec images: %w", err)
}
if err := resp.Find_repository_images_by_registry_path.Error; err.Detail != "" || err.Status != 0 {
return nil, fmt.Errorf("rhec images query error: detail: %s, status: %d", err.Detail, err.Status)
}
return resp.Find_repository_images_by_registry_path.Data, nil
}
func newRhecModel(repoData *genqlient.GetRepositoryResponse, imageData genqlient.FindRepositoryImagesFind_repository_images_by_registry_pathContainerImagePaginatedResponseDataContainerImage, imageTagName, repositoryName string) *rhecModel {
sourceId := "rhec"
createTime := repoData.Get_repository_by_registry_path.Data.Creation_date.Format(time.RFC3339)
lastUpdateTime := repoData.Get_repository_by_registry_path.Data.Last_update_date.Format(time.RFC3339)
description := repoData.Get_repository_by_registry_path.Data.Display_data.Short_description
readme := repoData.Get_repository_by_registry_path.Data.Display_data.Long_description
provider := repoData.Get_repository_by_registry_path.Data.Vendor_label
var maturity *string
if len(repoData.Get_repository_by_registry_path.Data.Release_categories) > 0 {
maturityStr := repoData.Get_repository_by_registry_path.Data.Release_categories[0]
maturity = &maturityStr
}
var tasks []string
for _, label := range imageData.Parsed_data.Labels {
tasks = append(tasks, label.Value)
}
imageCreationDate := imageData.Creation_date.Format(time.RFC3339)
imageLastUpdateDate := imageData.Last_update_date.Format(time.RFC3339)
modelName := repositoryName + ":" + imageTagName
return &rhecModel{
CatalogModel: openapi.CatalogModel{
Name: modelName,
CreateTimeSinceEpoch: &createTime,
LastUpdateTimeSinceEpoch: &lastUpdateTime,
Description: &description,
Readme: &readme,
Maturity: maturity,
Language: []string{},
Tasks: tasks,
Provider: &provider,
Logo: nil,
License: nil,
LicenseLink: nil,
LibraryName: nil,
SourceId: &sourceId,
},
Artifacts: []*openapi.CatalogModelArtifact{
{
Uri: "oci://registry.redhat.io/" + repositoryName + ":" + imageTagName,
CreateTimeSinceEpoch: &imageCreationDate,
LastUpdateTimeSinceEpoch: &imageLastUpdateDate,
},
},
}
}
func (r *rhecCatalogImpl) load(modelsList []string, excludedModelsList []string) error {
graphqlClient := graphql.NewClient("https://catalog.redhat.com/api/containers/graphql/", http.DefaultClient)
ctx := context.Background()
models := make(map[string]*rhecModel)
for _, repo := range modelsList {
repoData, err := fetchRepository(ctx, graphqlClient, repo)
if err != nil {
return err
}
imagesData, err := fetchRepositoryImages(ctx, graphqlClient, repo)
if err != nil {
return err
}
for _, image := range imagesData {
for _, imageRepository := range image.Repositories {
for _, imageTag := range imageRepository.Tags {
tagName := imageTag.Name
fullModelName := repo + ":" + tagName
if isModelExcluded(fullModelName, excludedModelsList) {
continue
}
model := newRhecModel(repoData, image, tagName, repo)
models[fullModelName] = model
}
}
}
}
r.modelsLock.Lock()
defer r.modelsLock.Unlock()
r.models = models
return nil
}
func isModelExcluded(modelName string, patterns []string) bool {
for _, pattern := range patterns {
if strings.HasSuffix(pattern, "*") {
if strings.HasPrefix(modelName, strings.TrimSuffix(pattern, "*")) {
return true
}
} else if modelName == pattern {
return true
}
}
return false
}
func newRhecCatalog(source *CatalogSourceConfig) (CatalogSourceProvider, error) {
modelsData, ok := source.Properties["models"]
if !ok {
return nil, fmt.Errorf("missing 'models' property for rhec catalog")
}
modelsList, ok := modelsData.([]any)
if !ok {
return nil, fmt.Errorf("'models' property should be a list")
}
models := make([]string, len(modelsList))
for i, v := range modelsList {
models[i], ok = v.(string)
if !ok {
return nil, fmt.Errorf("invalid entry in 'models' list, expected a string")
}
}
// Excluded models is an optional source property.
var excludedModels []string
if excludedModelsData, ok := source.Properties["excludedModels"]; ok {
excludedModelsList, ok := excludedModelsData.([]any)
if !ok {
return nil, fmt.Errorf("'excludedModels' property should be a list")
}
excludedModels = make([]string, len(excludedModelsList))
for i, v := range excludedModelsList {
excludedModels[i], ok = v.(string)
if !ok {
return nil, fmt.Errorf("invalid entry in 'excludedModels' list, expected a string")
}
}
}
r := &rhecCatalogImpl{
models: make(map[string]*rhecModel),
}
err := r.load(models, excludedModels)
if err != nil {
return nil, fmt.Errorf("error loading rhec catalog: %w", err)
}
return r, nil
}
func init() {
if err := RegisterCatalogType("rhec", newRhecCatalog); err != nil {
panic(err)
}
}

View File

@ -0,0 +1,393 @@
package catalog
import (
"context"
"testing"
"time"
"github.com/google/go-cmp/cmp"
"github.com/kubeflow/model-registry/catalog/pkg/openapi"
model "github.com/kubeflow/model-registry/catalog/pkg/openapi"
)
func TestRhecCatalogImpl_GetModel(t *testing.T) {
modelTime := time.Now()
createTime := modelTime.Format(time.RFC3339)
lastUpdateTime := modelTime.Add(5 * time.Minute).Format(time.RFC3339)
sourceId := "rhec"
provider := "redhat"
testModels := map[string]*rhecModel{
"model1": {
CatalogModel: openapi.CatalogModel{
Name: "model1",
CreateTimeSinceEpoch: &createTime,
LastUpdateTimeSinceEpoch: &lastUpdateTime,
Provider: &provider,
SourceId: &sourceId,
},
Artifacts: []*openapi.CatalogModelArtifact{},
},
}
r := &rhecCatalogImpl{
models: testModels,
}
tests := []struct {
name string
modelName string
want *openapi.CatalogModel
wantErr bool
}{
{
name: "get existing model",
modelName: "model1",
want: &openapi.CatalogModel{
Name: "model1",
CreateTimeSinceEpoch: &createTime,
LastUpdateTimeSinceEpoch: &lastUpdateTime,
Provider: &provider,
SourceId: &sourceId,
},
wantErr: false,
},
{
name: "get non-existent model",
modelName: "not-exist",
want: nil,
wantErr: false,
},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
got, err := r.GetModel(context.Background(), tt.modelName)
if (err != nil) != tt.wantErr {
t.Errorf("GetModel() error = %v, wantErr %v", err, tt.wantErr)
return
}
if diff := cmp.Diff(tt.want, got); diff != "" {
t.Errorf("GetModel() mismatch (-want +got):\n%s", diff)
}
})
}
}
func TestRhecCatalogImpl_GetArtifacts(t *testing.T) {
modelTime := time.Now()
createTime := modelTime.Format(time.RFC3339)
lastUpdateTime := modelTime.Add(5 * time.Minute).Format(time.RFC3339)
sourceId := "rhec"
provider := "redhat"
artifactCreateTime := modelTime.Add(10 * time.Minute).Format(time.RFC3339)
artifactLastUpdateTime := modelTime.Add(15 * time.Minute).Format(time.RFC3339)
testModels := map[string]*rhecModel{
"model1": {
CatalogModel: openapi.CatalogModel{
Name: "model1",
CreateTimeSinceEpoch: &createTime,
LastUpdateTimeSinceEpoch: &lastUpdateTime,
Provider: &provider,
SourceId: &sourceId,
},
Artifacts: []*openapi.CatalogModelArtifact{
{
Uri: "test-uri",
CreateTimeSinceEpoch: &artifactCreateTime,
LastUpdateTimeSinceEpoch: &artifactLastUpdateTime,
},
},
},
"model2-no-artifacts": {
CatalogModel: openapi.CatalogModel{
Name: "model2-no-artifacts",
CreateTimeSinceEpoch: &createTime,
LastUpdateTimeSinceEpoch: &lastUpdateTime,
Provider: &provider,
SourceId: &sourceId,
},
Artifacts: []*openapi.CatalogModelArtifact{},
},
}
r := &rhecCatalogImpl{
models: testModels,
}
tests := []struct {
name string
modelName string
want *openapi.CatalogModelArtifactList
wantErr bool
}{
{
name: "get artifacts for existing model",
modelName: "model1",
want: &openapi.CatalogModelArtifactList{
Items: []openapi.CatalogModelArtifact{
{
Uri: "test-uri",
CreateTimeSinceEpoch: &artifactCreateTime,
LastUpdateTimeSinceEpoch: &artifactLastUpdateTime,
},
},
PageSize: 1,
Size: 1,
},
wantErr: false,
},
{
name: "get artifacts for model with no artifacts",
modelName: "model2-no-artifacts",
want: &openapi.CatalogModelArtifactList{
Items: []openapi.CatalogModelArtifact{},
PageSize: 0,
Size: 0,
},
wantErr: false,
},
{
name: "get artifacts for non-existent model",
modelName: "not-exist",
want: nil,
wantErr: false,
},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
got, err := r.GetArtifacts(context.Background(), tt.modelName)
if (err != nil) != tt.wantErr {
t.Errorf("GetArtifacts() error = %v, wantErr %v", err, tt.wantErr)
return
}
if diff := cmp.Diff(tt.want, got); diff != "" {
t.Errorf("GetArtifacts() mismatch (-want +got):\n%s", diff)
}
})
}
}
func TestRhecCatalogListModels(t *testing.T) {
modelTime := time.Now()
createTime := modelTime.Format(time.RFC3339)
lastUpdateTime := modelTime.Add(5 * time.Minute).Format(time.RFC3339)
sourceId := "rhec"
provider := "redhat"
artifactCreateTime := modelTime.Add(10 * time.Minute).Format(time.RFC3339)
artifactLastUpdateTime := modelTime.Add(15 * time.Minute).Format(time.RFC3339)
testModels := map[string]*rhecModel{
"model3": {
CatalogModel: openapi.CatalogModel{
Name: "model3",
CreateTimeSinceEpoch: &createTime,
LastUpdateTimeSinceEpoch: &lastUpdateTime,
Provider: &provider,
SourceId: &sourceId,
},
Artifacts: []*openapi.CatalogModelArtifact{
{
Uri: "test-uri",
CreateTimeSinceEpoch: &artifactCreateTime,
LastUpdateTimeSinceEpoch: &artifactLastUpdateTime,
},
},
},
"model1": {
CatalogModel: openapi.CatalogModel{
Name: "model1",
CreateTimeSinceEpoch: &createTime,
LastUpdateTimeSinceEpoch: &lastUpdateTime,
Provider: &provider,
SourceId: &sourceId,
},
Artifacts: []*openapi.CatalogModelArtifact{
{
Uri: "test-uri",
CreateTimeSinceEpoch: &artifactCreateTime,
LastUpdateTimeSinceEpoch: &artifactLastUpdateTime,
},
},
},
"model1:v2": {
CatalogModel: openapi.CatalogModel{
Name: "model1:v2",
CreateTimeSinceEpoch: &createTime,
LastUpdateTimeSinceEpoch: &lastUpdateTime,
Provider: &provider,
SourceId: &sourceId,
},
Artifacts: []*openapi.CatalogModelArtifact{
{
Uri: "test-uri",
CreateTimeSinceEpoch: &artifactCreateTime,
LastUpdateTimeSinceEpoch: &artifactLastUpdateTime,
},
},
},
"model2": {
CatalogModel: openapi.CatalogModel{
Name: "model2",
CreateTimeSinceEpoch: &createTime,
LastUpdateTimeSinceEpoch: &lastUpdateTime,
Provider: &provider,
SourceId: &sourceId,
},
Artifacts: []*openapi.CatalogModelArtifact{},
},
}
r := &rhecCatalogImpl{
models: testModels,
}
tests := []struct {
name string
modelName string
params ListModelsParams
want openapi.CatalogModelList
wantErr bool
}{
{
name: "list models and sort order",
want: openapi.CatalogModelList{
Items: []openapi.CatalogModel{
{
Name: "model1",
CreateTimeSinceEpoch: &createTime,
LastUpdateTimeSinceEpoch: &lastUpdateTime,
Provider: &provider,
SourceId: &sourceId,
},
{
Name: "model1:v2",
CreateTimeSinceEpoch: &createTime,
LastUpdateTimeSinceEpoch: &lastUpdateTime,
Provider: &provider,
SourceId: &sourceId,
},
{
Name: "model2",
CreateTimeSinceEpoch: &createTime,
LastUpdateTimeSinceEpoch: &lastUpdateTime,
Provider: &provider,
SourceId: &sourceId,
},
{
Name: "model3",
CreateTimeSinceEpoch: &createTime,
LastUpdateTimeSinceEpoch: &lastUpdateTime,
Provider: &provider,
SourceId: &sourceId,
},
},
PageSize: 4,
Size: 4,
},
wantErr: false,
},
{
name: "list models with query and sort order",
modelName: "model1",
params: ListModelsParams{
Query: "model1",
OrderBy: model.ORDERBYFIELD_NAME,
SortOrder: model.SORTORDER_ASC,
},
want: openapi.CatalogModelList{
Items: []openapi.CatalogModel{
{
Name: "model1",
CreateTimeSinceEpoch: &createTime,
LastUpdateTimeSinceEpoch: &lastUpdateTime,
Provider: &provider,
SourceId: &sourceId,
},
{
Name: "model1:v2",
CreateTimeSinceEpoch: &createTime,
LastUpdateTimeSinceEpoch: &lastUpdateTime,
Provider: &provider,
SourceId: &sourceId,
},
},
PageSize: 2,
Size: 2,
},
wantErr: false,
},
{
name: "get non-existent model",
modelName: "not-exist",
params: ListModelsParams{
Query: "not-exist",
OrderBy: model.ORDERBYFIELD_NAME,
SortOrder: model.SORTORDER_ASC,
},
want: openapi.CatalogModelList{Items: []openapi.CatalogModel{}},
wantErr: false,
},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
got, err := r.ListModels(context.Background(), tt.params)
if (err != nil) != tt.wantErr {
t.Errorf("ListModels() error = %v, wantErr %v", err, tt.wantErr)
return
}
if diff := cmp.Diff(tt.want, got); diff != "" {
t.Errorf("ListModels() mismatch (-want +got):\n%s", diff)
}
})
}
}
func TestIsModelExcluded(t *testing.T) {
tests := []struct {
name string
modelName string
patterns []string
want bool
}{
{
name: "exact match",
modelName: "model1:v1",
patterns: []string{"model1:v1"},
want: true,
},
{
name: "wildcard match",
modelName: "model1:v2",
patterns: []string{"model1:*"},
want: true,
},
{
name: "no match",
modelName: "model2:v1",
patterns: []string{"model1:*"},
want: false,
},
{
name: "multiple patterns with match",
modelName: "model3:v1",
patterns: []string{"model2:*", "model3:v1"},
want: true,
},
{
name: "empty patterns",
modelName: "model1:v1",
patterns: []string{},
want: false,
},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
got := isModelExcluded(tt.modelName, tt.patterns)
if got != tt.want {
t.Errorf("isModelExcluded() = %v, want %v", got, tt.want)
}
})
}
}

View File

@ -0,0 +1,4 @@
source: empty-catalog
models: []

View File

@ -2,6 +2,7 @@ catalogs:
- name: "Catalog 1"
id: catalog1
type: yaml
enabled: true
properties:
privateProp11: 54321
privateProp12: privateStringValue
@ -9,7 +10,24 @@ catalogs:
- name: "Catalog 2"
id: catalog2
type: yaml
enabled: false
properties:
privateProp21: 12345
privateProp22: privateStringValue2
yamlCatalogPath: test-yaml-catalog.yaml
- name: "Catalog 3"
id: catalog3
type: rhec
enabled: true
properties:
models:
- rhelai1/modelcar-granite-7b-starter
- name: "Catalog 4"
id: catalog4
type: rhec
enabled: true
properties:
models:
- rhelai1/modelcar-granite-7b-starter
excludedModels:
- rhelai1/modelcar-granite-7b-starter:latest

View File

@ -0,0 +1,17 @@
catalogs:
- name: "HuggingFace Test Catalog"
id: hf_test
type: hf
enabled: true
properties:
apiKey: "hf_test_api_key_here"
url: "https://huggingface.co"
modelLimit: 50
- name: "HuggingFace Invalid Credentials"
id: hf_invalid
type: hf
enabled: false # disabled so it doesn't cause startup failures in tests
properties:
apiKey: "invalid_key"
url: "https://huggingface.co"
modelLimit: 10

View File

@ -0,0 +1,40 @@
source: test-list-models
models:
- name: model-alpha
description: A model for text generation.
tasks: ["text-generation", "nlp"]
provider: IBM
libraryName: transformers
createTimeSinceEpoch: "1678886400000" # March 15, 2023 00:00:00 GMT
- name: model-beta
description: Another model for image recognition.
tasks: ["image-recognition"]
provider: Google
libraryName: tensorflow
createTimeSinceEpoch: "1681564800000" # April 15, 2023 00:00:00 GMT
- name: model-gamma
description: A specialized model for natural language processing.
tasks: ["nlp"]
provider: IBM
libraryName: pytorch
createTimeSinceEpoch: "1675209600000" # February 1, 2023 00:00:00 GMT
- name: another-model-alpha
description: A different model for text summarization.
tasks: ["text-summarization", "nlp"]
provider: Microsoft
libraryName: huggingface
createTimeSinceEpoch: "1684243200000" # May 16, 2023 00:00:00 GMT
- name: model-with-no-tasks
description: This model has no specific tasks.
tasks: []
provider: None
libraryName: custom
createTimeSinceEpoch: "1672531200000" # January 1, 2023 00:00:00 GMT
- name: Z-model
description: The last model in alphabetical order.
tasks: ["optimization"]
provider: Oracle
libraryName: scikit-learn
createTimeSinceEpoch: "1690934400000" # August 2, 2023 00:00:00 GMT

View File

@ -324,8 +324,7 @@ models:
createTimeSinceEpoch: "1733514949000"
lastUpdateTimeSinceEpoch: "1734637721000"
artifacts:
- protocol: oci
uri: oci://registry.redhat.io/rhelai1/granite-8b-code-base:1.3-1732870892
- uri: oci://registry.redhat.io/rhelai1/granite-8b-code-base:1.3-1732870892
- name: rhelai1/granite-8b-code-instruct
provider: IBM
description: |-
@ -668,5 +667,27 @@ models:
createTimeSinceEpoch: "1733514949000"
lastUpdateTimeSinceEpoch: "1734637721000"
artifacts:
- protocol: oci
uri: oci://registry.redhat.io/rhelai1/granite-8b-code-instruct:1.3-1732870892
- uri: oci://registry.redhat.io/rhelai1/granite-8b-code-instruct:1.3-1732870892
createTimeSinceEpoch: "1733514949000"
lastUpdateTimeSinceEpoch: "1734637721000"
customProperties:
foo:
string_value: bar
baz:
string_value: qux
- name: model-with-no-artifacts
provider: Test
description: A model used for testing the GetArtifacts method when no artifacts are present.
readme: |
# Model with No Artifacts
This is a test model.
language: ["en"]
license: apache-2.0
licenseLink: https://www.apache.org/licenses/LICENSE-2.0.txt
maturity: Development
libraryName: testlib
tasks:
- test-task
createTimeSinceEpoch: "1700000000000"
lastUpdateTimeSinceEpoch: "1700000000000"
artifacts: []

View File

@ -3,21 +3,23 @@ package catalog
import (
"context"
"fmt"
"math"
"os"
"path/filepath"
"sort"
"strconv"
"strings"
"sync"
"k8s.io/apimachinery/pkg/util/yaml"
"github.com/golang/glog"
model "github.com/kubeflow/model-registry/catalog/pkg/openapi"
)
type yamlArtifacts struct {
Protocol string `yaml:"protocol"`
URI string `yaml:"uri"`
}
type yamlModel struct {
model.CatalogModel `yaml:",inline"`
Artifacts []yamlArtifacts `yaml:"artifacts"`
Artifacts []*model.CatalogModelArtifact `yaml:"artifacts"`
}
type yamlCatalog struct {
@ -26,13 +28,16 @@ type yamlCatalog struct {
}
type yamlCatalogImpl struct {
models map[string]*yamlModel
source *CatalogSourceConfig
modelsLock sync.RWMutex
models map[string]*yamlModel
}
var _ CatalogSourceProvider = &yamlCatalogImpl{}
func (y *yamlCatalogImpl) GetModel(ctx context.Context, name string) (*model.CatalogModel, error) {
y.modelsLock.RLock()
defer y.modelsLock.RUnlock()
ym := y.models[name]
if ym == nil {
return nil, nil
@ -42,11 +47,135 @@ func (y *yamlCatalogImpl) GetModel(ctx context.Context, name string) (*model.Cat
}
func (y *yamlCatalogImpl) ListModels(ctx context.Context, params ListModelsParams) (model.CatalogModelList, error) {
//TODO implement me
panic("implement me")
y.modelsLock.RLock()
defer y.modelsLock.RUnlock()
var filteredModels []*model.CatalogModel
for _, ym := range y.models {
cm := ym.CatalogModel
if params.Query != "" {
query := strings.ToLower(params.Query)
// Check if query matches name, description, tasks, provider, or libraryName
if !strings.Contains(strings.ToLower(cm.Name), query) &&
!strings.Contains(strings.ToLower(cm.GetDescription()), query) &&
!strings.Contains(strings.ToLower(cm.GetProvider()), query) &&
!strings.Contains(strings.ToLower(cm.GetLibraryName()), query) {
// Check tasks
foundInTasks := false
for _, task := range cm.GetTasks() { // Use GetTasks() for nil safety
if strings.Contains(strings.ToLower(task), query) {
foundInTasks = true
break
}
}
if !foundInTasks {
continue // Skip if no match in any searchable field
}
}
}
filteredModels = append(filteredModels, &cm)
}
// Sort the filtered models
sort.Slice(filteredModels, func(i, j int) bool {
a := filteredModels[i]
b := filteredModels[j]
var less bool
switch params.OrderBy {
case model.ORDERBYFIELD_CREATE_TIME:
// Convert CreateTimeSinceEpoch (string) to int64 for comparison
// Handle potential nil or conversion errors by treating as 0
aTime, _ := strconv.ParseInt(a.GetCreateTimeSinceEpoch(), 10, 64)
bTime, _ := strconv.ParseInt(b.GetCreateTimeSinceEpoch(), 10, 64)
less = aTime < bTime
case model.ORDERBYFIELD_LAST_UPDATE_TIME:
// Convert LastUpdateTimeSinceEpoch (string) to int64 for comparison
// Handle potential nil or conversion errors by treating as 0
aTime, _ := strconv.ParseInt(a.GetLastUpdateTimeSinceEpoch(), 10, 64)
bTime, _ := strconv.ParseInt(b.GetLastUpdateTimeSinceEpoch(), 10, 64)
less = aTime < bTime
case model.ORDERBYFIELD_NAME:
fallthrough
default:
// Fallback to name sort if an unknown sort field is provided
less = strings.Compare(a.Name, b.Name) < 0
}
if params.SortOrder == model.SORTORDER_DESC {
return !less
}
return less
})
count := len(filteredModels)
if count > math.MaxInt32 {
count = math.MaxInt32
}
list := model.CatalogModelList{
Items: make([]model.CatalogModel, count),
PageSize: int32(count),
Size: int32(count),
}
for i := range list.Items {
list.Items[i] = *filteredModels[i]
}
return list, nil // Return the struct value directly
}
// TODO start background thread to watch file
func (y *yamlCatalogImpl) GetArtifacts(ctx context.Context, name string) (*model.CatalogModelArtifactList, error) {
y.modelsLock.RLock()
defer y.modelsLock.RUnlock()
ym := y.models[name]
if ym == nil {
return nil, nil
}
count := len(ym.Artifacts)
if count > math.MaxInt32 {
count = math.MaxInt32
}
list := model.CatalogModelArtifactList{
Items: make([]model.CatalogModelArtifact, count),
PageSize: int32(count),
Size: int32(count),
}
for i := range list.Items {
list.Items[i] = *ym.Artifacts[i]
}
return &list, nil
}
func (y *yamlCatalogImpl) load(path string, excludedModelsList []string) error {
bytes, err := os.ReadFile(path)
if err != nil {
return fmt.Errorf("failed to read %s file: %v", yamlCatalogPath, err)
}
var contents yamlCatalog
if err = yaml.UnmarshalStrict(bytes, &contents); err != nil {
return fmt.Errorf("failed to parse %s file: %v", yamlCatalogPath, err)
}
models := make(map[string]*yamlModel)
for i := range contents.Models {
modelName := contents.Models[i].Name
if isModelExcluded(modelName, excludedModelsList) {
continue
}
models[modelName] = &contents.Models[i]
}
y.modelsLock.Lock()
defer y.modelsLock.Unlock()
y.models = models
return nil
}
const yamlCatalogPath = "yamlCatalogPath"
@ -55,30 +184,54 @@ func newYamlCatalog(source *CatalogSourceConfig) (CatalogSourceProvider, error)
if !exists || yamlModelFile == "" {
return nil, fmt.Errorf("missing %s string property", yamlCatalogPath)
}
bytes, err := os.ReadFile(yamlModelFile)
yamlModelFile, err := filepath.Abs(yamlModelFile)
if err != nil {
return nil, fmt.Errorf("failed to read %s file: %v", yamlCatalogPath, err)
return nil, fmt.Errorf("abs: %w", err)
}
var contents yamlCatalog
if err = yaml.UnmarshalStrict(bytes, &contents); err != nil {
return nil, fmt.Errorf("failed to parse %s file: %v", yamlCatalogPath, err)
// Excluded models is an optional source property.
var excludedModels []string
if excludedModelsData, ok := source.Properties["excludedModels"]; ok {
excludedModelsList, ok := excludedModelsData.([]any)
if !ok {
return nil, fmt.Errorf("'excludedModels' property should be a list")
}
excludedModels = make([]string, len(excludedModelsList))
for i, v := range excludedModelsList {
excludedModels[i], ok = v.(string)
if !ok {
return nil, fmt.Errorf("invalid entry in 'excludedModels' list, expected a string")
}
}
}
// override catalog name from Yaml Catalog File if set
if source.Name != "" {
source.Name = contents.Source
p := &yamlCatalogImpl{
models: make(map[string]*yamlModel),
}
err = p.load(yamlModelFile, excludedModels)
if err != nil {
return nil, err
}
models := make(map[string]*yamlModel, len(contents.Models))
for i := range contents.Models {
models[contents.Models[i].Name] = &contents.Models[i]
}
go func() {
changes, err := getMonitor().Path(yamlModelFile)
if err != nil {
glog.Errorf("unable to watch YAML catalog file: %v", err)
// Not fatal, we just won't get automatic updates.
}
return &yamlCatalogImpl{
models: models,
source: source,
}, nil
for range changes {
glog.Infof("Reloading YAML catalog %s", yamlModelFile)
err = p.load(yamlModelFile, excludedModels)
if err != nil {
glog.Errorf("unable to load YAML catalog: %v", err)
}
}
}()
return p, nil
}
func init() {

View File

@ -4,6 +4,7 @@ import (
"context"
"testing"
model "github.com/kubeflow/model-registry/catalog/pkg/openapi"
"github.com/stretchr/testify/assert"
)
@ -29,14 +30,178 @@ func TestYAMLCatalogGetModel(t *testing.T) {
assert.Nil(notFound)
}
func TestYAMLCatalogGetArtifacts(t *testing.T) {
assert := assert.New(t)
provider := testYAMLProvider(t, "testdata/test-yaml-catalog.yaml")
// Test case 1: Model with artifacts
artifacts, err := provider.GetArtifacts(context.Background(), "rhelai1/granite-8b-code-base")
if assert.NoError(err) {
assert.NotNil(artifacts)
assert.Equal(int32(1), artifacts.Size)
assert.Equal(int32(1), artifacts.PageSize)
assert.Len(artifacts.Items, 1)
assert.Equal("oci://registry.redhat.io/rhelai1/granite-8b-code-base:1.3-1732870892", artifacts.Items[0].Uri)
}
// Test case 2: Model with no artifacts
noArtifactsModel, err := provider.GetArtifacts(context.Background(), "model-with-no-artifacts")
if assert.NoError(err) {
assert.NotNil(noArtifactsModel)
assert.Equal(int32(0), noArtifactsModel.Size)
assert.Equal(int32(0), noArtifactsModel.PageSize)
assert.Len(noArtifactsModel.Items, 0)
}
// Test case 3: Model not found
notFoundArtifacts, err := provider.GetArtifacts(context.Background(), "non-existent-model")
assert.NoError(err)
assert.Nil(notFoundArtifacts)
}
func TestYAMLCatalogListModels(t *testing.T) {
assert := assert.New(t)
provider := testYAMLProvider(t, "testdata/test-list-models-catalog.yaml")
ctx := context.Background()
// Test case 1: List all models, default sort (by name ascending)
models, err := provider.ListModels(ctx, ListModelsParams{})
if assert.NoError(err) {
assert.NotNil(models)
assert.Equal(int32(6), models.Size)
assert.Equal(int32(6), models.PageSize)
assert.Len(models.Items, 6)
assert.Equal("Z-model", models.Items[0].Name) // Z-model should be first due to string comparison for alphabetical sort
assert.Equal("another-model-alpha", models.Items[1].Name)
assert.Equal("model-alpha", models.Items[2].Name)
assert.Equal("model-beta", models.Items[3].Name)
assert.Equal("model-gamma", models.Items[4].Name)
assert.Equal("model-with-no-tasks", models.Items[5].Name)
}
// Test case 2: List all models, sort by name ascending
models, err = provider.ListModels(ctx, ListModelsParams{OrderBy: model.ORDERBYFIELD_NAME, SortOrder: model.SORTORDER_ASC})
if assert.NoError(err) {
assert.Equal(int32(6), models.Size)
assert.Equal("Z-model", models.Items[0].Name)
assert.Equal("another-model-alpha", models.Items[1].Name)
}
// Test case 3: List all models, sort by name descending
models, err = provider.ListModels(ctx, ListModelsParams{OrderBy: model.ORDERBYFIELD_NAME, SortOrder: model.SORTORDER_DESC})
if assert.NoError(err) {
assert.Equal(int32(6), models.Size)
assert.Equal("model-with-no-tasks", models.Items[0].Name)
assert.Equal("model-gamma", models.Items[1].Name)
}
// Test case 4: List all models, sort by created (CreateTimeSinceEpoch) ascending
models, err = provider.ListModels(ctx, ListModelsParams{OrderBy: model.ORDERBYFIELD_CREATE_TIME, SortOrder: model.SORTORDER_ASC})
if assert.NoError(err) {
assert.Equal(int32(6), models.Size)
assert.Equal("model-with-no-tasks", models.Items[0].Name) // Jan 1, 2023
assert.Equal("model-gamma", models.Items[1].Name) // Feb 1, 2023
}
// Test case 5: List all models, sort by published (CreateTimeSinceEpoch) descending
models, err = provider.ListModels(ctx, ListModelsParams{OrderBy: model.ORDERBYFIELD_CREATE_TIME, SortOrder: model.SORTORDER_DESC})
if assert.NoError(err) {
assert.Equal(int32(6), models.Size)
assert.Equal("Z-model", models.Items[0].Name) // Aug 2, 2023
assert.Equal("another-model-alpha", models.Items[1].Name) // May 16, 2023
}
// Test case 6: Filter by query "model" (should match all 6 models)
models, err = provider.ListModels(ctx, ListModelsParams{Query: "model"})
if assert.NoError(err) {
assert.Equal(int32(6), models.Size)
assert.Equal("Z-model", models.Items[0].Name)
assert.Equal("another-model-alpha", models.Items[1].Name)
assert.Equal("model-alpha", models.Items[2].Name)
assert.Equal("model-beta", models.Items[3].Name)
assert.Equal("model-gamma", models.Items[4].Name)
assert.Equal("model-with-no-tasks", models.Items[5].Name)
}
// Test case 7: Filter by query "text" (should match model-alpha, another-model-alpha)
models, err = provider.ListModels(ctx, ListModelsParams{Query: "text"})
if assert.NoError(err) {
assert.Equal(int32(2), models.Size)
assert.Equal("another-model-alpha", models.Items[0].Name) // Alphabetical order
assert.Equal("model-alpha", models.Items[1].Name)
}
// Test case 8: Filter by query "nlp" (should match model-alpha, model-gamma, another-model-alpha)
models, err = provider.ListModels(ctx, ListModelsParams{Query: "nlp"})
if assert.NoError(err) {
assert.Equal(int32(3), models.Size)
assert.Equal("another-model-alpha", models.Items[0].Name)
assert.Equal("model-alpha", models.Items[1].Name)
assert.Equal("model-gamma", models.Items[2].Name)
}
// Test case 9: Filter by query "IBM" (should match model-alpha, model-gamma)
models, err = provider.ListModels(ctx, ListModelsParams{Query: "IBM"})
if assert.NoError(err) {
assert.Equal(int32(2), models.Size)
assert.Equal("model-alpha", models.Items[0].Name)
assert.Equal("model-gamma", models.Items[1].Name)
}
// Test case 10: Filter by query "transformers" (should match model-alpha)
models, err = provider.ListModels(ctx, ListModelsParams{Query: "transformers"})
if assert.NoError(err) {
assert.Equal(int32(1), models.Size)
assert.Equal("model-alpha", models.Items[0].Name)
}
// Test case 11: Filter by query "nonexistent" (should return empty list)
models, err = provider.ListModels(ctx, ListModelsParams{Query: "nonexistent"})
assert.NoError(err)
assert.NotNil(models)
assert.Equal(int32(0), models.Size)
assert.Equal(int32(0), models.PageSize)
assert.Len(models.Items, 0)
// Test case 12: Empty catalog
emptyProvider := testYAMLProvider(t, "testdata/empty-catalog.yaml") // Assuming an empty-catalog.yaml exists or will be created
emptyModels, err := emptyProvider.ListModels(ctx, ListModelsParams{})
assert.NoError(err)
assert.NotNil(emptyModels)
assert.Equal(int32(0), emptyModels.Size)
assert.Equal(int32(0), emptyModels.PageSize)
assert.Len(emptyModels.Items, 0)
// Test case 13: Test with excluded models
excludedProvider := testYAMLProviderWithExclusions(t, "testdata/test-list-models-catalog.yaml", []any{
"model-alpha",
})
excludedModels, err := excludedProvider.ListModels(ctx, ListModelsParams{})
if assert.NoError(err) {
assert.NotNil(excludedModels)
assert.Equal(int32(5), excludedModels.Size)
for _, m := range excludedModels.Items {
assert.NotEqual("model-alpha", m.Name)
}
}
}
func testYAMLProvider(t *testing.T, path string) CatalogSourceProvider {
return testYAMLProviderWithExclusions(t, path, nil)
}
func testYAMLProviderWithExclusions(t *testing.T, path string, excludedModels []any) CatalogSourceProvider {
properties := map[string]any{
yamlCatalogPath: path,
}
if excludedModels != nil {
properties["excludedModels"] = excludedModels
}
provider, err := newYamlCatalog(&CatalogSourceConfig{
Properties: map[string]any{
yamlCatalogPath: path,
},
Properties: properties,
})
if err != nil {
t.Fatalf("newYamlCatalog(%s) failed: %v", path, err)
t.Fatalf("newYamlCatalog(%s) with exclusions failed: %v", path, err)
}
return provider
}

View File

@ -4,6 +4,7 @@ error.go
helpers.go
impl.go
logger.go
model_artifact_type_query_param.go
model_base_model.go
model_base_resource_dates.go
model_base_resource_list.go

View File

@ -7,7 +7,6 @@ import (
"math"
"net/http"
"slices"
"strconv"
"strings"
"github.com/kubeflow/model-registry/catalog/internal/catalog"
@ -18,15 +17,53 @@ import (
// This service should implement the business logic for every endpoint for the ModelCatalogServiceAPI s.coreApi.
// Include any external packages or services that will be required by this service.
type ModelCatalogServiceAPIService struct {
sources map[string]catalog.CatalogSource
sources *catalog.SourceCollection
}
func (m *ModelCatalogServiceAPIService) GetAllModelArtifacts(context.Context, string, string) (ImplResponse, error) {
return Response(http.StatusNotImplemented, "Not implemented"), nil
// GetAllModelArtifacts retrieves all model artifacts for a given model from the specified source.
func (m *ModelCatalogServiceAPIService) GetAllModelArtifacts(ctx context.Context, sourceID string, name string) (ImplResponse, error) {
source, ok := m.sources.Get(sourceID)
if !ok {
return notFound("Unknown source"), nil
}
artifacts, err := source.Provider.GetArtifacts(ctx, name)
if err != nil {
return Response(http.StatusInternalServerError, err), err
}
return Response(http.StatusOK, artifacts), nil
}
func (m *ModelCatalogServiceAPIService) FindModels(ctx context.Context, source string, q string, pageSize string, orderBy model.OrderByField, sortOder model.SortOrder, nextPageToken string) (ImplResponse, error) {
return Response(http.StatusNotImplemented, "Not implemented"), nil
func (m *ModelCatalogServiceAPIService) FindModels(ctx context.Context, sourceID string, q string, pageSize string, orderBy model.OrderByField, sortOrder model.SortOrder, nextPageToken string) (ImplResponse, error) {
source, ok := m.sources.Get(sourceID)
if !ok {
return notFound("Unknown source"), errors.New("Unknown source")
}
p, err := newPaginator[model.CatalogModel](pageSize, orderBy, sortOrder, nextPageToken)
if err != nil {
return ErrorResponse(http.StatusBadRequest, err), err
}
listModelsParams := catalog.ListModelsParams{
Query: q,
OrderBy: p.OrderBy,
SortOrder: p.SortOrder,
}
models, err := source.Provider.ListModels(ctx, listModelsParams)
if err != nil {
return ErrorResponse(http.StatusInternalServerError, err), err
}
page, next := p.Paginate(models.Items)
models.Items = page
models.PageSize = p.PageSize
models.NextPageToken = next.Token()
return Response(http.StatusOK, models), nil
}
func (m *ModelCatalogServiceAPIService) GetModel(ctx context.Context, sourceID string, name string) (ImplResponse, error) {
@ -34,7 +71,7 @@ func (m *ModelCatalogServiceAPIService) GetModel(ctx context.Context, sourceID s
return m.GetAllModelArtifacts(ctx, sourceID, name)
}
source, ok := m.sources[sourceID]
source, ok := m.sources.Get(sourceID)
if !ok {
return notFound("Unknown source"), nil
}
@ -51,28 +88,22 @@ func (m *ModelCatalogServiceAPIService) GetModel(ctx context.Context, sourceID s
}
func (m *ModelCatalogServiceAPIService) FindSources(ctx context.Context, name string, strPageSize string, orderBy model.OrderByField, sortOrder model.SortOrder, nextPageToken string) (ImplResponse, error) {
// TODO: Implement real pagination in here by reusing the nextPageToken
// code from https://github.com/kubeflow/model-registry/pull/1205.
if len(m.sources) > math.MaxInt32 {
sources := m.sources.All()
if len(sources) > math.MaxInt32 {
err := errors.New("too many registered models")
return ErrorResponse(http.StatusInternalServerError, err), err
}
var pageSize int32 = 10
if strPageSize != "" {
pageSize64, err := strconv.ParseInt(strPageSize, 10, 32)
if err != nil {
return ErrorResponse(http.StatusBadRequest, err), err
}
pageSize = int32(pageSize64)
paginator, err := newPaginator[model.CatalogSource](strPageSize, orderBy, sortOrder, nextPageToken)
if err != nil {
return ErrorResponse(http.StatusBadRequest, err), err
}
items := make([]model.CatalogSource, 0, len(m.sources))
items := make([]model.CatalogSource, 0, len(sources))
name = strings.ToLower(name)
for _, v := range m.sources {
for _, v := range sources {
if !strings.Contains(strings.ToLower(v.Metadata.Name), name) {
continue
}
@ -87,15 +118,14 @@ func (m *ModelCatalogServiceAPIService) FindSources(ctx context.Context, name st
slices.SortStableFunc(items, cmpFunc)
total := int32(len(items))
if total > pageSize {
items = items[:pageSize]
}
pagedItems, next := paginator.Paginate(items)
res := model.CatalogSourceList{
PageSize: pageSize,
Items: items,
PageSize: paginator.PageSize,
Items: pagedItems,
Size: total,
NextPageToken: "",
NextPageToken: next.Token(),
}
return Response(http.StatusOK, res), nil
}
@ -128,7 +158,7 @@ func genCatalogCmpFunc(orderBy model.OrderByField, sortOrder model.SortOrder) (f
var _ ModelCatalogServiceAPIServicer = &ModelCatalogServiceAPIService{}
// NewModelCatalogServiceAPIService creates a default api service
func NewModelCatalogServiceAPIService(sources map[string]catalog.CatalogSource) ModelCatalogServiceAPIServicer {
func NewModelCatalogServiceAPIService(sources *catalog.SourceCollection) ModelCatalogServiceAPIServicer {
return &ModelCatalogServiceAPIService{
sources: sources,
}

View File

@ -3,8 +3,11 @@ package openapi
import (
"context"
"net/http"
"sort"
"strconv"
"strings"
"testing"
"time"
"github.com/kubeflow/model-registry/catalog/internal/catalog"
model "github.com/kubeflow/model-registry/catalog/pkg/openapi"
@ -12,8 +15,306 @@ import (
"github.com/stretchr/testify/require"
)
// timeToMillisStringPointer converts time.Time to *string representing milliseconds since epoch.
func timeToMillisStringPointer(t time.Time) *string {
s := strconv.FormatInt(t.UnixMilli(), 10)
return &s
}
// pointerOrDefault returns the value pointed to by p, or def if p is nil.
func pointerOrDefault(p *string, def string) string {
if p == nil {
return def
}
return *p
}
func TestFindModels(t *testing.T) {
// Define common models for testing
time1 := time.Date(2023, 1, 1, 0, 0, 0, 0, time.UTC)
time2 := time.Date(2023, 1, 2, 0, 0, 0, 0, time.UTC)
time3 := time.Date(2023, 1, 3, 0, 0, 0, 0, time.UTC)
time4 := time.Date(2023, 1, 4, 0, 0, 0, 0, time.UTC)
// Updated model definitions to match OpenAPI schema (no direct Id or Published, use Name, CreateTime, LastUpdateTime)
modelA := &model.CatalogModel{Name: "Model A", CreateTimeSinceEpoch: timeToMillisStringPointer(time1), LastUpdateTimeSinceEpoch: timeToMillisStringPointer(time4)}
modelB := &model.CatalogModel{Name: "Model B", CreateTimeSinceEpoch: timeToMillisStringPointer(time2), LastUpdateTimeSinceEpoch: timeToMillisStringPointer(time3)}
modelC := &model.CatalogModel{Name: "Another Model C", CreateTimeSinceEpoch: timeToMillisStringPointer(time3), LastUpdateTimeSinceEpoch: timeToMillisStringPointer(time2)}
modelD := &model.CatalogModel{Name: "My Model D", CreateTimeSinceEpoch: timeToMillisStringPointer(time4), LastUpdateTimeSinceEpoch: timeToMillisStringPointer(time1)}
testCases := []struct {
name string
sourceID string
mockModels map[string]*model.CatalogModel
q string
pageSize string
orderBy model.OrderByField
sortOrder model.SortOrder
nextPageToken string
expectedStatus int
expectedModelList *model.CatalogModelList
}{
{
name: "Successful query with no filters",
sourceID: "source1",
mockModels: map[string]*model.CatalogModel{
"modelA": modelA, "modelB": modelB, "modelC": modelC, "modelD": modelD,
},
q: "",
pageSize: "10",
orderBy: model.ORDERBYFIELD_NAME,
sortOrder: model.SORTORDER_ASC,
expectedStatus: http.StatusOK,
expectedModelList: &model.CatalogModelList{
Items: []model.CatalogModel{*modelC, *modelA, *modelB, *modelD}, // Sorted by Name ASC: Another Model C, Model A, Model B, My Model D
Size: 4,
PageSize: 10, // Default page size
NextPageToken: "",
},
},
{
name: "Filter by query 'Model'",
sourceID: "source1",
mockModels: map[string]*model.CatalogModel{
"modelA": modelA, "modelB": modelB, "modelC": modelC, "modelD": modelD,
},
q: "Model",
pageSize: "10",
orderBy: model.ORDERBYFIELD_NAME,
sortOrder: model.SORTORDER_ASC,
expectedStatus: http.StatusOK,
expectedModelList: &model.CatalogModelList{
Items: []model.CatalogModel{*modelC, *modelA, *modelB, *modelD}, // Corrected to include modelC and sorted by name ASC
Size: 4, // Corrected from 3 to 4
PageSize: 10,
NextPageToken: "",
},
},
{
name: "Filter by query 'model' (case insensitive)",
sourceID: "source1",
mockModels: map[string]*model.CatalogModel{
"modelA": modelA, "modelB": modelB, "modelC": modelC, "modelD": modelD,
},
q: "model",
pageSize: "10",
orderBy: model.ORDERBYFIELD_NAME,
sortOrder: model.SORTORDER_ASC,
expectedStatus: http.StatusOK,
expectedModelList: &model.CatalogModelList{
Items: []model.CatalogModel{*modelC, *modelA, *modelB, *modelD}, // Corrected to include modelC and sorted by name ASC
Size: 4, // Corrected from 3 to 4
PageSize: 10,
NextPageToken: "",
},
},
{
name: "Page size limit",
sourceID: "source1",
mockModels: map[string]*model.CatalogModel{
"modelA": modelA, "modelB": modelB, "modelC": modelC, "modelD": modelD,
},
q: "",
pageSize: "2",
orderBy: model.ORDERBYFIELD_NAME,
sortOrder: model.SORTORDER_ASC,
expectedStatus: http.StatusOK,
expectedModelList: &model.CatalogModelList{
Items: []model.CatalogModel{*modelC, *modelA}, // First 2 after sorting by Name ASC
Size: 4, // Total size remains 4
PageSize: 2,
NextPageToken: (&stringCursor{Value: "Model A", ID: "Model A"}).String(),
},
},
{
name: "Sort by ID Descending (mocked as Name Descending)",
sourceID: "source1",
mockModels: map[string]*model.CatalogModel{
"modelA": modelA, "modelB": modelB, "modelC": modelC, "modelD": modelD,
},
q: "",
pageSize: "10",
orderBy: model.ORDERBYFIELD_ID,
sortOrder: model.SORTORDER_DESC,
expectedStatus: http.StatusOK,
expectedModelList: &model.CatalogModelList{
Items: []model.CatalogModel{*modelD, *modelB, *modelA, *modelC}, // Sorted by Name DESC
Size: 4,
PageSize: 10,
NextPageToken: "",
},
},
{
name: "Sort by CreateTime Ascending",
sourceID: "source1",
mockModels: map[string]*model.CatalogModel{
"modelA": modelA, "modelB": modelB, "modelC": modelC, "modelD": modelD,
},
q: "",
pageSize: "10",
orderBy: model.ORDERBYFIELD_CREATE_TIME,
sortOrder: model.SORTORDER_ASC,
expectedStatus: http.StatusOK,
expectedModelList: &model.CatalogModelList{
Items: []model.CatalogModel{*modelA, *modelB, *modelC, *modelD}, // Sorted by CreateTime ASC
Size: 4,
PageSize: 10,
NextPageToken: "",
},
},
{
name: "Sort by LastUpdateTime Descending",
sourceID: "source1",
mockModels: map[string]*model.CatalogModel{
"modelA": modelA, "modelB": modelB, "modelC": modelC, "modelD": modelD,
},
q: "",
pageSize: "10",
orderBy: model.ORDERBYFIELD_LAST_UPDATE_TIME,
sortOrder: model.SORTORDER_DESC,
expectedStatus: http.StatusOK,
expectedModelList: &model.CatalogModelList{
Items: []model.CatalogModel{*modelA, *modelB, *modelC, *modelD}, // Corrected to be sorted by LastUpdateTime DESC (modelA has latest time4, modelD has earliest time1)
Size: 4,
PageSize: 10,
NextPageToken: "",
},
},
{
name: "Invalid source ID",
sourceID: "unknown-source",
mockModels: map[string]*model.CatalogModel{},
q: "",
pageSize: "10",
orderBy: model.ORDERBYFIELD_ID,
sortOrder: model.SORTORDER_ASC,
expectedStatus: http.StatusNotFound,
expectedModelList: nil,
},
{
name: "Invalid pageSize string",
sourceID: "source1",
mockModels: map[string]*model.CatalogModel{
"modelA": modelA,
},
q: "",
pageSize: "abc",
orderBy: model.ORDERBYFIELD_ID,
sortOrder: model.SORTORDER_ASC,
expectedStatus: http.StatusBadRequest,
expectedModelList: nil,
},
{
name: "Unsupported orderBy field",
sourceID: "source1",
mockModels: map[string]*model.CatalogModel{
"modelA": modelA,
},
q: "",
pageSize: "10",
orderBy: "UNSUPPORTED_FIELD",
sortOrder: model.SORTORDER_ASC,
expectedStatus: http.StatusBadRequest,
expectedModelList: nil,
},
{
name: "Unsupported sortOrder field",
sourceID: "source1",
mockModels: map[string]*model.CatalogModel{
"modelA": modelA,
},
q: "",
pageSize: "10",
orderBy: model.ORDERBYFIELD_ID,
sortOrder: "UNSUPPORTED_ORDER",
expectedStatus: http.StatusBadRequest,
expectedModelList: nil,
},
{
name: "Empty models in source",
sourceID: "source1",
mockModels: map[string]*model.CatalogModel{},
q: "",
pageSize: "10",
orderBy: model.ORDERBYFIELD_ID,
sortOrder: model.SORTORDER_ASC,
expectedStatus: http.StatusOK,
expectedModelList: &model.CatalogModelList{
Items: []model.CatalogModel{},
Size: 0,
PageSize: 10,
NextPageToken: "",
},
},
{
name: "Default sort (ID ascending) and default page size",
sourceID: "source1",
mockModels: map[string]*model.CatalogModel{
"modelB": modelB, "modelA": modelA, "modelD": modelD, "modelC": modelC,
},
q: "",
pageSize: "", // Default page size
orderBy: "", // Default order by ID
sortOrder: "", // Default sort order ASC
expectedStatus: http.StatusOK,
expectedModelList: &model.CatalogModelList{
Items: []model.CatalogModel{*modelC, *modelA, *modelB, *modelD}, // Sorted by Name ASC (as ID is mocked to use Name)
Size: 4,
PageSize: 10, // Default page size
NextPageToken: "",
},
},
}
for _, tc := range testCases {
t.Run(tc.name, func(t *testing.T) {
// Create mock source collection
sources := catalog.NewSourceCollection(map[string]catalog.CatalogSource{
"source1": {
Metadata: model.CatalogSource{Id: "source1", Name: "Test Source 1"},
Provider: &mockModelProvider{
models: tc.mockModels,
},
},
})
service := NewModelCatalogServiceAPIService(sources)
resp, err := service.FindModels(
context.Background(),
tc.sourceID,
tc.q,
tc.pageSize,
tc.orderBy,
tc.sortOrder,
tc.nextPageToken,
)
assert.Equal(t, tc.expectedStatus, resp.Code)
if tc.expectedStatus != http.StatusOK {
assert.NotNil(t, err)
return
}
require.NotNil(t, resp.Body)
modelList, ok := resp.Body.(model.CatalogModelList)
require.True(t, ok, "Response body should be a CatalogModelList")
assert.Equal(t, tc.expectedModelList.Size, modelList.Size)
assert.Equal(t, tc.expectedModelList.PageSize, modelList.PageSize)
if !assert.Equal(t, tc.expectedModelList.NextPageToken, modelList.NextPageToken) && tc.expectedModelList.NextPageToken != "" {
assert.Equal(t, decodeStringCursor(tc.expectedModelList.NextPageToken), decodeStringCursor(modelList.NextPageToken))
}
// Deep equality check for items
assert.Equal(t, tc.expectedModelList.Items, modelList.Items)
})
}
}
func TestFindSources(t *testing.T) {
// Setup test cases
trueValue := true
testCases := []struct {
name string
catalogs map[string]catalog.CatalogSource
@ -42,7 +343,7 @@ func TestFindSources(t *testing.T) {
name: "Single catalog",
catalogs: map[string]catalog.CatalogSource{
"catalog1": {
Metadata: model.CatalogSource{Id: "catalog1", Name: "Test Catalog 1"},
Metadata: model.CatalogSource{Id: "catalog1", Name: "Test Catalog 1", Enabled: &trueValue},
},
},
nameFilter: "",
@ -57,13 +358,13 @@ func TestFindSources(t *testing.T) {
name: "Multiple catalogs with no filter",
catalogs: map[string]catalog.CatalogSource{
"catalog1": {
Metadata: model.CatalogSource{Id: "catalog1", Name: "Test Catalog 1"},
Metadata: model.CatalogSource{Id: "catalog1", Name: "Test Catalog 1", Enabled: &trueValue},
},
"catalog2": {
Metadata: model.CatalogSource{Id: "catalog2", Name: "Test Catalog 2"},
Metadata: model.CatalogSource{Id: "catalog2", Name: "Test Catalog 2", Enabled: &trueValue},
},
"catalog3": {
Metadata: model.CatalogSource{Id: "catalog3", Name: "Another Catalog"},
Metadata: model.CatalogSource{Id: "catalog3", Name: "Another Catalog", Enabled: &trueValue},
},
},
nameFilter: "",
@ -78,13 +379,13 @@ func TestFindSources(t *testing.T) {
name: "Filter by name",
catalogs: map[string]catalog.CatalogSource{
"catalog1": {
Metadata: model.CatalogSource{Id: "catalog1", Name: "Test Catalog 1"},
Metadata: model.CatalogSource{Id: "catalog1", Name: "Test Catalog 1", Enabled: &trueValue},
},
"catalog2": {
Metadata: model.CatalogSource{Id: "catalog2", Name: "Test Catalog 2"},
Metadata: model.CatalogSource{Id: "catalog2", Name: "Test Catalog 2", Enabled: &trueValue},
},
"catalog3": {
Metadata: model.CatalogSource{Id: "catalog3", Name: "Another Catalog"},
Metadata: model.CatalogSource{Id: "catalog3", Name: "Another Catalog", Enabled: &trueValue},
},
},
nameFilter: "Test",
@ -99,13 +400,13 @@ func TestFindSources(t *testing.T) {
name: "Filter by name case insensitive",
catalogs: map[string]catalog.CatalogSource{
"catalog1": {
Metadata: model.CatalogSource{Id: "catalog1", Name: "Test Catalog 1"},
Metadata: model.CatalogSource{Id: "catalog1", Name: "Test Catalog 1", Enabled: &trueValue},
},
"catalog2": {
Metadata: model.CatalogSource{Id: "catalog2", Name: "Test Catalog 2"},
Metadata: model.CatalogSource{Id: "catalog2", Name: "Test Catalog 2", Enabled: &trueValue},
},
"catalog3": {
Metadata: model.CatalogSource{Id: "catalog3", Name: "Another Catalog"},
Metadata: model.CatalogSource{Id: "catalog3", Name: "Another Catalog", Enabled: &trueValue},
},
},
nameFilter: "test",
@ -120,13 +421,13 @@ func TestFindSources(t *testing.T) {
name: "Pagination - limit results",
catalogs: map[string]catalog.CatalogSource{
"catalog1": {
Metadata: model.CatalogSource{Id: "catalog1", Name: "Test Catalog 1"},
Metadata: model.CatalogSource{Id: "catalog1", Name: "Test Catalog 1", Enabled: &trueValue},
},
"catalog2": {
Metadata: model.CatalogSource{Id: "catalog2", Name: "Test Catalog 2"},
Metadata: model.CatalogSource{Id: "catalog2", Name: "Test Catalog 2", Enabled: &trueValue},
},
"catalog3": {
Metadata: model.CatalogSource{Id: "catalog3", Name: "Another Catalog"},
Metadata: model.CatalogSource{Id: "catalog3", Name: "Another Catalog", Enabled: &trueValue},
},
},
nameFilter: "",
@ -141,10 +442,10 @@ func TestFindSources(t *testing.T) {
name: "Default page size",
catalogs: map[string]catalog.CatalogSource{
"catalog1": {
Metadata: model.CatalogSource{Id: "catalog1", Name: "Test Catalog 1"},
Metadata: model.CatalogSource{Id: "catalog1", Name: "Test Catalog 1", Enabled: &trueValue},
},
"catalog2": {
Metadata: model.CatalogSource{Id: "catalog2", Name: "Test Catalog 2"},
Metadata: model.CatalogSource{Id: "catalog2", Name: "Test Catalog 2", Enabled: &trueValue},
},
},
nameFilter: "",
@ -159,7 +460,7 @@ func TestFindSources(t *testing.T) {
name: "Invalid page size",
catalogs: map[string]catalog.CatalogSource{
"catalog1": {
Metadata: model.CatalogSource{Id: "catalog1", Name: "Test Catalog 1"},
Metadata: model.CatalogSource{Id: "catalog1", Name: "Test Catalog 1", Enabled: &trueValue},
},
},
nameFilter: "",
@ -172,13 +473,13 @@ func TestFindSources(t *testing.T) {
name: "Sort by ID ascending",
catalogs: map[string]catalog.CatalogSource{
"catalog2": {
Metadata: model.CatalogSource{Id: "catalog2", Name: "B Catalog"},
Metadata: model.CatalogSource{Id: "catalog2", Name: "B Catalog", Enabled: &trueValue},
},
"catalog1": {
Metadata: model.CatalogSource{Id: "catalog1", Name: "A Catalog"},
Metadata: model.CatalogSource{Id: "catalog1", Name: "A Catalog", Enabled: &trueValue},
},
"catalog3": {
Metadata: model.CatalogSource{Id: "catalog3", Name: "C Catalog"},
Metadata: model.CatalogSource{Id: "catalog3", Name: "C Catalog", Enabled: &trueValue},
},
},
nameFilter: "",
@ -194,13 +495,13 @@ func TestFindSources(t *testing.T) {
name: "Sort by ID descending",
catalogs: map[string]catalog.CatalogSource{
"catalog2": {
Metadata: model.CatalogSource{Id: "catalog2", Name: "B Catalog"},
Metadata: model.CatalogSource{Id: "catalog2", Name: "B Catalog", Enabled: &trueValue},
},
"catalog1": {
Metadata: model.CatalogSource{Id: "catalog1", Name: "A Catalog"},
Metadata: model.CatalogSource{Id: "catalog1", Name: "A Catalog", Enabled: &trueValue},
},
"catalog3": {
Metadata: model.CatalogSource{Id: "catalog3", Name: "C Catalog"},
Metadata: model.CatalogSource{Id: "catalog3", Name: "C Catalog", Enabled: &trueValue},
},
},
nameFilter: "",
@ -216,13 +517,13 @@ func TestFindSources(t *testing.T) {
name: "Sort by name ascending",
catalogs: map[string]catalog.CatalogSource{
"catalog2": {
Metadata: model.CatalogSource{Id: "catalog2", Name: "B Catalog"},
Metadata: model.CatalogSource{Id: "catalog2", Name: "B Catalog", Enabled: &trueValue},
},
"catalog1": {
Metadata: model.CatalogSource{Id: "catalog1", Name: "A Catalog"},
Metadata: model.CatalogSource{Id: "catalog1", Name: "A Catalog", Enabled: &trueValue},
},
"catalog3": {
Metadata: model.CatalogSource{Id: "catalog3", Name: "C Catalog"},
Metadata: model.CatalogSource{Id: "catalog3", Name: "C Catalog", Enabled: &trueValue},
},
},
nameFilter: "",
@ -238,10 +539,10 @@ func TestFindSources(t *testing.T) {
name: "Sort by name descending",
catalogs: map[string]catalog.CatalogSource{
"catalog2": {
Metadata: model.CatalogSource{Id: "catalog2", Name: "B Catalog"},
Metadata: model.CatalogSource{Id: "catalog2", Name: "B Catalog", Enabled: &trueValue},
},
"catalog1": {
Metadata: model.CatalogSource{Id: "catalog1", Name: "A Catalog"},
Metadata: model.CatalogSource{Id: "catalog1", Name: "A Catalog", Enabled: &trueValue},
},
"catalog3": {
Metadata: model.CatalogSource{Id: "catalog3", Name: "C Catalog"},
@ -310,7 +611,7 @@ func TestFindSources(t *testing.T) {
for _, tc := range testCases {
t.Run(tc.name, func(t *testing.T) {
// Create service with test catalogs
service := NewModelCatalogServiceAPIService(tc.catalogs)
service := NewModelCatalogServiceAPIService(catalog.NewSourceCollection(tc.catalogs))
// Call FindSources
resp, err := service.FindSources(
@ -393,7 +694,8 @@ func TestFindSources(t *testing.T) {
// Define a mock model provider
type mockModelProvider struct {
models map[string]*model.CatalogModel
models map[string]*model.CatalogModel
artifacts map[string][]model.CatalogModelArtifact
}
// Implement GetModel method for the mock provider
@ -406,11 +708,71 @@ func (m *mockModelProvider) GetModel(ctx context.Context, name string) (*model.C
}
func (m *mockModelProvider) ListModels(ctx context.Context, params catalog.ListModelsParams) (model.CatalogModelList, error) {
return model.CatalogModelList{}, nil
var filteredModels []*model.CatalogModel
for _, mdl := range m.models {
if params.Query == "" || strings.Contains(strings.ToLower(mdl.Name), strings.ToLower(params.Query)) {
filteredModels = append(filteredModels, mdl)
}
}
// Sort the filtered models
sort.SliceStable(filteredModels, func(i, j int) bool {
cmp := 0
switch params.OrderBy {
case model.ORDERBYFIELD_CREATE_TIME:
// Parse CreateTimeSinceEpoch strings to int64 for comparison
t1, _ := strconv.ParseInt(pointerOrDefault(filteredModels[i].CreateTimeSinceEpoch, "0"), 10, 64)
t2, _ := strconv.ParseInt(pointerOrDefault(filteredModels[j].CreateTimeSinceEpoch, "0"), 10, 64)
cmp = int(t1 - t2)
case model.ORDERBYFIELD_LAST_UPDATE_TIME:
// Parse LastUpdateTimeSinceEpoch strings to int64 for comparison
t1, _ := strconv.ParseInt(pointerOrDefault(filteredModels[i].LastUpdateTimeSinceEpoch, "0"), 10, 64)
t2, _ := strconv.ParseInt(pointerOrDefault(filteredModels[j].LastUpdateTimeSinceEpoch, "0"), 10, 64)
cmp = int(t1 - t2)
case model.ORDERBYFIELD_NAME:
fallthrough
default:
cmp = strings.Compare(filteredModels[i].Name, filteredModels[j].Name)
}
if params.SortOrder == model.SORTORDER_DESC {
return cmp > 0
}
return cmp < 0
})
items := make([]model.CatalogModel, len(filteredModels))
for i, mdl := range filteredModels {
items[i] = *mdl
}
return model.CatalogModelList{
Items: items,
Size: int32(len(items)),
PageSize: int32(len(items)), // Mock returns all filtered items as one "page"
NextPageToken: "",
}, nil
}
func (m *mockModelProvider) GetArtifacts(ctx context.Context, name string) (*model.CatalogModelArtifactList, error) {
artifacts, exists := m.artifacts[name]
if !exists {
return &model.CatalogModelArtifactList{
Items: []model.CatalogModelArtifact{},
Size: 0,
PageSize: 0, // Or a default page size if applicable
NextPageToken: "",
}, nil
}
return &model.CatalogModelArtifactList{
Items: artifacts,
Size: int32(len(artifacts)),
PageSize: int32(len(artifacts)),
NextPageToken: "",
}, nil
}
func TestGetModel(t *testing.T) {
testCases := []struct {
name string
sources map[string]catalog.CatalogSource
@ -472,7 +834,7 @@ func TestGetModel(t *testing.T) {
for _, tc := range testCases {
t.Run(tc.name, func(t *testing.T) {
// Create service with test sources
service := NewModelCatalogServiceAPIService(tc.sources)
service := NewModelCatalogServiceAPIService(catalog.NewSourceCollection(tc.sources))
// Call GetModel
resp, _ := service.GetModel(
@ -501,3 +863,106 @@ func TestGetModel(t *testing.T) {
})
}
}
func TestGetAllModelArtifacts(t *testing.T) {
testCases := []struct {
name string
sources map[string]catalog.CatalogSource
sourceID string
modelName string
expectedStatus int
expectedArtifacts []model.CatalogModelArtifact
}{
{
name: "Existing artifacts for model in source",
sources: map[string]catalog.CatalogSource{
"source1": {
Metadata: model.CatalogSource{Id: "source1", Name: "Test Source"},
Provider: &mockModelProvider{
artifacts: map[string][]model.CatalogModelArtifact{
"test-model": {
{
Uri: "s3://bucket/artifact1",
},
{
Uri: "s3://bucket/artifact2",
},
},
},
},
},
},
sourceID: "source1",
modelName: "test-model",
expectedStatus: http.StatusOK,
expectedArtifacts: []model.CatalogModelArtifact{
{
Uri: "s3://bucket/artifact1",
},
{
Uri: "s3://bucket/artifact2",
},
},
},
{
name: "Non-existing source",
sources: map[string]catalog.CatalogSource{
"source1": {
Metadata: model.CatalogSource{Id: "source1", Name: "Test Source"},
},
},
sourceID: "source2",
modelName: "test-model",
expectedStatus: http.StatusNotFound,
expectedArtifacts: nil,
},
{
name: "Existing source, no artifacts for model",
sources: map[string]catalog.CatalogSource{
"source1": {
Metadata: model.CatalogSource{Id: "source1", Name: "Test Source"},
Provider: &mockModelProvider{
artifacts: map[string][]model.CatalogModelArtifact{},
},
},
},
sourceID: "source1",
modelName: "test-model",
expectedStatus: http.StatusOK,
expectedArtifacts: []model.CatalogModelArtifact{}, // Should be an empty slice, not nil
},
}
for _, tc := range testCases {
t.Run(tc.name, func(t *testing.T) {
// Create service with test sources
service := NewModelCatalogServiceAPIService(catalog.NewSourceCollection(tc.sources))
// Call GetAllModelArtifacts
resp, _ := service.GetAllModelArtifacts(
context.Background(),
tc.sourceID,
tc.modelName,
)
// Check response status
assert.Equal(t, tc.expectedStatus, resp.Code)
// If we expect an error or not found, we don't need to check the response body
if tc.expectedStatus != http.StatusOK {
return
}
// For successful responses, check the response body
require.NotNil(t, resp.Body)
// Type assertion to access the list of artifacts
artifactList, ok := resp.Body.(*model.CatalogModelArtifactList)
require.True(t, ok, "Response body should be a CatalogModelArtifactList")
// Check the artifacts
assert.Equal(t, tc.expectedArtifacts, artifactList.Items)
assert.Equal(t, int32(len(tc.expectedArtifacts)), artifactList.Size)
})
}
}

View File

@ -0,0 +1,123 @@
package openapi
import (
"encoding/base64"
"fmt"
"strconv"
"strings"
model "github.com/kubeflow/model-registry/catalog/pkg/openapi"
)
type paginator[T model.Sortable] struct {
PageSize int32
OrderBy model.OrderByField
SortOrder model.SortOrder
cursor *stringCursor
}
func newPaginator[T model.Sortable](pageSize string, orderBy model.OrderByField, sortOrder model.SortOrder, nextPageToken string) (*paginator[T], error) {
if orderBy != "" && !orderBy.IsValid() {
return nil, fmt.Errorf("unsupported order by field: %s", orderBy)
}
if sortOrder != "" && !sortOrder.IsValid() {
return nil, fmt.Errorf("unsupported sort order field: %s", sortOrder)
}
p := &paginator[T]{
PageSize: 10, // Default page size
OrderBy: orderBy,
SortOrder: sortOrder,
}
if pageSize != "" {
pageSize64, err := strconv.ParseInt(pageSize, 10, 32)
if err != nil {
return nil, fmt.Errorf("error converting page size to int32: %w", err)
}
p.PageSize = int32(pageSize64)
}
if nextPageToken != "" {
p.cursor = decodeStringCursor(nextPageToken)
}
return p, nil
}
func (p *paginator[T]) Token() string {
if p == nil || p.cursor == nil {
return ""
}
return p.cursor.String()
}
func (p *paginator[T]) Paginate(items []T) ([]T, *paginator[T]) {
startIndex := 0
if p.cursor != nil {
for i, item := range items {
itemValue := item.SortValue(p.OrderBy)
id := item.SortValue(model.ORDERBYFIELD_ID)
if id != "" && id == p.cursor.ID && itemValue == p.cursor.Value {
startIndex = i + 1
break
}
}
}
if startIndex >= len(items) {
return []T{}, nil
}
var pagedItems []T
var next *paginator[T]
endIndex := startIndex + int(p.PageSize)
if endIndex > len(items) {
endIndex = len(items)
}
pagedItems = items[startIndex:endIndex]
if endIndex < len(items) {
lastItem := pagedItems[len(pagedItems)-1]
lastItemID := lastItem.SortValue(model.ORDERBYFIELD_ID)
if lastItemID != "" {
next = &paginator[T]{
PageSize: p.PageSize,
OrderBy: p.OrderBy,
SortOrder: p.SortOrder,
cursor: &stringCursor{
Value: lastItem.SortValue(p.OrderBy),
ID: lastItemID,
},
}
}
}
return pagedItems, next
}
type stringCursor struct {
Value string
ID string
}
func (c *stringCursor) String() string {
return base64.StdEncoding.EncodeToString([]byte(fmt.Sprintf("%s:%s", c.Value, c.ID)))
}
func decodeStringCursor(encoded string) *stringCursor {
decoded, err := base64.StdEncoding.DecodeString(encoded)
if err != nil {
// Show the first page on a bad token.
return nil
}
parts := strings.SplitN(string(decoded), ":", 2)
if len(parts) != 2 {
return nil
}
return &stringCursor{
Value: parts[0],
ID: parts[1],
}
}

View File

@ -0,0 +1,180 @@
package openapi
import (
"strconv"
"testing"
model "github.com/kubeflow/model-registry/catalog/pkg/openapi"
"github.com/stretchr/testify/assert"
)
func createCatalogSource(id int) model.CatalogSource {
return model.CatalogSource{
Id: "source" + strconv.Itoa(id),
Name: "Source " + strconv.Itoa(id),
}
}
func createCatalogSources(count int) []model.CatalogSource {
sources := make([]model.CatalogSource, count)
for i := 0; i < count; i++ {
sources[i] = createCatalogSource(i)
}
return sources
}
func TestPaginateSources(t *testing.T) {
allSources := createCatalogSources(25)
testCases := []struct {
name string
items []model.CatalogSource
pageSize string
orderBy model.OrderByField
nextPageToken string
expectedItemsCount int
expectedNextToken bool
expectedFirstID string
expectedLastID string
}{
{
name: "First page, full page",
items: allSources,
pageSize: "10",
orderBy: "ID",
nextPageToken: "",
expectedItemsCount: 10,
expectedNextToken: true,
expectedFirstID: "source0",
expectedLastID: "source9",
},
{
name: "Second page, full page",
items: allSources,
pageSize: "10",
orderBy: "ID",
nextPageToken: (&stringCursor{Value: "source9", ID: "source9"}).String(),
expectedItemsCount: 10,
expectedNextToken: true,
expectedFirstID: "source10",
expectedLastID: "source19",
},
{
name: "Last page, partial page",
items: allSources,
pageSize: "10",
orderBy: "ID",
nextPageToken: (&stringCursor{Value: "source19", ID: "source19"}).String(),
expectedItemsCount: 5,
expectedNextToken: false,
expectedFirstID: "source20",
expectedLastID: "source24",
},
{
name: "Page size larger than items",
items: allSources,
pageSize: "30",
orderBy: "ID",
nextPageToken: "",
expectedItemsCount: 25,
expectedNextToken: false,
expectedFirstID: "source0",
expectedLastID: "source24",
},
{
name: "Empty items",
items: []model.CatalogSource{},
pageSize: "10",
orderBy: "ID",
nextPageToken: "",
expectedItemsCount: 0,
expectedNextToken: false,
},
{
name: "Order by Name, first page",
items: allSources,
pageSize: "5",
orderBy: "NAME",
nextPageToken: "",
expectedItemsCount: 5,
expectedNextToken: true,
expectedFirstID: "source0",
expectedLastID: "source4",
},
{
name: "Order by Name, second page",
items: allSources,
pageSize: "5",
orderBy: "NAME",
nextPageToken: (&stringCursor{Value: "Source 4", ID: "source4"}).String(),
expectedItemsCount: 5,
expectedNextToken: true,
expectedFirstID: "source5",
expectedLastID: "source9",
},
{
name: "Invalid token",
items: allSources,
pageSize: "10",
orderBy: "ID",
nextPageToken: "invalid-token",
expectedItemsCount: 10,
expectedNextToken: true,
expectedFirstID: "source0",
expectedLastID: "source9",
},
}
for _, tc := range testCases {
t.Run(tc.name, func(t *testing.T) {
paginator, err := newPaginator[model.CatalogSource](tc.pageSize, tc.orderBy, "", tc.nextPageToken)
if !assert.NoError(t, err) {
return
}
pagedItems, newNextPageToken := paginator.Paginate(tc.items)
assert.Equal(t, tc.expectedItemsCount, len(pagedItems))
if tc.expectedNextToken {
assert.NotEmpty(t, newNextPageToken)
} else {
assert.Empty(t, newNextPageToken)
}
if tc.expectedItemsCount > 0 {
assert.Equal(t, tc.expectedFirstID, pagedItems[0].Id)
assert.Equal(t, tc.expectedLastID, pagedItems[len(pagedItems)-1].Id)
}
})
}
}
func TestPaginateSources_NoDuplicates(t *testing.T) {
allSources := createCatalogSources(100)
pageSize := "10"
orderBy := "ID"
seenItems := make(map[string]struct{}, len(allSources))
totalSeen := 0
paginator, err := newPaginator[model.CatalogSource](pageSize, model.OrderByField(orderBy), "", "")
if !assert.NoError(t, err) {
return
}
for paginator != nil {
var pagedItems []model.CatalogSource
pagedItems, paginator = paginator.Paginate(allSources)
for _, item := range pagedItems {
if _, ok := seenItems[item.Id]; ok {
t.Errorf("Duplicate item found: %s", item.Id)
}
seenItems[item.Id] = struct{}{}
}
totalSeen += len(pagedItems)
}
assert.Equal(t, len(allSources), totalSeen, "Total number of items seen should match the original slice")
}

View File

@ -16,6 +16,16 @@ import (
model "github.com/kubeflow/model-registry/catalog/pkg/openapi"
)
// AssertArtifactTypeQueryParamConstraints checks if the values respects the defined constraints
func AssertArtifactTypeQueryParamConstraints(obj model.ArtifactTypeQueryParam) error {
return nil
}
// AssertArtifactTypeQueryParamRequired checks if the required fields are not zero-ed
func AssertArtifactTypeQueryParamRequired(obj model.ArtifactTypeQueryParam) error {
return nil
}
// AssertBaseModelConstraints checks if the values respects the defined constraints
func AssertBaseModelConstraints(obj model.BaseModel) error {
return nil

View File

@ -1,6 +1,7 @@
api_model_catalog_service.go
client.go
configuration.go
model_artifact_type_query_param.go
model_base_model.go
model_base_resource_dates.go
model_base_resource_list.go

View File

@ -33,7 +33,7 @@ type ApiFindModelsRequest struct {
nextPageToken *string
}
// Filter models by source. If not provided, models from all sources are returned. If multiple sources are provided, models from any of the sources are returned.
// Filter models by source. This parameter is currently required and may only be specified once.
func (r ApiFindModelsRequest) Source(source string) ApiFindModelsRequest {
r.source = &source
return r
@ -107,10 +107,11 @@ func (a *ModelCatalogServiceAPIService) FindModelsExecute(r ApiFindModelsRequest
localVarHeaderParams := make(map[string]string)
localVarQueryParams := url.Values{}
localVarFormParams := url.Values{}
if r.source != nil {
parameterAddToHeaderOrQuery(localVarQueryParams, "source", r.source, "")
if r.source == nil {
return localVarReturnValue, nil, reportError("source is required and must be specified")
}
parameterAddToHeaderOrQuery(localVarQueryParams, "source", r.source, "")
if r.q != nil {
parameterAddToHeaderOrQuery(localVarQueryParams, "q", r.q, "")
}

View File

@ -0,0 +1,116 @@
/*
Model Catalog REST API
REST API for Model Registry to create and manage ML model metadata
API version: v1alpha1
*/
// Code generated by OpenAPI Generator (https://openapi-generator.tech); DO NOT EDIT.
package openapi
import (
"encoding/json"
"fmt"
)
// ArtifactTypeQueryParam Supported artifact types for querying.
type ArtifactTypeQueryParam string
// List of ArtifactTypeQueryParam
const (
ARTIFACTTYPEQUERYPARAM_MODEL_ARTIFACT ArtifactTypeQueryParam = "model-artifact"
ARTIFACTTYPEQUERYPARAM_DOC_ARTIFACT ArtifactTypeQueryParam = "doc-artifact"
ARTIFACTTYPEQUERYPARAM_DATASET_ARTIFACT ArtifactTypeQueryParam = "dataset-artifact"
ARTIFACTTYPEQUERYPARAM_METRIC ArtifactTypeQueryParam = "metric"
ARTIFACTTYPEQUERYPARAM_PARAMETER ArtifactTypeQueryParam = "parameter"
)
// All allowed values of ArtifactTypeQueryParam enum
var AllowedArtifactTypeQueryParamEnumValues = []ArtifactTypeQueryParam{
"model-artifact",
"doc-artifact",
"dataset-artifact",
"metric",
"parameter",
}
func (v *ArtifactTypeQueryParam) UnmarshalJSON(src []byte) error {
var value string
err := json.Unmarshal(src, &value)
if err != nil {
return err
}
enumTypeValue := ArtifactTypeQueryParam(value)
for _, existing := range AllowedArtifactTypeQueryParamEnumValues {
if existing == enumTypeValue {
*v = enumTypeValue
return nil
}
}
return fmt.Errorf("%+v is not a valid ArtifactTypeQueryParam", value)
}
// NewArtifactTypeQueryParamFromValue returns a pointer to a valid ArtifactTypeQueryParam
// for the value passed as argument, or an error if the value passed is not allowed by the enum
func NewArtifactTypeQueryParamFromValue(v string) (*ArtifactTypeQueryParam, error) {
ev := ArtifactTypeQueryParam(v)
if ev.IsValid() {
return &ev, nil
} else {
return nil, fmt.Errorf("invalid value '%v' for ArtifactTypeQueryParam: valid values are %v", v, AllowedArtifactTypeQueryParamEnumValues)
}
}
// IsValid return true if the value is valid for the enum, false otherwise
func (v ArtifactTypeQueryParam) IsValid() bool {
for _, existing := range AllowedArtifactTypeQueryParamEnumValues {
if existing == v {
return true
}
}
return false
}
// Ptr returns reference to ArtifactTypeQueryParam value
func (v ArtifactTypeQueryParam) Ptr() *ArtifactTypeQueryParam {
return &v
}
type NullableArtifactTypeQueryParam struct {
value *ArtifactTypeQueryParam
isSet bool
}
func (v NullableArtifactTypeQueryParam) Get() *ArtifactTypeQueryParam {
return v.value
}
func (v *NullableArtifactTypeQueryParam) Set(val *ArtifactTypeQueryParam) {
v.value = val
v.isSet = true
}
func (v NullableArtifactTypeQueryParam) IsSet() bool {
return v.isSet
}
func (v *NullableArtifactTypeQueryParam) Unset() {
v.value = nil
v.isSet = false
}
func NewNullableArtifactTypeQueryParam(val *ArtifactTypeQueryParam) *NullableArtifactTypeQueryParam {
return &NullableArtifactTypeQueryParam{value: val, isSet: true}
}
func (v NullableArtifactTypeQueryParam) MarshalJSON() ([]byte, error) {
return json.Marshal(v.value)
}
func (v *NullableArtifactTypeQueryParam) UnmarshalJSON(src []byte) error {
v.isSet = true
return json.Unmarshal(src, &v.value)
}

View File

@ -23,6 +23,8 @@ type CatalogSource struct {
Id string `json:"id"`
// The name of the catalog source.
Name string `json:"name"`
// Whether the catalog source is enabled.
Enabled *bool `json:"enabled,omitempty"`
}
// NewCatalogSource instantiates a new CatalogSource object
@ -33,6 +35,8 @@ func NewCatalogSource(id string, name string) *CatalogSource {
this := CatalogSource{}
this.Id = id
this.Name = name
var enabled bool = true
this.Enabled = &enabled
return &this
}
@ -41,6 +45,8 @@ func NewCatalogSource(id string, name string) *CatalogSource {
// but it doesn't guarantee that properties required by API are set
func NewCatalogSourceWithDefaults() *CatalogSource {
this := CatalogSource{}
var enabled bool = true
this.Enabled = &enabled
return &this
}
@ -92,6 +98,38 @@ func (o *CatalogSource) SetName(v string) {
o.Name = v
}
// GetEnabled returns the Enabled field value if set, zero value otherwise.
func (o *CatalogSource) GetEnabled() bool {
if o == nil || IsNil(o.Enabled) {
var ret bool
return ret
}
return *o.Enabled
}
// GetEnabledOk returns a tuple with the Enabled field value if set, nil otherwise
// and a boolean to check if the value has been set.
func (o *CatalogSource) GetEnabledOk() (*bool, bool) {
if o == nil || IsNil(o.Enabled) {
return nil, false
}
return o.Enabled, true
}
// HasEnabled returns a boolean if a field has been set.
func (o *CatalogSource) HasEnabled() bool {
if o != nil && !IsNil(o.Enabled) {
return true
}
return false
}
// SetEnabled gets a reference to the given bool and assigns it to the Enabled field.
func (o *CatalogSource) SetEnabled(v bool) {
o.Enabled = &v
}
func (o CatalogSource) MarshalJSON() ([]byte, error) {
toSerialize, err := o.ToMap()
if err != nil {
@ -104,6 +142,9 @@ func (o CatalogSource) ToMap() (map[string]interface{}, error) {
toSerialize := map[string]interface{}{}
toSerialize["id"] = o.Id
toSerialize["name"] = o.Name
if !IsNil(o.Enabled) {
toSerialize["enabled"] = o.Enabled
}
return toSerialize, nil
}

View File

@ -0,0 +1,37 @@
package openapi
type Sortable interface {
// SortValue returns the value of a requested field converted to a string.
SortValue(field OrderByField) string
}
func (s CatalogSource) SortValue(field OrderByField) string {
switch field {
case ORDERBYFIELD_ID:
return s.Id
case ORDERBYFIELD_NAME:
return s.Name
}
return ""
}
func (m CatalogModel) SortValue(field OrderByField) string {
switch field {
case ORDERBYFIELD_ID:
return m.Name // Name is ID for models
case ORDERBYFIELD_NAME:
return m.Name
case ORDERBYFIELD_LAST_UPDATE_TIME:
return unrefString(m.LastUpdateTimeSinceEpoch)
case ORDERBYFIELD_CREATE_TIME:
return unrefString(m.CreateTimeSinceEpoch)
}
return ""
}
func unrefString(v *string) string {
if v == nil {
return ""
}
return *v
}

View File

@ -6,4 +6,4 @@
/.python-version
__pycache__/
venv/
.port-forwards.pid
.hypothesis/

View File

@ -1,7 +1,8 @@
all: install tidy
IMG_VERSION ?= latest
IMG ?= ghcr.io/kubeflow/model-registry/server:${IMG_VERSION}
IMG ?= ghcr.io/kubeflow/model-registry/server
BUILD_IMAGE ?= true # whether to build the MR server image
.PHONY: install
install:
@ -16,7 +17,11 @@ clean:
.PHONY: deploy-latest-mr
deploy-latest-mr:
cd ../../ && IMG_VERSION=${IMG_VERSION} IMG=${IMG} make image/build ARGS="--load$(if ${DEV_BUILD}, --target dev-build)" && LOCAL=1 ./scripts/deploy_on_kind.sh
cd ../../ && \
$(if $(filter true,$(BUILD_IMAGE)),\
IMG_VERSION=${IMG_VERSION} IMG=${IMG} make image/build ARGS="--load$(if ${DEV_BUILD}, --target dev-build)" && \
) \
LOCAL=1 ./scripts/deploy_on_kind.sh
kubectl port-forward -n kubeflow services/model-registry-service 8080:8080 & echo $$! >> .port-forwards.pid
.PHONY: deploy-test-minio
@ -36,6 +41,16 @@ test-e2e: deploy-latest-mr deploy-local-registry deploy-test-minio
$(MAKE) test-e2e-cleanup
@exit $$STATUS
.PHONY: test-fuzz
test-fuzz: deploy-latest-mr deploy-local-registry deploy-test-minio
@echo "Starting test-fuzz"
poetry install --all-extras
@set -a; . ../../scripts/manifests/minio/.env; set +a; \
poetry run pytest --fuzz -v -s --hypothesis-show-statistics
@rm -f ../../scripts/manifests/minio/.env
$(MAKE) test-e2e-cleanup
@exit $$STATUS
.PHONY: test-e2e-run
test-e2e-run:
@echo "Ensuring all extras are installed..."
@ -47,6 +62,8 @@ test-e2e-run:
.PHONY: test-e2e-cleanup
test-e2e-cleanup:
@echo "Cleaning up database..."
cd ../../ && ./scripts/cleanup.sh
@echo "Cleaning up port-forward processes..."
@if [ -f .port-forwards.pid ]; then \
kill $$(cat .port-forwards.pid) || true; \

View File

@ -345,6 +345,12 @@ Then you can run tests:
make test test-e2e
```
Then you can run fuzz tests:
```bash
make test-fuzz
```
### Using Nox
Common tasks, such as building documentation and running tests, can be executed using [`nox`](https://github.com/wntrblm/nox) sessions.

View File

@ -60,6 +60,7 @@ def tests(session: Session) -> None:
"pytest-asyncio",
"uvloop",
"olot",
"schemathesis",
)
session.run(
"pytest",
@ -83,6 +84,7 @@ def e2e_tests(session: Session) -> None:
"boto3",
"olot",
"uvloop",
"schemathesis",
)
try:
session.run(
@ -99,6 +101,22 @@ def e2e_tests(session: Session) -> None:
session.notify("coverage", posargs=[])
@session(name="fuzz", python=python_versions)
def fuzz_tests(session: Session) -> None:
"""Run the fuzzing tests."""
session.install(
".",
"requests",
"pytest",
"uvloop",
"olot",
"schemathesis",
)
session.run(
"pytest",
"--fuzz",
"-rA",
)
@session(python=python_versions[0])
def coverage(session: Session) -> None:
"""Produce the coverage report."""

File diff suppressed because it is too large Load Diff

View File

@ -1,6 +1,6 @@
[tool.poetry]
name = "model-registry"
version = "0.2.20"
version = "0.3.0"
description = "Client for Kubeflow Model Registry"
authors = ["Isabella Basso do Amaral <idoamara@redhat.com>"]
license = "Apache-2.0"
@ -26,7 +26,7 @@ nest-asyncio = "^1.6.0"
# necessary for modern type annotations using pydantic on 3.9
eval-type-backport = "^0.2.0"
huggingface-hub = { version = ">=0.20.1,<0.34.0", optional = true }
huggingface-hub = { version = ">=0.20.1,<0.35.0", optional = true }
olot = { version = "^0.1.6", optional = true }
boto3 = { version = "^1.37.34", optional = true }
@ -40,7 +40,7 @@ optional = true
[tool.poetry.group.docs.dependencies]
sphinx = "^7.2.6"
furo = ">=2023.9.10,<2025.0.0"
furo = ">=2023.9.10,<2026.0.0"
myst-parser = { extras = ["linkify"], version = ">=2,<4" }
sphinx-autobuild = ">=2021.3.14,<2025.0.0"
@ -55,11 +55,12 @@ ray = [
{version = "^2.43.0", python = ">=3.9, <3.13"}
]
uvloop = "^0.21.0"
pytest-asyncio = ">=0.23.7,<0.27.0"
pytest-asyncio = "^1.1.0"
requests = "^2.32.2"
black = ">=24.4.2,<26.0.0"
types-python-dateutil = "^2.9.0.20240906"
pytest-html = "^4.1.1"
schemathesis = ">=4.0.3"
[tool.coverage.run]
branch = true
@ -81,7 +82,10 @@ line-length = 119
[tool.pytest.ini_options]
asyncio_mode = "auto"
markers = ["e2e: end-to-end testing"]
markers = [
"e2e: end-to-end testing",
"fuzz: mark a test as a fuzzing (property-based or randomized) test"
]
[tool.ruff]
target-version = "py39"

View File

@ -0,0 +1,5 @@
base-url = "${API_HOST}"
[generation]
# Don't shrink failing examples to save time
no-shrink = true

View File

@ -10,6 +10,7 @@ mr_openapi/models/artifact.py
mr_openapi/models/artifact_create.py
mr_openapi/models/artifact_list.py
mr_openapi/models/artifact_state.py
mr_openapi/models/artifact_type_query_param.py
mr_openapi/models/artifact_update.py
mr_openapi/models/base_model.py
mr_openapi/models/base_resource.py
@ -17,11 +18,25 @@ mr_openapi/models/base_resource_create.py
mr_openapi/models/base_resource_dates.py
mr_openapi/models/base_resource_list.py
mr_openapi/models/base_resource_update.py
mr_openapi/models/data_set.py
mr_openapi/models/data_set_create.py
mr_openapi/models/data_set_update.py
mr_openapi/models/doc_artifact.py
mr_openapi/models/doc_artifact_create.py
mr_openapi/models/doc_artifact_update.py
mr_openapi/models/error.py
mr_openapi/models/execution_state.py
mr_openapi/models/experiment.py
mr_openapi/models/experiment_create.py
mr_openapi/models/experiment_list.py
mr_openapi/models/experiment_run.py
mr_openapi/models/experiment_run_create.py
mr_openapi/models/experiment_run_list.py
mr_openapi/models/experiment_run_state.py
mr_openapi/models/experiment_run_status.py
mr_openapi/models/experiment_run_update.py
mr_openapi/models/experiment_state.py
mr_openapi/models/experiment_update.py
mr_openapi/models/inference_service.py
mr_openapi/models/inference_service_create.py
mr_openapi/models/inference_service_list.py
@ -34,6 +49,10 @@ mr_openapi/models/metadata_proto_value.py
mr_openapi/models/metadata_string_value.py
mr_openapi/models/metadata_struct_value.py
mr_openapi/models/metadata_value.py
mr_openapi/models/metric.py
mr_openapi/models/metric_create.py
mr_openapi/models/metric_list.py
mr_openapi/models/metric_update.py
mr_openapi/models/model_artifact.py
mr_openapi/models/model_artifact_create.py
mr_openapi/models/model_artifact_list.py
@ -44,6 +63,10 @@ mr_openapi/models/model_version_list.py
mr_openapi/models/model_version_state.py
mr_openapi/models/model_version_update.py
mr_openapi/models/order_by_field.py
mr_openapi/models/parameter.py
mr_openapi/models/parameter_create.py
mr_openapi/models/parameter_type.py
mr_openapi/models/parameter_update.py
mr_openapi/models/registered_model.py
mr_openapi/models/registered_model_create.py
mr_openapi/models/registered_model_list.py

View File

@ -1,6 +1,6 @@
"""Main package for the Kubeflow model registry."""
__version__ = "0.2.20"
__version__ = "0.3.0"
from ._client import ModelRegistry

View File

@ -22,6 +22,8 @@ from mr_openapi import (
)
from mr_openapi import (
ArtifactState,
DocArtifactCreate,
DocArtifactUpdate,
ModelArtifactCreate,
ModelArtifactUpdate,
)
@ -47,7 +49,7 @@ class Artifact(BaseResourceModel, ABC):
"""
name: str | None = None
uri: str
uri: str | None = None
state: ArtifactState = ArtifactState.UNKNOWN
@classmethod
@ -87,11 +89,23 @@ class DocArtifact(Artifact):
@override
def create(self, **kwargs) -> Any:
raise NotImplementedError
"""Create a new DocArtifactCreate object."""
return DocArtifactCreate(
customProperties=self._map_custom_properties(),
**self._props_as_dict(exclude=("id", "custom_properties")),
artifactType="doc-artifact",
**kwargs,
)
@override
def update(self, **kwargs) -> Any:
raise NotImplementedError
"""Create a new DocArtifactUpdate object."""
return DocArtifactUpdate(
customProperties=self._map_custom_properties(),
**self._props_as_dict(exclude=("id", "name", "custom_properties")),
artifactType="doc-artifact",
**kwargs,
)
@override
def as_basemodel(self) -> DocArtifactBaseModel:
@ -105,7 +119,6 @@ class DocArtifact(Artifact):
@override
def from_basemodel(cls, source: DocArtifactBaseModel) -> DocArtifact:
assert source.name
assert source.uri
assert source.state
return cls(
id=source.id,
@ -189,7 +202,6 @@ class ModelArtifact(Artifact):
def from_basemodel(cls, source: ModelArtifactBaseModel) -> ModelArtifact:
"""Create a new ModelArtifact object from a BaseModel object."""
assert source.name
assert source.uri
assert source.state
return cls(
id=source.id,

View File

@ -36,8 +36,9 @@ class ModelVersion(BaseResourceModel):
Attributes:
name: Name of this version.
author: Author of the model version.
description: Description of the object.
author: Author of this model version.
state: Status of this model version.
description: Description of this object.
external_id: Customizable ID. Has to be unique among instances of the same type.
artifacts: Artifacts associated with this version.
"""
@ -45,6 +46,7 @@ class ModelVersion(BaseResourceModel):
name: str
author: str | None = None
state: ModelVersionState = ModelVersionState.LIVE
registered_model_id: str | None = None
@override
def create(self, *, registered_model_id: str, **kwargs) -> ModelVersionCreate: # type: ignore[override]
@ -75,6 +77,7 @@ class ModelVersion(BaseResourceModel):
author=source.author,
description=source.description,
external_id=source.external_id,
registered_model_id=source.registered_model_id,
create_time_since_epoch=source.create_time_since_epoch,
last_update_time_since_epoch=source.last_update_time_since_epoch,
custom_properties=cls._unmap_custom_properties(source.custom_properties)

View File

@ -75,6 +75,9 @@ Class | Method | HTTP request | Description
------------ | ------------- | ------------- | -------------
*ModelRegistryServiceApi* | [**create_artifact**](mr_openapi/docs/ModelRegistryServiceApi.md#create_artifact) | **POST** /api/model_registry/v1alpha3/artifacts | Create an Artifact
*ModelRegistryServiceApi* | [**create_environment_inference_service**](mr_openapi/docs/ModelRegistryServiceApi.md#create_environment_inference_service) | **POST** /api/model_registry/v1alpha3/serving_environments/{servingenvironmentId}/inference_services | Create a InferenceService in ServingEnvironment
*ModelRegistryServiceApi* | [**create_experiment**](mr_openapi/docs/ModelRegistryServiceApi.md#create_experiment) | **POST** /api/model_registry/v1alpha3/experiments | Create an Experiment
*ModelRegistryServiceApi* | [**create_experiment_experiment_run**](mr_openapi/docs/ModelRegistryServiceApi.md#create_experiment_experiment_run) | **POST** /api/model_registry/v1alpha3/experiments/{experimentId}/experiment_runs | Create an ExperimentRun in Experiment
*ModelRegistryServiceApi* | [**create_experiment_run**](mr_openapi/docs/ModelRegistryServiceApi.md#create_experiment_run) | **POST** /api/model_registry/v1alpha3/experiment_runs | Create an ExperimentRun
*ModelRegistryServiceApi* | [**create_inference_service**](mr_openapi/docs/ModelRegistryServiceApi.md#create_inference_service) | **POST** /api/model_registry/v1alpha3/inference_services | Create a InferenceService
*ModelRegistryServiceApi* | [**create_inference_service_serve**](mr_openapi/docs/ModelRegistryServiceApi.md#create_inference_service_serve) | **POST** /api/model_registry/v1alpha3/inference_services/{inferenceserviceId}/serves | Create a ServeModel action in a InferenceService
*ModelRegistryServiceApi* | [**create_model_artifact**](mr_openapi/docs/ModelRegistryServiceApi.md#create_model_artifact) | **POST** /api/model_registry/v1alpha3/model_artifacts | Create a ModelArtifact
@ -83,6 +86,8 @@ Class | Method | HTTP request | Description
*ModelRegistryServiceApi* | [**create_registered_model_version**](mr_openapi/docs/ModelRegistryServiceApi.md#create_registered_model_version) | **POST** /api/model_registry/v1alpha3/registered_models/{registeredmodelId}/versions | Create a ModelVersion in RegisteredModel
*ModelRegistryServiceApi* | [**create_serving_environment**](mr_openapi/docs/ModelRegistryServiceApi.md#create_serving_environment) | **POST** /api/model_registry/v1alpha3/serving_environments | Create a ServingEnvironment
*ModelRegistryServiceApi* | [**find_artifact**](mr_openapi/docs/ModelRegistryServiceApi.md#find_artifact) | **GET** /api/model_registry/v1alpha3/artifact | Get an Artifact that matches search parameters.
*ModelRegistryServiceApi* | [**find_experiment**](mr_openapi/docs/ModelRegistryServiceApi.md#find_experiment) | **GET** /api/model_registry/v1alpha3/experiment | Get an Experiment that matches search parameters.
*ModelRegistryServiceApi* | [**find_experiment_run**](mr_openapi/docs/ModelRegistryServiceApi.md#find_experiment_run) | **GET** /api/model_registry/v1alpha3/experiment_run | Get an ExperimentRun that matches search parameters.
*ModelRegistryServiceApi* | [**find_inference_service**](mr_openapi/docs/ModelRegistryServiceApi.md#find_inference_service) | **GET** /api/model_registry/v1alpha3/inference_service | Get an InferenceServices that matches search parameters.
*ModelRegistryServiceApi* | [**find_model_artifact**](mr_openapi/docs/ModelRegistryServiceApi.md#find_model_artifact) | **GET** /api/model_registry/v1alpha3/model_artifact | Get a ModelArtifact that matches search parameters.
*ModelRegistryServiceApi* | [**find_model_version**](mr_openapi/docs/ModelRegistryServiceApi.md#find_model_version) | **GET** /api/model_registry/v1alpha3/model_version | Get a ModelVersion that matches search parameters.
@ -91,6 +96,13 @@ Class | Method | HTTP request | Description
*ModelRegistryServiceApi* | [**get_artifact**](mr_openapi/docs/ModelRegistryServiceApi.md#get_artifact) | **GET** /api/model_registry/v1alpha3/artifacts/{id} | Get an Artifact
*ModelRegistryServiceApi* | [**get_artifacts**](mr_openapi/docs/ModelRegistryServiceApi.md#get_artifacts) | **GET** /api/model_registry/v1alpha3/artifacts | List All Artifacts
*ModelRegistryServiceApi* | [**get_environment_inference_services**](mr_openapi/docs/ModelRegistryServiceApi.md#get_environment_inference_services) | **GET** /api/model_registry/v1alpha3/serving_environments/{servingenvironmentId}/inference_services | List All ServingEnvironment&#39;s InferenceServices
*ModelRegistryServiceApi* | [**get_experiment**](mr_openapi/docs/ModelRegistryServiceApi.md#get_experiment) | **GET** /api/model_registry/v1alpha3/experiments/{experimentId} | Get an Experiment
*ModelRegistryServiceApi* | [**get_experiment_experiment_runs**](mr_openapi/docs/ModelRegistryServiceApi.md#get_experiment_experiment_runs) | **GET** /api/model_registry/v1alpha3/experiments/{experimentId}/experiment_runs | List All Experiment&#39;s ExperimentRuns
*ModelRegistryServiceApi* | [**get_experiment_run**](mr_openapi/docs/ModelRegistryServiceApi.md#get_experiment_run) | **GET** /api/model_registry/v1alpha3/experiment_runs/{experimentrunId} | Get an ExperimentRun
*ModelRegistryServiceApi* | [**get_experiment_run_artifacts**](mr_openapi/docs/ModelRegistryServiceApi.md#get_experiment_run_artifacts) | **GET** /api/model_registry/v1alpha3/experiment_runs/{experimentrunId}/artifacts | List all artifacts associated with the &#x60;ExperimentRun&#x60;
*ModelRegistryServiceApi* | [**get_experiment_run_metric_history**](mr_openapi/docs/ModelRegistryServiceApi.md#get_experiment_run_metric_history) | **GET** /api/model_registry/v1alpha3/experiment_runs/{experimentrunId}/metric_history | Get metric history for an ExperimentRun
*ModelRegistryServiceApi* | [**get_experiment_runs**](mr_openapi/docs/ModelRegistryServiceApi.md#get_experiment_runs) | **GET** /api/model_registry/v1alpha3/experiment_runs | List All ExperimentRuns
*ModelRegistryServiceApi* | [**get_experiments**](mr_openapi/docs/ModelRegistryServiceApi.md#get_experiments) | **GET** /api/model_registry/v1alpha3/experiments | List All Experiments
*ModelRegistryServiceApi* | [**get_inference_service**](mr_openapi/docs/ModelRegistryServiceApi.md#get_inference_service) | **GET** /api/model_registry/v1alpha3/inference_services/{inferenceserviceId} | Get a InferenceService
*ModelRegistryServiceApi* | [**get_inference_service_model**](mr_openapi/docs/ModelRegistryServiceApi.md#get_inference_service_model) | **GET** /api/model_registry/v1alpha3/inference_services/{inferenceserviceId}/model | Get InferenceService&#39;s RegisteredModel
*ModelRegistryServiceApi* | [**get_inference_service_serves**](mr_openapi/docs/ModelRegistryServiceApi.md#get_inference_service_serves) | **GET** /api/model_registry/v1alpha3/inference_services/{inferenceserviceId}/serves | List All InferenceService&#39;s ServeModel actions
@ -107,11 +119,14 @@ Class | Method | HTTP request | Description
*ModelRegistryServiceApi* | [**get_serving_environment**](mr_openapi/docs/ModelRegistryServiceApi.md#get_serving_environment) | **GET** /api/model_registry/v1alpha3/serving_environments/{servingenvironmentId} | Get a ServingEnvironment
*ModelRegistryServiceApi* | [**get_serving_environments**](mr_openapi/docs/ModelRegistryServiceApi.md#get_serving_environments) | **GET** /api/model_registry/v1alpha3/serving_environments | List All ServingEnvironments
*ModelRegistryServiceApi* | [**update_artifact**](mr_openapi/docs/ModelRegistryServiceApi.md#update_artifact) | **PATCH** /api/model_registry/v1alpha3/artifacts/{id} | Update an Artifact
*ModelRegistryServiceApi* | [**update_experiment**](mr_openapi/docs/ModelRegistryServiceApi.md#update_experiment) | **PATCH** /api/model_registry/v1alpha3/experiments/{experimentId} | Update an Experiment
*ModelRegistryServiceApi* | [**update_experiment_run**](mr_openapi/docs/ModelRegistryServiceApi.md#update_experiment_run) | **PATCH** /api/model_registry/v1alpha3/experiment_runs/{experimentrunId} | Update an ExperimentRun
*ModelRegistryServiceApi* | [**update_inference_service**](mr_openapi/docs/ModelRegistryServiceApi.md#update_inference_service) | **PATCH** /api/model_registry/v1alpha3/inference_services/{inferenceserviceId} | Update a InferenceService
*ModelRegistryServiceApi* | [**update_model_artifact**](mr_openapi/docs/ModelRegistryServiceApi.md#update_model_artifact) | **PATCH** /api/model_registry/v1alpha3/model_artifacts/{modelartifactId} | Update a ModelArtifact
*ModelRegistryServiceApi* | [**update_model_version**](mr_openapi/docs/ModelRegistryServiceApi.md#update_model_version) | **PATCH** /api/model_registry/v1alpha3/model_versions/{modelversionId} | Update a ModelVersion
*ModelRegistryServiceApi* | [**update_registered_model**](mr_openapi/docs/ModelRegistryServiceApi.md#update_registered_model) | **PATCH** /api/model_registry/v1alpha3/registered_models/{registeredmodelId} | Update a RegisteredModel
*ModelRegistryServiceApi* | [**update_serving_environment**](mr_openapi/docs/ModelRegistryServiceApi.md#update_serving_environment) | **PATCH** /api/model_registry/v1alpha3/serving_environments/{servingenvironmentId} | Update a ServingEnvironment
*ModelRegistryServiceApi* | [**upsert_experiment_run_artifact**](mr_openapi/docs/ModelRegistryServiceApi.md#upsert_experiment_run_artifact) | **POST** /api/model_registry/v1alpha3/experiment_runs/{experimentrunId}/artifacts | Upsert an Artifact in an ExperimentRun
*ModelRegistryServiceApi* | [**upsert_model_version_artifact**](mr_openapi/docs/ModelRegistryServiceApi.md#upsert_model_version_artifact) | **POST** /api/model_registry/v1alpha3/model_versions/{modelversionId}/artifacts | Upsert an Artifact in a ModelVersion
@ -121,6 +136,7 @@ Class | Method | HTTP request | Description
- [ArtifactCreate](mr_openapi/docs/ArtifactCreate.md)
- [ArtifactList](mr_openapi/docs/ArtifactList.md)
- [ArtifactState](mr_openapi/docs/ArtifactState.md)
- [ArtifactTypeQueryParam](mr_openapi/docs/ArtifactTypeQueryParam.md)
- [ArtifactUpdate](mr_openapi/docs/ArtifactUpdate.md)
- [BaseModel](mr_openapi/docs/BaseModel.md)
- [BaseResource](mr_openapi/docs/BaseResource.md)
@ -128,11 +144,25 @@ Class | Method | HTTP request | Description
- [BaseResourceDates](mr_openapi/docs/BaseResourceDates.md)
- [BaseResourceList](mr_openapi/docs/BaseResourceList.md)
- [BaseResourceUpdate](mr_openapi/docs/BaseResourceUpdate.md)
- [DataSet](mr_openapi/docs/DataSet.md)
- [DataSetCreate](mr_openapi/docs/DataSetCreate.md)
- [DataSetUpdate](mr_openapi/docs/DataSetUpdate.md)
- [DocArtifact](mr_openapi/docs/DocArtifact.md)
- [DocArtifactCreate](mr_openapi/docs/DocArtifactCreate.md)
- [DocArtifactUpdate](mr_openapi/docs/DocArtifactUpdate.md)
- [Error](mr_openapi/docs/Error.md)
- [ExecutionState](mr_openapi/docs/ExecutionState.md)
- [Experiment](mr_openapi/docs/Experiment.md)
- [ExperimentCreate](mr_openapi/docs/ExperimentCreate.md)
- [ExperimentList](mr_openapi/docs/ExperimentList.md)
- [ExperimentRun](mr_openapi/docs/ExperimentRun.md)
- [ExperimentRunCreate](mr_openapi/docs/ExperimentRunCreate.md)
- [ExperimentRunList](mr_openapi/docs/ExperimentRunList.md)
- [ExperimentRunState](mr_openapi/docs/ExperimentRunState.md)
- [ExperimentRunStatus](mr_openapi/docs/ExperimentRunStatus.md)
- [ExperimentRunUpdate](mr_openapi/docs/ExperimentRunUpdate.md)
- [ExperimentState](mr_openapi/docs/ExperimentState.md)
- [ExperimentUpdate](mr_openapi/docs/ExperimentUpdate.md)
- [InferenceService](mr_openapi/docs/InferenceService.md)
- [InferenceServiceCreate](mr_openapi/docs/InferenceServiceCreate.md)
- [InferenceServiceList](mr_openapi/docs/InferenceServiceList.md)
@ -145,6 +175,10 @@ Class | Method | HTTP request | Description
- [MetadataStringValue](mr_openapi/docs/MetadataStringValue.md)
- [MetadataStructValue](mr_openapi/docs/MetadataStructValue.md)
- [MetadataValue](mr_openapi/docs/MetadataValue.md)
- [Metric](mr_openapi/docs/Metric.md)
- [MetricCreate](mr_openapi/docs/MetricCreate.md)
- [MetricList](mr_openapi/docs/MetricList.md)
- [MetricUpdate](mr_openapi/docs/MetricUpdate.md)
- [ModelArtifact](mr_openapi/docs/ModelArtifact.md)
- [ModelArtifactCreate](mr_openapi/docs/ModelArtifactCreate.md)
- [ModelArtifactList](mr_openapi/docs/ModelArtifactList.md)
@ -155,6 +189,10 @@ Class | Method | HTTP request | Description
- [ModelVersionState](mr_openapi/docs/ModelVersionState.md)
- [ModelVersionUpdate](mr_openapi/docs/ModelVersionUpdate.md)
- [OrderByField](mr_openapi/docs/OrderByField.md)
- [Parameter](mr_openapi/docs/Parameter.md)
- [ParameterCreate](mr_openapi/docs/ParameterCreate.md)
- [ParameterType](mr_openapi/docs/ParameterType.md)
- [ParameterUpdate](mr_openapi/docs/ParameterUpdate.md)
- [RegisteredModel](mr_openapi/docs/RegisteredModel.md)
- [RegisteredModelCreate](mr_openapi/docs/RegisteredModelCreate.md)
- [RegisteredModelList](mr_openapi/docs/RegisteredModelList.md)

View File

@ -35,6 +35,7 @@ from mr_openapi.models.artifact import Artifact
from mr_openapi.models.artifact_create import ArtifactCreate
from mr_openapi.models.artifact_list import ArtifactList
from mr_openapi.models.artifact_state import ArtifactState
from mr_openapi.models.artifact_type_query_param import ArtifactTypeQueryParam
from mr_openapi.models.artifact_update import ArtifactUpdate
from mr_openapi.models.base_model import BaseModel
from mr_openapi.models.base_resource import BaseResource
@ -42,11 +43,25 @@ from mr_openapi.models.base_resource_create import BaseResourceCreate
from mr_openapi.models.base_resource_dates import BaseResourceDates
from mr_openapi.models.base_resource_list import BaseResourceList
from mr_openapi.models.base_resource_update import BaseResourceUpdate
from mr_openapi.models.data_set import DataSet
from mr_openapi.models.data_set_create import DataSetCreate
from mr_openapi.models.data_set_update import DataSetUpdate
from mr_openapi.models.doc_artifact import DocArtifact
from mr_openapi.models.doc_artifact_create import DocArtifactCreate
from mr_openapi.models.doc_artifact_update import DocArtifactUpdate
from mr_openapi.models.error import Error
from mr_openapi.models.execution_state import ExecutionState
from mr_openapi.models.experiment import Experiment
from mr_openapi.models.experiment_create import ExperimentCreate
from mr_openapi.models.experiment_list import ExperimentList
from mr_openapi.models.experiment_run import ExperimentRun
from mr_openapi.models.experiment_run_create import ExperimentRunCreate
from mr_openapi.models.experiment_run_list import ExperimentRunList
from mr_openapi.models.experiment_run_state import ExperimentRunState
from mr_openapi.models.experiment_run_status import ExperimentRunStatus
from mr_openapi.models.experiment_run_update import ExperimentRunUpdate
from mr_openapi.models.experiment_state import ExperimentState
from mr_openapi.models.experiment_update import ExperimentUpdate
from mr_openapi.models.inference_service import InferenceService
from mr_openapi.models.inference_service_create import InferenceServiceCreate
from mr_openapi.models.inference_service_list import InferenceServiceList
@ -59,6 +74,10 @@ from mr_openapi.models.metadata_proto_value import MetadataProtoValue
from mr_openapi.models.metadata_string_value import MetadataStringValue
from mr_openapi.models.metadata_struct_value import MetadataStructValue
from mr_openapi.models.metadata_value import MetadataValue
from mr_openapi.models.metric import Metric
from mr_openapi.models.metric_create import MetricCreate
from mr_openapi.models.metric_list import MetricList
from mr_openapi.models.metric_update import MetricUpdate
from mr_openapi.models.model_artifact import ModelArtifact
from mr_openapi.models.model_artifact_create import ModelArtifactCreate
from mr_openapi.models.model_artifact_list import ModelArtifactList
@ -69,6 +88,10 @@ from mr_openapi.models.model_version_list import ModelVersionList
from mr_openapi.models.model_version_state import ModelVersionState
from mr_openapi.models.model_version_update import ModelVersionUpdate
from mr_openapi.models.order_by_field import OrderByField
from mr_openapi.models.parameter import Parameter
from mr_openapi.models.parameter_create import ParameterCreate
from mr_openapi.models.parameter_type import ParameterType
from mr_openapi.models.parameter_update import ParameterUpdate
from mr_openapi.models.registered_model import RegisteredModel
from mr_openapi.models.registered_model_create import RegisteredModelCreate
from mr_openapi.models.registered_model_list import RegisteredModelList

File diff suppressed because it is too large Load Diff

View File

@ -18,6 +18,7 @@ from mr_openapi.models.artifact import Artifact
from mr_openapi.models.artifact_create import ArtifactCreate
from mr_openapi.models.artifact_list import ArtifactList
from mr_openapi.models.artifact_state import ArtifactState
from mr_openapi.models.artifact_type_query_param import ArtifactTypeQueryParam
from mr_openapi.models.artifact_update import ArtifactUpdate
from mr_openapi.models.base_model import BaseModel
from mr_openapi.models.base_resource import BaseResource
@ -25,11 +26,25 @@ from mr_openapi.models.base_resource_create import BaseResourceCreate
from mr_openapi.models.base_resource_dates import BaseResourceDates
from mr_openapi.models.base_resource_list import BaseResourceList
from mr_openapi.models.base_resource_update import BaseResourceUpdate
from mr_openapi.models.data_set import DataSet
from mr_openapi.models.data_set_create import DataSetCreate
from mr_openapi.models.data_set_update import DataSetUpdate
from mr_openapi.models.doc_artifact import DocArtifact
from mr_openapi.models.doc_artifact_create import DocArtifactCreate
from mr_openapi.models.doc_artifact_update import DocArtifactUpdate
from mr_openapi.models.error import Error
from mr_openapi.models.execution_state import ExecutionState
from mr_openapi.models.experiment import Experiment
from mr_openapi.models.experiment_create import ExperimentCreate
from mr_openapi.models.experiment_list import ExperimentList
from mr_openapi.models.experiment_run import ExperimentRun
from mr_openapi.models.experiment_run_create import ExperimentRunCreate
from mr_openapi.models.experiment_run_list import ExperimentRunList
from mr_openapi.models.experiment_run_state import ExperimentRunState
from mr_openapi.models.experiment_run_status import ExperimentRunStatus
from mr_openapi.models.experiment_run_update import ExperimentRunUpdate
from mr_openapi.models.experiment_state import ExperimentState
from mr_openapi.models.experiment_update import ExperimentUpdate
from mr_openapi.models.inference_service import InferenceService
from mr_openapi.models.inference_service_create import InferenceServiceCreate
from mr_openapi.models.inference_service_list import InferenceServiceList
@ -42,6 +57,10 @@ from mr_openapi.models.metadata_proto_value import MetadataProtoValue
from mr_openapi.models.metadata_string_value import MetadataStringValue
from mr_openapi.models.metadata_struct_value import MetadataStructValue
from mr_openapi.models.metadata_value import MetadataValue
from mr_openapi.models.metric import Metric
from mr_openapi.models.metric_create import MetricCreate
from mr_openapi.models.metric_list import MetricList
from mr_openapi.models.metric_update import MetricUpdate
from mr_openapi.models.model_artifact import ModelArtifact
from mr_openapi.models.model_artifact_create import ModelArtifactCreate
from mr_openapi.models.model_artifact_list import ModelArtifactList
@ -52,6 +71,10 @@ from mr_openapi.models.model_version_list import ModelVersionList
from mr_openapi.models.model_version_state import ModelVersionState
from mr_openapi.models.model_version_update import ModelVersionUpdate
from mr_openapi.models.order_by_field import OrderByField
from mr_openapi.models.parameter import Parameter
from mr_openapi.models.parameter_create import ParameterCreate
from mr_openapi.models.parameter_type import ParameterType
from mr_openapi.models.parameter_update import ParameterUpdate
from mr_openapi.models.registered_model import RegisteredModel
from mr_openapi.models.registered_model_create import RegisteredModelCreate
from mr_openapi.models.registered_model_list import RegisteredModelList

View File

@ -22,10 +22,13 @@ from pydantic import (
)
from typing_extensions import Self
from mr_openapi.models.data_set import DataSet
from mr_openapi.models.doc_artifact import DocArtifact
from mr_openapi.models.metric import Metric
from mr_openapi.models.model_artifact import ModelArtifact
from mr_openapi.models.parameter import Parameter
ARTIFACT_ONE_OF_SCHEMAS = ["DocArtifact", "ModelArtifact"]
ARTIFACT_ONE_OF_SCHEMAS = ["DataSet", "DocArtifact", "Metric", "ModelArtifact", "Parameter"]
class Artifact(BaseModel):
@ -35,8 +38,14 @@ class Artifact(BaseModel):
oneof_schema_1_validator: ModelArtifact | None = None
# data type: DocArtifact
oneof_schema_2_validator: DocArtifact | None = None
actual_instance: DocArtifact | ModelArtifact | None = None
one_of_schemas: set[str] = {"DocArtifact", "ModelArtifact"}
# data type: DataSet
oneof_schema_3_validator: DataSet | None = None
# data type: Metric
oneof_schema_4_validator: Metric | None = None
# data type: Parameter
oneof_schema_5_validator: Parameter | None = None
actual_instance: DataSet | DocArtifact | Metric | ModelArtifact | Parameter | None = None
one_of_schemas: set[str] = {"DataSet", "DocArtifact", "Metric", "ModelArtifact", "Parameter"}
model_config = ConfigDict(
validate_assignment=True,
@ -72,16 +81,31 @@ class Artifact(BaseModel):
error_messages.append(f"Error! Input type `{type(v)}` is not `DocArtifact`")
else:
match += 1
# validate data type: DataSet
if not isinstance(v, DataSet):
error_messages.append(f"Error! Input type `{type(v)}` is not `DataSet`")
else:
match += 1
# validate data type: Metric
if not isinstance(v, Metric):
error_messages.append(f"Error! Input type `{type(v)}` is not `Metric`")
else:
match += 1
# validate data type: Parameter
if not isinstance(v, Parameter):
error_messages.append(f"Error! Input type `{type(v)}` is not `Parameter`")
else:
match += 1
if match > 1:
# more than 1 match
raise ValueError(
"Multiple matches found when setting `actual_instance` in Artifact with oneOf schemas: DocArtifact, ModelArtifact. Details: "
"Multiple matches found when setting `actual_instance` in Artifact with oneOf schemas: DataSet, DocArtifact, Metric, ModelArtifact, Parameter. Details: "
+ ", ".join(error_messages)
)
if match == 0:
# no match
raise ValueError(
"No match found when setting `actual_instance` in Artifact with oneOf schemas: DocArtifact, ModelArtifact. Details: "
"No match found when setting `actual_instance` in Artifact with oneOf schemas: DataSet, DocArtifact, Metric, ModelArtifact, Parameter. Details: "
+ ", ".join(error_messages)
)
return v
@ -103,26 +127,56 @@ class Artifact(BaseModel):
msg = "Failed to lookup data type from the field `artifactType` in the input."
raise ValueError(msg)
# check if data type is `DataSet`
if _data_type == "dataset-artifact":
instance.actual_instance = DataSet.from_json(json_str)
return instance
# check if data type is `DocArtifact`
if _data_type == "doc-artifact":
instance.actual_instance = DocArtifact.from_json(json_str)
return instance
# check if data type is `Metric`
if _data_type == "metric":
instance.actual_instance = Metric.from_json(json_str)
return instance
# check if data type is `ModelArtifact`
if _data_type == "model-artifact":
instance.actual_instance = ModelArtifact.from_json(json_str)
return instance
# check if data type is `Parameter`
if _data_type == "parameter":
instance.actual_instance = Parameter.from_json(json_str)
return instance
# check if data type is `DataSet`
if _data_type == "DataSet":
instance.actual_instance = DataSet.from_json(json_str)
return instance
# check if data type is `DocArtifact`
if _data_type == "DocArtifact":
instance.actual_instance = DocArtifact.from_json(json_str)
return instance
# check if data type is `Metric`
if _data_type == "Metric":
instance.actual_instance = Metric.from_json(json_str)
return instance
# check if data type is `ModelArtifact`
if _data_type == "ModelArtifact":
instance.actual_instance = ModelArtifact.from_json(json_str)
return instance
# check if data type is `Parameter`
if _data_type == "Parameter":
instance.actual_instance = Parameter.from_json(json_str)
return instance
# deserialize data into ModelArtifact
try:
instance.actual_instance = ModelArtifact.from_json(json_str)
@ -135,17 +189,35 @@ class Artifact(BaseModel):
match += 1
except (ValidationError, ValueError) as e:
error_messages.append(str(e))
# deserialize data into DataSet
try:
instance.actual_instance = DataSet.from_json(json_str)
match += 1
except (ValidationError, ValueError) as e:
error_messages.append(str(e))
# deserialize data into Metric
try:
instance.actual_instance = Metric.from_json(json_str)
match += 1
except (ValidationError, ValueError) as e:
error_messages.append(str(e))
# deserialize data into Parameter
try:
instance.actual_instance = Parameter.from_json(json_str)
match += 1
except (ValidationError, ValueError) as e:
error_messages.append(str(e))
if match > 1:
# more than 1 match
raise ValueError(
"Multiple matches found when deserializing the JSON string into Artifact with oneOf schemas: DocArtifact, ModelArtifact. Details: "
"Multiple matches found when deserializing the JSON string into Artifact with oneOf schemas: DataSet, DocArtifact, Metric, ModelArtifact, Parameter. Details: "
+ ", ".join(error_messages)
)
if match == 0:
# no match
raise ValueError(
"No match found when deserializing the JSON string into Artifact with oneOf schemas: DocArtifact, ModelArtifact. Details: "
"No match found when deserializing the JSON string into Artifact with oneOf schemas: DataSet, DocArtifact, Metric, ModelArtifact, Parameter. Details: "
+ ", ".join(error_messages)
)
return instance
@ -159,7 +231,7 @@ class Artifact(BaseModel):
return self.actual_instance.to_json()
return json.dumps(self.actual_instance)
def to_dict(self) -> dict[str, Any] | DocArtifact | ModelArtifact | None:
def to_dict(self) -> dict[str, Any] | DataSet | DocArtifact | Metric | ModelArtifact | Parameter | None:
"""Returns the dict representation of the actual instance."""
if self.actual_instance is None:
return None

View File

@ -22,10 +22,19 @@ from pydantic import (
)
from typing_extensions import Self
from mr_openapi.models.data_set_create import DataSetCreate
from mr_openapi.models.doc_artifact_create import DocArtifactCreate
from mr_openapi.models.metric_create import MetricCreate
from mr_openapi.models.model_artifact_create import ModelArtifactCreate
from mr_openapi.models.parameter_create import ParameterCreate
ARTIFACTCREATE_ONE_OF_SCHEMAS = ["DocArtifactCreate", "ModelArtifactCreate"]
ARTIFACTCREATE_ONE_OF_SCHEMAS = [
"DataSetCreate",
"DocArtifactCreate",
"MetricCreate",
"ModelArtifactCreate",
"ParameterCreate",
]
class ArtifactCreate(BaseModel):
@ -35,8 +44,22 @@ class ArtifactCreate(BaseModel):
oneof_schema_1_validator: ModelArtifactCreate | None = None
# data type: DocArtifactCreate
oneof_schema_2_validator: DocArtifactCreate | None = None
actual_instance: DocArtifactCreate | ModelArtifactCreate | None = None
one_of_schemas: set[str] = {"DocArtifactCreate", "ModelArtifactCreate"}
# data type: DataSetCreate
oneof_schema_3_validator: DataSetCreate | None = None
# data type: MetricCreate
oneof_schema_4_validator: MetricCreate | None = None
# data type: ParameterCreate
oneof_schema_5_validator: ParameterCreate | None = None
actual_instance: (
DataSetCreate | DocArtifactCreate | MetricCreate | ModelArtifactCreate | ParameterCreate | None
) = None
one_of_schemas: set[str] = {
"DataSetCreate",
"DocArtifactCreate",
"MetricCreate",
"ModelArtifactCreate",
"ParameterCreate",
}
model_config = ConfigDict(
validate_assignment=True,
@ -72,16 +95,31 @@ class ArtifactCreate(BaseModel):
error_messages.append(f"Error! Input type `{type(v)}` is not `DocArtifactCreate`")
else:
match += 1
# validate data type: DataSetCreate
if not isinstance(v, DataSetCreate):
error_messages.append(f"Error! Input type `{type(v)}` is not `DataSetCreate`")
else:
match += 1
# validate data type: MetricCreate
if not isinstance(v, MetricCreate):
error_messages.append(f"Error! Input type `{type(v)}` is not `MetricCreate`")
else:
match += 1
# validate data type: ParameterCreate
if not isinstance(v, ParameterCreate):
error_messages.append(f"Error! Input type `{type(v)}` is not `ParameterCreate`")
else:
match += 1
if match > 1:
# more than 1 match
raise ValueError(
"Multiple matches found when setting `actual_instance` in ArtifactCreate with oneOf schemas: DocArtifactCreate, ModelArtifactCreate. Details: "
"Multiple matches found when setting `actual_instance` in ArtifactCreate with oneOf schemas: DataSetCreate, DocArtifactCreate, MetricCreate, ModelArtifactCreate, ParameterCreate. Details: "
+ ", ".join(error_messages)
)
if match == 0:
# no match
raise ValueError(
"No match found when setting `actual_instance` in ArtifactCreate with oneOf schemas: DocArtifactCreate, ModelArtifactCreate. Details: "
"No match found when setting `actual_instance` in ArtifactCreate with oneOf schemas: DataSetCreate, DocArtifactCreate, MetricCreate, ModelArtifactCreate, ParameterCreate. Details: "
+ ", ".join(error_messages)
)
return v
@ -103,26 +141,56 @@ class ArtifactCreate(BaseModel):
msg = "Failed to lookup data type from the field `artifactType` in the input."
raise ValueError(msg)
# check if data type is `DataSetCreate`
if _data_type == "dataset-artifact":
instance.actual_instance = DataSetCreate.from_json(json_str)
return instance
# check if data type is `DocArtifactCreate`
if _data_type == "doc-artifact":
instance.actual_instance = DocArtifactCreate.from_json(json_str)
return instance
# check if data type is `MetricCreate`
if _data_type == "metric":
instance.actual_instance = MetricCreate.from_json(json_str)
return instance
# check if data type is `ModelArtifactCreate`
if _data_type == "model-artifact":
instance.actual_instance = ModelArtifactCreate.from_json(json_str)
return instance
# check if data type is `ParameterCreate`
if _data_type == "parameter":
instance.actual_instance = ParameterCreate.from_json(json_str)
return instance
# check if data type is `DataSetCreate`
if _data_type == "DataSetCreate":
instance.actual_instance = DataSetCreate.from_json(json_str)
return instance
# check if data type is `DocArtifactCreate`
if _data_type == "DocArtifactCreate":
instance.actual_instance = DocArtifactCreate.from_json(json_str)
return instance
# check if data type is `MetricCreate`
if _data_type == "MetricCreate":
instance.actual_instance = MetricCreate.from_json(json_str)
return instance
# check if data type is `ModelArtifactCreate`
if _data_type == "ModelArtifactCreate":
instance.actual_instance = ModelArtifactCreate.from_json(json_str)
return instance
# check if data type is `ParameterCreate`
if _data_type == "ParameterCreate":
instance.actual_instance = ParameterCreate.from_json(json_str)
return instance
# deserialize data into ModelArtifactCreate
try:
instance.actual_instance = ModelArtifactCreate.from_json(json_str)
@ -135,17 +203,35 @@ class ArtifactCreate(BaseModel):
match += 1
except (ValidationError, ValueError) as e:
error_messages.append(str(e))
# deserialize data into DataSetCreate
try:
instance.actual_instance = DataSetCreate.from_json(json_str)
match += 1
except (ValidationError, ValueError) as e:
error_messages.append(str(e))
# deserialize data into MetricCreate
try:
instance.actual_instance = MetricCreate.from_json(json_str)
match += 1
except (ValidationError, ValueError) as e:
error_messages.append(str(e))
# deserialize data into ParameterCreate
try:
instance.actual_instance = ParameterCreate.from_json(json_str)
match += 1
except (ValidationError, ValueError) as e:
error_messages.append(str(e))
if match > 1:
# more than 1 match
raise ValueError(
"Multiple matches found when deserializing the JSON string into ArtifactCreate with oneOf schemas: DocArtifactCreate, ModelArtifactCreate. Details: "
"Multiple matches found when deserializing the JSON string into ArtifactCreate with oneOf schemas: DataSetCreate, DocArtifactCreate, MetricCreate, ModelArtifactCreate, ParameterCreate. Details: "
+ ", ".join(error_messages)
)
if match == 0:
# no match
raise ValueError(
"No match found when deserializing the JSON string into ArtifactCreate with oneOf schemas: DocArtifactCreate, ModelArtifactCreate. Details: "
"No match found when deserializing the JSON string into ArtifactCreate with oneOf schemas: DataSetCreate, DocArtifactCreate, MetricCreate, ModelArtifactCreate, ParameterCreate. Details: "
+ ", ".join(error_messages)
)
return instance
@ -159,7 +245,17 @@ class ArtifactCreate(BaseModel):
return self.actual_instance.to_json()
return json.dumps(self.actual_instance)
def to_dict(self) -> dict[str, Any] | DocArtifactCreate | ModelArtifactCreate | None:
def to_dict(
self,
) -> (
dict[str, Any]
| DataSetCreate
| DocArtifactCreate
| MetricCreate
| ModelArtifactCreate
| ParameterCreate
| None
):
"""Returns the dict representation of the actual instance."""
if self.actual_instance is None:
return None

View File

@ -0,0 +1,34 @@
"""Model Registry REST API.
REST API for Model Registry to create and manage ML model metadata
The version of the OpenAPI document: v1alpha3
Generated by OpenAPI Generator (https://openapi-generator.tech)
Do not edit the class manually.
""" # noqa: E501
from __future__ import annotations
import json
from enum import Enum
from typing_extensions import Self
class ArtifactTypeQueryParam(str, Enum):
"""Supported artifact types for querying."""
"""
allowed enum values
"""
MODEL_MINUS_ARTIFACT = "model-artifact"
DOC_MINUS_ARTIFACT = "doc-artifact"
DATASET_MINUS_ARTIFACT = "dataset-artifact"
METRIC = "metric"
PARAMETER = "parameter"
@classmethod
def from_json(cls, json_str: str) -> Self:
"""Create an instance of ArtifactTypeQueryParam from a JSON string."""
return cls(json.loads(json_str))

View File

@ -22,10 +22,19 @@ from pydantic import (
)
from typing_extensions import Self
from mr_openapi.models.data_set_update import DataSetUpdate
from mr_openapi.models.doc_artifact_update import DocArtifactUpdate
from mr_openapi.models.metric_update import MetricUpdate
from mr_openapi.models.model_artifact_update import ModelArtifactUpdate
from mr_openapi.models.parameter_update import ParameterUpdate
ARTIFACTUPDATE_ONE_OF_SCHEMAS = ["DocArtifactUpdate", "ModelArtifactUpdate"]
ARTIFACTUPDATE_ONE_OF_SCHEMAS = [
"DataSetUpdate",
"DocArtifactUpdate",
"MetricUpdate",
"ModelArtifactUpdate",
"ParameterUpdate",
]
class ArtifactUpdate(BaseModel):
@ -35,8 +44,22 @@ class ArtifactUpdate(BaseModel):
oneof_schema_1_validator: ModelArtifactUpdate | None = None
# data type: DocArtifactUpdate
oneof_schema_2_validator: DocArtifactUpdate | None = None
actual_instance: DocArtifactUpdate | ModelArtifactUpdate | None = None
one_of_schemas: set[str] = {"DocArtifactUpdate", "ModelArtifactUpdate"}
# data type: DataSetUpdate
oneof_schema_3_validator: DataSetUpdate | None = None
# data type: MetricUpdate
oneof_schema_4_validator: MetricUpdate | None = None
# data type: ParameterUpdate
oneof_schema_5_validator: ParameterUpdate | None = None
actual_instance: (
DataSetUpdate | DocArtifactUpdate | MetricUpdate | ModelArtifactUpdate | ParameterUpdate | None
) = None
one_of_schemas: set[str] = {
"DataSetUpdate",
"DocArtifactUpdate",
"MetricUpdate",
"ModelArtifactUpdate",
"ParameterUpdate",
}
model_config = ConfigDict(
validate_assignment=True,
@ -72,16 +95,31 @@ class ArtifactUpdate(BaseModel):
error_messages.append(f"Error! Input type `{type(v)}` is not `DocArtifactUpdate`")
else:
match += 1
# validate data type: DataSetUpdate
if not isinstance(v, DataSetUpdate):
error_messages.append(f"Error! Input type `{type(v)}` is not `DataSetUpdate`")
else:
match += 1
# validate data type: MetricUpdate
if not isinstance(v, MetricUpdate):
error_messages.append(f"Error! Input type `{type(v)}` is not `MetricUpdate`")
else:
match += 1
# validate data type: ParameterUpdate
if not isinstance(v, ParameterUpdate):
error_messages.append(f"Error! Input type `{type(v)}` is not `ParameterUpdate`")
else:
match += 1
if match > 1:
# more than 1 match
raise ValueError(
"Multiple matches found when setting `actual_instance` in ArtifactUpdate with oneOf schemas: DocArtifactUpdate, ModelArtifactUpdate. Details: "
"Multiple matches found when setting `actual_instance` in ArtifactUpdate with oneOf schemas: DataSetUpdate, DocArtifactUpdate, MetricUpdate, ModelArtifactUpdate, ParameterUpdate. Details: "
+ ", ".join(error_messages)
)
if match == 0:
# no match
raise ValueError(
"No match found when setting `actual_instance` in ArtifactUpdate with oneOf schemas: DocArtifactUpdate, ModelArtifactUpdate. Details: "
"No match found when setting `actual_instance` in ArtifactUpdate with oneOf schemas: DataSetUpdate, DocArtifactUpdate, MetricUpdate, ModelArtifactUpdate, ParameterUpdate. Details: "
+ ", ".join(error_messages)
)
return v
@ -103,26 +141,56 @@ class ArtifactUpdate(BaseModel):
msg = "Failed to lookup data type from the field `artifactType` in the input."
raise ValueError(msg)
# check if data type is `DataSetUpdate`
if _data_type == "dataset-artifact":
instance.actual_instance = DataSetUpdate.from_json(json_str)
return instance
# check if data type is `DocArtifactUpdate`
if _data_type == "doc-artifact":
instance.actual_instance = DocArtifactUpdate.from_json(json_str)
return instance
# check if data type is `MetricUpdate`
if _data_type == "metric":
instance.actual_instance = MetricUpdate.from_json(json_str)
return instance
# check if data type is `ModelArtifactUpdate`
if _data_type == "model-artifact":
instance.actual_instance = ModelArtifactUpdate.from_json(json_str)
return instance
# check if data type is `ParameterUpdate`
if _data_type == "parameter":
instance.actual_instance = ParameterUpdate.from_json(json_str)
return instance
# check if data type is `DataSetUpdate`
if _data_type == "DataSetUpdate":
instance.actual_instance = DataSetUpdate.from_json(json_str)
return instance
# check if data type is `DocArtifactUpdate`
if _data_type == "DocArtifactUpdate":
instance.actual_instance = DocArtifactUpdate.from_json(json_str)
return instance
# check if data type is `MetricUpdate`
if _data_type == "MetricUpdate":
instance.actual_instance = MetricUpdate.from_json(json_str)
return instance
# check if data type is `ModelArtifactUpdate`
if _data_type == "ModelArtifactUpdate":
instance.actual_instance = ModelArtifactUpdate.from_json(json_str)
return instance
# check if data type is `ParameterUpdate`
if _data_type == "ParameterUpdate":
instance.actual_instance = ParameterUpdate.from_json(json_str)
return instance
# deserialize data into ModelArtifactUpdate
try:
instance.actual_instance = ModelArtifactUpdate.from_json(json_str)
@ -135,17 +203,35 @@ class ArtifactUpdate(BaseModel):
match += 1
except (ValidationError, ValueError) as e:
error_messages.append(str(e))
# deserialize data into DataSetUpdate
try:
instance.actual_instance = DataSetUpdate.from_json(json_str)
match += 1
except (ValidationError, ValueError) as e:
error_messages.append(str(e))
# deserialize data into MetricUpdate
try:
instance.actual_instance = MetricUpdate.from_json(json_str)
match += 1
except (ValidationError, ValueError) as e:
error_messages.append(str(e))
# deserialize data into ParameterUpdate
try:
instance.actual_instance = ParameterUpdate.from_json(json_str)
match += 1
except (ValidationError, ValueError) as e:
error_messages.append(str(e))
if match > 1:
# more than 1 match
raise ValueError(
"Multiple matches found when deserializing the JSON string into ArtifactUpdate with oneOf schemas: DocArtifactUpdate, ModelArtifactUpdate. Details: "
"Multiple matches found when deserializing the JSON string into ArtifactUpdate with oneOf schemas: DataSetUpdate, DocArtifactUpdate, MetricUpdate, ModelArtifactUpdate, ParameterUpdate. Details: "
+ ", ".join(error_messages)
)
if match == 0:
# no match
raise ValueError(
"No match found when deserializing the JSON string into ArtifactUpdate with oneOf schemas: DocArtifactUpdate, ModelArtifactUpdate. Details: "
"No match found when deserializing the JSON string into ArtifactUpdate with oneOf schemas: DataSetUpdate, DocArtifactUpdate, MetricUpdate, ModelArtifactUpdate, ParameterUpdate. Details: "
+ ", ".join(error_messages)
)
return instance
@ -159,7 +245,17 @@ class ArtifactUpdate(BaseModel):
return self.actual_instance.to_json()
return json.dumps(self.actual_instance)
def to_dict(self) -> dict[str, Any] | DocArtifactUpdate | ModelArtifactUpdate | None:
def to_dict(
self,
) -> (
dict[str, Any]
| DataSetUpdate
| DocArtifactUpdate
| MetricUpdate
| ModelArtifactUpdate
| ParameterUpdate
| None
):
"""Returns the dict representation of the actual instance."""
if self.actual_instance is None:
return None

View File

@ -42,7 +42,7 @@ class BaseResource(BaseModel):
description: StrictStr | None = Field(default=None, description="An optional description about the resource.")
external_id: StrictStr | None = Field(
default=None,
description="The external id that come from the clients' system. This field is optional. If set, it must be unique among all resources within a database instance.",
description="The external id that come from the clients system. This field is optional. If set, it must be unique among all resources within a database instance.",
alias="externalId",
)
name: StrictStr | None = Field(

View File

@ -0,0 +1,173 @@
"""Model Registry REST API.
REST API for Model Registry to create and manage ML model metadata
The version of the OpenAPI document: v1alpha3
Generated by OpenAPI Generator (https://openapi-generator.tech)
Do not edit the class manually.
""" # noqa: E501
from __future__ import annotations
import json
import pprint
import re # noqa: F401
from typing import Any, ClassVar
from pydantic import BaseModel, ConfigDict, Field, StrictStr
from typing_extensions import Self
from mr_openapi.models.artifact_state import ArtifactState
from mr_openapi.models.metadata_value import MetadataValue
class DataSet(BaseModel):
"""A dataset artifact representing training or test data.""" # noqa: E501
custom_properties: dict[str, MetadataValue] | None = Field(
default=None,
description="User provided custom properties which are not defined by its type.",
alias="customProperties",
)
description: StrictStr | None = Field(default=None, description="An optional description about the resource.")
external_id: StrictStr | None = Field(
default=None,
description="The external id that come from the clients system. This field is optional. If set, it must be unique among all resources within a database instance.",
alias="externalId",
)
name: StrictStr | None = Field(
default=None,
description="The client provided name of the artifact. This field is optional. If set, it must be unique among all the artifacts of the same artifact type within a database instance and cannot be changed once set.",
)
id: StrictStr | None = Field(default=None, description="The unique server generated id of the resource.")
create_time_since_epoch: StrictStr | None = Field(
default=None,
description="Output only. Create time of the resource in millisecond since epoch.",
alias="createTimeSinceEpoch",
)
last_update_time_since_epoch: StrictStr | None = Field(
default=None,
description="Output only. Last update time of the resource since epoch in millisecond since epoch.",
alias="lastUpdateTimeSinceEpoch",
)
artifact_type: StrictStr | None = Field(default="dataset-artifact", alias="artifactType")
digest: StrictStr | None = Field(default=None, description="A unique hash or identifier for the dataset content.")
source_type: StrictStr | None = Field(
default=None,
description='The type of data source (e.g., "s3", "hdfs", "local", "database").',
alias="sourceType",
)
source: StrictStr | None = Field(
default=None, description="The location or connection string for the dataset source."
)
var_schema: StrictStr | None = Field(
default=None, description="JSON schema or description of the dataset structure.", alias="schema"
)
profile: StrictStr | None = Field(default=None, description="Statistical profile or summary of the dataset.")
uri: StrictStr | None = Field(
default=None,
description="The uniform resource identifier of the physical dataset. May be empty if there is no physical dataset.",
)
state: ArtifactState | None = None
__properties: ClassVar[list[str]] = [
"customProperties",
"description",
"externalId",
"name",
"id",
"createTimeSinceEpoch",
"lastUpdateTimeSinceEpoch",
"artifactType",
"digest",
"sourceType",
"source",
"schema",
"profile",
"uri",
"state",
]
model_config = ConfigDict(
populate_by_name=True,
validate_assignment=True,
protected_namespaces=(),
)
def to_str(self) -> str:
"""Returns the string representation of the model using alias."""
return pprint.pformat(self.model_dump(by_alias=True))
def to_json(self) -> str:
"""Returns the JSON representation of the model using alias."""
# TODO: pydantic v2: use .model_dump_json(by_alias=True, exclude_unset=True) instead
return json.dumps(self.to_dict())
@classmethod
def from_json(cls, json_str: str) -> Self | None:
"""Create an instance of DataSet from a JSON string."""
return cls.from_dict(json.loads(json_str))
def to_dict(self) -> dict[str, Any]:
"""Return the dictionary representation of the model using alias.
This has the following differences from calling pydantic's
`self.model_dump(by_alias=True)`:
* `None` is only added to the output dict for nullable fields that
were set at model initialization. Other fields with value `None`
are ignored.
* OpenAPI `readOnly` fields are excluded.
* OpenAPI `readOnly` fields are excluded.
"""
excluded_fields: set[str] = {
"create_time_since_epoch",
"last_update_time_since_epoch",
}
_dict = self.model_dump(
by_alias=True,
exclude=excluded_fields,
exclude_none=True,
)
# override the default output from pydantic by calling `to_dict()` of each value in custom_properties (dict)
_field_dict = {}
if self.custom_properties:
for _key in self.custom_properties:
if self.custom_properties[_key]:
_field_dict[_key] = self.custom_properties[_key].to_dict()
_dict["customProperties"] = _field_dict
return _dict
@classmethod
def from_dict(cls, obj: dict[str, Any] | None) -> Self | None:
"""Create an instance of DataSet from a dict."""
if obj is None:
return None
if not isinstance(obj, dict):
return cls.model_validate(obj)
return cls.model_validate(
{
"customProperties": (
{_k: MetadataValue.from_dict(_v) for _k, _v in obj["customProperties"].items()}
if obj.get("customProperties") is not None
else None
),
"description": obj.get("description"),
"externalId": obj.get("externalId"),
"name": obj.get("name"),
"id": obj.get("id"),
"createTimeSinceEpoch": obj.get("createTimeSinceEpoch"),
"lastUpdateTimeSinceEpoch": obj.get("lastUpdateTimeSinceEpoch"),
"artifactType": obj.get("artifactType") if obj.get("artifactType") is not None else "dataset-artifact",
"digest": obj.get("digest"),
"sourceType": obj.get("sourceType"),
"source": obj.get("source"),
"schema": obj.get("schema"),
"profile": obj.get("profile"),
"uri": obj.get("uri"),
"state": obj.get("state"),
}
)

View File

@ -0,0 +1,151 @@
"""Model Registry REST API.
REST API for Model Registry to create and manage ML model metadata
The version of the OpenAPI document: v1alpha3
Generated by OpenAPI Generator (https://openapi-generator.tech)
Do not edit the class manually.
""" # noqa: E501
from __future__ import annotations
import json
import pprint
import re # noqa: F401
from typing import Any, ClassVar
from pydantic import BaseModel, ConfigDict, Field, StrictStr
from typing_extensions import Self
from mr_openapi.models.artifact_state import ArtifactState
from mr_openapi.models.metadata_value import MetadataValue
class DataSetCreate(BaseModel):
"""A dataset artifact to be created.""" # noqa: E501
custom_properties: dict[str, MetadataValue] | None = Field(
default=None,
description="User provided custom properties which are not defined by its type.",
alias="customProperties",
)
description: StrictStr | None = Field(default=None, description="An optional description about the resource.")
external_id: StrictStr | None = Field(
default=None,
description="The external id that come from the clients system. This field is optional. If set, it must be unique among all resources within a database instance.",
alias="externalId",
)
name: StrictStr | None = Field(
default=None,
description="The client provided name of the artifact. This field is optional. If set, it must be unique among all the artifacts of the same artifact type within a database instance and cannot be changed once set.",
)
artifact_type: StrictStr | None = Field(default="dataset-artifact", alias="artifactType")
digest: StrictStr | None = Field(default=None, description="A unique hash or identifier for the dataset content.")
source_type: StrictStr | None = Field(
default=None,
description='The type of data source (e.g., "s3", "hdfs", "local", "database").',
alias="sourceType",
)
source: StrictStr | None = Field(
default=None, description="The location or connection string for the dataset source."
)
var_schema: StrictStr | None = Field(
default=None, description="JSON schema or description of the dataset structure.", alias="schema"
)
profile: StrictStr | None = Field(default=None, description="Statistical profile or summary of the dataset.")
uri: StrictStr | None = Field(
default=None,
description="The uniform resource identifier of the physical dataset. May be empty if there is no physical dataset.",
)
state: ArtifactState | None = None
__properties: ClassVar[list[str]] = [
"customProperties",
"description",
"externalId",
"name",
"artifactType",
"digest",
"sourceType",
"source",
"schema",
"profile",
"uri",
"state",
]
model_config = ConfigDict(
populate_by_name=True,
validate_assignment=True,
protected_namespaces=(),
)
def to_str(self) -> str:
"""Returns the string representation of the model using alias."""
return pprint.pformat(self.model_dump(by_alias=True))
def to_json(self) -> str:
"""Returns the JSON representation of the model using alias."""
# TODO: pydantic v2: use .model_dump_json(by_alias=True, exclude_unset=True) instead
return json.dumps(self.to_dict())
@classmethod
def from_json(cls, json_str: str) -> Self | None:
"""Create an instance of DataSetCreate from a JSON string."""
return cls.from_dict(json.loads(json_str))
def to_dict(self) -> dict[str, Any]:
"""Return the dictionary representation of the model using alias.
This has the following differences from calling pydantic's
`self.model_dump(by_alias=True)`:
* `None` is only added to the output dict for nullable fields that
were set at model initialization. Other fields with value `None`
are ignored.
"""
excluded_fields: set[str] = set()
_dict = self.model_dump(
by_alias=True,
exclude=excluded_fields,
exclude_none=True,
)
# override the default output from pydantic by calling `to_dict()` of each value in custom_properties (dict)
_field_dict = {}
if self.custom_properties:
for _key in self.custom_properties:
if self.custom_properties[_key]:
_field_dict[_key] = self.custom_properties[_key].to_dict()
_dict["customProperties"] = _field_dict
return _dict
@classmethod
def from_dict(cls, obj: dict[str, Any] | None) -> Self | None:
"""Create an instance of DataSetCreate from a dict."""
if obj is None:
return None
if not isinstance(obj, dict):
return cls.model_validate(obj)
return cls.model_validate(
{
"customProperties": (
{_k: MetadataValue.from_dict(_v) for _k, _v in obj["customProperties"].items()}
if obj.get("customProperties") is not None
else None
),
"description": obj.get("description"),
"externalId": obj.get("externalId"),
"name": obj.get("name"),
"artifactType": obj.get("artifactType") if obj.get("artifactType") is not None else "dataset-artifact",
"digest": obj.get("digest"),
"sourceType": obj.get("sourceType"),
"source": obj.get("source"),
"schema": obj.get("schema"),
"profile": obj.get("profile"),
"uri": obj.get("uri"),
"state": obj.get("state"),
}
)

View File

@ -0,0 +1,145 @@
"""Model Registry REST API.
REST API for Model Registry to create and manage ML model metadata
The version of the OpenAPI document: v1alpha3
Generated by OpenAPI Generator (https://openapi-generator.tech)
Do not edit the class manually.
""" # noqa: E501
from __future__ import annotations
import json
import pprint
import re # noqa: F401
from typing import Any, ClassVar
from pydantic import BaseModel, ConfigDict, Field, StrictStr
from typing_extensions import Self
from mr_openapi.models.artifact_state import ArtifactState
from mr_openapi.models.metadata_value import MetadataValue
class DataSetUpdate(BaseModel):
"""A dataset artifact to be updated.""" # noqa: E501
custom_properties: dict[str, MetadataValue] | None = Field(
default=None,
description="User provided custom properties which are not defined by its type.",
alias="customProperties",
)
description: StrictStr | None = Field(default=None, description="An optional description about the resource.")
external_id: StrictStr | None = Field(
default=None,
description="The external id that come from the clients system. This field is optional. If set, it must be unique among all resources within a database instance.",
alias="externalId",
)
artifact_type: StrictStr | None = Field(default="dataset-artifact", alias="artifactType")
digest: StrictStr | None = Field(default=None, description="A unique hash or identifier for the dataset content.")
source_type: StrictStr | None = Field(
default=None,
description='The type of data source (e.g., "s3", "hdfs", "local", "database").',
alias="sourceType",
)
source: StrictStr | None = Field(
default=None, description="The location or connection string for the dataset source."
)
var_schema: StrictStr | None = Field(
default=None, description="JSON schema or description of the dataset structure.", alias="schema"
)
profile: StrictStr | None = Field(default=None, description="Statistical profile or summary of the dataset.")
uri: StrictStr | None = Field(
default=None,
description="The uniform resource identifier of the physical dataset. May be empty if there is no physical dataset.",
)
state: ArtifactState | None = None
__properties: ClassVar[list[str]] = [
"customProperties",
"description",
"externalId",
"artifactType",
"digest",
"sourceType",
"source",
"schema",
"profile",
"uri",
"state",
]
model_config = ConfigDict(
populate_by_name=True,
validate_assignment=True,
protected_namespaces=(),
)
def to_str(self) -> str:
"""Returns the string representation of the model using alias."""
return pprint.pformat(self.model_dump(by_alias=True))
def to_json(self) -> str:
"""Returns the JSON representation of the model using alias."""
# TODO: pydantic v2: use .model_dump_json(by_alias=True, exclude_unset=True) instead
return json.dumps(self.to_dict())
@classmethod
def from_json(cls, json_str: str) -> Self | None:
"""Create an instance of DataSetUpdate from a JSON string."""
return cls.from_dict(json.loads(json_str))
def to_dict(self) -> dict[str, Any]:
"""Return the dictionary representation of the model using alias.
This has the following differences from calling pydantic's
`self.model_dump(by_alias=True)`:
* `None` is only added to the output dict for nullable fields that
were set at model initialization. Other fields with value `None`
are ignored.
"""
excluded_fields: set[str] = set()
_dict = self.model_dump(
by_alias=True,
exclude=excluded_fields,
exclude_none=True,
)
# override the default output from pydantic by calling `to_dict()` of each value in custom_properties (dict)
_field_dict = {}
if self.custom_properties:
for _key in self.custom_properties:
if self.custom_properties[_key]:
_field_dict[_key] = self.custom_properties[_key].to_dict()
_dict["customProperties"] = _field_dict
return _dict
@classmethod
def from_dict(cls, obj: dict[str, Any] | None) -> Self | None:
"""Create an instance of DataSetUpdate from a dict."""
if obj is None:
return None
if not isinstance(obj, dict):
return cls.model_validate(obj)
return cls.model_validate(
{
"customProperties": (
{_k: MetadataValue.from_dict(_v) for _k, _v in obj["customProperties"].items()}
if obj.get("customProperties") is not None
else None
),
"description": obj.get("description"),
"externalId": obj.get("externalId"),
"artifactType": obj.get("artifactType") if obj.get("artifactType") is not None else "dataset-artifact",
"digest": obj.get("digest"),
"sourceType": obj.get("sourceType"),
"source": obj.get("source"),
"schema": obj.get("schema"),
"profile": obj.get("profile"),
"uri": obj.get("uri"),
"state": obj.get("state"),
}
)

View File

@ -0,0 +1,143 @@
"""Model Registry REST API.
REST API for Model Registry to create and manage ML model metadata
The version of the OpenAPI document: v1alpha3
Generated by OpenAPI Generator (https://openapi-generator.tech)
Do not edit the class manually.
""" # noqa: E501
from __future__ import annotations
import json
import pprint
import re # noqa: F401
from typing import Any, ClassVar
from pydantic import BaseModel, ConfigDict, Field, StrictStr
from typing_extensions import Self
from mr_openapi.models.experiment_state import ExperimentState
from mr_openapi.models.metadata_value import MetadataValue
class Experiment(BaseModel):
"""An experiment in model registry. An experiment has ExperimentRun children.""" # noqa: E501
custom_properties: dict[str, MetadataValue] | None = Field(
default=None,
description="User provided custom properties which are not defined by its type.",
alias="customProperties",
)
description: StrictStr | None = Field(default=None, description="An optional description about the resource.")
external_id: StrictStr | None = Field(
default=None,
description="The external id that come from the clients system. This field is optional. If set, it must be unique among all resources within a database instance.",
alias="externalId",
)
name: StrictStr = Field(
description="The client provided name of the experiment. It must be unique among all the Experiments of the same type within a Model Registry instance and cannot be changed once set."
)
id: StrictStr | None = Field(default=None, description="The unique server generated id of the resource.")
create_time_since_epoch: StrictStr | None = Field(
default=None,
description="Output only. Create time of the resource in millisecond since epoch.",
alias="createTimeSinceEpoch",
)
last_update_time_since_epoch: StrictStr | None = Field(
default=None,
description="Output only. Last update time of the resource since epoch in millisecond since epoch.",
alias="lastUpdateTimeSinceEpoch",
)
owner: StrictStr | None = None
state: ExperimentState | None = None
__properties: ClassVar[list[str]] = [
"customProperties",
"description",
"externalId",
"name",
"id",
"createTimeSinceEpoch",
"lastUpdateTimeSinceEpoch",
"owner",
"state",
]
model_config = ConfigDict(
populate_by_name=True,
validate_assignment=True,
protected_namespaces=(),
)
def to_str(self) -> str:
"""Returns the string representation of the model using alias."""
return pprint.pformat(self.model_dump(by_alias=True))
def to_json(self) -> str:
"""Returns the JSON representation of the model using alias."""
# TODO: pydantic v2: use .model_dump_json(by_alias=True, exclude_unset=True) instead
return json.dumps(self.to_dict())
@classmethod
def from_json(cls, json_str: str) -> Self | None:
"""Create an instance of Experiment from a JSON string."""
return cls.from_dict(json.loads(json_str))
def to_dict(self) -> dict[str, Any]:
"""Return the dictionary representation of the model using alias.
This has the following differences from calling pydantic's
`self.model_dump(by_alias=True)`:
* `None` is only added to the output dict for nullable fields that
were set at model initialization. Other fields with value `None`
are ignored.
* OpenAPI `readOnly` fields are excluded.
* OpenAPI `readOnly` fields are excluded.
"""
excluded_fields: set[str] = {
"create_time_since_epoch",
"last_update_time_since_epoch",
}
_dict = self.model_dump(
by_alias=True,
exclude=excluded_fields,
exclude_none=True,
)
# override the default output from pydantic by calling `to_dict()` of each value in custom_properties (dict)
_field_dict = {}
if self.custom_properties:
for _key in self.custom_properties:
if self.custom_properties[_key]:
_field_dict[_key] = self.custom_properties[_key].to_dict()
_dict["customProperties"] = _field_dict
return _dict
@classmethod
def from_dict(cls, obj: dict[str, Any] | None) -> Self | None:
"""Create an instance of Experiment from a dict."""
if obj is None:
return None
if not isinstance(obj, dict):
return cls.model_validate(obj)
return cls.model_validate(
{
"customProperties": (
{_k: MetadataValue.from_dict(_v) for _k, _v in obj["customProperties"].items()}
if obj.get("customProperties") is not None
else None
),
"description": obj.get("description"),
"externalId": obj.get("externalId"),
"name": obj.get("name"),
"id": obj.get("id"),
"createTimeSinceEpoch": obj.get("createTimeSinceEpoch"),
"lastUpdateTimeSinceEpoch": obj.get("lastUpdateTimeSinceEpoch"),
"owner": obj.get("owner"),
"state": obj.get("state"),
}
)

View File

@ -0,0 +1,114 @@
"""Model Registry REST API.
REST API for Model Registry to create and manage ML model metadata
The version of the OpenAPI document: v1alpha3
Generated by OpenAPI Generator (https://openapi-generator.tech)
Do not edit the class manually.
""" # noqa: E501
from __future__ import annotations
import json
import pprint
import re # noqa: F401
from typing import Any, ClassVar
from pydantic import BaseModel, ConfigDict, Field, StrictStr
from typing_extensions import Self
from mr_openapi.models.experiment_state import ExperimentState
from mr_openapi.models.metadata_value import MetadataValue
class ExperimentCreate(BaseModel):
"""An experiment in model registry. An experiment has ExperimentRun children.""" # noqa: E501
custom_properties: dict[str, MetadataValue] | None = Field(
default=None,
description="User provided custom properties which are not defined by its type.",
alias="customProperties",
)
description: StrictStr | None = Field(default=None, description="An optional description about the resource.")
external_id: StrictStr | None = Field(
default=None,
description="The external id that come from the clients system. This field is optional. If set, it must be unique among all resources within a database instance.",
alias="externalId",
)
name: StrictStr = Field(
description="The client provided name of the experiment. It must be unique among all the Experiments of the same type within a Model Registry instance and cannot be changed once set."
)
owner: StrictStr | None = None
state: ExperimentState | None = None
__properties: ClassVar[list[str]] = ["customProperties", "description", "externalId", "name", "owner", "state"]
model_config = ConfigDict(
populate_by_name=True,
validate_assignment=True,
protected_namespaces=(),
)
def to_str(self) -> str:
"""Returns the string representation of the model using alias."""
return pprint.pformat(self.model_dump(by_alias=True))
def to_json(self) -> str:
"""Returns the JSON representation of the model using alias."""
# TODO: pydantic v2: use .model_dump_json(by_alias=True, exclude_unset=True) instead
return json.dumps(self.to_dict())
@classmethod
def from_json(cls, json_str: str) -> Self | None:
"""Create an instance of ExperimentCreate from a JSON string."""
return cls.from_dict(json.loads(json_str))
def to_dict(self) -> dict[str, Any]:
"""Return the dictionary representation of the model using alias.
This has the following differences from calling pydantic's
`self.model_dump(by_alias=True)`:
* `None` is only added to the output dict for nullable fields that
were set at model initialization. Other fields with value `None`
are ignored.
"""
excluded_fields: set[str] = set()
_dict = self.model_dump(
by_alias=True,
exclude=excluded_fields,
exclude_none=True,
)
# override the default output from pydantic by calling `to_dict()` of each value in custom_properties (dict)
_field_dict = {}
if self.custom_properties:
for _key in self.custom_properties:
if self.custom_properties[_key]:
_field_dict[_key] = self.custom_properties[_key].to_dict()
_dict["customProperties"] = _field_dict
return _dict
@classmethod
def from_dict(cls, obj: dict[str, Any] | None) -> Self | None:
"""Create an instance of ExperimentCreate from a dict."""
if obj is None:
return None
if not isinstance(obj, dict):
return cls.model_validate(obj)
return cls.model_validate(
{
"customProperties": (
{_k: MetadataValue.from_dict(_v) for _k, _v in obj["customProperties"].items()}
if obj.get("customProperties") is not None
else None
),
"description": obj.get("description"),
"externalId": obj.get("externalId"),
"name": obj.get("name"),
"owner": obj.get("owner"),
"state": obj.get("state"),
}
)

View File

@ -0,0 +1,99 @@
"""Model Registry REST API.
REST API for Model Registry to create and manage ML model metadata
The version of the OpenAPI document: v1alpha3
Generated by OpenAPI Generator (https://openapi-generator.tech)
Do not edit the class manually.
""" # noqa: E501
from __future__ import annotations
import json
import pprint
import re # noqa: F401
from typing import Any, ClassVar
from pydantic import BaseModel, ConfigDict, Field, StrictInt, StrictStr
from typing_extensions import Self
from mr_openapi.models.experiment import Experiment
class ExperimentList(BaseModel):
"""List of Experiments.""" # noqa: E501
next_page_token: StrictStr = Field(
description="Token to use to retrieve next page of results.", alias="nextPageToken"
)
page_size: StrictInt = Field(description="Maximum number of resources to return in the result.", alias="pageSize")
size: StrictInt = Field(description="Number of items in result list.")
items: list[Experiment]
__properties: ClassVar[list[str]] = ["nextPageToken", "pageSize", "size", "items"]
model_config = ConfigDict(
populate_by_name=True,
validate_assignment=True,
protected_namespaces=(),
)
def to_str(self) -> str:
"""Returns the string representation of the model using alias."""
return pprint.pformat(self.model_dump(by_alias=True))
def to_json(self) -> str:
"""Returns the JSON representation of the model using alias."""
# TODO: pydantic v2: use .model_dump_json(by_alias=True, exclude_unset=True) instead
return json.dumps(self.to_dict())
@classmethod
def from_json(cls, json_str: str) -> Self | None:
"""Create an instance of ExperimentList from a JSON string."""
return cls.from_dict(json.loads(json_str))
def to_dict(self) -> dict[str, Any]:
"""Return the dictionary representation of the model using alias.
This has the following differences from calling pydantic's
`self.model_dump(by_alias=True)`:
* `None` is only added to the output dict for nullable fields that
were set at model initialization. Other fields with value `None`
are ignored.
"""
excluded_fields: set[str] = set()
_dict = self.model_dump(
by_alias=True,
exclude=excluded_fields,
exclude_none=True,
)
# override the default output from pydantic by calling `to_dict()` of each item in items (list)
_items = []
if self.items:
for _item in self.items:
if _item:
_items.append(_item.to_dict())
_dict["items"] = _items
return _dict
@classmethod
def from_dict(cls, obj: dict[str, Any] | None) -> Self | None:
"""Create an instance of ExperimentList from a dict."""
if obj is None:
return None
if not isinstance(obj, dict):
return cls.model_validate(obj)
return cls.model_validate(
{
"nextPageToken": obj.get("nextPageToken"),
"pageSize": obj.get("pageSize"),
"size": obj.get("size"),
"items": (
[Experiment.from_dict(_item) for _item in obj["items"]] if obj.get("items") is not None else None
),
}
)

Some files were not shown because too many files have changed in this diff Show More