Merge pull request #22332 from aevesdocker/dmr-ex-fix

dmr-ex-fix
This commit is contained in:
Usha Mandya 2025-03-31 15:46:12 +01:00 committed by GitHub
commit 1fc33445a2
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
1 changed files with 6 additions and 6 deletions

View File

@ -80,7 +80,7 @@ Output:
```text ```text
Downloaded: 257.71 MB Downloaded: 257.71 MB
Model ai/smo11m2 pulled successfully Model ai/smollm2 pulled successfully
``` ```
### List available models ### List available models
@ -105,7 +105,7 @@ Run a model and interact with it using a submitted prompt or in chat mode.
#### One-time prompt #### One-time prompt
```console ```console
$ docker model run ai/smo11m2 "Hi" $ docker model run ai/smollm2 "Hi"
``` ```
Output: Output:
@ -117,7 +117,7 @@ Hello! How can I assist you today?
#### Interactive chat #### Interactive chat
```console ```console
docker model run ai/smo11m2 docker model run ai/smollm2
``` ```
Output: Output:
@ -216,7 +216,7 @@ Examples of calling an OpenAI endpoint (`chat/completions`) from within another
curl http://model-runner.docker.internal/engines/llama.cpp/v1/chat/completions \ curl http://model-runner.docker.internal/engines/llama.cpp/v1/chat/completions \
-H "Content-Type: application/json" \ -H "Content-Type: application/json" \
-d '{ -d '{
"model": "ai/smo11m2", "model": "ai/smollm2",
"messages": [ "messages": [
{ {
"role": "system", "role": "system",
@ -242,7 +242,7 @@ curl --unix-socket $HOME/.docker/run/docker.sock \
localhost/exp/vDD4.40/engines/llama.cpp/v1/chat/completions \ localhost/exp/vDD4.40/engines/llama.cpp/v1/chat/completions \
-H "Content-Type: application/json" \ -H "Content-Type: application/json" \
-d '{ -d '{
"model": "ai/smo11m2", "model": "ai/smollm2",
"messages": [ "messages": [
{ {
"role": "system", "role": "system",
@ -269,7 +269,7 @@ Afterwards, interact with it as previously documented using `localhost` and the
curl http://localhost:12434/engines/llama.cpp/v1/chat/completions \ curl http://localhost:12434/engines/llama.cpp/v1/chat/completions \
-H "Content-Type: application/json" \ -H "Content-Type: application/json" \
-d '{ -d '{
"model": "ai/smo11m2", "model": "ai/smollm2",
"messages": [ "messages": [
{ {
"role": "system", "role": "system",