You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: LLM/deploy-openai-llm-byoc.md
+33-43Lines changed: 33 additions & 43 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,6 +1,7 @@
1
1
# Deploy OpenAI open-source models
2
2
3
-
This guide demonstrates how to deploy and perform inference using AI Quick Action registered models with Oracle Data Science Service Managed Containers (SMC) powered by vLLM. In this example, we will use a model downloaded from Hugging Face specifically, [openai/gpt-oss-120b](https://huggingface.co/openai/gpt-oss-120b) from OpenAI.
3
+
This guide demonstrates how to deploy and perform inference using OCI Data Science Service. In this example, we will use a model downloaded from Hugging Face specifically, [openai/gpt-oss-120b](https://huggingface.co/openai/gpt-oss-120b) from OpenAI.
4
+
4
5
5
6
6
7
## Required IAM Policies
@@ -9,40 +10,58 @@ Add these [policies](https://github.com/oracle-samples/oci-data-science-ai-sampl
9
10
10
11
## Setup
11
12
13
+
Create a data science notebook session with at least 400GB of space. We will use notebook session to -
14
+
1. Download model weights
15
+
2. Create Model Catalog entry
16
+
3. Deploy the model
12
17
18
+
To prepare inference container, we will use local laptop since this step requires docker commmands. The notebook session does not come with the docker tooling.
13
19
14
-
```shell
15
-
# Install required python packages
16
20
17
-
pip install oracle-ads
18
-
pip install oci
19
-
pip install huggingface_hub
20
-
```
21
+
# Prepare Inference container
21
22
23
+
vLLM is an easy-to-use library for LLM inference and server. You can get the container image from [DockerHub](https://hub.docker.com/r/vllm/vllm-openai/tags).
Currently, OCI Data Science Model Deployment only supports container images residing in the OCI Registry. Before we can push the pulled vLLM container, make sure you have created a repository in your tenancy.
32
+
- Go to your tenancy Container Registry
33
+
- Click on the Create repository button
34
+
- Select Private under Access types
35
+
- Set a name for Repository name. We are using "vllm-odsc "in the example.
36
+
- Click on Create button
23
37
24
-
### Common variables
38
+
You may need to docker login to the Oracle Cloud Container Registry (OCIR) first, if you haven't done so before in order to push the image. To login, you have to use your API Auth Token that can be created under your Oracle Cloud Account->Auth Token. You need to login only once. Replace <region> with the OCI region you are using.
If your tenancy is federated with Oracle Identity Cloud Service, use the format <tenancy-namespace>/oracleidentitycloudservice/<username>. You can then push the container image to the OCI Registry
27
45
28
-
## API Endpoint Usage
46
+
```shell
47
+
docker tag vllm/vllm-openai:gptoss -t <region>.ocir.io/<tenancy>/vllm-odsc/vllm-openai:gptoss
The `/v1/completions` is for interacting with non-chat base models or the instruction trained chat model. This endpoint provides the completion for a single prompt and takes a single string as input, whereas the `/v1/chat/completions` endpoint provides the responses for a given dialog and requires the input in a specific format corresponding to the message history. This guide uses `/v1/chat/completions` endpoint.
51
+
# Deployment
31
52
53
+
Following steps are to be performed on OCI Notebook Session -
32
54
33
55
## Prepare The Model Artifacts
34
56
35
57
To prepare Model artifacts for LLM model deployment:
36
58
37
-
- Download the model files from huggingface to local directory using a valid huggingface token (only needed for gated models). If you don't have Huggingface Token, refer [this](https://huggingface.co/docs/hub/en/security-tokens) to generate one.
59
+
- Download the model files from huggingface to local directory.
38
60
- Upload the model folder to a [versioned bucket](https://docs.oracle.com/en-us/iaas/Content/Object/Tasks/usingversioning.htm) in Oracle Object Storage. If you don’t have an Object Storage bucket, create one using the OCI SDK or the Console. Create an Object Storage bucket. Make a note of the `namespace`, `compartment`, and `bucketname`. Configure the policies to allow the Data Science service to read and write the model artifact to the Object Storage bucket in your tenancy. An administrator must configure the policies in IAM in the Console.
39
61
- Create model catalog entry for the model using the Object storage path
40
62
41
63
### Model Download from HuggingFace Model Hub
42
64
43
-
44
-
45
-
46
65
[This documentation](https://huggingface.co/docs/huggingface_hub/en/guides/cli#download-an-entire-repository) provides more information on using `huggingface-cli` to download an entire repository at a given revision. Models in the HuggingFace hub are stored in their own repository.
47
66
48
67
@@ -120,33 +139,6 @@ model = (
120
139
model.create(model_by_reference=True)
121
140
```
122
141
123
-
## Inference container
124
-
125
-
vLLM is an easy-to-use library for LLM inference and server. You can get the container image from [DockerHub](https://hub.docker.com/r/vllm/vllm-openai/tags).
Currently, OCI Data Science Model Deployment only supports container images residing in the OCI Registry. Before we can push the pulled vLLM container, make sure you have created a repository in your tenancy.
132
-
- Go to your tenancy Container Registry
133
-
- Click on the Create repository button
134
-
- Select Private under Access types
135
-
- Set a name for Repository name. We are using "vllm-odsc "in the example.
136
-
- Click on Create button
137
-
138
-
You may need to docker login to the Oracle Cloud Container Registry (OCIR) first, if you haven't done so before in order to push the image. To login, you have to use your API Auth Token that can be created under your Oracle Cloud Account->Auth Token. You need to login only once. Replace <region> with the OCI region you are using.
If your tenancy is federated with Oracle Identity Cloud Service, use the format <tenancy-namespace>/oracleidentitycloudservice/<username>. You can then push the container image to the OCI Registry
145
-
146
-
```shell
147
-
docker tag vllm/vllm-openai:gptoss -t <region>.ocir.io/<tenancy>/vllm-odsc/vllm-openai:gptoss
0 commit comments