Skip to content

Commit 0d2941c

Browse files
authored
[Inference Snippets] Reorder clients: openai > huggingface > requests/fetch (#1789)
Quickly discussed with @hanouticelina. Order of definition of the inference snippets was a bit random. Let's always show "openai" client when possible (i.e. when `chat completion`), then `huggingface_hub`/`huggingface.js` and then the raw `fetch`/`requests`. By default on the Hub we are already showcasing `openai` first. This update will align the snippets in the docs as well.
1 parent 9d8cb9a commit 0d2941c

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

packages/inference/src/snippets/getInferenceSnippets.ts

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -24,8 +24,8 @@ export type InferenceSnippetOptions = {
2424
inputs?: Record<string, unknown>; // overrides the default snippet's inputs
2525
} & Record<string, unknown>;
2626

27-
const PYTHON_CLIENTS = ["huggingface_hub", "fal_client", "requests", "openai"] as const;
28-
const JS_CLIENTS = ["fetch", "huggingface.js", "openai"] as const;
27+
const PYTHON_CLIENTS = ["openai", "huggingface_hub", "fal_client", "requests"] as const;
28+
const JS_CLIENTS = ["openai", "huggingface.js", "fetch"] as const;
2929
const SH_CLIENTS = ["curl"] as const;
3030

3131
type Client = (typeof SH_CLIENTS)[number] | (typeof PYTHON_CLIENTS)[number] | (typeof JS_CLIENTS)[number];

0 commit comments

Comments
 (0)