You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+59-27Lines changed: 59 additions & 27 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -18,43 +18,75 @@ Those familiar with the [`transformers`](https://github.com/huggingface/transfor
18
18
19
19
## Rationale & Overview
20
20
21
-
Check out [our announcement post](https://huggingface.co/blog/swift-coreml-llm).
21
+
Check out [our v1.0 release post](https://huggingfce.co/blog/swift-transformers) and our [original announcement](https://huggingface.co/blog/swift-coreml-llm) for more context on why we built this library.
22
22
23
-
## Modules
23
+
## Examples
24
24
25
-
-`Tokenizers`: Utilities to convert text to tokens and back, with support for Chat Templates and Tools. Follows the abstractions in [`tokenizers`](https://github.com/huggingface/tokenizers).
25
+
The most commonly used modules from `swift-transformers` are `Tokenizers` and `Hub`, which allow fast tokenization and
26
+
model downloads from the Hugging Face Hub.
27
+
28
+
### Tokenizing text + chat templating
29
+
30
+
Tokenizing text should feel very familiar to those who have used the Python `transformers` library:
26
31
27
-
Usage example:
28
32
```swift
29
-
importTokenizers
30
-
functestTokenizer() asyncthrows {
31
-
let tokenizer =tryawait AutoTokenizer.from(pretrained: "deepseek-ai/DeepSeek-R1-Distill-Qwen-7B")
32
-
let messages = [["role":"user", "content":"Describe the Swift programming language."]]
33
-
let encoded =try tokenizer.applyChatTemplate(messages: messages)
34
-
let decoded = tokenizer.decode(tokens: encoded)
35
-
}
33
+
let tokenizer =tryawait AutoTokenizer.from(pretrained: "deepseek-ai/DeepSeek-R1-Distill-Qwen-7B")
34
+
let messages = [["role":"user", "content":"Describe the Swift programming language."]]
35
+
let encoded =try tokenizer.applyChatTemplate(messages: messages)
36
+
let decoded = tokenizer.decode(tokens: encoded)
36
37
```
37
-
-`Hub`: Utilities for interacting with the Hugging Face Hub. Download models, tokenizers and other config files.
38
38
39
-
Usage example:
39
+
40
+
### Tool calling
41
+
42
+
`swift-transformers` natively supports formatting inputs for tool calling, allowing for complex interactions with language models:
40
43
41
44
```swift
42
-
importHub
43
-
functestHub() asyncthrows {
44
-
let repo = Hub.Repo(id: "mlx-community/Qwen2.5-0.5B-Instruct-2bit-mlx")
print("Files downloaded to: \(modelDirectory.path)")
53
-
}
45
+
let tokenizer =tryawait AutoTokenizer.from(pretrained: "mlx-community/Qwen2.5-7B-Instruct-4bit")
46
+
47
+
let weatherTool = [
48
+
"type":"function",
49
+
"function": [
50
+
"name":"get_current_weather",
51
+
"description":"Get the current weather in a given location",
52
+
"parameters": [
53
+
"type":"object",
54
+
"properties": ["location": ["type":"string", "description":"City and state"]],
55
+
"required": ["location"]
56
+
]
57
+
]
58
+
]
59
+
60
+
let tokens =try tokenizer.applyChatTemplate(
61
+
messages: [["role":"user", "content":"What's the weather in Paris?"]],
62
+
tools: [weatherTool]
63
+
)
54
64
```
55
65
56
-
-`Generation`: Utilities for text generation, handling tokenization for you. Currently supported sampling methods: greedy search, top-k sampling, and top-p sampling.
57
-
-`Models`: Language model abstraction over a Core ML package.
66
+
67
+
### Hub downloads
68
+
69
+
Downloading models to a user device _fast_ and _reliably_ is a core requirement of on-device ML. `swift-transformers` provides a simple API to
70
+
download models from the Hugging Face Hub, with progress reporting, flaky connection handling, and more:
71
+
72
+
```swift
73
+
let repo = Hub.Repo(id: "mlx-community/Qwen2.5-0.5B-Instruct-2bit-mlx")
print("Files downloaded to: \(modelDirectory.path)")
82
+
```
83
+
84
+
### CoreML Integration
85
+
86
+
The `Models` and `Generation` modules provide handy utilities when working with language models in CoreML. Check out our
87
+
example converting and running Mistral 7B using CoreML [here](https://github.com/huggingface/swift-transformers/tree/main/Examples).
88
+
89
+
The [modernization of Core ML](https://github.com/huggingface/swift-transformers/pull/257) and corresponding examples were primarily contributed by @joshnewnham, @1duo, @alejandro-isaza, @aseemw. Thank you 🙏
0 commit comments