Skip to content

Commit ab449fa

Browse files
committed
Document Sync by Tina
1 parent 13ee8a7 commit ab449fa

File tree

2 files changed

+29
-0
lines changed

2 files changed

+29
-0
lines changed
Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,4 @@
1+
{
2+
"label": "Developer Guide",
3+
"position": 6
4+
}
Lines changed: 25 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,25 @@
1+
---
2+
sidebar_position: 0
3+
---
4+
5+
# Supporting a New Hardware
6+
7+
ServerlessLLM actively expands support for new hardware configurations to meet diverse deployment needs.
8+
9+
## Support Standards
10+
Hardware is considered supported by ServerlessLLM if:
11+
1. Any of the inference backends used (e.g., Transformers, vLLM) can run model inference on the hardware.
12+
2. ServerlessLLM Store can successfully load model checkpoints on the hardware.
13+
14+
## Steps to Support a New Hardware
15+
1. **Check Inference Backend Compatibility**: Refer to the specific inference backend documentation (e.g., for vLLM, Transformers) for hardware support.
16+
2. **ServerlessLLM Store Configuration**:
17+
- If the hardware provides CUDA-compatible APIs (e.g., ROCm), adjust the build script (`CMakeLists.txt`) by adding necessary compiler flags.
18+
- For non-CUDA-compatible APIs, implementing a custom checkpoint loading function might be required.
19+
20+
## Verifying Hardware Support in ServerlessLLM Store
21+
The hardware support is verified if it successfully completes the [Quick Start Guide](https://serverlessllm.github.io/docs/stable/getting_started/quickstart/) examples, showcasing checkpoint loading and inference functionality without errors.
22+
23+
If the hardware is not publicly available (i.e., can't be tested by the ServerlessLLM team), a screenshot or output log of the successful execution of the Quick Start Guide examples is required to verify hardware support.
24+
25+
If you encounter any issues or have questions, please reach out to the ServerlessLLM team by raising an issue on the [GitHub repository](https://github.com/ServerlessLLM/ServerlessLLM/issues).

0 commit comments

Comments
 (0)