You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thank you for your interest in contributing to vLLM!
4
-
Our community is open to everyone and welcomes all kinds of contributions, no matter how small or large.
5
-
There are several ways you can contribute to the project:
3
+
Thank you for your interest in contributing to vLLM! Our community is open to everyone and welcomes all kinds of contributions, no matter how small or large. There are several ways you can contribute to the project:
6
4
7
5
- Identify and report any issues or bugs.
8
-
- Request or add a new model.
6
+
- Request or add support for a new model.
9
7
- Suggest or implement new features.
8
+
- Improve documentation or contribute a how-to guide.
10
9
11
-
However, remember that contributions aren't just about code.
12
-
We believe in the power of community support; thus, answering queries, assisting others, and enhancing the documentation are highly regarded and beneficial contributions.
10
+
We also believe in the power of community support; thus, answering queries, offering PR reviews, and assisting others are also highly regarded and beneficial contributions.
13
11
14
-
Finally, one of the most impactful ways to support us is by raising awareness about vLLM.
15
-
Talk about it in your blog posts, highlighting how it's driving your incredible projects.
16
-
Express your support on Twitter if vLLM aids you, or simply offer your appreciation by starring our repository.
12
+
Finally, one of the most impactful ways to support us is by raising awareness about vLLM. Talk about it in your blog posts and highlight how it's driving your incredible projects. Express your support on social media if you're using vLLM, or simply offer your appreciation by starring our repository!
17
13
18
14
19
-
## Setup for development
15
+
## Developing
20
16
21
-
### Build from source
17
+
Depending on the kind of development you'd like to do (e.g. Python, CUDA), you can choose to build vLLM with or without compilation. Check out the [building from source](https://docs.vllm.ai/en/latest/getting_started/installation.html#build-from-source) documentation for details.
22
18
23
-
```bash
24
-
pip install -e .# This may take several minutes.
25
-
```
26
19
27
-
###Testing
20
+
## Testing
28
21
29
22
```bash
30
23
pip install -r requirements-dev.txt
@@ -36,15 +29,16 @@ mypy
36
29
# Unit tests
37
30
pytest tests/
38
31
```
39
-
**Note:** Currently, the repository does not pass the mypy tests.
32
+
**Note:** Currently, the repository does not pass the ``mypy`` tests.
40
33
34
+
## Contribution Guidelines
41
35
42
-
##Contributing Guidelines
36
+
### Issues
43
37
44
-
### Issue Reporting
38
+
If you encounter a bug or have a feature request, please [search existing issues](https://github.com/vllm-project/vllm/issues?q=is%3Aissue) first to see if it has already been reported. If not, please [file a new issue](https://github.com/vllm-project/vllm/issues/new/choose), providing as much relevant information as possible.
45
39
46
-
If you encounter a bug or have a feature request, please check our issues page first to see if someone else has already reported it.
47
-
If not, please file a new issue, providing as much relevant information as possible.
40
+
> [!IMPORTANT]
41
+
> If you discover a security vulnerability, please follow the instructions [here](/SECURITY.md#reporting-a-vulnerability).
48
42
49
43
### Pull Requests & Code Reviews
50
44
@@ -53,4 +47,4 @@ Please check the PR checklist in the [PR template](.github/PULL_REQUEST_TEMPLATE
53
47
### Thank You
54
48
55
49
Finally, thank you for taking the time to read these guidelines and for your interest in contributing to vLLM.
56
-
Your contributions make vLLM a great tool for everyone!
50
+
All of your contributions help make vLLM a great tool and community for everyone!
RUN pip install https://intel-extension-for-pytorch.s3.amazonaws.com/ipex_dev/cpu/intel_extension_for_pytorch-2.4.0%2Bgitfbaa4bc-cp310-cp310-linux_x86_64.whl
25
+
RUN pip install intel_extension_for_pytorch==2.4.0
Copy file name to clipboardExpand all lines: SECURITY.md
+4-5Lines changed: 4 additions & 5 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -2,11 +2,10 @@
2
2
3
3
## Reporting a Vulnerability
4
4
5
-
If you believe you have found a security vulnerability in vLLM, we encourage you to let us know right away.
6
-
We will investigate all legitimate reports and do our best to quickly fix the problem.
5
+
If you believe you have found a security vulnerability in vLLM, we encourage you to let us know right away. We will investigate all legitimate reports and do our best to quickly fix the problem.
7
6
8
-
Please report security issues using https://github.com/vllm-project/vllm/security/advisories/new
7
+
Please report security issues privately using [the vulnerability submission form](https://github.com/vllm-project/vllm/security/advisories/new).
9
8
10
9
---
11
-
Please see PyTorch Security for more information how to securely interact with models: https://github.com/pytorch/pytorch/blob/main/SECURITY.md
12
-
This document mostly references the recommendation from PyTorch, thank you!
10
+
11
+
Please see [PyTorch's Security Policy](https://github.com/pytorch/pytorch/blob/main/SECURITY.md) for more information and recommendations on how to securely interact with models.
0 commit comments