You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+46Lines changed: 46 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -8,6 +8,52 @@ This is the official PyTorch implementation for the following EMNLP 2021 paper f
8
8

9
9
10
10
## Updates
11
+
**Oct 25, 2021**
12
+
13
+
We release a CodeT5-base fine-tuned checkpoint ([Salesforce/codet5-base-multi-sum](https://huggingface.co/Salesforce/codet5-base-multi-sum)) for multi-lingual code summarzation. Below is how to use this model:
14
+
15
+
```python
16
+
from transformers import RobertaTokenizer, T5ForConditionalGeneration
# this prints: "Convert a SVG string to a QImage."
40
+
```
41
+
42
+
It significantly outperforms previous methods on code summarization in the [CodeXGLUE benchmark](https://github.com/microsoft/CodeXGLUE/tree/main/Code-Text/code-to-text):
43
+
| Model | Ruby | Javascript | Go | Python | Java | PHP | Overall |
We add a [model card](https://github.com/salesforce/CodeT5/blob/main/CodeT5_model_card.pdf) for CodeT5! Please reach out if you have any questions about it.
56
+
11
57
**Sep 24, 2021**
12
58
13
59
CodeT5 is now in [hugginface](https://huggingface.co/)!
0 commit comments