Skip to content

Commit 22c19a7

Browse files
committed
rope delta order
1 parent 7daacb4 commit 22c19a7

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

src/transformers/models/glm4v/modeling_glm4v.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1424,6 +1424,8 @@ def forward(
14241424
**kwargs: Unpack[TransformersKwargs],
14251425
) -> Union[tuple, Glm4vCausalLMOutputWithPast]:
14261426
r"""
1427+
rope_deltas (`torch.LongTensor` of shape `(batch_size, )`, *optional*):
1428+
The rope index difference between sequence length and multimodal rope.
14271429
labels (`torch.LongTensor` of shape `(batch_size, sequence_length)`, *optional*):
14281430
Labels for computing the masked language modeling loss. Indices should either be in `[0, ...,
14291431
config.vocab_size]` or -100 (see `input_ids` docstring). Tokens with indices set to `-100` are ignored
@@ -1432,8 +1434,6 @@ def forward(
14321434
The temporal, height and width of feature shape of each image in LLM.
14331435
video_grid_thw (`torch.LongTensor` of shape `(num_videos, 3)`, *optional*):
14341436
The temporal, height and width of feature shape of each video in LLM.
1435-
rope_deltas (`torch.LongTensor` of shape `(batch_size, )`, *optional*):
1436-
The rope index difference between sequence length and multimodal rope.
14371437
14381438
Example:
14391439

0 commit comments

Comments
 (0)