Skip to content

Commit 4af8cd2

Browse files
authored
[Term Entry] PyTorch Tensor Operations: .remainder()
* Create remainder.md with initial content structure Add remainder.md file with the term entry template. * Add `.remainder()` term under PyTorch Tensor Operations. This commit adds the .remainder() term under PyTorch Tensor Operations to the previously made remainder.md template at the following path: docs/content/pytorch/concepts/tensors/terms/remainder/remainder.md Implements #7855 * Update remainder.md * Minor changes ---------
1 parent ed00ec3 commit 4af8cd2

File tree

1 file changed

+108
-0
lines changed
  • content/pytorch/concepts/tensors/terms/remainder

1 file changed

+108
-0
lines changed
Lines changed: 108 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,108 @@
1+
---
2+
Title: '.remainder()'
3+
Description: 'Computes the element-wise remainder of tensor division, where the result’s sign matches the divisor.'
4+
Subjects:
5+
- 'Computer Science'
6+
- 'Data Science'
7+
Tags:
8+
- 'AI'
9+
- 'Deep Learning'
10+
- 'Functions'
11+
CatalogContent:
12+
- 'intro-to-py-torch-and-neural-networks'
13+
- 'py-torch-for-classification'
14+
---
15+
16+
In PyTorch, the **`.remainder()`** function computes the element-wise remainder of division between two tensors or between a tensor and a scalar. The result always has the same sign as the divisor, unlike `.fmod()`, which matches the sign of the dividend. This operation works with both integer and floating-point tensors.
17+
18+
## Syntax
19+
20+
```pseudo
21+
torch.remainder(input, other)
22+
```
23+
24+
**Parameters:**
25+
26+
- `input` (Tensor): The input tensor containing dividend values.
27+
- `other` (Tensor or Number): The divisor. It can be a scalar or another tensor of compatible shape.
28+
- `out` (Tensor, optional): The output tensor to store the result.
29+
30+
**Return value:**
31+
32+
Returns a tensor containing the element-wise remainder of the division.
33+
34+
- If `other` is a scalar, the operation is applied using the same divisor for all elements.
35+
- If `other` is a tensor, element-wise division is performed.
36+
37+
## Example 1: Divide a 1D Tensor by an Integer
38+
39+
This example computes the remainder of each element in `x` when divided by 2, keeping the sign consistent with the divisor:
40+
41+
```py
42+
import torch
43+
44+
x = torch.tensor([-3, -4, -1, -6, 4, 7, 8])
45+
print(torch.remainder(x, 3))
46+
```
47+
48+
The output of this code is:
49+
50+
```shell
51+
tensor([0, 2, 2, 0, 1, 1, 2])
52+
```
53+
54+
## Example 2: Divide a 2D Tensor by an Integer
55+
56+
In this example, each negative number wraps around within the range `[0, 4)` since the remainder must match the divisor’s sign:
57+
58+
```py
59+
import torch
60+
61+
A = torch.tensor([[ 1, 2, 3],
62+
[-1, -2, -3]])
63+
print(torch.remainder(A, 4))
64+
```
65+
66+
The output of this code is:
67+
68+
```shell
69+
tensor([[1, 2, 3],
70+
[3, 2, 1]])
71+
```
72+
73+
## Example 3: Divide a Tensor by Another Tensor
74+
75+
This example demonstrates element-wise remainder calculation between two tensors of the same shape:
76+
77+
```py
78+
import torch
79+
80+
num = torch.tensor([ 3, -3, 3, -3], dtype=torch.int32)
81+
den = torch.tensor([ 2, 2, -2, -2], dtype=torch.int32)
82+
print(torch.remainder(num, den))
83+
```
84+
85+
The output of this code is:
86+
87+
```shell
88+
tensor([ 1, 1, -1, -1])
89+
```
90+
91+
> **Note:** If `input` and `other` don’t share the same shape, PyTorch tries automatic size expansion (broadcasting). Dimensions match when they’re equal or one of them is 1 (aligned from the right). If no match is possible, a size-mismatch error is raised.
92+
93+
## Example 4: Divide a Floating-Point Tensor by a Number
94+
95+
This example calculates remainders for floating-point values, preserving the sign of the divisor:
96+
97+
```py
98+
import torch
99+
100+
xf = torch.tensor([-7.5, 7.5, 5.0])
101+
print(torch.remainder(xf, 4.0))
102+
```
103+
104+
The output of this code is:
105+
106+
```shell
107+
tensor([0.5, 3.5, 1.0])
108+
```

0 commit comments

Comments
 (0)