You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Let's consider another example with [Nx.weighted_mean](https://hexdocs.pm/nx/Nx.html#weighted_mean/3). It supports full-tensor and per axis operations. We display how to compute the _weighted mean aggregate_ of a matrix with the example below of a 2D tensor of shape `{2,2}` labeled `m`:
76
76
77
77
```elixir
78
-
m =~M[
78
+
m =~MAT[
79
79
12
80
80
34
81
81
]
@@ -96,7 +96,7 @@ m = ~M[
96
96
First, we'll compute the full-tensor aggregation. The calculations are developed below. We calculate an "array product" (aka [Hadamard product](<https://en.wikipedia.org/wiki/Hadamard_product_(matrices)#:~:text=In%20mathematics%2C%20the%20Hadamard%20product,elements%20i%2C%20j%20of%20the>), an element-wise product) of our tensor with the tensor of weights, then sum all the elements and divide by the sum of the weights.
The weighted mean can be computed _per axis_. Let's compute it along the _first_ axis (`axes: [0]`): you calculate "by column", so you aggregate/reduce along the first axis:
We calculate weighted mean of a square matrix along the _second_ axis (`axes: [1]`): you calculate per row, so you aggregate/reduce along the second axis.
149
149
150
150
```elixir
151
-
w =~M[
151
+
w =~MAT[
152
152
1020
153
153
3040
154
154
]
@@ -816,7 +816,7 @@ Nx.argmin(t, axis: 3)
816
816
You have the `:tie_break` option to decide how to operate with you have several occurences of the result. It defaults to `tie_break: :low`.
0 commit comments