You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@@ -39,17 +41,25 @@ of. Thus, in this context, the terms **label**, **index**, and
39
41
**variable** are synonymous and hence used interchangeably.
40
42
41
43
## What is a tensor network?
42
-
We now turn our attention to defining a **tensor network**.
43
-
Tensor network a mathematical object that can be used to represent a multilinear map between tensors. It is widely used in condensed matter physics [^Orus2014][^Pfeifer2014] and quantum simulation [^Markov2008][^Pan2022]. It is also a powerful tool for solving combinatorial optimization problems [^Liu2023].
44
-
It is important to note that we use a generalized version of the conventional
45
-
notation, which is also knwon as the [eisnum](https://numpy.org/doc/stable/reference/generated/numpy.einsum.html) function that widely used in high performance computing.
46
-
Packages that implement the conventional notation include
44
+
45
+
We now turn our attention to defining a **tensor network**, a mathematical
46
+
object used to represent a multilinear map between tensors. This concept is
47
+
widely employed in fields like condensed matter physics
48
+
[^Orus2014][^Pfeifer2014], quantum simulation [^Markov2008][^Pan2022], and
49
+
even in solving combinatorial optimization problems [^Liu2023]. It's worth
50
+
noting that we use a generalized version of the conventional notation, most
This approach allows us to represent a more extensive set of sum-product multilinear operations between tensors, meeting the requirements of the PGM field.
60
+
This approach allows us to represent a broader range of sum-product
61
+
multilinear operations between tensors, thus meeting the requirements of the
62
+
PGM field.
53
63
54
64
*Definition*[^Liu2023]: A tensor network is a multilinear map represented by the triple
As a final comment, repeated indices in the same tensor is not forbidden in
175
-
the definition of a tensor network, hence self-loops are also allowed in a tensor
176
-
network diagram.
186
+
As a final note, our definition of a tensor network allows for repeated
187
+
indices within the same tensor, which translates to self-loops in their
188
+
corresponding diagrams.
177
189
178
190
## Tensor network contraction orders
191
+
179
192
The performance of a tensor network contraction depends on the order in which
180
193
the tensors are contracted. The order of contraction is usually specified by
181
194
binary trees, where the leaves are the input tensors and the internal nodes
182
195
represent the order of contraction. The root of the tree is the output tensor.
183
196
184
-
Plenty of algorithms have been proposed to find the optimal contraction order, which includes
197
+
Numerous approaches have been proposed to determine efficient contraction
198
+
orderings, which include:
185
199
- Greedy algorithms
186
200
- Breadth-first search and Dynamic programming [^Pfeifer2014]
187
201
- Graph bipartitioning [^Gray2021]
188
202
- Local search [^Kalachev2021]
189
203
190
-
Some of them have already been included in the [OMEinsum](https://github.com/under-Peter/OMEinsum.jl) package. Please check [Performance Tips](@ref) for more details.
@@ -210,4 +226,4 @@ Some of them have already been included in the [OMEinsum](https://github.com/und
210
226
Pan F, Chen K, Zhang P. Solving the sampling problem of the sycamore quantum circuits[J]. Physical Review Letters, 2022, 129(9): 090502.
211
227
212
228
[^Liu2023]:
213
-
Liu J G, Gao X, Cain M, et al. Computing solution space properties of combinatorial optimization problems via generic tensor networks[J]. SIAM Journal on Scientific Computing, 2023, 45(3): A1239-A1270.
229
+
Liu J G, Gao X, Cain M, et al. Computing solution space properties of combinatorial optimization problems via generic tensor networks[J]. SIAM Journal on Scientific Computing, 2023, 45(3): A1239-A1270.
0 commit comments