You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/src/performancetips.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -23,7 +23,7 @@ Other optimizers include
23
23
One can type `?TreeSA` in a Julia REPL for more information about how to configure the hyper-parameters of `TreeSA` method.
24
24
`simplifier` keyword argument is not so important, it is a preprocessing routine to improve the searching speed of the `optimizer`.
25
25
26
-
The returned instance `problem` contains a field `code` that specifies the tensor network contraction order. For an independence problem, its contraction time space complexity is ``2^{{\rm tw}(G)}``, where ``{\rm tw(G)}`` is the [tree-width](https://en.wikipedia.org/wiki/Treewidth) of ``G``.
26
+
The returned instance `problem` contains a field `code` that specifies the tensor network contraction order. For an independent set problem, its contraction time space complexity is ``2^{{\rm tw}(G)}``, where ``{\rm tw(G)}`` is the [tree-width](https://en.wikipedia.org/wiki/Treewidth) of ``G``.
27
27
One can check the time, space and read-write complexity with the following function.
Copy file name to clipboardExpand all lines: examples/IndependentSet.jl
+29-29Lines changed: 29 additions & 29 deletions
Original file line number
Diff line number
Diff line change
@@ -16,40 +16,40 @@ locations = [[rot15(0.0, 1.0, i) for i=0:4]..., [rot15(0.0, 0.6, i) for i=0:4]..
16
16
show_graph(graph; locs=locations)
17
17
18
18
# ## Tensor network representation
19
-
# Type [`IndependentSet`](@ref) can be used for constructing the tensor network with optimized contraction order for solving an independent set problem.
20
-
# we map a vertex ``i\in V`` to a label ``s_i \in \{0, 1\}`` of dimension 2,
21
-
# where we use 0 (1) to denote a vertex is absent (present) in the set.
19
+
# Let ``G=(V,E)`` be the target graph that we want to solve.
20
+
# The tensor network representation map a vertex ``i\in V`` to a label ``s_i \in \{0, 1\}`` of dimension ``2`` in a tensor network, where we use ``0`` (``1``) to denote a vertex is absent (present) in the set.
22
21
# For each label ``s_i``, we defined a parametrized rank-one vertex tensor ``W(x_i)`` as
23
-
# ```math
24
-
# W(x_i)_{s_i} = \left(\begin{matrix}
25
-
# 1 \\
26
-
# x_i
27
-
# \end{matrix}\right)_{s_i}
28
-
# ```
29
-
# We use subscripts to index tensor elements, e.g.``W(x_i)_0=1`` is the first element associated
30
-
# with ``s_i=0`` and ``W(x_i)_1=x_i`` is the second element associated with ``s_i=1``.
22
+
# \begin{equation}
23
+
# W(x_i) = \left(\begin{matrix}
24
+
# 1 \\
25
+
# x_i
26
+
# \end{matrix}\right).
27
+
# \end{equation}
28
+
# We use subscripts to index tensor elements, e.g. ``W(x_i)_0=1`` is the first element associated with ``s_i=0`` and ``W(x_i)_1=x_i`` is the second element associated with ``s_i=1``.
31
29
# Similarly, on each edge ``(u, v)``, we define a matrix ``B`` indexed by ``s_u`` and ``s_v`` as
32
-
# ```math
33
-
# B_{s_i s_j} = \left(\begin{matrix}
34
-
# 1 & 1\\
35
-
# 1 & 0
36
-
# \end{matrix}\right)_{s_is_j}
37
-
# ```
38
-
# Let us contruct the problem instance with optimized tensor network contraction order as bellow.
30
+
# \begin{equation}
31
+
# \qquad \quad
32
+
# B = \left(\begin{matrix}
33
+
# 1 & 1\\
34
+
# 1 & 0
35
+
# \end{matrix}\right). \label{eq:edgetensor}
36
+
# \end{equation}
37
+
38
+
# We can use [`IndependentSet`](@ref) to construct a tensor network
39
+
# corresponding to the independent set problem on our target graph.
39
40
problem =IndependentSet(graph; optimizer=TreeSA());
40
41
41
-
#In the input arguments of [`IndependentSet`](@ref), the `optimizer` is for optimizing the contraction orders.
42
-
# Here we use the local search based optimizer `TreeSA`.
43
-
# The returned instance `problem` contains a field `code` that specifies the tensor network contraction order.
42
+
#Key word argument `optimizer` specifies the contraction order optimizer of the tensor network.
43
+
# Here we use the local search based optimizer [`TreeSA`](@ref).
44
+
# The return value `problem` contains a field `code` that specifies the tensor network and its contraction order.
44
45
# The optimal contraction time and space complexity of an independent set problem is ``2^{{\rm tw}(G)}``,
45
46
# where ``{\rm tw(G)}`` is the [tree-width](https://en.wikipedia.org/wiki/Treewidth) of ``G``.
46
47
# One can check the time, space and read-write complexity with the following function.
47
48
48
49
timespacereadwrite_complexity(problem)
49
50
50
-
# The return values are `log2` of the the number of iterations, the number elements in the max tensor and the number of read-write operations to tensor elements.
51
-
# For more information about the performance, please check the [Performance Tips](@ref).
52
-
51
+
# The return values are `log2` of the the number of iterations, the number elements in the tensor with maximum size during contraction and the number of tensor element read-write operations.
52
+
# For more information about how to improve the contraction order, please check the [Performance Tips](@ref).
# There are two approaches to find one of the best solution.
81
81
# The unbounded (default) version uses [`ConfigSampler`](@ref) to sample one of the best solutions directly.
82
82
# The bounded version uses the binary gradient back-propagation (see our paper) to compute the gradients.
83
-
# It requires caching intermediate states, but is often faster on CPU because it can use [`TropicalGEMM`](https://github.com/TensorBFS/TropicalGEMM.jl).
83
+
# It requires caching intermediate states, but is often faster (on CPU) because it can use [`TropicalGEMM`](https://github.com/TensorBFS/TropicalGEMM.jl) (see [Performance Tips](@ref)).
0 commit comments