Skip to content

Commit 972a4c8

Browse files
committed
docs: add a section on Why TensorInference.jl on the welcome page
1 parent 3b6f51d commit 972a4c8

File tree

1 file changed

+32
-0
lines changed

1 file changed

+32
-0
lines changed

docs/src/index.md

Lines changed: 32 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -24,6 +24,38 @@ Solutions to the most common probabilistic inference tasks, including:
2424
- **Marginal Maximum a Posteriori (MMAP)**: Finds the most probable state of a
2525
subset of variables, averaging out the uncertainty over the remaining ones.
2626

27+
## Why TensorInference.jl
28+
29+
A major challenge in developing intelligent systems is the ability to reason
30+
under uncertainty, a challenge that appears in many real-world problems across
31+
various domains, including artificial intelligence, medical diagnosis,
32+
computer vision, computational biology, and natural language processing.
33+
Reasoning under uncertainty involves calculating the probabilities of relevant
34+
variables while taking into account any information that is acquired. This
35+
process, which can be thought of as drawing global insights from local
36+
observations, is known as *probabilistic inference*.
37+
38+
*Probabilistic graphical models* (PGMs) provide a unified framework to perform
39+
probabilistic inference. These models use graphs to represent the joint
40+
probability distribution of complex systems in a concise manner by exploiting
41+
the conditional independence between variables in the model. Additionally,
42+
they form the foundation for various algorithms that enable efficient
43+
probabilistic inference.
44+
45+
However, even with the representational aid of PGMs, performing probabilistic
46+
inference remains an intractable endeavor on many real-world models. The
47+
reason is that performing probabilistic inference involves complex
48+
combinatorial optimization problems in very high dimensional spaces. To tackle
49+
these challenges, more efficient and scalable inference algorithms are needed.
50+
51+
As an attempt to tackle the aforementioned challenges, we present
52+
`TensorInference.jl`, a Julia package for probabilistic inference that
53+
combines the representational capabilities of PGMs with the computational
54+
power of tensor networks. By harnessing the best of both worlds,
55+
`TensorInference.jl` aims to enhance the performance of probabilistic
56+
inference, thereby expanding the tractability spectrum of exact inference for
57+
more complex, real-world models.
58+
2759
## Outline
2860
```@contents
2961
Pages = [

0 commit comments

Comments
 (0)