Skip to content

Commit 6ecd258

Browse files
committed
[no ci] add early stage disclaimer to user guide
1 parent 21690fd commit 6ecd258

File tree

9 files changed

+19
-1
lines changed

9 files changed

+19
-1
lines changed

docsrc/source/user_guide/approximators.ipynb

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -7,6 +7,8 @@
77
"source": [
88
"# Approximators\n",
99
"\n",
10+
"*Disclaimer: This guide is in an early stage. We welcome contributions to the guide in form of issues and pull requests.*\n",
11+
"\n",
1012
"Neural approximators provide an approximation of a distribution or a value. To achieve this, they combine the things we have discussed in the previous chapters: simulated data, adapters, summary networks and inference networks. Approximators are at the heart of BayesFlow, as they organize the different components and provide the `fit()` function used for training."
1113
]
1214
}

docsrc/source/user_guide/data_processing.ipynb

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -7,6 +7,8 @@
77
"source": [
88
"# Data Processing: Adapters\n",
99
"\n",
10+
"*Disclaimer: This guide is in an early stage. We welcome contributions to the guide in form of issues and pull requests.*\n",
11+
"\n",
1012
"To ensure that the training data generated by a simulator can be used for deep learning, we have to bring our data into the structure required by BayesFlow. The {py:class}`~bayesflow.adapters.Adapter` class provides multiple flexible functionalities, from standardization to renaming, and many more.\n",
1113
"\n",
1214
"## BayesFlow's Data Structure\n",

docsrc/source/user_guide/diagnostics.ipynb

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -7,6 +7,8 @@
77
"source": [
88
"# Diagnostics and Visualizations\n",
99
"\n",
10+
"*Disclaimer: This guide is in an early stage. We welcome contributions to the guide in form of issues and pull requests.*\n",
11+
"\n",
1012
"There are many factors that influence whether training succeeds and how well we can approximate a target. In this light, checking the results and diagnosing potential problems is an important part of the workflow.\n",
1113
"\n",
1214
"## Loss\n",

docsrc/source/user_guide/generative_models.ipynb

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,9 @@
55
"id": "62abf622-68b3-454c-9424-384b41ad6a38",
66
"metadata": {},
77
"source": [
8-
"# Generative Models"
8+
"# Generative Models\n",
9+
"\n",
10+
"*Disclaimer: This guide is in an early stage. We welcome contributions to the guide in form of issues and pull requests.*"
911
]
1012
},
1113
{

docsrc/source/user_guide/inference_networks.ipynb

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -7,6 +7,8 @@
77
"source": [
88
"# Inference Networks\n",
99
"\n",
10+
"*Disclaimer: This guide is in an early stage. We welcome contributions to the guide in form of issues and pull requests.*\n",
11+
"\n",
1012
"Inference networks form the backbone of neural amortized Bayesian inference methods. They are generative models (usually _invertible_ ones, but they do not have to be), that can transform samples from a simple distribution (e.g., a unit Gaussian) to a complicated one (e.g., a posterior distribution).\n",
1113
"\n",
1214
"You can find the inference networks in the {py:mod}`~bayesflow.networks` module. You can identify them by the \"Bases: {py:class}`~bayesflow.networks.InferenceNetwork`\" label.\n",

docsrc/source/user_guide/introduction.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,7 @@
11
# Introduction
22

3+
*Disclaimer: This guide is in an early stage. We welcome contributions to the guide in form of issues and pull requests.*
4+
35
Welcome to the User Guide! This guide is still in a very early stage, but we plan to evolve it into a comprehensive guide to using BayesFlow.
46

57
## Why (and When) Do We Need Amortized Bayesian Inference (ABI)?

docsrc/source/user_guide/saving_loading.ipynb

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -7,6 +7,8 @@
77
"source": [
88
"# Saving & Loading Models\n",
99
"\n",
10+
"*Disclaimer: This guide is in an early stage. We welcome contributions to the guide in form of issues and pull requests.*\n",
11+
"\n",
1012
"Saving and loading of models takes place via our backend, [Keras 3](https://keras.io/). Objects that can be saved have a `save` method, which allows saving to a `.keras` file.\n",
1113
"\n",
1214
"The [`keras.saving.load_model`](https://keras.io/api/models/model_saving_apis/model_saving_and_loading/#load_model-function) function can be used to load the stored models. There is a lot more to say about this topic. For now, refer to the respective [Keras guide](https://keras.io/guides/serialization_and_saving/).\n",

docsrc/source/user_guide/summary_networks.ipynb

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -7,6 +7,8 @@
77
"source": [
88
"# Summary Networks\n",
99
"\n",
10+
"*Disclaimer: This guide is in an early stage. We welcome contributions to the guide in form of issues and pull requests.*\n",
11+
"\n",
1012
"Learnable summary statistics provide several advantages for amortized Bayesian inference. By reducing the dimensionality of the data, they enable more efficient processing by the inference network. Additionally, they can encode knowledge about the data (e.g., that different observations are exchangeable) directly into the network architecture. Finally, the resulting _summary space_ can serve as a diagnostic tool. In some cases, we also require a summary network to convert observations of variable size into a fixed-size representation.\n",
1113
"\n",
1214
"You can find the summary networks in the {py:mod}`~bayesflow.networks` module. You can identify them by the \"Bases: {py:class}`~bayesflow.networks.SummaryNetwork`\" label."

docsrc/source/user_guide/workflows.ipynb

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -7,6 +7,8 @@
77
"source": [
88
"# Workflows\n",
99
"\n",
10+
"*Disclaimer: This guide is in an early stage. We welcome contributions to the guide in form of issues and pull requests.*\n",
11+
"\n",
1012
"Workflows are an abstraction on top of the approximator, that expose methods for training and inference in a more abstract, and therefore simplified, fashion.\n",
1113
"\n",
1214
"## BasicWorkflow\n",

0 commit comments

Comments
 (0)