You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/src/index.md
+2Lines changed: 2 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -20,6 +20,8 @@ pkg> add OperatorLearning
20
20
21
21
## Usage
22
22
23
+
In total, the exported layers behave like you would expect from ones that `Flux.jl` provides, i.e. you can use basically all the tools that come along with `Flux` to do training.
24
+
23
25
### Fourier Neural Operator
24
26
25
27
The basic workflow is more or less in line with the layer architectures that `Flux` provides, i.e. you construct individual layers, chain them if desired and pass the inputs as arguments to the layers.
0 commit comments