@@ -24,13 +24,62 @@ priors with non-zero means and handle the change of variables internally.
2424
2525Probably most users would like to use the exported function
2626``` julia
27- ESS_mcmc ([rng:: AbstracRNG ,] prior, loglikelihood, N:: Int [; burnin:: Int = 0 ])
27+ ESS_mcmc ([rng:: AbstracRNG , ] prior, loglikelihood, N:: Int [; burnin:: Int = 0 ])
2828```
29- which returns a Markov chain of ` N ` samples for approximating the posterior of
30- a model with a multivariate Gaussian prior that allows sampling from the ` prior `
31- and evaluation of the log likelihood ` loglikelihood ` . The burn-in phase with
29+ which returns a vector of ` N ` samples for approximating the posterior of
30+ a model with a Gaussian prior that allows sampling from the ` prior ` and
31+ evaluation of the log likelihood ` loglikelihood ` . The burn-in phase with
3232` burnin ` samples is discarded.
3333
34+ If you want to have more control about the sampling procedure (e.g., if you
35+ only want to save a subset of samples or want to use another stopping
36+ criterion), the function
37+ ``` julia
38+ ESS_mcmc_sampler ([rng:: AbstractRNG , ]prior, loglikelihood)
39+ ```
40+ gives you access to an iterator from which you can generate an unlimited
41+ number of samples.
42+
43+ ### Prior
44+
45+ You may specify Gaussian priors with arbitrary means. EllipticalSliceSampling.jl
46+ provides first-class support for the scalar and multivariate normal distributions
47+ in [ Distributions.jl] ( https://github.com/JuliaStats/Distributions.jl ) . For
48+ instance, if the prior distribution is a standard normal distribution, you can
49+ choose
50+ ``` julia
51+ prior = Normal ()
52+ ```
53+
54+ However, custom Gaussian priors are supported as well. For instance, if you want to
55+ use a custom distribution type ` GaussianPrior ` , the following methods should be
56+ implemented:
57+ ``` julia
58+ # state that the distribution is actually Gaussian
59+ EllipticalSliceSampling. isnormal (:: GaussianPrior ) = true
60+
61+ # define how to sample from the distribution
62+ # only one of the following methods is needed:
63+ # - if the samples are immutable (e.g., numbers or static arrays) only
64+ # `rand(rng, dist)` should be implemented
65+ # - otherwise only `rand!(rng, dist, sample)` is required
66+ Base. rand (rng:: AbstractRNG , dist:: GaussianPrior ) = ...
67+ Base. rand! (rng:: AbstractRNG , dist:: GaussianPrior , sample) = ...
68+
69+ # specify the type of a sample from the distribution
70+ Base. eltype (:: Type{<:GaussianPrior} ) = ...
71+ ```
72+
73+ ### Log likelihood
74+
75+ In addition to the prior, you have to specify a Julia implementation of
76+ the log likelihood function. Here the predefined log densities and log
77+ likelihood functions in
78+ [ Distributions.jl] ( https://github.com/JuliaStats/Distributions.jl ) might
79+ be useful.
80+
81+ ### Progress monitor
82+
3483If you use a package such as [ Juno] ( https://junolab.org/ ) or
3584[ ConsoleProgressMonitor.jl] ( https://github.com/tkf/ConsoleProgressMonitor.jl ) that supports
3685progress logs created by the
0 commit comments