Skip to content

Commit d11aa12

Browse files
authored
Merge branch 'master' into feature-custom-meanfn
2 parents 04370ed + b4c4924 commit d11aa12

File tree

14 files changed

+711
-22
lines changed

14 files changed

+711
-22
lines changed

.travis.yml

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -53,6 +53,7 @@ install:
5353
# Before proceeding to install anything else, we debu
5454
# by making sure that the conda install didn't break matplotlib.
5555
- python -c "import matplotlib.pyplot as plt; print('Using MPL backend:'); print(plt.get_backend())"
56+
- pip install --upgrade pip
5657
- pip install -r requirements.txt
5758
# Before proceeding, we pause to list all installed conda and pip
5859
# packages for later debugging.

doc/source/apiref/derived_models.rst

Lines changed: 12 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -41,3 +41,15 @@ additional functionality or changing the behaviors of underlying models.
4141

4242
.. autoclass:: MLEModel
4343
:members:
44+
45+
:class:`RandomWalkModel` - Model for adding fixed random walk to parameters
46+
---------------------------------------------------------------------------
47+
48+
.. autoclass:: RandomWalkModel
49+
:members:
50+
51+
:class:`GaussianRandomWalkModel` - Model for adding gaussian random walk to parameters
52+
--------------------------------------------------------------------------------------
53+
54+
.. autoclass:: GaussianRandomWalkModel
55+
:members:

doc/source/apiref/test_models.rst

Lines changed: 12 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -39,4 +39,16 @@ built on top of QInfer.
3939

4040
.. autoclass:: NDieModel
4141
:members:
42+
43+
Custom Models
44+
-------------
45+
46+
Writing custom models is standard practice for QInfer users.
47+
See :ref:`CustomModels`.
48+
49+
.. currentmodule:: qinfer.tests.base_test
50+
51+
:meth:`test_model` - Method to run suite of tests on a model instance
52+
---------------------------------------------------------------------
4253

54+
.. autofunction:: test_model

doc/source/apiref/utils.rst

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -33,3 +33,6 @@ Function Reference
3333

3434
.. autofunction:: format_uncertainty
3535

36+
.. autofunction:: to_simplex
37+
38+
.. autofunction:: from_simplex

doc/source/guide/models.rst

Lines changed: 29 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -319,6 +319,8 @@ True
319319
>>> L.shape == (1, 100, 9)
320320
True
321321

322+
.. _CustomModels:
323+
322324
Implementing Custom Simulators and Models
323325
-----------------------------------------
324326

@@ -414,6 +416,33 @@ True
414416
>>> D.shape == (2, 10000, 81)
415417
True
416418

419+
Finally, we mention a useful tool for doing a set of
420+
tests on the custom model, which make sure its pieces are working as
421+
expected. These tests look at things like data types and index dimensions of
422+
various functions. They also plug the outputs of some methods into the inputs
423+
of other methods, and so forth. Although they can't check the statistical
424+
soundness of your model, if they all pass, you can be pretty confident
425+
you won't run into weird indexing bugs in the future.
426+
427+
We just need to pass :func:`~qinfer.tests.test_model` an instance of the
428+
custom model, a prior that samples valid model parameters, and an array of valid
429+
``expparams``.
430+
431+
>>> from qinfer.tests import test_model
432+
>>> from qinfer import UniformDistribution
433+
>>> prior = UniformDistribution([[0,1],[0,1]])
434+
>>> test_model(mcm, prior, expparams)
435+
436+
.. code-block:: None
437+
:emphasize-lines: 1,2,3,4,5
438+
439+
.......
440+
----------------------------------------------------------------------
441+
Ran 7 tests in 0.013s
442+
443+
OK
444+
445+
417446
.. note::
418447

419448
Creating ``expparams`` as an empty array and filling it by field name is a
@@ -456,4 +485,3 @@ which is discussed in more detail in :ref:`perf_testing_guide`. Roughly,
456485
this model causes the likeihood functions calculated by its underlying model
457486
to be subject to random noise, so that the robustness of an inference algorithm
458487
against such noise can be tested.
459-

doc/source/guide/timedep.rst

Lines changed: 136 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -174,3 +174,139 @@ with a custom :meth:`~Simulatable.update_timestep` implementation.
174174
plt.ylabel(r'$\omega$')
175175

176176
plt.show()
177+
178+
Learning Walk Parameters
179+
------------------------
180+
181+
In the above examples, the diffusion distribution was treated as exactly
182+
known by the model. We can also parameterize this distribution, adding its
183+
parameters to model to be learned as well. :class:`GaussianRandomWalkModel`
184+
is a built in model similar to :class:`RandomWalkModel`. It is more
185+
restrictive in the sense that it is limited to gaussian time-step updates,
186+
but more general in that it has the ability to automatically append a
187+
parameterization of the gaussian time-step distribution, either diagonal or
188+
dense, to the underlying model.
189+
190+
For example suppose that we have a coin whose bias is taking a random walk
191+
in time with an unknown diffusion constant.
192+
To avoid exiting the allowable space of biases, :math:`[0,1]`,
193+
we transform to inverse-logit space before taking a gaussian step, and
194+
transform back to the probability interval after each step.
195+
196+
.. plot::
197+
198+
import numpy as np
199+
from scipy.special import expit, logit
200+
from qinfer import (
201+
CoinModel, BinomialModel, GaussianRandomWalkModel,
202+
UniformDistribution, SMCUpdater
203+
)
204+
205+
# Put a random walk on top of a binomial coin model
206+
model = GaussianRandomWalkModel(
207+
BinomialModel(CoinModel()),
208+
model_transformation=(logit, expit)
209+
)
210+
211+
# Generate some data with a true diffusion 0.05
212+
true_sigma_p = 0.05
213+
Nbin = 10
214+
p = expit(logit(0.5) + np.cumsum(true_sigma_p * np.random.normal(size=300)))
215+
data = np.random.binomial(Nbin, 1-p)
216+
217+
# Analyse the data
218+
prior = UniformDistribution([[0.2,0.8],[0,0.1]])
219+
u = SMCUpdater(model, 10000, prior)
220+
ests, stds = np.empty((data.size+1, 2)), np.empty((data.size+1, 2))
221+
ts = np.arange(ests.shape[0])
222+
ests[0,:] = u.est_mean()
223+
for idx in range(data.size):
224+
expparam = np.array([Nbin]).astype(model.expparams_dtype)
225+
u.update(np.array([data[idx]]), expparam)
226+
ests[idx+1,:] = u.est_mean()
227+
stds[idx+1,:] = np.sqrt(np.diag(u.est_covariance_mtx()))
228+
229+
plt.figure(figsize=(10,10))
230+
plt.subplot(2,1,1)
231+
u.plot_posterior_marginal(1)
232+
plt.title('Diffusion parameter posterior')
233+
234+
plt.subplot(2,1,2)
235+
plt.plot(ts, ests[:,0], label='estimated')
236+
plt.fill_between(ts, ests[:,0]-stds[:,0], ests[:,0]+stds[:,0],
237+
alpha=0.2, antialiased=True)
238+
plt.plot(ts[1:], p, '--', label='actual')
239+
plt.legend()
240+
plt.title('Coin bias vs. time')
241+
plt.show()
242+
243+
As a second example, consider a 5-sided die for which the 3rd, 4th and 5th
244+
sides are taking a correlated gaussian random walk, and the other two sides
245+
are constant. We can attempt to learn the six parameters of the cholesky
246+
factorization of the random walk covariance matrix as we track the drift
247+
of the die probabilities.
248+
249+
.. plot::
250+
251+
import numpy as np
252+
from qinfer.utils import to_simplex, from_simplex, sample_multinomial
253+
from qinfer import (
254+
NDieModel, MultinomialModel, GaussianRandomWalkModel,
255+
UniformDistribution, ConstrainedSumDistribution, SMCUpdater, ProductDistribution
256+
)
257+
258+
# Put a random walk on top of a multinomial die model
259+
randomwalk_idxs = [2,3,4] # only these sides of the die are taking a walk
260+
model = GaussianRandomWalkModel(
261+
MultinomialModel(NDieModel(5)),
262+
model_transformation=(from_simplex, to_simplex),
263+
diagonal=False,
264+
random_walk_idxs = randomwalk_idxs
265+
)
266+
267+
# Generate some data with some true covariance matrix
268+
true_cov = 0.1 * np.random.random(size=(3,3))
269+
true_cov = np.dot(true_cov, true_cov.T)
270+
Nmult = 40
271+
ps = from_simplex(np.array([[0.1,0.2,0.2,0.4,.1]] * 200))
272+
ps[:, randomwalk_idxs] += np.random.multivariate_normal(np.zeros(3), true_cov, size=200).cumsum(axis=0)
273+
ps = to_simplex(ps)
274+
expparam = np.array([(0,Nmult)],dtype=model.expparams_dtype)
275+
data = sample_multinomial(Nmult, ps.T).T
276+
277+
# Analyse the data
278+
prior = ProductDistribution(
279+
ConstrainedSumDistribution(UniformDistribution([[0,1]] * 5)),
280+
UniformDistribution([[0,0.2]] * 6)
281+
)
282+
u = SMCUpdater(model, 10000, prior)
283+
ests, stds = np.empty((data.shape[0]+1, model.n_modelparams)), np.empty((data.shape[0]+1, model.n_modelparams))
284+
ts = np.arange(ests.shape[0])
285+
ests[0,:] = u.est_mean()
286+
for idx in range(data.shape[0]):
287+
expparam = np.array([(0,Nmult)],dtype=model.expparams_dtype)
288+
outcome = np.array([(data[idx],)], dtype=model.domain(expparam)[0].dtype)
289+
u.update(outcome, expparam)
290+
ests[idx+1,:] = u.est_mean()
291+
stds[idx+1,:] = np.sqrt(np.diag(u.est_covariance_mtx()))
292+
293+
true_chol = np.linalg.cholesky(true_cov)
294+
k = 1
295+
plt.figure(figsize=(10,10))
296+
for idx, coord in enumerate(zip(*model._srw_tri_idxs)):
297+
i, j = coord
298+
plt.subplot(3,3,i*3 + j + 1)
299+
u.plot_posterior_marginal(5 + idx)
300+
plt.axvline(true_chol[i,j],color='b')
301+
plt.show()
302+
303+
plt.figure(figsize=(12,10))
304+
color=iter(plt.cm.Vega10(range(5)))
305+
for idx in range(5):
306+
c=next(color)
307+
plt.plot(ps[:, idx], '--', label='$p_{}$ actual'.format(idx), color=c)
308+
plt.plot(ests[:,idx], label='$p_{}$ estimated'.format(idx), color=c)
309+
plt.fill_between(range(len(ests)), ests[:,idx]-stds[:,idx], ests[:,idx]+stds[:,idx],alpha=0.2, color=c, antialiased=True)
310+
plt.legend(loc='center left', bbox_to_anchor=(1, 0.5))
311+
plt.show()
312+

setup.py

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -55,7 +55,8 @@ def write_version(filename=VERSION_TARGET):
5555
packages=[
5656
'qinfer',
5757
'qinfer._lib',
58-
'qinfer.tomography'
58+
'qinfer.tomography',
59+
'qinfer.tests'
5960
],
6061
keywords=['quantum', 'Bayesian', 'estimation'],
6162
description=

0 commit comments

Comments
 (0)