ReversibleJumpUpdate, MixtureBond,
and Example 7: BinomialHypothesisTest

One of the most challenging sitations whenimplementing an MCMC is if a parameter has a mixture distribution and
is most naturally thought of as living in one of two or more different
spaces. A simple example arises in variable subset selection in
regression: a slope parameter may be given a prior distribution that
is a mixture of a normal distribution and a point mass at zero. In
some cases it will be possible to update such a parameter using its
full conditional distribution, but in other cases this distribution is
not tractable, so that we want to use some method more like
Metropolis-Hastings. Reversible jump MCMC, as presented
in [8] demonstrates how to do this. The algorithm
attempts to jump between spaces, and the Metropolis-Hastings
acceptance probability is a function of the methods of proposing new
parameter values so that the stationary distribution of the algorithm
is as desired.

The ReversibleJumpUpdate can be thought of as a
transition matrix between models, and an array of functions
specifying how to propose a new value of the parameters
in a new model, as a function of which model was the old model.
For this to work, it is necessary to adjust acceptance probabilities
in such a way that the algorithm's limiting distribution is
the desired posterior distribution.
The arguments to the constructor are

an array of parameters whose values might be changed by the
reversible jump update;

an integer that determines how many models we are mixing over
(denote this integer by );

an integer that indicates which of these models is the initial
state for the MCMC (this should be consistent with the initial values
of the parameters, and its possible values are
);

a one-dimensional array of transition probabilities between the models.
This array will be interpreted as a square matrix whose
coordinate is the probability that if the current model is model , the
next model proposed will be model ;

an array of JumpPerturbers. A JumpPerturber is much
like a Perturber. Both are interfaces, and JumpPerturber
extends Perturber by adding a density() method.
This method computes the probability density of the proposed new value of
the parameters.

a String indicating a directory where output specific to the
reversible jump update should go. This output includes acceptance
probabilities for the purpose of Rao-Blackwellization, but has not
been used enough to be standard.

To be precise, suppose the current model is model and the current
value of the parameter is . To generate a new value of the
parameter, we first choose a new model according to the probabilities
in the transition matrix: suppose model is chosen; the probability
that this happens is (say) . The proposed new value
of the parameter is chosen from density
. Denote the unnormalized posterior
by . The acceptance probability for the reversible jump move is

This is ``just'' the Metropolis-Hastings rule, but specialized to
include the possibility of jumping across models.
(The JumpPerturbers evaluate the densities
.)

In the following example, we perform a Bayesian hypothesis
test for equality of two probabilities and .
We have binomial samples Binomial
for . We make it into a hypothesis testing problem
by putting a mixture prior on : with probability
, and their common value has a beta
prior distribution, while with probability ,
the 's are exchangeable, each with a (possibly
different) beta prior distribution.
Click on these buttons to bring up new windows displaying
the code and input files for this example.