next up previous
Next: ReversibleJumpUpdate, MixtureBond, and Example Up: Advanced topics and examples Previous: Advanced topics and examples


FiniteUpdate, and Example 6: BetaBinomialExample

In this section we learn about how to update parameters that take on finite numbers of values. Clearly, it does not work to add a Gaussian step to such a parameter. Instead, we address this problem by sampling from the full conditional distribution of the discrete parameter. (This is as yet the only example where we have allowed YADAS to be polluted with sampling from full conditional distributions.) YADAS's infrastructure for computing ratios of posterior distributions serves it well in this situation. Suppose $f(x, \theta)$ is the unnnormalized posterior distribution evaluated at a discrete-valued parameter $x$ and the remaining parameters $\theta$. If the current value of $x$ is $i$ and we are contemplating changing the value of this parameter to $j$, YADAS is set up to compute the ratio $r_j = f(j, \theta)/f(i, \theta)$. If we compute this ratio for all values of $j$, a sample from the full conditional distribution of $x$ chooses $x=j$ with probability proportional to $r_j$.

It is quite easy to use the FiniteUpdate class: one simply inserts a FiniteUpdate into the update array. The constructor of FiniteUpdate requires only the name of the parameter and an array of integers with the same number of elements as the parameter. The integers list how many possible values each element of the parameter can take on. (We assume that the possible values of the $i$th element of the parameter are $\{0, 1, \ldots, n_i-1\}$, in which case $n_i$ should be placed inside the integer array.)

We illustrate the use of FiniteUpdate with a simple example in which we sample from the beta-binomial distribution. This example was inspired by [1]. Click on the buttons to call up the code and the data file.

This is not a statistical problem in that there is no data: the situation is that a probability $y$ is sampled from a Beta$(a,b)$ distribution, and conditionally on $y$, $x$ is sampled from a binomial distribution with sample size $n$ and probability $y$. We estimate the joint distribution of $(y,x)$ using MCMC: we alternate between updating $y$ and $x$, updating $y$ when it is its turn using random walk Metropolis, and updating $x$ by sampling from its full conditional. As noted in the code, the code has a few annoying features: the input file must contain a real variable $n$ to use as an argument to the Binomial Likelihood, and it also must contain an integer variable $ni = n + 1$ to tell FiniteUpdate the number of possible values of $x$. A sanity check is that the marginal distribution of $y$ turns out to be the beta distribution with the supplied parameters.

Note that the restriction that the possible values of the parameter are zero up to some maximum is in fact not a restriction at all, because such parameters are ideally suited for use as subscripting variables in ArgumentMakers.

In some cases, the number of possible values of a finite-valued parameter is large, so it may be inefficient to entertain all possible values when updating the parameter. In this case, IntegerMCMCParameter can be used to propose a Gaussian-like step that proposes a discrete move. The constructor of an IntegerMCMCParameter looks exactly like that for an ordinary MCMCParameter, so the step size is the standard deviation of the Gaussian that is sampled before being converted to an integer. (The proposal is $\theta^\prime = \theta + \mbox{sgn}(Z) \{1 +
\mbox{floor}(\vert sZ\vert) \}$, where Z is standard normal.) If you use this option, there must be a likelihood function that prevents $\theta$ from taking on disallowed values, using negative infinities.

 

 

 


next up previous
Next: ReversibleJumpUpdate, MixtureBond, and Example Up: Advanced topics and examples Previous: Advanced topics and examples
Todd Graves 2008-09-24