OpenBUGS: missing value in Bernoulli distribution

I am trying to simulate the "time" of observation as a random variable using OpenBUGS via R (R2OpenBUGS). If all the observation time is available (no NA), everything works, but if I set one of the NA times, nothing will happen. I tested the same code with WinBUGS, and I get a "NIL break (read)" trap error. So my question is is there something really wrong in my code, or is my model too weird for BUGS?

My model looks like this:

model{ for(i in 1:k){ obs[i] ~ dbern(p) #is the observation done at time 1 or 2? y[(i-1)*2 + obs[i]+1] <- x[i] } for(i in 1:n){ y[i] ~ dnorm(mu,tau) } mu ~ dnorm(0,0.0001) tau~ dgamma(0.001,0.001) p ~ dunif(0,1) } 

And the R code looks like this:

 library(R2OpenBUGS) x<-obs<-rep(NA,5) for(i in 1:k) { obs[i]<-sample(c(0,1),1) #observation time of ith observation x[i]<-rnorm(1) #observed values } obs[2]<-NA #one of the sampling times is missing INITS <- list(list(tau=1,mu=0,p=0.5)) DATA <- list(x=x,n=n,k=k,obs=obs) ob <- bugs( data=DATA, inits=INITS, parameters.to.save=c("tau","mu","p","y"), model.file="BUGSModel.R", n.chains=1, n.iter=50, n.burnin=10, n.thin=1, DIC=FALSE) 
+4
source share
1 answer

If I understand your question well, you ask if this expression is

 obs[i] ~ dbern(p) 

is strange for Win / OpenBUGS, so it will not handle the missing value. No, I do not think so; errors can handle missing values ​​in this way, and they even impute them - with subsequent distribution.

But I have a strong suspicion that

 y[(i-1)*2 + obs[i]+1] <- x[i] 

really strange! This can cause error problems as you force the index to be computed using the obs[i] observation, which is null. This is really strange, you should try to find another way to do this. First, try to simplify the model to skip this rule, and I would say that the problem disappears.

+4
source

Source: https://habr.com/ru/post/1398604/


All Articles