gpt4 book ai didi

r - 如何使用 JAGS 对来自不同参数族的有限组件的混合进行建模?

转载 作者:行者123 更新时间:2023-12-05 04:13:49 27 4
gpt4 key购买 nike

想象一个基础过程,它从概率为 $\alpha$ 的正态分布和概率为 $1 -\alpha$ 的均匀分布中抽取数字。因此,观察到的由该过程生成的数字序列遵循 $f$ 分布,它是 2 个分量混合 和 $\的混合权重 alpha$ 和 $1 -\alpha$。当观察到的序列是唯一输入但参数族已知时,您将如何使用 JAGS 对这种混合建模?

示例(在 R 中):

set.seed(8361299)
N <- 100
alpha <- 0.3
mu <- 5
max <- 50
# Which component to choose from?
latent_class <- rbinom(N, 1, alpha)
Y <- ifelse(latent_class, runif(N, min=mu, max=max), rnorm(N, mean=mu))

生成的(观察到的)Y 看起来像: Generated Y

有了JAGS,应该可以获得混合权重,以及已知组件的参数?

最佳答案

具有相同参数分布的混合模型在 JAGS/BUGS 中非常简单,但具有不同参数响应的混合模型(如您的模型)有点棘手。一种方法是使用“ones trick”,我们手动计算响应的可能性(选择模型的潜在部分指定的两个分布之一)并将其与伯努利试验的(假)响应相匹配每个数据点。例如:

# Your data generation:
set.seed(8361299)
N <- 100
alpha <- 0.3
mu <- 5
max <- 50
# Which component to choose from?
latent_class <- rbinom(N, 1, alpha)
Y <- ifelse(latent_class, runif(N, min=mu, max=max), rnorm(N, mean=mu))

# The model:
model <- "model{

for(i in 1:N){

# Log density for the normal part:
ld_norm[i] <- logdensity.norm(Y[i], mu, tau)

# Log density for the uniform part:
ld_unif[i] <- logdensity.unif(Y[i], lower, upper)

# Select one of these two densities:
density[i] <- exp(ld_norm[i]*norm_chosen[i] + ld_unif[i]*(1-norm_chosen[i]))

# Generate a likelihood for the MCMC sampler:
Ones[i] ~ dbern(density[i])

# The latent class part as usual:
norm_chosen[i] ~ dbern(prob)

}

# Priors:
lower ~ dnorm(0, 10^-6)
upper ~ dnorm(0, 10^-6)
prob ~ dbeta(1,1)
mu ~ dnorm(0, 10^-6)
tau ~ dgamma(0.01, 0.01)

# Specify monitors, data and initial values using runjags:
#monitor# lower, upper, prob, mu, tau
#data# N, Y, Ones
#inits# lower, upper
}"

# Run the model using runjags (or use rjags if you prefer!)
library('runjags')

lower <- min(Y)-10
upper <- max(Y)+10

Ones <- rep(1,N)

results <- run.jags(model, sample=20000, thin=1)
results

plot(results)

这似乎可以很好地恢复您的参数(您的 alpha 是 1-prob),但要注意自相关(和收敛)。

马特


编辑:由于您询问了关于泛化到 2 个以上分布的问题,这里是等效的(但更具泛化性)代码:

# The model:
model <- "model{
for(i in 1:N){
# Log density for the normal part:
ld_comp[i, 1] <- logdensity.norm(Y[i], mu, tau)
# Log density for the uniform part:
ld_comp[i, 2] <- logdensity.unif(Y[i], lower, upper)
# Select one of these two densities and normalise with a Constant:
density[i] <- exp(ld_comp[i, component_chosen[i]] - Constant)
# Generate a likelihood for the MCMC sampler:
Ones[i] ~ dbern(density[i])
# The latent class part using dcat:
component_chosen[i] ~ dcat(probs)
}
# Priors for 2 parameters using a dirichlet distribution:
probs ~ ddirch(c(1,1))
lower ~ dnorm(0, 10^-6)
upper ~ dnorm(0, 10^-6)
mu ~ dnorm(0, 10^-6)
tau ~ dgamma(0.01, 0.01)
# Specify monitors, data and initial values using runjags:
#monitor# lower, upper, probs, mu, tau
#data# N, Y, Ones, Constant
#inits# lower, upper, mu, tau
}"

library('runjags')

# Initial values to get the chains started:
lower <- min(Y)-10
upper <- max(Y)+10
mu <- 0
tau <- 0.01

Ones <- rep(1,N)

# The constant needs to be big enough to avoid any densities >1 but also small enough to calculate probabilities for observations of 1:
Constant <- 10

results <- run.jags(model, sample=10000, thin=1)
results

此代码适用于您需要的任意数量的分布,但预计随着更多分布的自相关性呈指数级恶化。

关于r - 如何使用 JAGS 对来自不同参数族的有限组件的混合进行建模?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/36609365/

27 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com