gpt4 book ai didi

python - 从 pymc2 到 PyMC3 的 CAR 模型

转载 作者:行者123 更新时间:2023-12-01 03:23:41 24 4
gpt4 key购买 nike

我仍然是 PyMC3 的菜鸟,所以这个问题可能我很天真,但我不知道如何在 pymc3 中翻译这个 pymc2 代码。特别是我不清楚如何翻译 R 函数。

beta = pymc.Normal('beta', mu=0, tau=1.0e-4)
s = pymc.Uniform('s', lower=0, upper=1.0e+4)
tau = pymc.Lambda('tau', lambda s=s: s**(-2))

### Intrinsic CAR
@pymc.stochastic
def R(tau=tau, value=np.zeros(N)):
# Calculate mu based on average of neighbors
mu = np.array([sum(W[i]*value[A[i]])/Wplus[i] for i in xrange(N)])

# Scale precision to the number of neighbors
taux = tau*Wplus
return pymc.normal_like(value, mu, taux)

@pymc.deterministic
def M(beta=beta, R=R):
return [np.exp(beta + R[i]) for i in xrange(N)]

obsvd = pymc.Poisson("obsvd", mu=M, value=Y, observed=True)
model = pymc.Model([s, beta, obsvd])

来自 https://github.com/Youki/statistical-modeling-for-data-analysis-with-python/blob/945c13549a872d869e33bc48082c42efc022a07b/Chapter11/Chapter11.rst 的代码, 和 http://glau.ca/?p=340

你能帮助我吗?谢谢

最佳答案

在 PyMC3 中,可以使用 Theano 的 scan 函数来实现 CAR 模型。他们的 documentation 中有一个示例代码.链接文档中有两个 CAR 实现。这是第一个 [Source] :


from theano import scan
floatX = "float32"

from pymc3.distributions import continuous
from pymc3.distributions import distribution

class CAR(distribution.Continuous):
"""
Conditional Autoregressive (CAR) distribution

Parameters
----------
a : list of adjacency information
w : list of weight information
tau : precision at each location
"""
def __init__(self, w, a, tau, *args, **kwargs):
super(CAR, self).__init__(*args, **kwargs)
self.a = a = tt.as_tensor_variable(a)
self.w = w = tt.as_tensor_variable(w)
self.tau = tau*tt.sum(w, axis=1)
self.mode = 0.

def get_mu(self, x):

def weigth_mu(w, a):
a1 = tt.cast(a, 'int32')
return tt.sum(w*x[a1])/tt.sum(w)

mu_w, _ = scan(fn=weigth_mu,
sequences=[self.w, self.a])

return mu_w

def logp(self, x):
mu_w = self.get_mu(x)
tau = self.tau
return tt.sum(continuous.Normal.dist(mu=mu_w, tau=tau).logp(x))

with pm.Model() as model1:
# Vague prior on intercept
beta0 = pm.Normal('beta0', mu=0.0, tau=1.0e-5)
# Vague prior on covariate effect
beta1 = pm.Normal('beta1', mu=0.0, tau=1.0e-5)

# Random effects (hierarchial) prior
tau_h = pm.Gamma('tau_h', alpha=3.2761, beta=1.81)
# Spatial clustering prior
tau_c = pm.Gamma('tau_c', alpha=1.0, beta=1.0)

# Regional random effects
theta = pm.Normal('theta', mu=0.0, tau=tau_h, shape=N)
mu_phi = CAR('mu_phi', w=wmat, a=amat, tau=tau_c, shape=N)

# Zero-centre phi
phi = pm.Deterministic('phi', mu_phi-tt.mean(mu_phi))

# Mean model
mu = pm.Deterministic('mu', tt.exp(logE + beta0 + beta1*aff + theta + phi))

# Likelihood
Yi = pm.Poisson('Yi', mu=mu, observed=O)

# Marginal SD of heterogeniety effects
sd_h = pm.Deterministic('sd_h', tt.std(theta))
# Marginal SD of clustering (spatial) effects
sd_c = pm.Deterministic('sd_c', tt.std(phi))
# Proportion sptial variance
alpha = pm.Deterministic('alpha', sd_c/(sd_h+sd_c))

trace1 = pm.sample(1000, tune=500, cores=4,
init='advi',
nuts_kwargs={"target_accept":0.9,
"max_treedepth": 15})


M函数在这里写成:

mu = pm.Deterministic('mu', tt.exp(logE + beta0 + beta1*aff + theta + phi))

关于python - 从 pymc2 到 PyMC3 的 CAR 模型,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/43328715/

24 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com