作者热门文章
- html - 出于某种原因,IE8 对我的 Sass 文件中继承的 html5 CSS 不友好?
- JMeter 在响应断言中使用 span 标签的问题
- html - 在 :hover and :active? 上具有不同效果的 CSS 动画
- html - 相对于居中的 html 内容固定的 CSS 重复背景?
我想编写一个模型,如下所示。主要思想是我有几个条件(或处理),所有参数都是独立估计每个条件的,除了 kappa 参数对于所有条件都相同。
with pm.Model() as model:
trace_per_condition = []
# define the kappa hyperparameter
kappa = pm.Gamma('kappa', 1, 0.1)
for condition in range(0, ncond):
z_cond = z[condition]
# define the mu hyperparameter
mu = pm.Beta('mu', 1, 1)
# define the prior
theta = pm.Beta('theta', mu * kappa, (1 - mu) * kappa, shape=len(z_cond))
# define the likelihood
y = pm.Binomial('y', p=theta, n=trials, observed=z_cond)
# Generate a MCMC chain
start = pm.find_MAP()
step1 = pm.Metropolis([theta, mu])
step2 = pm.NUTS([kappa])
trace = pm.sample(1000, [step1, step2], progressbar=False)
trace_per_condition.append(trace)
当我运行模型时,我收到以下消息。
/usr/local/lib/python2.7/dist-packages/Theano-0.6.0-py2.7.egg/theano/gradient.py:513: UserWarning: grad method was asked to compute the gradient with respect to a variable that is not part of the computational graph of the cost, or is used only by a non-differentiable operator: mu handle_disconnected(elem)
/usr/local/lib/python2.7/dist-packages/Theano-0.6.0-py2.7.egg/theano/gradient.py:533: UserWarning: grad method was asked to compute the gradient with respect to a variable that is not part of the computational graph of the cost, or is used only by a non-differentiable operator: <DisconnectedType>
handle_disconnected(rval[i])
/usr/local/lib/python2.7/dist-packages/Theano-0.6.0-py2.7.egg/theano/gradient.py:513: UserWarning: grad method was asked to compute the gradient with respect to a variable that is not part of the computational graph of the cost, or is used only by a non-differentiable operator: theta
handle_disconnected(elem)
Traceback (most recent call last):
File "<stdin>", line 46, in <module>
File "/usr/local/lib/python2.7/dist-packages/pymc-3.0-py2.7.egg/pymc/tuning/starting.py", line 80, in find_MAP
start), fprime=grad_logp_o, disp=disp, *args, **kwargs)
File "/usr/lib/python2.7/dist-packages/scipy/optimize/optimize.py", line 777, in fmin_bfgs
res = _minimize_bfgs(f, x0, args, fprime, callback=callback, **opts)
File "/usr/lib/python2.7/dist-packages/scipy/optimize/optimize.py", line 832, in _minimize_bfgs
gfk = myfprime(x0)
File "/usr/lib/python2.7/dist-packages/scipy/optimize/optimize.py", line 281, in function_wrapper
return function(*(wrapper_args + args))
File "/usr/local/lib/python2.7/dist-packages/pymc-3.0-py2.7.egg/pymc/tuning/starting.py", line 75, in grad_logp_o
return nan_to_num(-dlogp(point))
File "/usr/local/lib/python2.7/dist-packages/pymc-3.0-py2.7.egg/pymc/blocking.py", line 119, in __call__
return self.fa(self.fb(x))
File "/usr/local/lib/python2.7/dist-packages/pymc-3.0-py2.7.egg/pymc/model.py", line 284, in __call__
return self.f(**state)
File "/usr/local/lib/python2.7/dist-packages/Theano-0.6.0-py2.7.egg/theano/compile/function_module.py", line 516, in __call__
self[k] = arg
File "/usr/local/lib/python2.7/dist-packages/Theano-0.6.0-py2.7.egg/theano/compile/function_module.py", line 452, in __setitem__
self.value[item] = value
File "/usr/local/lib/python2.7/dist-packages/Theano-0.6.0-py2.7.egg/theano/compile/function_module.py", line 413, in __setitem__
"of the inputs of your function for duplicates." % str(item))
TypeError: Ambiguous name: mu - please check the names of the inputs of your function for duplicates.
编辑根据 chris-fonnesbeck 的回答,我尝试了以下操作:
with pm.Model() as model:
trace_per_condition = []
# define the kappa hyperparameter
kappa = pm.Gamma('kappa', 1, 0.1)
for condition in range(0, ncond):
z_cond = z[condition]
# define the mu hyperparameter
mu = pm.Beta('mu_%i' % condition, 1, 1)
# define the prior
theta = pm.Beta('theta_%i' % condition, mu * kappa, (1 - mu) * kappa, shape=len(z_cond))
# define the likelihood
y = pm.Binomial('y_%i' % condition, p=theta, n=trials, observed=z_cond)
# Generate a MCMC chain
start = pm.find_MAP()
step1 = pm.Metropolis([theta, mu])
step2 = pm.NUTS([kappa])
trace = pm.sample(10000, [step1, step2], start=start, progressbar=False)
trace_per_condition.append(trace)
我收到错误:
/usr/local/lib/python2.7/dist-packages/Theano-0.6.0-py2.7.egg/theano/gradient.py:513:
UserWarning: grad method was asked to compute the gradient with respect to a variable
that is not part of the computational graph of the cost, or is used only by a
non-differentiable operator: mu_1
handle_disconnected(elem)
/usr/local/lib/python2.7/dist-packages/Theano-0.6.0-py2.7.egg/theano/gradient.py:533:
UserWarning: grad method was asked to compute the gradient with respect to a variable
that is not part of the computational graph of the cost, or is used only by a
non-differentiable operator: <DisconnectedType>
handle_disconnected(rval[i])
/usr/local/lib/python2.7/dist-packages/Theano-0.6.0-py2.7.egg/theano/gradient.py:513:
UserWarning: grad method was asked to compute the gradient with respect to a variable
that is not part of the computational graph of the cost, or is used only by a
non-differentiable operator: theta_1
handle_disconnected(elem)
Traceback (most recent call last):
File "<stdin>", line 43, in <module>
File "/usr/local/lib/python2.7/dist-packages/pymc-3.0-py2.7.egg/pymc/tuning/starting.py", line 80, in find_MAP
start), fprime=grad_logp_o, disp=disp, *args, **kwargs)
File "/usr/lib/python2.7/dist-packages/scipy/optimize/optimize.py", line 777, in fmin_bfgs
res = _minimize_bfgs(f, x0, args, fprime, callback=callback, **opts)
File "/usr/lib/python2.7/dist-packages/scipy/optimize/optimize.py", line 837, in _minimize_bfgs
old_fval = f(x0)
File "/usr/lib/python2.7/dist-packages/scipy/optimize/optimize.py", line 281, in function_wrapper
return function(*(wrapper_args + args))
File "/usr/local/lib/python2.7/dist-packages/pymc-3.0-py2.7.egg/pymc/tuning/starting.py", line 72, in logp_o
return nan_to_high(-logp(point))
File "/usr/local/lib/python2.7/dist-packages/pymc-3.0-py2.7.egg/pymc/blocking.py", line 119, in __call__
return self.fa(self.fb(x))
File "/usr/local/lib/python2.7/dist-packages/pymc-3.0-py2.7.egg/pymc/model.py", line 283, in __call__
return self.f(**state)
File "/usr/local/lib/python2.7/dist-packages/Theano-0.6.0-py2.7.egg/theano/compile/function_module.py", line 482, in __call__
raise TypeError("Too many parameter passed to theano function")
TypeError: Too many parameter passed to theano function
UserWarning 与起点的优化有关,如果我不使用 pm.find_MAP(),则将其删除。其余错误仍然存在。
最佳答案
我注意到的一件事是,每次添加条件时都会进行采样,我认为您可能希望将其从循环中取出。
此外,您不需要为每个条件的每个 mu、theta、y 定义单独的变量。例如,如果您的数据位于列中的 data
中,则 yu 应该能够执行类似的操作
with pm.Model() as model:
kappa = pm.Gamma('kappa', 1, 0.1)
mu = pm.Beta('mu', 1, 1, shape=ncond)
mu_c = mu[data.condition]
theta = pm.Beta('theta', mu_c * kappa, (1 - mu_c) * kappa, shape=len(data))
y = pm.Binomial('y', p=theta, n=data.trials, observed=data.z_cond)
关于python - 如何在 PyMC3 中定义一个模型,其中一个参数在多个条件下限制为相同值,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/24945018/
我是一名优秀的程序员,十分优秀!