gpt4 book ai didi

贝叶斯概率矩阵分解 (BPMF) 与 PyMC3 : PositiveDefiniteError using `NUTS`

转载 作者:行者123 更新时间:2023-12-04 20:00:43 28 4
gpt4 key购买 nike

我已经实现了 Bayesian Probabilistic Matrix Factorization算法使用 pymc3在 Python 中。我还实现了它的前身,概率矩阵分解(PMF)。 See my previous question引用这里使用的数据。

我在使用 NUTS 采样器绘制 MCMC 样本时遇到问题。我使用来自 PMF 的 MAP 初始化模型参数,并使用高斯随机绘制的超参数散布在 0 左右。但是,我得到了 PositiveDefiniteError为采样器设置步进对象时。我已经验证了来自 PMF 的 MAP 估计是合理的,所以我希望它与超参数的初始化方式有关。这是 PMF 模型:

import pymc3 as pm
import numpy as np
import pandas as pd
import theano
import scipy as sp

data = pd.read_csv('jester-dense-subset-100x20.csv')
n, m = data.shape
test_size = m / 10
train_size = m - test_size

train = data.copy()
train.ix[:,train_size:] = np.nan # remove test set data
train[train.isnull()] = train.mean().mean() # mean value imputation
train = train.values

test = data.copy()
test.ix[:,:train_size] = np.nan # remove train set data
test = test.values

# Low precision reflects uncertainty; prevents overfitting
alpha_u = alpha_v = 1/np.var(train)
alpha = np.ones((n,m)) * 2 # fixed precision for likelihood function
dim = 10 # dimensionality

# Specify the model.
with pm.Model() as pmf:
pmf_U = pm.MvNormal('U', mu=0, tau=alpha_u * np.eye(dim),
shape=(n, dim), testval=np.random.randn(n, dim)*.01)
pmf_V = pm.MvNormal('V', mu=0, tau=alpha_v * np.eye(dim),
shape=(m, dim), testval=np.random.randn(m, dim)*.01)
pmf_R = pm.Normal('R', mu=theano.tensor.dot(pmf_U, pmf_V.T),
tau=alpha, observed=train)

# Find mode of posterior using optimization
start = pm.find_MAP(fmin=sp.optimize.fmin_powell)

这是 BPMF:
n, m = data.shape
dim = 10 # dimensionality
beta_0 = 1 # scaling factor for lambdas; unclear on its use
alpha = np.ones((n,m)) * 2 # fixed precision for likelihood function

logging.info('building the BPMF model')
std = .05 # how much noise to use for model initialization
with pm.Model() as bpmf:
# Specify user feature matrix
lambda_u = pm.Wishart(
'lambda_u', n=dim, V=np.eye(dim), shape=(dim, dim),
testval=np.random.randn(dim, dim) * std)
mu_u = pm.Normal(
'mu_u', mu=0, tau=beta_0 * lambda_u, shape=dim,
testval=np.random.randn(dim) * std)
U = pm.MvNormal(
'U', mu=mu_u, tau=lambda_u, shape=(n, dim),
testval=np.random.randn(n, dim) * std)

# Specify item feature matrix
lambda_v = pm.Wishart(
'lambda_v', n=dim, V=np.eye(dim), shape=(dim, dim),
testval=np.random.randn(dim, dim) * std)
mu_v = pm.Normal(
'mu_v', mu=0, tau=beta_0 * lambda_v, shape=dim,
testval=np.random.randn(dim) * std)
V = pm.MvNormal(
'V', mu=mu_v, tau=lambda_v, shape=(m, dim),
testval=np.random.randn(m, dim) * std)

# Specify rating likelihood function
R = pm.Normal(
'R', mu=theano.tensor.dot(U, V.T), tau=alpha,
observed=train)

# `start` is the start dictionary obtained from running find_MAP for PMF.
for key in bpmf.test_point:
if key not in start:
start[key] = bpmf.test_point[key]

with bpmf:
step = pm.NUTS(scaling=start)

在最后一行,我收到以下错误:
PositiveDefiniteError: Scaling is not positive definite. Simple check failed. Diagonal contains negatives. Check indexes [   0    2   ...  2206  2207  ]

据我了解,我不能使用 find_MAP使用具有 BPMF 等超先验的模型。这就是为什么我试图用来自 PMF 的 MAP 值进行初始化,它使用 U 和 V 上的参数的点估计而不是参数化的超先验。

最佳答案

不幸的是,Wishart 发行版不起作用。我最近在这里添加了一个警告:https://github.com/pymc-devs/pymc3/commit/642f63973ec9f807fb6e55a0fc4b31bdfa1f261e

有关此棘手分布的更多讨论,请参见此处:https://github.com/pymc-devs/pymc3/issues/538

您可以通过修复协方差矩阵来确认这是来源。如果是这种情况,我会尝试使用 JKL 先验分布:https://github.com/pymc-devs/pymc3/blob/master/pymc3/examples/LKJ_correlation.py

关于贝叶斯概率矩阵分解 (BPMF) 与 PyMC3 : PositiveDefiniteError using `NUTS` ,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/29736966/

28 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com