gpt4 book ai didi

Python、Numpy 和 OLS

转载 作者:行者123 更新时间:2023-11-28 17:51:42 24 4
gpt4 key购买 nike

下面的代码可以按预期工作,但并不是我所需要的。我想将 c[1] 更改为 c[1:] 以便我对所有 x 变量进行回归,而不仅仅是一个。当我进行更改(并添加适当的 x 标签)时,出现以下错误:ValueError: matrices are not aligned。有人可以解释为什么会这样并建议修改代码吗?谢谢。

from numpy import *
from ols import *

a = [[.001,.05,-.003,.014,.035,-.01,.032,-.0013,.0224,.005],[-.011,.012,.0013,.014,-.0015,.019,-.032,.013,-.04,-.05608],
[.0021,.02,-.023,.0024,.025,-.081,.032,-.0513,.00014,-.00015],[.001,.02,-.003,.014,.035,-.001,.032,-.003,.0224,-.005],
[.0021,-.002,-.023,.0024,.025,.01,.032,-.0513,.00014,-.00015],[-.0311,.012,.0013,.014,-.0015,.019,-.032,.013,-.014,-.008],
[.001,.02,-.0203,.014,.035,-.001,.00032,-.0013,.0224,.05],[.0021,-.022,-.0213,.0024,.025,.081,.032,.05313,.00014,-.00015],
[-.01331,.012,.0013,.014,.01015,.019,-.032,.013,-.014,-.012208],[.01021,-.022,-.023,.0024,.025,.081,.032,.0513,.00014,-.020015]]


c = column_stack(a)
y = c[0]
m = ols(y, c[1], y_varnm='y', x_varnm=['x1'])
print m.summary()

编辑:我想出了一个部分解决方案,但仍然有问题。下面的代码适用于 9 个解释变量中的 8 个。

c = column_stack(a)
y = c[0]
x = column_stack([c[i] for i in range(1, 9)])
m = ols(y, x, y_varnm='y', x_varnm=['x1','x2','x3','x4','x5','x6','x7','x8'])
print m.summary()

但是,当我尝试包含第 9 个 x 变量时,出现以下错误:RuntimeWarning: divide by zero encountered in double_scalars。知道为什么吗?这是代码(注意 len(a) = 10):

c = column_stack(a)
y = c[0]
x = column_stack([c[i] for i in range(1, len(a))])
m = ols(y, x, y_varnm='y', x_varnm=['x1','x2','x3','x4','x5','x6','x7','x8','x9'])
print m.summary()

最佳答案

我对您使用的 ols 模块一无所知。但是,如果您使用 s cikits.statsmodels 尝试以下操作,它应该工作:

import numpy as np
import scikits.statsmodels.api as sm

a = np.array([[.001,.05,-.003,.014,.035,-.01,.032,-.0013,.0224,.005],[-.011,.012,.0013,.014,-.0015,.019,-.032,.013,-.04,-.05608],
[.0021,.02,-.023,.0024,.025,-.081,.032,-.0513,.00014,-.00015],[.001,.02,-.003,.014,.035,-.001,.032,-.003,.0224,-.005],
[.0021,-.002,-.023,.0024,.025,.01,.032,-.0513,.00014,-.00015],[-.0311,.012,.0013,.014,-.0015,.019,-.032,.013,-.014,-.008],
[.001,.02,-.0203,.014,.035,-.001,.00032,-.0013,.0224,.05],[.0021,-.022,-.0213,.0024,.025,.081,.032,.05313,.00014,-.00015],
[-.01331,.012,.0013,.014,.01015,.019,-.032,.013,-.014,-.012208],[.01021,-.022,-.023,.0024,.025,.081,.032,.0513,.00014,-.020015]])

y = a[:, 0]
x = a[:, 1:]
results = sm.OLS(y, x).fit()
print results.summary()

输出:

     Summary of Regression Results
=======================================
| Dependent Variable: ['y']|
| Model: OLS|
| Method: Least Squares|
| # obs: 10.0|
| Df residuals: 1.0|
| Df model: 8.0|
==============================================================================
| coefficient std. error t-statistic prob. |
------------------------------------------------------------------------------
| x0 0.2557 0.6622 0.3862 0.7654 |
| x1 0.03054 1.453 0.0210 0.9866 |
| x2 -3.392 2.444 -1.3877 0.3975 |
| x3 1.445 1.474 0.9808 0.5062 |
| x4 0.03559 0.2610 0.1363 0.9137 |
| x5 -0.7412 0.8754 -0.8467 0.5527 |
| x6 0.02289 0.2466 0.0928 0.9411 |
| x7 0.5754 1.413 0.4074 0.7537 |
| x8 -0.4827 0.7569 -0.6378 0.6386 |
==============================================================================
| Models stats Residual stats |
------------------------------------------------------------------------------
| R-squared: 0.8832 Durbin-Watson: 2.578 |
| Adjusted R-squared: -0.05163 Omnibus: 0.5325 |
| F-statistic: 0.9448 Prob(Omnibus): 0.7663 |
| Prob (F-statistic): 0.6663 JB: 0.1630 |
| Log likelihood: 41.45 Prob(JB): 0.9217 |
| AIC criterion: -64.91 Skew: 0.4037 |
| BIC criterion: -62.18 Kurtosis: 2.405 |
------------------------------------------------------------------------------

关于Python、Numpy 和 OLS,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/8643332/

24 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com