gpt4 book ai didi

python - Python 中的回归

转载 作者:太空宇宙 更新时间:2023-11-04 07:09:38 25 4
gpt4 key购买 nike

尝试通过 pandas 和 statsmodels 进行逻辑回归。不知道为什么我会收到错误或如何修复它。

import pandas as pd
import statsmodels.api as sm
x = [1, 3, 5, 6, 8]
y = [0, 1, 0, 1, 1]
d = { "x": pd.Series(x), "y": pd.Series(y)}
df = pd.DataFrame(d)

model = "y ~ x"
glm = sm.Logit(model, df=df).fit()

错误:

Traceback (most recent call last):
File "regress.py", line 45, in <module>
glm = sm.Logit(model, df=df).fit()
TypeError: __init__() takes exactly 3 arguments (2 given)

最佳答案

您不能将公式传递给 Logit。做:

In [82]: import patsy

In [83]: f = 'y ~ x'

In [84]: y, X = patsy.dmatrices(f, df, return_type='dataframe')

In [85]: sm.Logit(y, X).fit().summary()
Optimization terminated successfully.
Current function value: 0.511631
Iterations 6
Out[85]:
<class 'statsmodels.iolib.summary.Summary'>
"""
Logit Regression Results
==============================================================================
Dep. Variable: y No. Observations: 5
Model: Logit Df Residuals: 3
Method: MLE Df Model: 1
Date: Fri, 30 Aug 2013 Pseudo R-squ.: 0.2398
Time: 16:56:38 Log-Likelihood: -2.5582
converged: True LL-Null: -3.3651
LLR p-value: 0.2040
==============================================================================
coef std err z P>|z| [95.0% Conf. Int.]
------------------------------------------------------------------------------
Intercept -2.0544 2.452 -0.838 0.402 -6.861 2.752
x 0.5672 0.528 1.073 0.283 -0.468 1.603
==============================================================================
"""

这几乎直接来自 the docs on how to do exactly what you're asking .

编辑:您还可以按照@user333700 的建议使用公式 API:

In [22]: print sm.formula.logit(model, data=df).fit().summary()
Optimization terminated successfully.
Current function value: 0.511631
Iterations 6
Logit Regression Results
==============================================================================
Dep. Variable: y No. Observations: 5
Model: Logit Df Residuals: 3
Method: MLE Df Model: 1
Date: Fri, 30 Aug 2013 Pseudo R-squ.: 0.2398
Time: 18:14:26 Log-Likelihood: -2.5582
converged: True LL-Null: -3.3651
LLR p-value: 0.2040
==============================================================================
coef std err z P>|z| [95.0% Conf. Int.]
------------------------------------------------------------------------------
Intercept -2.0544 2.452 -0.838 0.402 -6.861 2.752
x 0.5672 0.528 1.073 0.283 -0.468 1.603
==============================================================================

关于python - Python 中的回归,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/18540738/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com