gpt4 book ai didi

python - Scipy 最小化/Scipy 曲线拟合/lmfit

转载 作者:行者123 更新时间:2023-12-04 17:13:36 28 4
gpt4 key购买 nike

log( VA ) = gamma - (1/eta)log[alpha L ^(-eta) + 测试版 K ^(-eta)]
我试图用非线性最小二乘法估计上述函数。我为此使用了 3 个不同的包(Scipy-minimize、Scipy-curve_fit 和 lmfit - Model),但我发现每个包的参数结果都不同。我不明白为什么。如果有人可以提供解决方案或提供不同的解决方法,我将不胜感激。
SCIPY-最小化

import numpy as np
from scipy.optimize import minimize, curve_fit
from lmfit import Model, Parameters

L = np.array([0.299, 0.295, 0.290, 0.284, 0.279, 0.273, 0.268, 0.262, 0.256, 0.250])
K = np.array([2.954, 3.056, 3.119, 3.163, 3.215, 3.274, 3.351, 3.410, 3.446, 3.416])
VA = np.array([0.919, 0.727, 0.928, 0.629, 0.656, 0.854, 0.955, 0.981, 0.908, 0.794])

def f(param):
gamma = param[0]
alpha = param[1]
beta = param[2]
eta = param[3]
VA_est = gamma - (1/eta)*np.log(alpha*L**-eta + beta*K**-eta)

return np.sum((np.log(VA) - VA_est)**2)

bnds = [(1, np.inf), (0,1),(0,1),(-1, np.inf)]
x0 = (1,0.01,0.98, 1)
con = {"type":"eq", "fun":c}

result = minimize(f, x0, bounds = bnds)

print(result.fun)
print(result.message)
print(result.x[0],result.x[1],result.x[2],result.x[3])
SCIPY-最小化 - OUT
0.30666062040617503
CONVERGENCE: NORM_OF_PROJECTED_GRADIENT_<=_PGTOL
1.0 0.5587147011643757 0.9371430857380681 5.873041615873815

SCIPY-CURVE_FIT
def f(X, gamma, alpha, beta, eta):
L,K = X

return gamma - (1/eta) * np.log(alpha*L**-eta + beta*K**-eta)

p0 = 1,0.01,0.98, 1

res, cov = curve_fit(f, (L, K), np.log(VA), p0, bounds = ((1,0,0,-1),(np.inf,1,1,np.inf)) )
gamma, alpha, beta, eta = res[0],res[1],res[2],res[3]
gamma, alpha, beta, eta
SCIPY-CURVE_FIT - OUT
(1.000000000062141,
0.26366547263939205,
0.9804436474926481,
13.449747863921704)
LMFIT-MODEL
def f(x, gamma, alpha, beta, eta):
L = x[0]
K = x[1]

return gamma - (1/eta)*np.log(alpha*L**-eta + beta*K**-eta)

fmodel = Model(f)
params = Parameters()
params.add('gamma', value = 1, vary=True, min = 1)
params.add('alpha', value = 0.01, vary=True, max = 1, min = 0)
params.add('beta', value = 0.98, vary=True, max = 1, min = 0)
params.add('eta', value = 1, vary=True, min = -1)

result = fmodel.fit(np.log(VA), params, x=(L,K))
print(result.fit_report())
LMFIT-模型 - OUT
[[Model]]
Model(f)
[[Fit Statistics]]
# fitting method = leastsq
# function evals = 103
# data points = 10
# variables = 4
chi-square = 0.31749840
reduced chi-square = 0.05291640
Akaike info crit = -26.4986758
Bayesian info crit = -25.2883354
## Warning: uncertainties could not be estimated:
gamma: at initial value
gamma: at boundary
alpha: at boundary
[[Variables]]
gamma: 1.00000000 (init = 1)
alpha: 1.3245e-13 (init = 0.01)
beta: 0.20130064 (init = 0.98)
eta: 447.960413 (init = 1)

最佳答案

拟合算法总是寻找底层最小二乘问题的局部极小值。请注意,您的问题是凸的,但不是严格凸的。因此,没有唯一的全局最小化器。但是每个局部最小化器都是一个全局最小化器。通过评估第一个函数 f对于每个找到的解决方案,我们可以观察到它们都具有相同的目标函数值。因此,每个解决方案都是全局最小化器。
为什么每种方法都会找到不同的最小化器?原因很简单。每个人都使用不同的算法来解决潜在的非线性优化问题,例如scipy.optimize.minimize使用“L-BFGS-B”算法,而 scipy.optimize.curve_fit用途 scipy.optimize.least_squares使用信任区域反射算法 ('TRF')。简而言之,对于严格凸问题,您只能期望不同算法的相同解决方案。

关于python - Scipy 最小化/Scipy 曲线拟合/lmfit,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/69046347/

28 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com