gpt4 book ai didi

python - 给定计算的偏导数优化函数

转载 作者:行者123 更新时间:2023-12-01 05:13:15 25 4
gpt4 key购买 nike

我需要优化一个函数(找到它的最小值)。该函数使用向量w 进行参数化。所以我基本上所做的就是计算函数相对于每个参数的偏导数。然后我简单地使用梯度下降来实现这一点。然而,据我所知,可以使用比梯度下降更复杂的优化方法,不需要调整或关心参数。我想在 numpy 或 scipy 中尝试其中一种方法,但我不知道如何操作。我需要的是一种接受计算的偏导数值作为输入的方法,然后优化函数。 numpyscipy 中是否存在类似的东西?

最佳答案

scipy.optimize.minimize为您提供输入目标函数的 Jacobian 和 Hessian 的选项:

jac: bool 值或可调用,可选

Jacobian (gradient) of objective function. Only for CG, BFGS, Newton-CG, L-BFGS-B, TNC, SLSQP, dogleg, trust-ncg. If jac is a Boolean and is True, fun is assumed to return the gradient along with the objective function. If False, the gradient will be estimated numerically. jac can also be a callable returning the gradient of the objective. In this case, it must accept the same arguments as fun. hess,

hessp:可调用,可选

Hessian (matrix of second-order derivatives) of objective function or Hessian of objective function times an arbitrary vector p. Only for Newton-CG, dogleg, trust-ncg. Only one of hessp or hess needs to be given. If hess is provided, then hessp will be ignored. If neither hess nor hessp is provided, then the Hessian product will be approximated using finite differences on jac. hessp must compute the Hessian times an arbitrary vector.

正如 @ffriend 在评论中提到的,您可以找到一些带有和不带有渐变的示例 here

关于python - 给定计算的偏导数优化函数,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/23762019/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com