gpt4 book ai didi

scikit-learn - SVR 估计器在训练后不包含支持向量

转载 作者:行者123 更新时间:2023-12-04 14:03:59 25 4
gpt4 key购买 nike

当我使用 Scikit learn 的 SVR 来拟合一些自己的数据时,经过训练的估计器最终不包含支持向量,因此预测值总是恒定的。令我惊讶的是,同样的代码在使用一些随机训练数据时可以完美运行。我的代码有什么问题?数据有问题吗? (我还有很多其他数据显示了同样的问题。)

这是代码的最小示例,首先是随机训练数据。为了确保数据能够拟合,我还添加了 KernelRidge 作为保证。使用随机训练数据,一切正常:

import numpy as np
import pandas as pd

from sklearn.model_selection import GridSearchCV
from sklearn.pipeline import Pipeline
from sklearn.preprocessing import StandardScaler
from sklearn.multioutput import MultiOutputRegressor
from sklearn.kernel_ridge import KernelRidge
from sklearn.svm import SVR

# Random training data - works fine with both estimators
X = np.random.rand(50, 3) * 100
Y = [[np.sum(x), np.average(x)] for x in X]

# Kernel Ridge - works fine with both data
kernelRidgePipeline = Pipeline([('scale', StandardScaler()),
('KernelRidge', KernelRidge(kernel='poly'))])
kernelRidgeGridsearch = GridSearchCV(kernelRidgePipeline, n_jobs=-1,
param_grid={'KernelRidge__alpha': 10.0 ** -np.arange(1, 8),
'KernelRidge__degree': range(1, 5)})
kernelRidgeGridsearch.fit(X, Y)
print('Trained Kernel Ridge, cross-validation score: {:.2%}'.format(kernelRidgeGridsearch.best_score_))
#print(pd.DataFrame(kernelRidgeGridsearch.cv_results_))

# SVR - works fine with random data, empty support vectors after training with fixed data
SVRPipeline = Pipeline([('scale', StandardScaler()),
('SVR', MultiOutputRegressor(SVR(kernel='rbf')))])
SVRGridsearch = GridSearchCV(SVRPipeline, n_jobs=-1,
param_grid={'SVR__estimator__C': np.logspace(-3, 3, 7),
'SVR__estimator__gamma': np.logspace(-3, 3, 7)})
SVRGridsearch.fit(X, Y)
print('Trained SVR, cross-validation score: {:.2%}'.format(SVRGridsearch.best_score_))
#print(pd.DataFrame(SVRGridsearch.cv_results_))

best_svr_pipe = SVRGridsearch.best_estimator_
print([(estimator, estimator.support_vectors_) for estimator in best_svr_pipe.named_steps['SVR'].estimators_])

执行上面的代码,输出结果如下:

Trained Kernel Ridge, cross-validation score: 100.00%
Trained SVR, cross-validation score: 100.00%
[(SVR(C=1000.0, gamma=0.001), array([[-0.63418505, -1.59107071, -0.24340053],
[-1.65336124, -0.57465634, -1.6026979 ],
[-1.18120827, 0.82189646, -1.78927989],
[-0.95929744, 1.56254011, 1.02792552],
...

现在,当我改为使用以下训练数据时,问题开始出现:

# Own data - does not work with SVR
X = [[0.0009804, 0.004533 , 0.01827 , 0.007706 , 0.03145 , 0.01904 ],
[0.05073 , 0.03821 , 0.03137 , 0.00321 , 0.04469 , 0.033 ],
[0.03696 , 0.00544 , 0.04304 , 0.03579 , 0.01125 , 0.04032 ],
[0.0515 , 0.01897 , 0.0 , 0.01897 , 0.0 , 0.0 ],
[0.01897 , 0.01897 , 0.01897 , 0.0515 , 0.0 , 0.01897 ],
[0.04704 , 0.02259 , 0.03783 , 0.008367 , 0.04813 , 0.05104 ],
[0.0 , 0.01897 , 0.0 , 0.0515 , 0.01897 , 0.01897 ],
[0.0 , 0.0 , 0.01897 , 0.01897 , 0.0 , 0.01897 ],
[0.0 , 0.01897 , 0.0515 , 0.01897 , 0.0515 , 0.0515 ],
[0.03163 , 0.02566 , 0.01027 , 0.02068 , 0.006748 , 0.02103 ],
[0.003292 , 0.03846 , 0.02204 , 0.01941 , 0.01632 , 0.002126 ],
[0.0515 , 0.0515 , 0.01897 , 0.01897 , 0.01897 , 0.0515 ],
[0.0452 , 0.02487 , 0.0425 , 0.007782 , 0.001749 , 0.01841 ],
[0.0515 , 0.0515 , 0.01897 , 0.0515 , 0.01897 , 0.0515 ],
[0.0515 , 0.0 , 0.0515 , 0.01897 , 0.0 , 0.0515 ],
[0.0 , 0.01897 , 0.0 , 0.0515 , 0.0 , 0.01897 ],
[0.0 , 0.0515 , 0.01897 , 0.0 , 0.0 , 0.01897 ],
[0.0515 , 0.0515 , 0.0515 , 0.01897 , 0.0515 , 0.01897 ],
[0.0515 , 0.01897 , 0.0515 , 0.0515 , 0.0 , 0.0515 ],
[0.01897 , 0.0515 , 0.01897 , 0.01897 , 0.0 , 0.0515 ],
[0.0515 , 0.01897 , 0.0 , 0.0 , 0.0 , 0.0515 ],
[0.04883 , 0.02794 , 0.01418 , 0.03165 , 0.01753 , 0.007313 ],
[0.01073 , 0.009494 , 0.03339 , 0.001327 , 0.01707 , 0.01588 ],
[0.04193 , 0.03918 , 0.007814 , 0.03498 , 0.002789 , 0.03957 ],
[0.04872 , 0.04928 , 0.01344 , 0.03339 , 0.02326 , 0.02606 ],
[0.00997 , 0.00993 , 0.03386 , 0.01935 , 0.006923 , 0.02288 ],
[0.01897 , 0.01897 , 0.01897 , 0.0515 , 0.0 , 0.0515 ],
[0.01897 , 0.0515 , 0.0515 , 0.0 , 0.01897 , 0.0 ],
[0.008615 , 0.001054 , 0.04226 , 0.007394 , 0.002071 , 0.01514 ],
[0.006528 , 0.04534 , 0.004602 , 0.01214 , 0.04099 , 0.02716 ],
[0.0515 , 0.0515 , 0.0 , 0.0515 , 0.0515 , 0.0515 ],
[0.04717 , 0.04847 , 0.02927 , 0.02849 , 0.04382 , 0.01184 ],
[0.02146 , 0.03994 , 0.005115 , 0.02845 , 0.03113 , 0.02515 ],
[0.003326 , 0.002409 , 0.04982 , 0.03079 , 0.02167 , 0.0116 ],
[0.0 , 0.01897 , 0.01897 , 0.0515 , 0.0515 , 0.01897 ],
[0.02106 , 0.01718 , 0.02647 , 0.01066 , 0.02419 , 0.002777 ],
[0.02533 , 0.008516 , 0.05118 , 0.04527 , 0.008341 , 0.0012 ],
[0.04721 , 0.001682 , 0.04941 , 0.0431 , 0.01283 , 0.03503 ],
[0.01897 , 0.0 , 0.0515 , 0.0 , 0.01897 , 0.0515 ],
[0.0 , 0.0515 , 0.0515 , 0.0515 , 0.0515 , 0.01897 ],
[0.0 , 0.0 , 0.0515 , 0.01897 , 0.0515 , 0.0515 ],
[0.0386 , 0.01649 , 0.02286 , 0.03572 , 0.005517 , 0.00382 ],
[0.02654 , 0.01036 , 0.04756 , 0.04297 , 0.03086 , 0.03606 ],
[0.01222 , 0.03092 , 0.01132 , 0.00487 , 0.0192 , 0.002185 ],
[0.04892 , 0.03272 , 0.03173 , 0.04939 , 0.007464 , 0.02107 ],
[0.0 , 0.0515 , 0.01897 , 0.0515 , 0.01897 , 0.0515 ],
[0.0515 , 0.01897 , 0.0515 , 0.0 , 0.01897 , 0.0 ],
[0.0 , 0.01897 , 0.01897 , 0.01897 , 0.0 , 0.0515 ],
[0.01897 , 0.01897 , 0.0 , 0.01897 , 0.01897 , 0.01897 ],
[0.049 , 0.02872 , 0.01126 , 0.03502 , 0.04904 , 0.04057 ]]
Y = [[0.008053 , 0.003143 , 0.006198 , 0.005975 , 0.008053 ],
[0.007296 , 0.002185 , 0.003862 , 0.003294 , 0.007296 ],
[0.006632 , 0.001999 , 0.005249 , 0.003463 , 0.006632 ],
[0.01035 , 0.004031 , 0.006534 , 0.005148 , 0.01035 ],
[0.007918 , 0.002983 , 0.005321 , 0.00498 , 0.007918 ],
[0.006595 , 0.001628 , 0.003932 , 0.002831 , 0.006595 ],
[0.007923 , 0.003134 , 0.005321 , 0.005976 , 0.007923 ],
[0.009137 , 0.003162 , 0.006538 , 0.006061 , 0.009137 ],
[0.005462 , 0.001916 , 0.004102 , 0.004758 , 0.005462 ],
[0.009059 , 0.002799 , 0.00489 , 0.004375 , 0.009059 ],
[0.007887 , 0.004124 , 0.005531 , 0.006745 , 0.007887 ],
[0.007924 , 0.001586 , 0.002859 , 0.002664 , 0.007924 ],
[0.008681 , 0.00287 , 0.005059 , 0.004109 , 0.008681 ],
[0.006705 , 0.001586 , 0.002859 , 0.002664 , 0.006705 ],
[0.007893 , 0.001608 , 0.005319 , 0.002746 , 0.007893 ],
[0.009136 , 0.003134 , 0.005321 , 0.005976 , 0.009136 ],
[0.01035 , 0.003072 , 0.004077 , 0.005832 , 0.01035 ],
[0.005462 , 0.002805 , 0.004077 , 0.003883 , 0.005462 ],
[0.006675 , 0.0016 , 0.004102 , 0.002717 , 0.006675 ],
[0.009137 , 0.001731 , 0.002859 , 0.003662 , 0.009137 ],
[0.01157 , 0.0016 , 0.004102 , 0.002717 , 0.01157 ],
[0.007688 , 0.00351 , 0.005578 , 0.004669 , 0.007688 ],
[0.008482 , 0.003219 , 0.006034 , 0.005567 , 0.008482 ],
[0.00888 , 0.001983 , 0.003592 , 0.003275 , 0.00888 ],
[0.007375 , 0.002475 , 0.003798 , 0.003609 , 0.007375 ],
[0.007945 , 0.002865 , 0.005647 , 0.005245 , 0.007945 ],
[0.007918 , 0.001765 , 0.004102 , 0.003762 , 0.007918 ],
[0.007893 , 0.004163 , 0.00529 , 0.006093 , 0.007893 ],
[0.00869 , 0.003287 , 0.006663 , 0.005751 , 0.00869 ],
[0.00827 , 0.002649 , 0.003866 , 0.005098 , 0.00827 ],
[0.006704 , 0.001586 , 0.002859 , 0.002664 , 0.006704 ],
[0.005992 , 0.00322 , 0.004561 , 0.004387 , 0.005992 ],
[0.007765 , 0.002648 , 0.00413 , 0.004514 , 0.007765 ],
[0.006768 , 0.00354 , 0.006768 , 0.006256 , 0.00605 ],
[0.005976 , 0.003134 , 0.005321 , 0.005976 , 0.005487 ],
[0.007775 , 0.003974 , 0.006419 , 0.005899 , 0.007775 ],
[0.007052 , 0.004071 , 0.007052 , 0.00587 , 0.006263 ],
[0.0061 , 0.002139 , 0.005705 , 0.003364 , 0.0061 ],
[0.007893 , 0.001781 , 0.00532 , 0.003819 , 0.007893 ],
[0.005832 , 0.003072 , 0.004077 , 0.005832 , 0.004244 ],
[0.005462 , 0.001944 , 0.00532 , 0.004843 , 0.005462 ],
[0.007806 , 0.003801 , 0.006381 , 0.005206 , 0.007806 ],
[0.005274 , 0.0022 , 0.005072 , 0.003956 , 0.005274 ],
[0.009198 , 0.004064 , 0.005813 , 0.006297 , 0.009198 ],
[0.00685 , 0.002716 , 0.004588 , 0.003867 , 0.00685 ],
[0.006705 , 0.001854 , 0.002859 , 0.004614 , 0.006705 ],
[0.007893 , 0.004031 , 0.006534 , 0.005148 , 0.007893 ],
[0.009137 , 0.001916 , 0.004102 , 0.004758 , 0.009137 ],
[0.009141 , 0.002983 , 0.005321 , 0.00498 , 0.009141 ],
[0.006505 , 0.001927 , 0.003962 , 0.003081 , 0.006505 ]]

使用这个训练数据,程序的输出如下:

Trained Kernel Ridge, cross-validation score: 96.32%
Trained SVR, cross-validation score: -46.40%
[(SVR(C=0.001, gamma=0.001), array([], shape=(0, 6), dtype=float64)), (SVR(C=0.001, gamma=0.001), array([], shape=(0, 6), dtype=float64)), (SVR(C=0.001, gamma=0.001), array([], shape=(0, 6), dtype=float64)), (SVR(C=0.001, gamma=0.001), array([], shape=(0, 6), dtype=float64)), (SVR(C=0.001, gamma=0.001), array([], shape=(0, 6), dtype=float64))]

如果我取消注释 print(pd.DataFrame(SVRGridsearch.cv_results_)),我还可以看到所有超参数组合的行为都是相同的。

感谢任何帮助!

最佳答案

编辑后的答案:

问题是您的 Y 未正确缩放,请参阅 this similar postthis

您可以将参数 epsilon 更改为 0.0001 或更低,如

('SVR', MultiOutputRegressor(SVR(kernel='rbf', epsilon=0.0001)))

或将您的 Y 至少缩放 1000。

或者,您可以尝试让 StandardScaler 在您的 Y 上运行 TransformedTargetRegressorthis answer .

关于scikit-learn - SVR 估计器在训练后不包含支持向量,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/69109745/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com