gpt4 book ai didi

r - glmnet 模型性能与 boosting 算法的比较

转载 作者:行者123 更新时间:2023-11-30 09:17:16 25 4
gpt4 key购买 nike

为了帮助了解机器学习,我编写了一些示例,并不是为了表明一种方法比另一种方法更好,而是为了说明如何使用各种函数以及需要调整哪些参数。我从 this blog 开始比较了BooST和xgboost,然后我成功地将gbm添加到示例中。现在我尝试添加 glmnet,但返回的模型的两个系数始终(接近)为零。要么我做错了什么,要么 glmnet 不是该数据的正确算法。我正在尝试找出它是什么。这是我的可重现示例:

# Uncomment the following 2 lines if you need to install BooST (requires devtools)
#library(devtools)
#install_github("gabrielrvsc/BooST")

library(BooST)
library(xgboost)
library(gbm)
library(glmnet)

# Data generating process
dgp = function(N, r2){
X = matrix(rnorm(N*2,0,1),N,2)
X[,ncol(X)] = base::sample(c(0,1),N,replace=TRUE)
aux = X
yaux = cos(pi*(rowSums(X)))
vyaux = var(yaux)
ve = vyaux*(1-r2)/r2
e = rnorm(N,0,sqrt(ve))
y = yaux+e
return(list(y = y, X = X))
}

# Real data
x1r = rep(seq(-4,4,length.out = 1000), 2)
x2r = c(rep(0,1000), rep(1,1000))
yr = cos(pi*(x1r+x2r))
real_function = data.frame(x1 = x1r, x2 = as.factor(x2r), y = yr)

# Train data (noisy)
set.seed(1)
data = dgp(N = 1000, r2 = 0.5)
y = data$y
x = data$X

# Test data (noisy)
set.seed(2)
dataout=dgp(N = 1000, r2 = 0.5)
yout = dataout$y
xout = dataout$X

# Set seed and train all 4 models
set.seed(1)
BooST_Model = BooST(x, y, v = 0.18, M = 300 , display = TRUE)
xgboost_Model = xgboost(x, label = y, nrounds = 300, params = list(eta = 0.14, max_depth = 2))
gbm_Model = gbm.fit(x, y, distribution = "gaussian", n.trees = 10000, shrinkage = .001, interaction.depth=5)
glmnet_Model = cv.glmnet(x, y, family = "gaussian", alpha=0)
coef(glmnet_Model)

coef(glmnet_Model)

3 x 1 sparse Matrix of class "dgCMatrix" 1

(Intercept) 0.078072154632597062784427066617354284971952438

V1 -0.000000000000000000000000000000000000003033534

V2 -0.000000000000000000000000000000000000044661342

# Predict from test data
p_BooST = predict(BooST_Model, xout)
p_xgboost = predict(xgboost_Model, xout)
p_gbm = predict(gbm_Model, xout, n.trees=10000)
p_glmnet = predict(glmnet_Model, xout)

# Show RMSE
sqrt(mean((p_BooST - yout)^2))
sqrt(mean((p_xgboost - yout)^2))
sqrt(mean((p_gbm - yout)^2))
sqrt(mean((p_glmnet - yout)^2))

fitted = data.frame(x1 = x[,1], x2 = as.factor(x[,2]),
BooST = fitted(BooST_Model),
xgboost = predict(xgboost_Model, x),
gbm = predict(object = gbm_Model, newdata = x, n.trees = 10000),
glmnet = predict(glmnet_Model, newx = x, s=glmnet_Model$lambda.min)[, 1], y = y)

# Plot noisy Y
ggplot() + geom_point(data = fitted, aes(x = x1, y = y, color = x2)) + geom_line(data = real_function, aes(x = x1, y = y, linetype = x2))

# Plot xgboost
ggplot() + geom_point(data = fitted, aes(x = x1, y = y), color = "gray") + geom_point(data = fitted, aes(x = x1, y = xgboost, color = x2)) + geom_line(data = real_function, aes(x = x1, y = y, linetype = x2))

# Plot BooST
ggplot() + geom_point(data = fitted, aes(x = x1, y = y), color = "gray") + geom_point(data = fitted, aes(x = x1, y = BooST, color = x2)) + geom_line(data = real_function, aes(x = x1, y = y, linetype = x2))

# Plot gbm
ggplot() + geom_point(data = fitted, aes(x = x1, y = y), color = "gray") + geom_point(data = fitted, aes(x = x1, y = gbm, color = x2)) + geom_line(data = real_function, aes(x = x1, y = y, linetype = x2))

# Plot glmnet
ggplot() + geom_point(data = fitted, aes(x = x1, y = y), color = "gray") + geom_point(data = fitted, aes(x = x1, y = glmnet, color = x2)) + geom_line(data = real_function, aes(x = x1, y = y, linetype = x2))

最佳答案

请记住,glmnet 适合线性模型,这意味着响应可以写为预测变量的线性组合:

 y = b0 + b1*x1 + b2*x2 + ...

在您的数据集中,您将响应定义为不同的

yaux = cos(pi*(rowSums(X)))

yr = cos(pi*(x1r+x2r))

在这两种情况下,显然都不是预测变量的线性组合。

关于r - glmnet 模型性能与 boosting 算法的比较,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/52223462/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com