gpt4 book ai didi

python - 如何在 nolearn、lasagne 中定义成本函数?

转载 作者:太空狗 更新时间:2023-10-30 00:44:15 24 4
gpt4 key购买 nike

我在 nolearn 中做一个神经网络,nolearn 是一个使用 lasagne 的基于 Theano 的库。

我不明白如何定义自己的成本函数。

输出层只有 3 个神经元 [0, 1, 2] 我希望它在给出 1 或 2 时基本确定,否则 - 如果它不确定1、2 - 简单地回馈 0。

因此,我想出了一个成本函数(需要调整),其中 1 和 2 的成本是 0 的两倍,但我不明白如何将其告知网络。

# optimization method:
from lasagne.updates import sgd
update=sgd,
update_learning_rate=0.0001

这是更新的代码,但我如何告诉 SGD 使用我的成本函数而不是它自己的?

编辑:完整的网络代码是:

def nn_loss(data, x_period, columns, num_epochs, batchsize, l_rate=0.02):
net1 = NeuralNet(
layers=[('input', layers.InputLayer),
('hidden1', layers.DenseLayer),
('output', layers.DenseLayer),
],
# layer parameters:
batch_iterator_train=BatchIterator(batchsize),
batch_iterator_test=BatchIterator(batchsize),

input_shape=(None, int(x_period*columns)),
hidden1_nonlinearity=lasagne.nonlinearities.rectify,
hidden1_num_units=100, # number of units in 'hidden' layer
output_nonlinearity=lasagne.nonlinearities.sigmoid,
output_num_units=3,

# optimization method:
update=nesterov_momentum,
update_learning_rate=5*10**(-3),
update_momentum=0.9,
on_epoch_finished=[
EarlyStopping(patience=20),
],
max_epochs=num_epochs,
verbose=1,

# Here are the important parameters for multi labels
regression=True,
# objective_loss_function=multilabel_objective,
# custom_score=("validation score", lambda x, y: np.mean(np.abs(x - y)))
)

# Train the network
start_time = time.time()
net1.fit(data['X_train'], data['y_train'])
}

编辑使用 regression=True

时出错
Got 99960 testing datasets.
# Neural Network with 18403 learnable parameters

## Layer information

# name size
--- ------- ------
0 input 180
1 hidden1 100
2 output 3

Traceback (most recent call last):
File "/Users/morgado/anaconda/lib/python3.4/site-packages/theano/compile/function_module.py", line 607, in __call__
outputs = self.fn()
ValueError: GpuElemwise. Input dimension mis-match. Input 1 (indices start at 0) has shape[1] == 1, but the output's size on that axis is 3.

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "train_nolearn_simple.py", line 272, in <module>
main(**kwargs)
File "train_nolearn_simple.py", line 239, in main
nn_loss_fit = nn_loss(data, x_period, columns, num_epochs, batchsize)
File "train_nolearn_simple.py", line 217, in nn_loss
net1.fit(data['X_train'], data['y_train'])
File "/Users/morgado/anaconda/lib/python3.4/site-packages/nolearn/lasagne/base.py", line 416, in fit
self.train_loop(X, y)
File "/Users/morgado/anaconda/lib/python3.4/site-packages/nolearn/lasagne/base.py", line 462, in train_loop
self.train_iter_, Xb, yb)
File "/Users/morgado/anaconda/lib/python3.4/site-packages/nolearn/lasagne/base.py", line 516, in apply_batch_func
return func(Xb) if yb is None else func(Xb, yb)
File "/Users/morgado/anaconda/lib/python3.4/site-packages/theano/compile/function_module.py", line 618, in __call__
storage_map=self.fn.storage_map)
File "/Users/morgado/anaconda/lib/python3.4/site-packages/theano/gof/link.py", line 297, in raise_with_op
reraise(exc_type, exc_value, exc_trace)
File "/Users/morgado/anaconda/lib/python3.4/site-packages/six.py", line 658, in reraise
raise value.with_traceback(tb)
File "/Users/morgado/anaconda/lib/python3.4/site-packages/theano/compile/function_module.py", line 607, in __call__
outputs = self.fn()
ValueError: GpuElemwise. Input dimension mis-match. Input 1 (indices start at 0) has shape[1] == 1, but the output's size on that axis is 3.
Apply node that caused the error: GpuElemwise{Sub}[(0, 1)](GpuElemwise{Composite{scalar_sigmoid((i0 + i1))}}[(0, 0)].0, GpuFromHost.0)
Toposort index: 22
Inputs types: [CudaNdarrayType(float32, matrix), CudaNdarrayType(float32, matrix)]
Inputs shapes: [(200, 3), (200, 1)]
Inputs strides: [(3, 1), (1, 0)]
Inputs values: ['not shown', 'not shown']
Outputs clients: [[GpuCAReduce{pre=sqr,red=add}{1,1}(GpuElemwise{Sub}[(0, 1)].0), GpuElemwise{Mul}[(0, 0)](GpuElemwise{Sub}[(0, 1)].0, GpuElemwise{Composite{scalar_sigmoid((i0 + i1))}}[(0, 0)].0, GpuElemwise{sub,no_inplace}.0), GpuElemwise{mul,no_inplace}(CudaNdarrayConstant{[[ 2.]]}, GpuElemwise{Composite{(inv(i0) / i1)},no_inplace}.0, GpuElemwise{Sub}[(0, 1)].0, GpuElemwise{Composite{scalar_sigmoid((i0 + i1))}}[(0, 0)].0, GpuElemwise{sub,no_inplace}.0)]]

HINT: Re-running with most Theano optimization disabled could give you a back-trace of when this node was created. This can be done with by setting the Theano flag 'optimizer=fast_compile'. If that does not work, Theano optimizations can be disabled with 'optimizer=None'.
HINT: Use the Theano flag 'exception_verbosity=high' for a debugprint and storage map footprint of this apply node.

最佳答案

当您实例化您的神经网络时,您可以传递您之前定义的自定义损失函数:

import theano.tensor as T
import numpy as np
from nolearn.lasagne import NeuralNet
# I'm skipping other inputs for the sake of concision

def multilabel_objective(predictions, targets):
epsilon = np.float32(1.0e-6)
one = np.float32(1.0)
pred = T.clip(predictions, epsilon, one - epsilon)
return -T.sum(targets * T.log(pred) + (one - targets) * T.log(one - pred), axis=1)

net = NeuralNet(
# your other parameters here (layers, update, max_epochs...)
# here are the one you're interested in:
objective_loss_function=multilabel_objective,
custom_score=("validation score", lambda x, y: np.mean(np.abs(x - y)))
)

如您所见,还可以定义自定义分数(使用关键字 custom_score)

关于python - 如何在 nolearn、lasagne 中定义成本函数?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/32151251/

24 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com