gpt4 book ai didi

python - Keras 有没有办法立即停止训练?

转载 作者:行者123 更新时间:2023-12-04 13:35:42 25 4
gpt4 key购买 nike

我正在为我的 tf.keras 编写一个自定义的提前停止回调训练。为此,我可以设置变量 self.model.stop_training = True在回调函数之一中,例如 on_epoch_end() .但是,Keras 仅在当前时期完成时停止训练,即使我在一个时期的训练中设置了此变量,例如在 on_batch_end() 中.

因此我的问题是:即使在当前时代的进程中,Keras 是否有办法立即停止训练?

最佳答案

在 keras 中,您使用 EarlyStopping 当监控量停止改进时停止。从你的问题来看,你想停止的条件是什么还不清楚。如果你只想监控一个值,比如 EarlyStopping但只想在批处理后停止,如果值没有提高,可以重写EarlyStopping类并实现 on_batch_end 中的逻辑而不是 on_epoch_end :

class EarlyBatchStopping(Callback):


def __init__(self,
monitor='val_loss',
min_delta=0,
patience=0,
verbose=0,
mode='auto',
baseline=None,
restore_best_weights=False):
super(EarlyStopping, self).__init__()

self.monitor = monitor
self.baseline = baseline
self.patience = patience
self.verbose = verbose
self.min_delta = min_delta
self.wait = 0
self.stopped_epoch = 0
self.restore_best_weights = restore_best_weights
self.best_weights = None

if mode not in ['auto', 'min', 'max']:
warnings.warn('EarlyStopping mode %s is unknown, '
'fallback to auto mode.' % mode,
RuntimeWarning)
mode = 'auto'

if mode == 'min':
self.monitor_op = np.less
elif mode == 'max':
self.monitor_op = np.greater
else:
if 'acc' in self.monitor:
self.monitor_op = np.greater
else:
self.monitor_op = np.less

if self.monitor_op == np.greater:
self.min_delta *= 1
else:
self.min_delta *= -1

def on_train_begin(self, logs=None):
# Allow instances to be re-used
self.wait = 0
self.stopped_epoch = 0
if self.baseline is not None:
self.best = self.baseline
else:
self.best = np.Inf if self.monitor_op == np.less else -np.Inf

def on_batch_end(self, epoch, logs=None):
current = self.get_monitor_value(logs)
if current is None:
return

if self.monitor_op(current - self.min_delta, self.best):
self.best = current
self.wait = 0
if self.restore_best_weights:
self.best_weights = self.model.get_weights()
else:
self.wait += 1
if self.wait >= self.patience:
self.stopped_epoch = epoch
self.model.stop_training = True
if self.restore_best_weights:
if self.verbose > 0:
print('Restoring model weights from the end of '
'the best epoch')
self.model.set_weights(self.best_weights)

def on_train_end(self, logs=None):
if self.stopped_epoch > 0 and self.verbose > 0:
print('Epoch %05d: early stopping' % (self.stopped_epoch + 1))

def get_monitor_value(self, logs):
monitor_value = logs.get(self.monitor)
if monitor_value is None:
warnings.warn(
'Early stopping conditioned on metric `%s` '
'which is not available. Available metrics are: %s' %
(self.monitor, ','.join(list(logs.keys()))), RuntimeWarning
)
return monitor_value

如果你有其他逻辑,你可以使用 on_batch_end并设置 self.model.stop_training = True基于你的逻辑,但我认为你明白了。

关于python - Keras 有没有办法立即停止训练?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/62168914/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com