gpt4 book ai didi

tensorflow - 如何在 saver.restore 之后检索最后一个 global_step

转载 作者:行者123 更新时间:2023-12-02 03:16:40 24 4
gpt4 key购买 nike

我们可以保存一个检查点

saver = tf.train.Saver()
saver.save(sess, FLAGS.train_dir, global_step=step)

然后,稍后,我可以恢复所有变量:

saver.restore(sess, FLAGS.train_dir)

我想得到我在调用'saver.save'时通过的'golbal_step',以便我可以在上一个global_step的基础上继续训练。

有什么办法可以得到吗? CheckpointState 似乎不包含该信息。

message CheckpointState {
// Path to the most-recent model checkpoint.
string model_checkpoint_path = 1;

// Paths to all not-yet-deleted model checkpoints, sorted from oldest to
// newest.
// Note that the value of model_checkpoint_path should be the last item in
// this list.
repeated string all_model_checkpoint_paths = 2;
}

Tensorflow get the global_step when restoring checkpoints ,我可以引入一个新的TF变量,但如果我能不添加新变量就最好了。有什么办法吗?

最佳答案

我认为一个简单的 sess.run(global_step) 应该返回值。

## Create and Save global_step 
global_step = tf.Variable(0, trainable=False)
train_step = tf.train.AdamOptimizer(...).minimize(..., global_step=global_step, ...)
...
saver = tf.train.Saver() # var_list is None: defaults to the list of all saveable objects.

## Restore global_step
sess.run(tf.global_variables_initializer())
...
ckpt = tf.train.get_checkpoint_state(FilePath_checkpoints)
if ckpt and ckpt.model_checkpoint_path:
saver.restore(sess, ckpt.model_checkpoint_path)
last_global_step = sess.run(global_step)

关于tensorflow - 如何在 saver.restore 之后检索最后一个 global_step,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/36619230/

24 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com