gpt4 book ai didi

python - Wide & Deep learning for large data 错误 : GraphDef cannot be larger than 2GB

转载 作者:太空狗 更新时间:2023-10-29 17:20:49 25 4
gpt4 key购买 nike

将 1MM+ 行插入 wide and deep learning model抛出 ValueError:GraphDef 不能大于 2GB:

Traceback (most recent call last):
File "search_click.py", line 207, in <module>
tf.app.run()
File "/usr/lib/python2.7/site-packages/tensorflow/python/platform/app.py", line 30, in run
sys.exit(main(sys.argv))
File "search_click.py", line 204, in main
train_and_eval()
File "search_click.py", line 181, in train_and_eval
m.fit(input_fn=lambda: input_fn(df_train), steps=FLAGS.train_steps)
File "/usr/lib/python2.7/site-packages/tensorflow/contrib/learn/python/learn/estimators/estimator.py", line 182, in fit
monitors=monitors)
File "/usr/lib/python2.7/site-packages/tensorflow/contrib/learn/python/learn/estimators/estimator.py", line 458, in _train_model
summary_writer=graph_actions.get_summary_writer(self._model_dir))
File "/usr/lib/python2.7/site-packages/tensorflow/contrib/learn/python/learn/graph_actions.py", line 76, in get_summary_writer
graph=ops.get_default_graph())
File "/usr/lib/python2.7/site-packages/tensorflow/python/training/summary_io.py", line 113, in __init__
self.add_graph(graph=graph, graph_def=graph_def)
File "/usr/lib/python2.7/site-packages/tensorflow/python/training/summary_io.py", line 204, in add_graph
true_graph_def = graph.as_graph_def(add_shapes=True)
File "/usr/lib/python2.7/site-packages/tensorflow/python/framework/ops.py", line 2117, in as_graph_def
raise ValueError("GraphDef cannot be larger than 2GB.")
ValueError: GraphDef cannot be larger than 2GB.

我定义了与示例中相同的 input_fn:

def input_fn(df):
"""Input builder function."""
# Creates a dictionary mapping from each continuous feature column name (k) to
# the values of that column stored in a constant Tensor.
continuous_cols = {k: tf.constant(df[k].values) for k in CONTINUOUS_COLUMNS}
# Creates a dictionary mapping from each categorical feature column name (k)
# to the values of that column stored in a tf.SparseTensor.
categorical_cols = {k: tf.SparseTensor(
indices=[[i, 0] for i in range(df[k].size)],
values=df[k].values,
shape=[df[k].size, 1])
for k in CATEGORICAL_COLUMNS}
# Merges the two dictionaries into one.
feature_cols = dict(continuous_cols)
feature_cols.update(categorical_cols)
# Converts the label column into a constant Tensor.
label = tf.constant(df[LABEL_COLUMN].values)
# Returns the feature columns and the label.
return feature_cols, label

是否有 tf.constanttf.SparseTensor 的替代品允许批量插入并避免内存错误?

最佳答案

wide和deep的例子需要将数据集全部加载到内存中。如果你有大数据集,你可能需要 tf.decode_csv用于 csv 格式的输入。如果您的输入格式是自定义的,您应该创建一个 custom data reader .

关于python - Wide & Deep learning for large data 错误 : GraphDef cannot be larger than 2GB,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/41439136/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com