gpt4 book ai didi

python - Tensorflow:属性错误:模块 'tensorflow.python.ops.nn' 没有属性 'softmax_cross_entropy_with_logits_v2'

转载 作者:行者123 更新时间:2023-12-04 13:00:56 27 4
gpt4 key购买 nike

当我运行这个时,我得到 AttributeError: AttributeError: module 'tensorflow.python.ops.nn' has no attribute 'softmax_cross_entropy_with_logits_v2'。我可以得到任何帮助吗?

import tensorflow as tf
import matplotlib.pyplot as plt
import numpy as np

tf.reset_default_graph()


sentences = ["Bless the Lord oh my soul",
"Oh my soul",
"Worship His Holy name",
"Sing like never before",
"Oh my soul",
"I'll worship Your Holy name"]

word_sequence = " ".join(sentences).split()
word_list = " ".join(sentences).split()
word_list = list(set(word_list))
word_dict = {w: i for i, w in enumerate(word_list)}

# Word2Vec Parameter
batch_size = 20
embedding_size = 2 # To show 2 dim embedding graph
voc_size = len(word_list)

def random_batch(data, size):
random_inputs = []
random_labels = []
random_index = np.random.choice(range(len(data)), size, replace=False)

for i in random_index:
random_inputs.append(np.eye(voc_size)[data[i][0]]) # target
random_labels.append(np.eye(voc_size)[data[i][1]]) # context word

return random_inputs, random_labels

# Make skip gram of one size window
skip_grams = []
for i in range(1, len(word_sequence) - 1):
target = word_dict[word_sequence[i]]
context = [word_dict[word_sequence[i - 1]], word_dict[word_sequence[i + 1]]]

for w in context:
skip_grams.append([target, w])

# Model
inputs = tf.placeholder(tf.float32, shape=[None, voc_size])
labels = tf.placeholder(tf.float32, shape=[None, voc_size])

# W and WT is not Traspose relationship
W = tf.Variable(tf.random_uniform([voc_size, embedding_size], -1.0, 1.0))
WT = tf.Variable(tf.random_uniform([embedding_size, voc_size], -1.0, 1.0))

hidden_layer = tf.matmul(inputs, W) # [batch_size, embedding_size]
output_layer = tf.matmul(hidden_layer, WT) # [batch_size, voc_size]

cost = tf.reduce_sum(tf.nn.softmax_cross_entropy_with_logits_v2(logits=output_layer, labels=labels))
optimizer = tf.train.AdamOptimizer(0.001).minimize(cost)

with tf.Session() as sess:
init = tf.global_variables_initializer()
sess.run(init)

for epoch in range(5000):
batch_inputs, batch_labels = random_batch(skip_grams, batch_size)
_, loss = sess.run([optimizer, cost], feed_dict={inputs: batch_inputs, labels: batch_labels})

if (epoch + 1)%1000 == 0:
print('Epoch:', '%04d' % (epoch + 1), 'cost =', '{:.6f}'.format(loss))

trained_embeddings = W.eval()

for i, label in enumerate(word_list):
x, y = trained_embeddings[i]
plt.scatter(x, y)
plt.annotate(label, xy=(x, y), xytext=(5, 2), textcoords='offset points', ha='right', va='bottom')
plt.show()

在第 64 行中,cost = tf.reduce_sum(tf.nn.softmax_cross_entropy_with_logits_v2(logits=output_layer, labels=labels)) AttributeError: module 'tensorflow.python.ops.nn' has no attribute 'softmax_cross_entropy_with_logits_v2'

我试图从谷歌寻求一些帮助,但我没有得到任何有用的信息。感谢您的帮助。

最佳答案


tf.nn.softmax_cross_entropy_with_logits

代替
tf.nn.softmax_cross_entropy_with_logits_v2

关于python - Tensorflow:属性错误:模块 'tensorflow.python.ops.nn' 没有属性 'softmax_cross_entropy_with_logits_v2',我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/57082918/

27 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com