gpt4 book ai didi

tensorflow - 如何在Tensorflow中使用多层双向LSTM?

转载 作者:行者123 更新时间:2023-12-02 17:28:48 26 4
gpt4 key购买 nike

我想知道如何在 Tensorflow 中使用多层双向 LSTM。

我已经实现了双向 LSTM 的内容,但我想将此模型与添加多层的模型进行比较。

我应该如何在这部分添加一些代码?

x = tf.unstack(tf.transpose(x, perm=[1, 0, 2]))
#print(x[0].get_shape())

# Define lstm cells with tensorflow
# Forward direction cell
lstm_fw_cell = rnn.BasicLSTMCell(n_hidden, forget_bias=1.0)
# Backward direction cell
lstm_bw_cell = rnn.BasicLSTMCell(n_hidden, forget_bias=1.0)

# Get lstm cell output
try:
outputs, _, _ = rnn.static_bidirectional_rnn(lstm_fw_cell, lstm_bw_cell, x,
dtype=tf.float32)
except Exception: # Old TensorFlow version only returns outputs not states
outputs = rnn.static_bidirectional_rnn(lstm_fw_cell, lstm_bw_cell, x,
dtype=tf.float32)

# Linear activation, using rnn inner loop last output
outputs = tf.stack(outputs, axis=1)
outputs = tf.reshape(outputs, (batch_size*n_steps, n_hidden*2))
outputs = tf.matmul(outputs, weights['out']) + biases['out']
outputs = tf.reshape(outputs, (batch_size, n_steps, n_classes))

最佳答案

您可以使用两种不同的方法来应用多层 bilstm 模型:

1) 使用前一个 bilstm 层作为下一个 bilstm 层的输入。首先,您应该创建具有长度为 num_layers 的前向和后向单元格的数组。并且

for n in range(num_layers):
cell_fw = cell_forw[n]
cell_bw = cell_back[n]

state_fw = cell_fw.zero_state(batch_size, tf.float32)
state_bw = cell_bw.zero_state(batch_size, tf.float32)

(output_fw, output_bw), last_state = tf.nn.bidirectional_dynamic_rnn(cell_fw, cell_bw, output,
initial_state_fw=state_fw,
initial_state_bw=state_bw,
scope='BLSTM_'+ str(n),
dtype=tf.float32)

output = tf.concat([output_fw, output_bw], axis=2)

2)另一种方法也值得一看 stacked bilstm .

关于tensorflow - 如何在Tensorflow中使用多层双向LSTM?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/46189318/

26 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com