我想构建一个 3 层的 LSTM 网络。这是代码:
num_layers=3
time_steps=10
num_units=128
n_input=1
learning_rate=0.001
n_classes=1
...
x=tf.placeholder("float",[None,time_steps,n_input],name="x")
y=tf.placeholder("float",[None,n_classes],name="y")
input=tf.unstack(x,time_steps,1)
lstm_layer=rnn_cell.BasicLSTMCell(num_units,state_is_tuple=True)
network=rnn_cell.MultiRNNCell([lstm_layer for _ in range(num_layers)],state_is_tuple=True)
outputs,_=rnn.static_rnn(network,inputs=input,dtype="float")
使用 num_layers=1
可以正常工作,但是如果层数超过一层,我会在这一行收到错误:
outputs,_=rnn.static_rnn(network,inputs=input,dtype="float")
ValueError: Dimensions must be equal, but are 256 and 129 for 'rnn/rnn/multi_rnn_cell/cell_0/cell_0/basic_lstm_cell/MatMul_1' (op: 'MatMul') with input shapes: [?,256], [129,512].
任何人都可以解释值 129 和 512 的来源吗?
您不应该为第一层和更深层重复使用相同的单元格,因为它们的输入不同,因此核矩阵也不同。试试这个:
# Extra function is for readability. No problem to inline it.
def make_cell(lstm_size):
return tf.nn.rnn_cell.BasicLSTMCell(lstm_size, state_is_tuple=True)
network = rnn_cell.MultiRNNCell([make_cell(num_units) for _ in range(num_layers)],
state_is_tuple=True)
我是一名优秀的程序员,十分优秀!