我有以下网络,运行良好:
output = LSTM(8)(output)
output = Dense(2)(output)
现在对于相同的模型,我尝试堆叠一些 LSTM 层,如下所示:
output = LSTM(8)(output, return_sequences=True)
output = LSTM(8)(output)
output = Dense(2)(output)
但是我遇到了以下错误:
TypeError Traceback (most recent call last)
<ipython-input-2-0d0ced2c7417> in <module>()
39
40 output = Concatenate(axis=2)([leftOutput,rightOutput])
---> 41 output = LSTM(8)(output, return_sequences=True)
42 output = LSTM(8)(output)
43 output = Dense(2)(output)
/usr/local/lib/python3.4/dist-packages/keras/layers/recurrent.py in __call__(self, inputs, initial_state, constants, **kwargs)
480
481 if initial_state is None and constants is None:
--> 482 return super(RNN, self).__call__(inputs, **kwargs)
483
484 # If any of `initial_state` or `constants` are specified and are Keras
/usr/local/lib/python3.4/dist-packages/keras/engine/topology.py in __call__(self, inputs, **kwargs)
601
602 # Actually call the layer, collecting output(s), mask(s), and shape(s).
--> 603 output = self.call(inputs, **kwargs)
604 output_mask = self.compute_mask(inputs, previous_mask)
605
TypeError: call() got an unexpected keyword argument 'return_sequences'
这很令人困惑,因为 return_sequences 是基于 Keras 文档的有效参数:https://keras.io/layers/recurrent/#lstm
我在这里做错了什么?谢谢!
最佳答案
问题在于 return_sequences
应作为参数传递给层构造函数 - 而不是层调用。将代码更改为:
output = LSTM(8, return_sequences=True)(output)
解决了问题。
关于machine-learning - Keras:堆叠多个 LSTM 层,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/48510318/