在 MNIST LSTM 示例中,我不明白“隐藏层”是什么意思。它是当你表示随着时间推移展开的 RNN 时形成的假想层吗?
为什么在大多数情况下num_units = 128
?
最佳答案
num_units
can be interpreted as the analogy of hidden layer from the feed forward neural network. The number of nodes in hidden layer of a feed forward neural network is equivalent to num_units number of LSTM units in a LSTM cell at every time step of the network.
请参阅image那里也有!
关于tensorflow - tensorflow BasicLSTMCell中的num_units是什么?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/37901047/