我的评论正确吗?这些是我的模型的 5 层吗?如下所述?
型号
# input - conv - conv - linear - linear(fc)
def model(data): # input Layer
# 1 conv Layer
conv = tf.nn.conv2d(data, layer1_weights, [1, 2, 2, 1], padding='SAME')
hidden = tf.nn.relu(conv + layer1_biases) # Activation function
# 1 conv Layer
conv = tf.nn.conv2d(hidden, layer2_weights, [1, 2, 2, 1], padding='SAME')
hidden = tf.nn.relu(conv + layer2_biases) # Activation function
# not a layer ( just reshape)
shape = hidden.get_shape().as_list()
reshape = tf.reshape(hidden, [shape[0], shape[1] * shape[2] * shape[3]])
# 1 linear layer - not fc due to relu
hidden = tf.nn.relu(tf.matmul(reshape, layer3_weights) + layer3_biases)
# 1 linear fully connected layer
return tf.matmul(hidden, layer4_weights) + layer4_biases
最佳答案
# 1 linear layer - not fc due to relu
hidden = tf.nn.relu(tf.matmul(reshape, layer3_weights) + layer3_biases)
在这一层中,它是一个完全连接的层,并且通过“RELU”激活函数传递。这段代码所在的层就是这部分
tf.matmul(reshape, layer3_weights) + layer3_biases
并且您将通过 relu 激活函数发送该层
tf.nn.relu(tf.matmul(reshape, layer3_weights) + layer3_biases)
除此之外,一切似乎都很好。
关于python - 识别卷积神经网络的层,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/47536405/