我正在尝试在 keras 中创建自己的自定义激活函数,如果 x < 0,则返回 0;如果 x >= 0,则返回 1
from keras.layers import Dense
from keras.models import Sequential
from keras.layers import Activation
import tensorflow as tf
def hard_lim(x):
zero = tf.convert_to_tensor(0., x.dtype.base_dtype)
one = tf.convert_to_tensor(1., x.dtype.base_dtype)
sess = tf.Session()
if sess.run(tf.greater_equal(x, zero)):
return one
else:
return zero
model = Sequential()
model.add(Dense(4, input_dim=2, activation=Activation(hard_lim))
model.add(Dense(2, activation=Activation(hard_lim))
model.add(Dense(1, activation=Activation(hard_lim))
它给了我这个错误
InvalidArgumentError (see above for traceback): You must feed a value
for placeholder tensor '1_input' with dtype float and shape [?,2]
如何修复它?
最佳答案
Warning: this operation you want has no gradients and will not allow any weights before it to be trainable. You will see error messages like "an operation has None for gradient" or something like "None type not supported".
As a workaround for your activation, I believe the 'relu' activation would be the closest and best option, with the advantage of being very popular and used in most models.
在 Keras 中,您通常不运行 session 。对于自定义操作,您可以使用 backend 创建一个函数功能。
因此,您可以使用 Lambda
层:
import keras.backend as K
def hardlim(x):
return K.cast(K.greater_equal(x,0), K.floatx())
然后您可以在层中使用activation=hardlim
。
关于python - 在keras中创建自定义激活函数,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/53547872/