0
点赞
收藏
分享

微信扫一扫

1D-SE Block的Tensorflow实现

冬冬_79d4 2022-01-31 阅读 27
def SE_Block(input_tensor,ratio = 16):
    input_shape = K.int_shape(input_tensor)
    squeeze = tf.keras.layers.GlobalAveragePooling1D()(input_tensor)
    excitation = tf.keras.layers.Dense(units = input_shape[-1]//ratio, kernel_initializer='he_normal',activation='relu')(squeeze)
    excitation = tf.keras.layers.Dense(units = input_shape[-1],activation='sigmoid')(excitation)
    #excitation = tf.reshape(excitation, [-1, 1, input_shape[-1]])
    scale = tf.keras.layers.Multiply()([input_tensor, excitation])
    return scale

#X = tf.random.uniform((32, 352))
X = tf.keras.Input(shape=(32,352))
Y = SE_Block(X,16)
model = tf.keras.Model(inputs=[X], outputs=[Y])
model.summary()
print(Y.shape)

一维情况下的SE Block
输入是(Batch size,squence length,num of channels)
ratio是压缩率,这里是16
比如说输入是352个channel,中间经过全连接层的时候降为352/16=22
经过第二个连接层的时候再转换为352

举报

相关推荐

0 条评论