gpt4 book ai didi

python - 如何从 keras 中父模型的摘要中公开子模型的各层?

转载 作者:行者123 更新时间:2023-12-01 09:02:47 26 4
gpt4 key购买 nike

现在我有以下模型,名为 model1:

Layer (type)                    Output Shape         Param #     Connected to                     
==================================================================================================
input_3 (InputLayer) (None, 101, 101, 1) 0
__________________________________________________________________________________________________
up_sampling2d_2 (UpSampling2D) (None, 202, 202, 1) 0 input_3[0][0]
__________________________________________________________________________________________________
zero_padding2d_36 (ZeroPadding2 (None, 256, 256, 1) 0 up_sampling2d_2[0][0]
__________________________________________________________________________________________________
conv2d_3 (Conv2D) (None, 256, 256, 3) 6 zero_padding2d_36[0][0]
__________________________________________________________________________________________________
u-resnet34 (Model) (None, 256, 256, 1) 24453178 conv2d_3[0][0]
__________________________________________________________________________________________________
input_4 (InputLayer) (None, 1, 1, 1) 0
__________________________________________________________________________________________________
cropping2d_2 (Cropping2D) (None, 202, 202, 1) 0 u-resnet34[1][0]
__________________________________________________________________________________________________
lambda_3 (Lambda) (None, 1, 1, 1) 0 input_4[0][0]
__________________________________________________________________________________________________
max_pooling2d_2 (MaxPooling2D) (None, 101, 101, 1) 0 cropping2d_2[0][0]
__________________________________________________________________________________________________
lambda_4 (Lambda) (None, 101, 101, 1) 0 lambda_3[0][0]
__________________________________________________________________________________________________
concatenate_10 (Concatenate) (None, 101, 101, 2) 0 max_pooling2d_2[0][0]
lambda_4[0][0]
__________________________________________________________________________________________________
conv2d_14 (Conv2D) (None, 101, 101, 1) 3 concatenate_10[0][0]
==================================================================================================
Total params: 24,453,187
Trainable params: 24,437,821
Non-trainable params: 15,366
_____________________________________

u-resnet34 层是另一个模型,内部有更多层。我可以打印它的摘要,并且可以卡住我想要的任何图层。当我卡住 u-resnet34 层并打印摘要时,我可以看到可训练参数相应减少。

但是,即使我将模型层卡住在 model1 内,model1 的可训练参数也不会减少。

如何卡住 u-resnet34 层并使其反射(reflect) model1 的可训练参数?

<小时/>

编辑:下面是我的代码

# https://github.com/qubvel/segmentation_models
from segmentation_models import Unet
from keras.models import Model
from keras.layers import Input, Cropping2D, Conv2D

inputs = Input((256, 256, 3))
resnetmodel = Unet(backbone_name='resnet34', encoder_weights='imagenet', input_shape=(256, 256, 3), activation=None)
outputs = resnetmodel(inputs)
outputs = Cropping2D(cropping=((27, 27), (27, 27)) ) (outputs)
outputs = Conv2D(1, (1, 1), activation='sigmoid') (outputs)

model = Model(inputs=inputs, outputs=outputs)
model.compile(optimizer='adam', loss='binary_crossentropy')
model.summary()

输出:

Total params: 24,453,180
Trainable params: 24,437,814
Non-trainable params: 15,366

然后:

for layer in resnetmodel.layers:
layer.trainable = False
resnetmodel.summary()

哪些输出:

Total params: 24,453,178
Trainable params: 0
Non-trainable params: 24,453,178

最后是这个:

model.summary()

输出如下:

Total params: 48,890,992
Trainable params: 24,437,814
Non-trainable params: 24,453,178

最佳答案

让我们以 ResNet50 为例。

from keras.models import Model
from keras.layers import Input, Dense
from keras.applications.resnet50 import ResNet50

res = ResNet50()
res.summary()
#....
#Total params: 25,636,712
#Trainable params: 25,583,592
#Non-trainable params: 53,120

Resn​​et 模型有很多参数需要训练。

让我们将其作为模型的一层。

x = Input((224,224,3))
y = res(x)
y = Dense(10)(y)
model = Model(x, y)
model.summary()
#.....
#Total params: 25,646,722
#Trainable params: 25,593,602
#Non-trainable params: 53,120

卡住 resnet 层。

for layer in res.layers:
layer.trainable = False
res.summary()
# ....
#Total params: 25,636,712
#Trainable params: 0
#Non-trainable params: 25,636,712

这也反射(reflect)了使用 resnet 的模型。

model.summary()
#.....
#Total params: 25,646,722
#Trainable params: 10,010
#Non-trainable params: 25,636,712

因此,内部模型的卡住层应该反射(reflect)到外部模型。

编辑

如果您在卡住模型之前编译模型,则需要再次编译。

关于python - 如何从 keras 中父模型的摘要中公开子模型的各层?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/52336811/

26 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com