• Keras入门与残差网络的搭建


    发现草稿箱里还有一篇很早之前的学习笔记,希望可以帮助到有需要的童鞋~

    目录

    1、keras入门

    2、残差网络 (ResNet)

    2.1、恒等块

    2.2、卷积块

    搭建一个50层的残差网络

    自己的测试数据


    1、keras入门

           本文参考参考

           Keras模型大纲:

    1. def model(input_shape):
    2. """
    3. 模型大纲
    4. """
    5. #定义一个tensor的placeholder,维度为input_shape
    6. X_input = Input(input_shape)
    7. #使用0填充:X_input的周围填充0
    8. X = ZeroPadding2D((3,3))(X_input)
    9. # 对X使用 CONV -> BN -> RELU 块
    10. #第一个参数:输出的特征个数,第二个kernel_size,第三个stride
    11. X = Conv2D(32, (7, 7), strides = (1, 1), name = 'conv0')(X)
    12. X = BatchNormalization(axis = 3, name = 'bn0')(X)
    13. X = Activation('relu')(X)
    14. #最大值池化层
    15. X = MaxPooling2D((2,2),name="max_pool")(X)
    16. #降维,矩阵转化为向量 + 全连接层
    17. X = Flatten()(X)
    18. #全连接,第一个参数:输出特征个数,第二个参数:激活方式
    19. X = Dense(1, activation='sigmoid', name='fc')(X)
    20. #创建模型,讲话创建一个模型的实体,我们可以用它来训练、测试。
    21. #类似tensorflow,你告诉他输入和输出之间的关系就可以了
    22. model = Model(inputs = X_input, outputs = X, name='HappyModel')
    23. return model

    设计好模型之后:

    1、创建模型实体

    2、编译模型:参数顺序为 1、优化器 2、损失计算方式 3、衡量指标   (编译模型就是告诉他实施的具体细节,否则样本输入之后模型也不知道如何计算如何优化)

    3、训练模型 :输入训练样本以及标签,迭代次数,批数据大小

    4、评估模型:

    1. #创建一个模型实体
    2. model_test = model(X_train.shape[1:])
    3. #编译模型
    4. model_test.compile("adam","binary_crossentropy", metrics=['accuracy'])
    5. #训练模型
    6. #请注意,此操作会花费你大约6-10分钟。
    7. model_test.fit(X_train, Y_train, epochs=40, batch_size=50)
    8. #评估模型
    9. preds = model_test.evaluate(X_test, Y_test, batch_size=32, verbose=1, sample_weight=None)
    10. print ("误差值 = " + str(preds[0]))
    11. print ("准确度 = " + str(preds[1]))

    其他功能:

    1、model.summary():打印每一层的细节,输出类似下面的结果

    1. _________________________________________________________________
    2. Layer (type) Output Shape Param #
    3. =================================================================
    4. input_2 (InputLayer) (None, 64, 64, 3) 0
    5. _________________________________________________________________
    6. zero_padding2d_2 (ZeroPaddin (None, 70, 70, 3) 0
    7. _________________________________________________________________
    8. conv0 (Conv2D) (None, 64, 64, 32) 4736
    9. _________________________________________________________________
    10. bn0 (BatchNormalization) (None, 64, 64, 32) 128
    11. _________________________________________________________________
    12. activation_2 (Activation) (None, 64, 64, 32) 0
    13. _________________________________________________________________
    14. max_pool (MaxPooling2D) (None, 32, 32, 32) 0
    15. _________________________________________________________________
    16. flatten_2 (Flatten) (None, 32768) 0
    17. _________________________________________________________________
    18. fc (Dense) (None, 1) 32769
    19. =================================================================
    20. Total params: 37,633
    21. Trainable params: 37,569
    22. Non-trainable params: 64
    23. _________________________________________________________________

    2、plot_model():绘制布局图

    1. %matplotlib inline
    2. plot_model(happy_model, to_file='happy_model.png')
    3. SVG(model_to_dot(happy_model).create(prog='dot', format='svg'))


    2、残差网络 (ResNet)

           神经网络层数深了会变得更加难以训练,出现梯度消失等问题。残差网络解决了深层网络难以训练的问题。

    2.1、恒等块

           基本结构:上面的曲线为捷径,可以看到在输入X卷积二次之后输出的结果和输入X进行了相加,然后进行了激活。这样做就实现了更深层次的梯度直接传向较浅的层的功能。

           实现细节:由于需要相加,那么两次卷积的输出结果需要和输入X的shape相同,所以这就被称为恒等块。下面的实现中将会完成下图的3层跳跃,同样这也是一个恒等块。

    1. import numpy as np
    2. import tensorflow as tf
    3. from keras import layers
    4. from keras.layers import Input, Add, Dense, Activation, ZeroPadding2D, BatchNormalization, Flatten, Conv2D, AveragePooling2D, MaxPooling2D, GlobalMaxPooling2D
    5. from keras.models import Model, load_model
    6. from keras.preprocessing import image
    7. from keras.utils import layer_utils
    8. from keras.utils.data_utils import get_file
    9. from keras.applications.imagenet_utils import preprocess_input
    10. from keras.utils.vis_utils import model_to_dot
    11. from keras.utils import plot_model
    12. from keras.initializers import glorot_uniform
    13. import pydot
    14. from IPython.display import SVG
    15. import scipy.misc
    16. from matplotlib.pyplot import imshow
    17. import keras.backend as K
    18. K.set_image_data_format('channels_last')
    19. K.set_learning_phase(1)
    20. import resnets_utils

        不得不说,Keras牛批。

        Conv2D(输出特征数,kernel_size, stride, padding, name, kernel_initializer)():直接完成了卷积操作,一步到位

        BatchNormalization(axis, name):对通道层批量归一化,axis = 3

        Activation(): 完成激活

    1. def identify_block(X,f,filters,stage,block):
    2. """
    3. X - 输入的tensor类型数据,维度为(m, n_H_prev, n_W_prev, n_H_prev)
    4. f - kernal大小
    5. filters - 整数列表,定义每一层卷积层过滤器的数量
    6. stage - 整数 定义层位置
    7. block - 字符串 定义层位置
    8. X - 恒等输出,tensor类型,维度(n_H, n_W, n_C)
    9. """
    10. conv_name_base = 'res' +str(stage) +block +'_branch'
    11. bn_name_base = 'bn' + str(stage) + block + '_branch'
    12. F1, F2, F3 = filters #定义输出特征的个数
    13. X_shortcut = X
    14. X = Conv2D(filters=F1,kernel_size=(1,1),strides=(1,1),padding='valid',name = conv_name_base+'2a',
    15. kernel_initializer=glorot_uniform(seed=0))(X)
    16. X = BatchNormalization(axis=3, name= bn_name_base+'2a')(X)
    17. X = Activation('relu')(X)
    18. X = Conv2D(filters=F2, kernel_size=(f,f),strides=(1,1),padding='same',name=conv_name_base+'2b',
    19. kernel_initializer=glorot_uniform(seed=0))(X)
    20. X = BatchNormalization(axis=3,name= bn_name_base+'2b')(X)
    21. X = Activation('relu')(X)
    22. X = Conv2D(filters=F3,kernel_size=(1,1),strides=(1,1),padding='valid',name=conv_name_base+'2c',
    23. kernel_initializer=glorot_uniform(seed=0))(X)
    24. X = BatchNormalization(axis=3,name=bn_name_base+'2c')(X)
    25. #没有激活
    26. X = Add()([X,X_shortcut])
    27. X = Activation('relu')(X)
    28. return X

    2.2、卷积块

           上述恒等块要求在主线上进行卷积时shape不变,这样才能和捷径上的X相加。如果形状变化了,那就在捷径中加上卷积层,使捷径上卷积层的输出和主线上的shape相同。

    1. def convolutional_block(X,f,filters,stage,block,s=2):
    2. #参数意义和上文相同
    3. conv_name_base = 'res' +str(stage) +block +'_branch'
    4. bn_name_base = 'bn' + str(stage) + block + '_branch'
    5. F1,F2,F3 = filters
    6. X_shortcut = X
    7. X = Conv2D(filters=F1,kernel_size=(1,1),strides=(s,s),padding='valid',name = conv_name_base+'2a',kernel_initializer=glorot_uniform(seed=0))(X)
    8. X = BatchNormalization(axis=3,name=bn_name_base+'2a')(X)
    9. X = Activation('relu')(X)
    10. X = Conv2D(filters=F2,kernel_size=(f,f),strides=(1,1),padding='same',name = conv_name_base+'2b',kernel_initializer=glorot_uniform(seed=0))(X)
    11. X = BatchNormalization(axis=3,name= bn_name_base+'2b')(X)
    12. X =Activation('relu')(X)
    13. X = Conv2D(filters=F3,kernel_size=(1,1),strides=(1,1),padding='valid',name=conv_name_base+'2c',kernel_initializer=glorot_uniform(seed=0))(X)
    14. X = BatchNormalization(axis=3, name= bn_name_base+'2c')(X)
    15. #shortcut
    16. X_shortcut = Conv2D(filters=F3,kernel_size=(1,1),strides=(s,s),padding='valid',name = conv_name_base+'1',
    17. kernel_initializer=glorot_uniform(seed=0))(X_shortcut)
    18. X_shortcut = BatchNormalization(axis=3,name = bn_name_base+'1')(X_shortcut)
    19. X = Add()([X,X_shortcut])
    20. X = Activation('relu')(X)
    21. return X

    搭建一个50层的残差网络

           网络结构如下:

           ID_BLOCK对应恒等块,CONV_BLOCK对应卷积块,每个块有3层,总共50层。

    1. def ResNet50(input_shape=(64,64,3),classes=6):
    2. """
    3. CONV2D -> BATCHNORM -> RELU -> MAXPOOL -> CONVBLOCK -> IDBLOCK*2 -> CONVBLOCK -> IDBLOCK*3
    4. -> CONVBLOCK -> IDBLOCK*5 -> CONVBLOCK -> IDBLOCK*2 -> AVGPOOL -> TOPLAYER
    5. input_shape: 据集维度
    6. classes: 分类数
    7. """
    8. #定义一个placeholder
    9. X_input = Input(input_shape)
    10. #0填充
    11. X = ZeroPadding2D((3,3))(X_input)
    12. #stage1
    13. X = Conv2D(filters=64,kernel_size=(7,7),strides=(2,2),name='conv1',kernel_initializer=glorot_uniform(seed=0))(X)
    14. X= BatchNormalization(axis=3, name='bn_conv1')(X)
    15. X = Activation('relu')(X)
    16. X = MaxPooling2D(pool_size=(3,3),strides=(2,2))(X)
    17. #stage2
    18. X = convolutional_block(X,f=3,filters=[64,64,256],stage=2,block='a',s=1)
    19. X = identify_block(X, f=3,filters=[64,64,256],stage=2,block='b')
    20. X = identify_block(X, f=3,filters=[64,64,256],stage=2,block='c')
    21. #stage3
    22. X = convolutional_block(X, f=3, filters=[128,128,512], stage=3, block="a", s=2)
    23. X = identify_block(X, f=3, filters=[128,128,512], stage=3, block="b")
    24. X = identify_block(X, f=3, filters=[128,128,512], stage=3, block="c")
    25. X = identify_block(X, f=3, filters=[128,128,512], stage=3, block="d")
    26. #stage4
    27. X = convolutional_block(X, f=3, filters=[256,256,1024], stage=4, block="a", s=2)
    28. X = identify_block(X, f=3, filters=[256,256,1024], stage=4, block="b")
    29. X = identify_block(X, f=3, filters=[256,256,1024], stage=4, block="c")
    30. X = identify_block(X, f=3, filters=[256,256,1024], stage=4, block="d")
    31. X = identify_block(X, f=3, filters=[256,256,1024], stage=4, block="e")
    32. X = identify_block(X, f=3, filters=[256,256,1024], stage=4, block="f")
    33. #stage5
    34. X = convolutional_block(X, f=3, filters=[512,512,2048], stage=5, block="a", s=2)
    35. X = identify_block(X, f=3, filters=[512,512,2048], stage=5, block="b")
    36. X = identify_block(X, f=3, filters=[512,512,2048], stage=5, block="c")
    37. #均值池化
    38. X = AveragePooling2D(pool_size=(2,2),padding='same')(X)
    39. #输出层
    40. X = Flatten()(X)
    41. X = Dense(classes,activation="softmax",name='fc'+str(classes),kernel_initializer=glorot_uniform(seed=0))(X)
    42. model = Model(inputs = X_input,output = X, name= 'ResNet50')
    43. return model

    创建实例以及编译 ,训练。我们要做的就是输入数据的shape

    1. model = ResNet50(input_shape=(64,64,3),classes=6)
    2. model.compile(optimizer="adam", loss="categorical_crossentropy", metrics=["accuracy"])
    3. model.fit(X_train,Y_train,epochs=2,batch_size=32)

    模型评估 

    1. preds = model.evaluate(X_test,Y_test)
    2. print("误差值 = " + str(preds[0]))
    3. print("准确率 = " + str(preds[1]))
    model.summary()
    1. __________________________________________________________________________________________________
    2. Layer (type) Output Shape Param # Connected to
    3. ==================================================================================================
    4. input_3 (InputLayer) (None, 64, 64, 3) 0
    5. __________________________________________________________________________________________________
    6. zero_padding2d_3 (ZeroPadding2D (None, 70, 70, 3) 0 input_3[0][0]
    7. __________________________________________________________________________________________________
    8. conv1 (Conv2D) (None, 32, 32, 64) 9472 zero_padding2d_3[0][0]
    9. __________________________________________________________________________________________________
    10. bn_conv1 (BatchNormalization) (None, 32, 32, 64) 256 conv1[0][0]
    11. __________________________________________________________________________________________________
    12. activation_66 (Activation) (None, 32, 32, 64) 0 bn_conv1[0][0]
    13. __________________________________________________________________________________________________
    14. max_pooling2d_3 (MaxPooling2D) (None, 15, 15, 64) 0 activation_66[0][0]
    15. __________________________________________________________________________________________________
    16. res2a_branch2a (Conv2D) (None, 15, 15, 64) 4160 max_pooling2d_3[0][0]
    17. __________________________________________________________________________________________________
    18. bn2a_branch2a (BatchNormalizati (None, 15, 15, 64) 256 res2a_branch2a[0][0]
    19. __________________________________________________________________________________________________
    20. activation_67 (Activation) (None, 15, 15, 64) 0 bn2a_branch2a[0][0]
    21. __________________________________________________________________________________________________
    22. res2a_branch2b (Conv2D) (None, 15, 15, 64) 36928 activation_67[0][0]
    23. __________________________________________________________________________________________________
    24. bn2a_branch2b (BatchNormalizati (None, 15, 15, 64) 256 res2a_branch2b[0][0]
    25. __________________________________________________________________________________________________
    26. activation_68 (Activation) (None, 15, 15, 64) 0 bn2a_branch2b[0][0]
    27. __________________________________________________________________________________________________
    28. res2a_branch2c (Conv2D) (None, 15, 15, 256) 16640 activation_68[0][0]
    29. __________________________________________________________________________________________________
    30. res2a_branch1 (Conv2D) (None, 15, 15, 256) 16640 max_pooling2d_3[0][0]
    31. __________________________________________________________________________________________________
    32. bn2a_branch2c (BatchNormalizati (None, 15, 15, 256) 1024 res2a_branch2c[0][0]
    33. __________________________________________________________________________________________________
    34. bn2a_branch1 (BatchNormalizatio (None, 15, 15, 256) 1024 res2a_branch1[0][0]
    35. __________________________________________________________________________________________________
    36. add_22 (Add) (None, 15, 15, 256) 0 bn2a_branch2c[0][0]
    37. bn2a_branch1[0][0]
    38. __________________________________________________________________________________________________
    39. activation_69 (Activation) (None, 15, 15, 256) 0 add_22[0][0]
    40. __________________________________________________________________________________________________
    41. res2b_branch2a (Conv2D) (None, 15, 15, 64) 16448 activation_69[0][0]
    42. __________________________________________________________________________________________________
    43. bn2b_branch2a (BatchNormalizati (None, 15, 15, 64) 256 res2b_branch2a[0][0]
    44. __________________________________________________________________________________________________
    45. activation_70 (Activation) (None, 15, 15, 64) 0 bn2b_branch2a[0][0]
    46. __________________________________________________________________________________________________
    47. res2b_branch2b (Conv2D) (None, 15, 15, 64) 36928 activation_70[0][0]
    48. __________________________________________________________________________________________________
    49. bn2b_branch2b (BatchNormalizati (None, 15, 15, 64) 256 res2b_branch2b[0][0]
    50. __________________________________________________________________________________________________
    51. activation_71 (Activation) (None, 15, 15, 64) 0 bn2b_branch2b[0][0]
    52. __________________________________________________________________________________________________
    53. res2b_branch2c (Conv2D) (None, 15, 15, 256) 16640 activation_71[0][0]
    54. __________________________________________________________________________________________________
    55. bn2b_branch2c (BatchNormalizati (None, 15, 15, 256) 1024 res2b_branch2c[0][0]
    56. __________________________________________________________________________________________________
    57. add_23 (Add) (None, 15, 15, 256) 0 bn2b_branch2c[0][0]
    58. activation_69[0][0]
    59. __________________________________________________________________________________________________
    60. activation_72 (Activation) (None, 15, 15, 256) 0 add_23[0][0]
    61. __________________________________________________________________________________________________
    62. res2c_branch2a (Conv2D) (None, 15, 15, 64) 16448 activation_72[0][0]
    63. __________________________________________________________________________________________________
    64. bn2c_branch2a (BatchNormalizati (None, 15, 15, 64) 256 res2c_branch2a[0][0]
    65. __________________________________________________________________________________________________
    66. activation_73 (Activation) (None, 15, 15, 64) 0 bn2c_branch2a[0][0]
    67. __________________________________________________________________________________________________
    68. res2c_branch2b (Conv2D) (None, 15, 15, 64) 36928 activation_73[0][0]
    69. __________________________________________________________________________________________________
    70. bn2c_branch2b (BatchNormalizati (None, 15, 15, 64) 256 res2c_branch2b[0][0]
    71. __________________________________________________________________________________________________
    72. activation_74 (Activation) (None, 15, 15, 64) 0 bn2c_branch2b[0][0]
    73. __________________________________________________________________________________________________
    74. res2c_branch2c (Conv2D) (None, 15, 15, 256) 16640 activation_74[0][0]
    75. __________________________________________________________________________________________________
    76. bn2c_branch2c (BatchNormalizati (None, 15, 15, 256) 1024 res2c_branch2c[0][0]
    77. __________________________________________________________________________________________________
    78. add_24 (Add) (None, 15, 15, 256) 0 bn2c_branch2c[0][0]
    79. activation_72[0][0]
    80. __________________________________________________________________________________________________
    81. activation_75 (Activation) (None, 15, 15, 256) 0 add_24[0][0]
    82. __________________________________________________________________________________________________
    83. res3a_branch2a (Conv2D) (None, 8, 8, 128) 32896 activation_75[0][0]
    84. __________________________________________________________________________________________________
    85. bn3a_branch2a (BatchNormalizati (None, 8, 8, 128) 512 res3a_branch2a[0][0]
    86. __________________________________________________________________________________________________
    87. activation_76 (Activation) (None, 8, 8, 128) 0 bn3a_branch2a[0][0]
    88. __________________________________________________________________________________________________
    89. res3a_branch2b (Conv2D) (None, 8, 8, 128) 147584 activation_76[0][0]
    90. __________________________________________________________________________________________________
    91. bn3a_branch2b (BatchNormalizati (None, 8, 8, 128) 512 res3a_branch2b[0][0]
    92. __________________________________________________________________________________________________
    93. activation_77 (Activation) (None, 8, 8, 128) 0 bn3a_branch2b[0][0]
    94. __________________________________________________________________________________________________
    95. res3a_branch2c (Conv2D) (None, 8, 8, 512) 66048 activation_77[0][0]
    96. __________________________________________________________________________________________________
    97. res3a_branch1 (Conv2D) (None, 8, 8, 512) 131584 activation_75[0][0]
    98. __________________________________________________________________________________________________
    99. bn3a_branch2c (BatchNormalizati (None, 8, 8, 512) 2048 res3a_branch2c[0][0]
    100. __________________________________________________________________________________________________
    101. bn3a_branch1 (BatchNormalizatio (None, 8, 8, 512) 2048 res3a_branch1[0][0]
    102. __________________________________________________________________________________________________
    103. add_25 (Add) (None, 8, 8, 512) 0 bn3a_branch2c[0][0]
    104. bn3a_branch1[0][0]
    105. __________________________________________________________________________________________________
    106. activation_78 (Activation) (None, 8, 8, 512) 0 add_25[0][0]
    107. __________________________________________________________________________________________________
    108. res3b_branch2a (Conv2D) (None, 8, 8, 128) 65664 activation_78[0][0]
    109. __________________________________________________________________________________________________
    110. bn3b_branch2a (BatchNormalizati (None, 8, 8, 128) 512 res3b_branch2a[0][0]
    111. __________________________________________________________________________________________________
    112. activation_79 (Activation) (None, 8, 8, 128) 0 bn3b_branch2a[0][0]
    113. __________________________________________________________________________________________________
    114. res3b_branch2b (Conv2D) (None, 8, 8, 128) 147584 activation_79[0][0]
    115. __________________________________________________________________________________________________
    116. bn3b_branch2b (BatchNormalizati (None, 8, 8, 128) 512 res3b_branch2b[0][0]
    117. __________________________________________________________________________________________________
    118. activation_80 (Activation) (None, 8, 8, 128) 0 bn3b_branch2b[0][0]
    119. __________________________________________________________________________________________________
    120. res3b_branch2c (Conv2D) (None, 8, 8, 512) 66048 activation_80[0][0]
    121. __________________________________________________________________________________________________
    122. bn3b_branch2c (BatchNormalizati (None, 8, 8, 512) 2048 res3b_branch2c[0][0]
    123. __________________________________________________________________________________________________
    124. add_26 (Add) (None, 8, 8, 512) 0 bn3b_branch2c[0][0]
    125. activation_78[0][0]
    126. __________________________________________________________________________________________________
    127. activation_81 (Activation) (None, 8, 8, 512) 0 add_26[0][0]
    128. __________________________________________________________________________________________________
    129. res3c_branch2a (Conv2D) (None, 8, 8, 128) 65664 activation_81[0][0]
    130. __________________________________________________________________________________________________
    131. bn3c_branch2a (BatchNormalizati (None, 8, 8, 128) 512 res3c_branch2a[0][0]
    132. __________________________________________________________________________________________________
    133. activation_82 (Activation) (None, 8, 8, 128) 0 bn3c_branch2a[0][0]
    134. __________________________________________________________________________________________________
    135. res3c_branch2b (Conv2D) (None, 8, 8, 128) 147584 activation_82[0][0]
    136. __________________________________________________________________________________________________
    137. bn3c_branch2b (BatchNormalizati (None, 8, 8, 128) 512 res3c_branch2b[0][0]
    138. __________________________________________________________________________________________________
    139. activation_83 (Activation) (None, 8, 8, 128) 0 bn3c_branch2b[0][0]
    140. __________________________________________________________________________________________________
    141. res3c_branch2c (Conv2D) (None, 8, 8, 512) 66048 activation_83[0][0]
    142. __________________________________________________________________________________________________
    143. bn3c_branch2c (BatchNormalizati (None, 8, 8, 512) 2048 res3c_branch2c[0][0]
    144. __________________________________________________________________________________________________
    145. add_27 (Add) (None, 8, 8, 512) 0 bn3c_branch2c[0][0]
    146. activation_81[0][0]
    147. __________________________________________________________________________________________________
    148. activation_84 (Activation) (None, 8, 8, 512) 0 add_27[0][0]
    149. __________________________________________________________________________________________________
    150. res3d_branch2a (Conv2D) (None, 8, 8, 128) 65664 activation_84[0][0]
    151. __________________________________________________________________________________________________
    152. bn3d_branch2a (BatchNormalizati (None, 8, 8, 128) 512 res3d_branch2a[0][0]
    153. __________________________________________________________________________________________________
    154. activation_85 (Activation) (None, 8, 8, 128) 0 bn3d_branch2a[0][0]
    155. __________________________________________________________________________________________________
    156. res3d_branch2b (Conv2D) (None, 8, 8, 128) 147584 activation_85[0][0]
    157. __________________________________________________________________________________________________
    158. bn3d_branch2b (BatchNormalizati (None, 8, 8, 128) 512 res3d_branch2b[0][0]
    159. __________________________________________________________________________________________________
    160. activation_86 (Activation) (None, 8, 8, 128) 0 bn3d_branch2b[0][0]
    161. __________________________________________________________________________________________________
    162. res3d_branch2c (Conv2D) (None, 8, 8, 512) 66048 activation_86[0][0]
    163. __________________________________________________________________________________________________
    164. bn3d_branch2c (BatchNormalizati (None, 8, 8, 512) 2048 res3d_branch2c[0][0]
    165. __________________________________________________________________________________________________
    166. add_28 (Add) (None, 8, 8, 512) 0 bn3d_branch2c[0][0]
    167. activation_84[0][0]
    168. __________________________________________________________________________________________________
    169. activation_87 (Activation) (None, 8, 8, 512) 0 add_28[0][0]
    170. __________________________________________________________________________________________________
    171. res4a_branch2a (Conv2D) (None, 4, 4, 256) 131328 activation_87[0][0]
    172. __________________________________________________________________________________________________
    173. bn4a_branch2a (BatchNormalizati (None, 4, 4, 256) 1024 res4a_branch2a[0][0]
    174. __________________________________________________________________________________________________
    175. activation_88 (Activation) (None, 4, 4, 256) 0 bn4a_branch2a[0][0]
    176. __________________________________________________________________________________________________
    177. res4a_branch2b (Conv2D) (None, 4, 4, 256) 590080 activation_88[0][0]
    178. __________________________________________________________________________________________________
    179. bn4a_branch2b (BatchNormalizati (None, 4, 4, 256) 1024 res4a_branch2b[0][0]
    180. __________________________________________________________________________________________________
    181. activation_89 (Activation) (None, 4, 4, 256) 0 bn4a_branch2b[0][0]
    182. __________________________________________________________________________________________________
    183. res4a_branch2c (Conv2D) (None, 4, 4, 1024) 263168 activation_89[0][0]
    184. __________________________________________________________________________________________________
    185. res4a_branch1 (Conv2D) (None, 4, 4, 1024) 525312 activation_87[0][0]
    186. __________________________________________________________________________________________________
    187. bn4a_branch2c (BatchNormalizati (None, 4, 4, 1024) 4096 res4a_branch2c[0][0]
    188. __________________________________________________________________________________________________
    189. bn4a_branch1 (BatchNormalizatio (None, 4, 4, 1024) 4096 res4a_branch1[0][0]
    190. __________________________________________________________________________________________________
    191. add_29 (Add) (None, 4, 4, 1024) 0 bn4a_branch2c[0][0]
    192. bn4a_branch1[0][0]
    193. __________________________________________________________________________________________________
    194. activation_90 (Activation) (None, 4, 4, 1024) 0 add_29[0][0]
    195. __________________________________________________________________________________________________
    196. res4b_branch2a (Conv2D) (None, 4, 4, 256) 262400 activation_90[0][0]
    197. __________________________________________________________________________________________________
    198. bn4b_branch2a (BatchNormalizati (None, 4, 4, 256) 1024 res4b_branch2a[0][0]
    199. __________________________________________________________________________________________________
    200. activation_91 (Activation) (None, 4, 4, 256) 0 bn4b_branch2a[0][0]
    201. __________________________________________________________________________________________________
    202. res4b_branch2b (Conv2D) (None, 4, 4, 256) 590080 activation_91[0][0]
    203. __________________________________________________________________________________________________
    204. bn4b_branch2b (BatchNormalizati (None, 4, 4, 256) 1024 res4b_branch2b[0][0]
    205. __________________________________________________________________________________________________
    206. activation_92 (Activation) (None, 4, 4, 256) 0 bn4b_branch2b[0][0]
    207. __________________________________________________________________________________________________
    208. res4b_branch2c (Conv2D) (None, 4, 4, 1024) 263168 activation_92[0][0]
    209. __________________________________________________________________________________________________
    210. bn4b_branch2c (BatchNormalizati (None, 4, 4, 1024) 4096 res4b_branch2c[0][0]
    211. __________________________________________________________________________________________________
    212. add_30 (Add) (None, 4, 4, 1024) 0 bn4b_branch2c[0][0]
    213. activation_90[0][0]
    214. __________________________________________________________________________________________________
    215. activation_93 (Activation) (None, 4, 4, 1024) 0 add_30[0][0]
    216. __________________________________________________________________________________________________
    217. res4c_branch2a (Conv2D) (None, 4, 4, 256) 262400 activation_93[0][0]
    218. __________________________________________________________________________________________________
    219. bn4c_branch2a (BatchNormalizati (None, 4, 4, 256) 1024 res4c_branch2a[0][0]
    220. __________________________________________________________________________________________________
    221. activation_94 (Activation) (None, 4, 4, 256) 0 bn4c_branch2a[0][0]
    222. __________________________________________________________________________________________________
    223. res4c_branch2b (Conv2D) (None, 4, 4, 256) 590080 activation_94[0][0]
    224. __________________________________________________________________________________________________
    225. bn4c_branch2b (BatchNormalizati (None, 4, 4, 256) 1024 res4c_branch2b[0][0]
    226. __________________________________________________________________________________________________
    227. activation_95 (Activation) (None, 4, 4, 256) 0 bn4c_branch2b[0][0]
    228. __________________________________________________________________________________________________
    229. res4c_branch2c (Conv2D) (None, 4, 4, 1024) 263168 activation_95[0][0]
    230. __________________________________________________________________________________________________
    231. bn4c_branch2c (BatchNormalizati (None, 4, 4, 1024) 4096 res4c_branch2c[0][0]
    232. __________________________________________________________________________________________________
    233. add_31 (Add) (None, 4, 4, 1024) 0 bn4c_branch2c[0][0]
    234. activation_93[0][0]
    235. __________________________________________________________________________________________________
    236. activation_96 (Activation) (None, 4, 4, 1024) 0 add_31[0][0]
    237. __________________________________________________________________________________________________
    238. res4d_branch2a (Conv2D) (None, 4, 4, 256) 262400 activation_96[0][0]
    239. __________________________________________________________________________________________________
    240. bn4d_branch2a (BatchNormalizati (None, 4, 4, 256) 1024 res4d_branch2a[0][0]
    241. __________________________________________________________________________________________________
    242. activation_97 (Activation) (None, 4, 4, 256) 0 bn4d_branch2a[0][0]
    243. __________________________________________________________________________________________________
    244. res4d_branch2b (Conv2D) (None, 4, 4, 256) 590080 activation_97[0][0]
    245. __________________________________________________________________________________________________
    246. bn4d_branch2b (BatchNormalizati (None, 4, 4, 256) 1024 res4d_branch2b[0][0]
    247. __________________________________________________________________________________________________
    248. activation_98 (Activation) (None, 4, 4, 256) 0 bn4d_branch2b[0][0]
    249. __________________________________________________________________________________________________
    250. res4d_branch2c (Conv2D) (None, 4, 4, 1024) 263168 activation_98[0][0]
    251. __________________________________________________________________________________________________
    252. bn4d_branch2c (BatchNormalizati (None, 4, 4, 1024) 4096 res4d_branch2c[0][0]
    253. __________________________________________________________________________________________________
    254. add_32 (Add) (None, 4, 4, 1024) 0 bn4d_branch2c[0][0]
    255. activation_96[0][0]
    256. __________________________________________________________________________________________________
    257. activation_99 (Activation) (None, 4, 4, 1024) 0 add_32[0][0]
    258. __________________________________________________________________________________________________
    259. res4e_branch2a (Conv2D) (None, 4, 4, 256) 262400 activation_99[0][0]
    260. __________________________________________________________________________________________________
    261. bn4e_branch2a (BatchNormalizati (None, 4, 4, 256) 1024 res4e_branch2a[0][0]
    262. __________________________________________________________________________________________________
    263. activation_100 (Activation) (None, 4, 4, 256) 0 bn4e_branch2a[0][0]
    264. __________________________________________________________________________________________________
    265. res4e_branch2b (Conv2D) (None, 4, 4, 256) 590080 activation_100[0][0]
    266. __________________________________________________________________________________________________
    267. bn4e_branch2b (BatchNormalizati (None, 4, 4, 256) 1024 res4e_branch2b[0][0]
    268. __________________________________________________________________________________________________
    269. activation_101 (Activation) (None, 4, 4, 256) 0 bn4e_branch2b[0][0]
    270. __________________________________________________________________________________________________
    271. res4e_branch2c (Conv2D) (None, 4, 4, 1024) 263168 activation_101[0][0]
    272. __________________________________________________________________________________________________
    273. bn4e_branch2c (BatchNormalizati (None, 4, 4, 1024) 4096 res4e_branch2c[0][0]
    274. __________________________________________________________________________________________________
    275. add_33 (Add) (None, 4, 4, 1024) 0 bn4e_branch2c[0][0]
    276. activation_99[0][0]
    277. __________________________________________________________________________________________________
    278. activation_102 (Activation) (None, 4, 4, 1024) 0 add_33[0][0]
    279. __________________________________________________________________________________________________
    280. res4f_branch2a (Conv2D) (None, 4, 4, 256) 262400 activation_102[0][0]
    281. __________________________________________________________________________________________________
    282. bn4f_branch2a (BatchNormalizati (None, 4, 4, 256) 1024 res4f_branch2a[0][0]
    283. __________________________________________________________________________________________________
    284. activation_103 (Activation) (None, 4, 4, 256) 0 bn4f_branch2a[0][0]
    285. __________________________________________________________________________________________________
    286. res4f_branch2b (Conv2D) (None, 4, 4, 256) 590080 activation_103[0][0]
    287. __________________________________________________________________________________________________
    288. bn4f_branch2b (BatchNormalizati (None, 4, 4, 256) 1024 res4f_branch2b[0][0]
    289. __________________________________________________________________________________________________
    290. activation_104 (Activation) (None, 4, 4, 256) 0 bn4f_branch2b[0][0]
    291. __________________________________________________________________________________________________
    292. res4f_branch2c (Conv2D) (None, 4, 4, 1024) 263168 activation_104[0][0]
    293. __________________________________________________________________________________________________
    294. bn4f_branch2c (BatchNormalizati (None, 4, 4, 1024) 4096 res4f_branch2c[0][0]
    295. __________________________________________________________________________________________________
    296. add_34 (Add) (None, 4, 4, 1024) 0 bn4f_branch2c[0][0]
    297. activation_102[0][0]
    298. __________________________________________________________________________________________________
    299. activation_105 (Activation) (None, 4, 4, 1024) 0 add_34[0][0]
    300. __________________________________________________________________________________________________
    301. res5a_branch2a (Conv2D) (None, 2, 2, 512) 524800 activation_105[0][0]
    302. __________________________________________________________________________________________________
    303. bn5a_branch2a (BatchNormalizati (None, 2, 2, 512) 2048 res5a_branch2a[0][0]
    304. __________________________________________________________________________________________________
    305. activation_106 (Activation) (None, 2, 2, 512) 0 bn5a_branch2a[0][0]
    306. __________________________________________________________________________________________________
    307. res5a_branch2b (Conv2D) (None, 2, 2, 512) 2359808 activation_106[0][0]
    308. __________________________________________________________________________________________________
    309. bn5a_branch2b (BatchNormalizati (None, 2, 2, 512) 2048 res5a_branch2b[0][0]
    310. __________________________________________________________________________________________________
    311. activation_107 (Activation) (None, 2, 2, 512) 0 bn5a_branch2b[0][0]
    312. __________________________________________________________________________________________________
    313. res5a_branch2c (Conv2D) (None, 2, 2, 2048) 1050624 activation_107[0][0]
    314. __________________________________________________________________________________________________
    315. res5a_branch1 (Conv2D) (None, 2, 2, 2048) 2099200 activation_105[0][0]
    316. __________________________________________________________________________________________________
    317. bn5a_branch2c (BatchNormalizati (None, 2, 2, 2048) 8192 res5a_branch2c[0][0]
    318. __________________________________________________________________________________________________
    319. bn5a_branch1 (BatchNormalizatio (None, 2, 2, 2048) 8192 res5a_branch1[0][0]
    320. __________________________________________________________________________________________________
    321. add_35 (Add) (None, 2, 2, 2048) 0 bn5a_branch2c[0][0]
    322. bn5a_branch1[0][0]
    323. __________________________________________________________________________________________________
    324. activation_108 (Activation) (None, 2, 2, 2048) 0 add_35[0][0]
    325. __________________________________________________________________________________________________
    326. res5b_branch2a (Conv2D) (None, 2, 2, 512) 1049088 activation_108[0][0]
    327. __________________________________________________________________________________________________
    328. bn5b_branch2a (BatchNormalizati (None, 2, 2, 512) 2048 res5b_branch2a[0][0]
    329. __________________________________________________________________________________________________
    330. activation_109 (Activation) (None, 2, 2, 512) 0 bn5b_branch2a[0][0]
    331. __________________________________________________________________________________________________
    332. res5b_branch2b (Conv2D) (None, 2, 2, 512) 2359808 activation_109[0][0]
    333. __________________________________________________________________________________________________
    334. bn5b_branch2b (BatchNormalizati (None, 2, 2, 512) 2048 res5b_branch2b[0][0]
    335. __________________________________________________________________________________________________
    336. activation_110 (Activation) (None, 2, 2, 512) 0 bn5b_branch2b[0][0]
    337. __________________________________________________________________________________________________
    338. res5b_branch2c (Conv2D) (None, 2, 2, 2048) 1050624 activation_110[0][0]
    339. __________________________________________________________________________________________________
    340. bn5b_branch2c (BatchNormalizati (None, 2, 2, 2048) 8192 res5b_branch2c[0][0]
    341. __________________________________________________________________________________________________
    342. add_36 (Add) (None, 2, 2, 2048) 0 bn5b_branch2c[0][0]
    343. activation_108[0][0]
    344. __________________________________________________________________________________________________
    345. activation_111 (Activation) (None, 2, 2, 2048) 0 add_36[0][0]
    346. __________________________________________________________________________________________________
    347. res5c_branch2a (Conv2D) (None, 2, 2, 512) 1049088 activation_111[0][0]
    348. __________________________________________________________________________________________________
    349. bn5c_branch2a (BatchNormalizati (None, 2, 2, 512) 2048 res5c_branch2a[0][0]
    350. __________________________________________________________________________________________________
    351. activation_112 (Activation) (None, 2, 2, 512) 0 bn5c_branch2a[0][0]
    352. __________________________________________________________________________________________________
    353. res5c_branch2b (Conv2D) (None, 2, 2, 512) 2359808 activation_112[0][0]
    354. __________________________________________________________________________________________________
    355. bn5c_branch2b (BatchNormalizati (None, 2, 2, 512) 2048 res5c_branch2b[0][0]
    356. __________________________________________________________________________________________________
    357. activation_113 (Activation) (None, 2, 2, 512) 0 bn5c_branch2b[0][0]
    358. __________________________________________________________________________________________________
    359. res5c_branch2c (Conv2D) (None, 2, 2, 2048) 1050624 activation_113[0][0]
    360. __________________________________________________________________________________________________
    361. bn5c_branch2c (BatchNormalizati (None, 2, 2, 2048) 8192 res5c_branch2c[0][0]
    362. __________________________________________________________________________________________________
    363. add_37 (Add) (None, 2, 2, 2048) 0 bn5c_branch2c[0][0]
    364. activation_111[0][0]
    365. __________________________________________________________________________________________________
    366. activation_114 (Activation) (None, 2, 2, 2048) 0 add_37[0][0]
    367. __________________________________________________________________________________________________
    368. average_pooling2d_2 (AveragePoo (None, 1, 1, 2048) 0 activation_114[0][0]
    369. __________________________________________________________________________________________________
    370. flatten_2 (Flatten) (None, 2048) 0 average_pooling2d_2[0][0]
    371. __________________________________________________________________________________________________
    372. fc6 (Dense) (None, 6) 12294 flatten_2[0][0]
    373. ==================================================================================================
    374. Total params: 23,600,006
    375. Trainable params: 23,546,886
    376. Non-trainable params: 53,120

    1. plot_model(model, to_file='model.png')
    2. SVG(model_to_dot(model).create(prog='dot', format='svg'))


    自己的测试数据

    这里放上一个基本流程

    1. from PIL import Image
    2. import numpy as np
    3. import matplotlib.pyplot as plt # plt 用于显示图片
    4. %matplotlib inline
    5. img_path = 'images/fingers_big/2.jpg'
    6. my_image = image.load_img(img_path, target_size=(64, 64))
    7. my_image = image.img_to_array(my_image)
    8. my_image = np.expand_dims(my_image,axis=0)
    9. my_image = preprocess_input(my_image)
    10. print("my_image.shape = " + str(my_image.shape))
    11. print("class prediction vector [p(0), p(1), p(2), p(3), p(4), p(5)] = ")
    12. print(model.predict(my_image))
    13. my_image = scipy.misc.imread(img_path)
    14. plt.imshow(my_image)

  • 相关阅读:
    DHorse系列文章之镜像制作
    Charles工具
    Linux C 线程
    【云原生 | Kubernetes 系列】Ingress
    LuatOS-SOC接口文档(air780E)-- fs - 文件系统额外操作
    从零开始 Spring Boot 16:枚举
    RNA核糖核酸修饰荧光染料|HiLyte Fluor 488/555/594/647/680/750标记RNA核糖核酸
    1.2-断言
    golang学习笔记系列之标识符,关键字以及命名规则
    【高等数学】二.一元函数微分学
  • 原文地址:https://blog.csdn.net/qq_41828351/article/details/90473684