- html - 出于某种原因,IE8 对我的 Sass 文件中继承的 html5 CSS 不友好?
- JMeter 在响应断言中使用 span 标签的问题
- html - 在 :hover and :active? 上具有不同效果的 CSS 动画
- html - 相对于居中的 html 内容固定的 CSS 重复背景?
我利用一些空闲时间快速学习了一些 Python 和 Keras。我创建了一个包含 4.050 个 a 类(三叶草)图像和 2.358 个 b 类(草)图像的图像集。可能会有更多的类(class),所以我没有选择二进制 class_mode。
这些图像被组织在每个类的子文件夹中,我将它们随机分为 70% 的训练数据和 30% 的测试数据,并具有相应的文件夹结构。训练和测试数据尚未标准化。
我训练了模型并保存了结果。我得到了大约 90% 的训练准确率。当我现在尝试预测单个图像(这是所需的用例)时,该预测的平均准确率约为 64%,非常接近于总体 a 类图像的百分比(4.050/(4.050+2.358) = ~63%)。在这个测试中,我使用了实际数据集的随机图像,但同样的不良结果在真实的新数据中也是可见的。查看预测,它主要预测 a 类和几次 b 类。为什么会这样?我不知道出了什么问题。你能看看吗?
所以模型在这里建立:
epochs = 50
IMG_HEIGHT = 50
IMG_WIDTH = 50
train_image_generator = ImageDataGenerator(
rescale=1./255,
rotation_range=45,
width_shift_range=.15,
height_shift_range=.15,
horizontal_flip=True,
zoom_range=0.1)
validation_image_generator = ImageDataGenerator(rescale=1./255)
train_path = os.path.join(global_dir,"Train")
validate_path = os.path.join(global_dir,"Validate")
train_data_gen = train_image_generator.flow_from_directory(directory=train_path,
shuffle=True,
target_size=(IMG_HEIGHT, IMG_WIDTH),
class_mode='categorical')
val_data_gen = validation_image_generator.flow_from_directory(directory=validate_path,
shuffle=True,
target_size=(IMG_HEIGHT, IMG_WIDTH),
class_mode='categorical')
model = Sequential([
Conv2D(16, 3, padding='same', activation='relu',
input_shape=(IMG_HEIGHT, IMG_WIDTH, 3)),
MaxPooling2D(),
Conv2D(32, 3, padding='same', activation='relu'),
MaxPooling2D(),
Dropout(0.2),
Conv2D(64, 3, padding='same', activation='relu'),
MaxPooling2D(),
Dropout(0.2),
Flatten(),
Dense(512, activation='relu'),
Dense(64, activation='relu'),
Dense(2, activation='softmax')
])
model.compile(optimizer='adam',
loss=keras.losses.categorical_crossentropy,
metrics=['accuracy'])
model.summary()
history = model.fit(
train_data_gen,
batch_size=200,
epochs=epochs,
validation_data=val_data_gen
)
model.save(global_dir + "/Model/1)
Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
conv2d (Conv2D) (None, 50, 50, 16) 448
_________________________________________________________________
max_pooling2d (MaxPooling2D) (None, 25, 25, 16) 0
_________________________________________________________________
conv2d_1 (Conv2D) (None, 25, 25, 32) 4640
_________________________________________________________________
max_pooling2d_1 (MaxPooling2 (None, 12, 12, 32) 0
_________________________________________________________________
dropout (Dropout) (None, 12, 12, 32) 0
_________________________________________________________________
conv2d_2 (Conv2D) (None, 12, 12, 64) 18496
_________________________________________________________________
max_pooling2d_2 (MaxPooling2 (None, 6, 6, 64) 0
_________________________________________________________________
dropout_1 (Dropout) (None, 6, 6, 64) 0
_________________________________________________________________
flatten (Flatten) (None, 2304) 0
_________________________________________________________________
dense (Dense) (None, 512) 1180160
_________________________________________________________________
dense_1 (Dense) (None, 64) 32832
_________________________________________________________________
dense_2 (Dense) (None, 2) 130
=================================================================
Total params: 1,236,706
Trainable params: 1,236,706
Non-trainable params: 0
_________________________________________________________________
Epoch 1/50
141/141 [==============================] - 14s 102ms/step - loss: 0.6216 - accuracy: 0.6468 - val_loss: 0.5396 - val_accuracy: 0.7120
Epoch 2/50
141/141 [==============================] - 12s 86ms/step - loss: 0.5129 - accuracy: 0.7488 - val_loss: 0.4427 - val_accuracy: 0.8056
Epoch 3/50
141/141 [==============================] - 12s 86ms/step - loss: 0.4917 - accuracy: 0.7624 - val_loss: 0.5004 - val_accuracy: 0.7705
Epoch 4/50
141/141 [==============================] - 15s 104ms/step - loss: 0.4510 - accuracy: 0.7910 - val_loss: 0.4226 - val_accuracy: 0.8198
Epoch 5/50
141/141 [==============================] - 12s 85ms/step - loss: 0.4056 - accuracy: 0.8219 - val_loss: 0.3439 - val_accuracy: 0.8514
Epoch 6/50
141/141 [==============================] - 12s 84ms/step - loss: 0.3904 - accuracy: 0.8295 - val_loss: 0.3207 - val_accuracy: 0.8646
Epoch 7/50
141/141 [==============================] - 12s 85ms/step - loss: 0.3764 - accuracy: 0.8304 - val_loss: 0.3185 - val_accuracy: 0.8702
Epoch 8/50
141/141 [==============================] - 12s 87ms/step - loss: 0.3695 - accuracy: 0.8362 - val_loss: 0.2958 - val_accuracy: 0.8743
Epoch 9/50
141/141 [==============================] - 12s 84ms/step - loss: 0.3455 - accuracy: 0.8574 - val_loss: 0.3096 - val_accuracy: 0.8687
Epoch 10/50
141/141 [==============================] - 12s 84ms/step - loss: 0.3483 - accuracy: 0.8473 - val_loss: 0.3552 - val_accuracy: 0.8412
Epoch 11/50
141/141 [==============================] - 12s 84ms/step - loss: 0.3362 - accuracy: 0.8616 - val_loss: 0.3004 - val_accuracy: 0.8804
Epoch 12/50
141/141 [==============================] - 12s 85ms/step - loss: 0.3277 - accuracy: 0.8616 - val_loss: 0.2974 - val_accuracy: 0.8733
Epoch 13/50
141/141 [==============================] - 12s 85ms/step - loss: 0.3243 - accuracy: 0.8589 - val_loss: 0.2732 - val_accuracy: 0.8931
Epoch 14/50
141/141 [==============================] - 12s 84ms/step - loss: 0.3324 - accuracy: 0.8563 - val_loss: 0.2568 - val_accuracy: 0.8941
Epoch 15/50
141/141 [==============================] - 12s 84ms/step - loss: 0.3071 - accuracy: 0.8701 - val_loss: 0.2706 - val_accuracy: 0.8911
Epoch 16/50
141/141 [==============================] - 12s 84ms/step - loss: 0.3114 - accuracy: 0.8696 - val_loss: 0.2503 - val_accuracy: 0.9059
Epoch 17/50
141/141 [==============================] - 12s 85ms/step - loss: 0.2978 - accuracy: 0.8794 - val_loss: 0.2853 - val_accuracy: 0.8896
Epoch 18/50
141/141 [==============================] - 12s 85ms/step - loss: 0.3029 - accuracy: 0.8725 - val_loss: 0.2458 - val_accuracy: 0.9033
Epoch 19/50
141/141 [==============================] - 12s 84ms/step - loss: 0.2988 - accuracy: 0.8721 - val_loss: 0.2713 - val_accuracy: 0.8916
Epoch 20/50
141/141 [==============================] - 12s 88ms/step - loss: 0.2960 - accuracy: 0.8747 - val_loss: 0.2649 - val_accuracy: 0.8926
Epoch 21/50
141/141 [==============================] - 13s 92ms/step - loss: 0.2901 - accuracy: 0.8819 - val_loss: 0.2611 - val_accuracy: 0.8957
Epoch 22/50
141/141 [==============================] - 12s 89ms/step - loss: 0.2879 - accuracy: 0.8821 - val_loss: 0.2497 - val_accuracy: 0.8947
Epoch 23/50
141/141 [==============================] - 12s 88ms/step - loss: 0.2831 - accuracy: 0.8817 - val_loss: 0.2396 - val_accuracy: 0.9069
Epoch 24/50
141/141 [==============================] - 12s 89ms/step - loss: 0.2856 - accuracy: 0.8799 - val_loss: 0.2386 - val_accuracy: 0.9059
Epoch 25/50
141/141 [==============================] - 12s 87ms/step - loss: 0.2834 - accuracy: 0.8817 - val_loss: 0.2472 - val_accuracy: 0.9048
Epoch 26/50
141/141 [==============================] - 12s 88ms/step - loss: 0.3038 - accuracy: 0.8768 - val_loss: 0.2792 - val_accuracy: 0.8835
Epoch 27/50
141/141 [==============================] - 13s 91ms/step - loss: 0.2786 - accuracy: 0.8854 - val_loss: 0.2326 - val_accuracy: 0.9079
Epoch 28/50
141/141 [==============================] - 12s 86ms/step - loss: 0.2692 - accuracy: 0.8846 - val_loss: 0.2325 - val_accuracy: 0.9115
Epoch 29/50
141/141 [==============================] - 12s 88ms/step - loss: 0.2770 - accuracy: 0.8841 - val_loss: 0.2507 - val_accuracy: 0.8972
Epoch 30/50
141/141 [==============================] - 13s 92ms/step - loss: 0.2751 - accuracy: 0.8886 - val_loss: 0.2329 - val_accuracy: 0.9104
Epoch 31/50
141/141 [==============================] - 12s 88ms/step - loss: 0.2902 - accuracy: 0.8785 - val_loss: 0.2901 - val_accuracy: 0.8758
Epoch 32/50
141/141 [==============================] - 13s 94ms/step - loss: 0.2665 - accuracy: 0.8915 - val_loss: 0.2314 - val_accuracy: 0.9089
Epoch 33/50
141/141 [==============================] - 13s 91ms/step - loss: 0.2797 - accuracy: 0.8805 - val_loss: 0.2708 - val_accuracy: 0.8921
Epoch 34/50
141/141 [==============================] - 13s 90ms/step - loss: 0.2895 - accuracy: 0.8799 - val_loss: 0.2332 - val_accuracy: 0.9140
Epoch 35/50
141/141 [==============================] - 13s 93ms/step - loss: 0.2696 - accuracy: 0.8857 - val_loss: 0.2512 - val_accuracy: 0.8972
Epoch 36/50
141/141 [==============================] - 13s 90ms/step - loss: 0.2641 - accuracy: 0.8868 - val_loss: 0.2304 - val_accuracy: 0.9104
Epoch 37/50
141/141 [==============================] - 13s 94ms/step - loss: 0.2675 - accuracy: 0.8895 - val_loss: 0.2706 - val_accuracy: 0.8830
Epoch 38/50
141/141 [==============================] - 12s 88ms/step - loss: 0.2699 - accuracy: 0.8839 - val_loss: 0.2285 - val_accuracy: 0.9053
Epoch 39/50
141/141 [==============================] - 12s 87ms/step - loss: 0.2577 - accuracy: 0.8917 - val_loss: 0.2469 - val_accuracy: 0.9043
Epoch 40/50
141/141 [==============================] - 12s 87ms/step - loss: 0.2547 - accuracy: 0.8948 - val_loss: 0.2205 - val_accuracy: 0.9074
Epoch 41/50
141/141 [==============================] - 12s 86ms/step - loss: 0.2553 - accuracy: 0.8930 - val_loss: 0.2494 - val_accuracy: 0.9038
Epoch 42/50
141/141 [==============================] - 14s 97ms/step - loss: 0.2705 - accuracy: 0.8883 - val_loss: 0.2263 - val_accuracy: 0.9109
Epoch 43/50
141/141 [==============================] - 12s 88ms/step - loss: 0.2521 - accuracy: 0.8926 - val_loss: 0.2319 - val_accuracy: 0.9084
Epoch 44/50
141/141 [==============================] - 12s 84ms/step - loss: 0.2694 - accuracy: 0.8850 - val_loss: 0.2199 - val_accuracy: 0.9109
Epoch 45/50
141/141 [==============================] - 12s 83ms/step - loss: 0.2601 - accuracy: 0.8901 - val_loss: 0.2318 - val_accuracy: 0.9079
Epoch 46/50
141/141 [==============================] - 12s 83ms/step - loss: 0.2535 - accuracy: 0.8917 - val_loss: 0.2342 - val_accuracy: 0.9089
Epoch 47/50
141/141 [==============================] - 12s 84ms/step - loss: 0.2584 - accuracy: 0.8897 - val_loss: 0.2238 - val_accuracy: 0.9089
Epoch 48/50
141/141 [==============================] - 12s 83ms/step - loss: 0.2580 - accuracy: 0.8944 - val_loss: 0.2219 - val_accuracy: 0.9120
Epoch 49/50
141/141 [==============================] - 12s 83ms/step - loss: 0.2514 - accuracy: 0.8895 - val_loss: 0.2225 - val_accuracy: 0.9150
Epoch 50/50
141/141 [==============================] - 12s 83ms/step - loss: 0.2483 - accuracy: 0.8977 - val_loss: 0.2370 - val_accuracy: 0.9084
model = tf.keras.models.load_model(global_dir + "/Model/1")
image = cv.resize(image,(50,50))
image= image.astype('float32')/255
image= np.expand_dims(image, axis=0)
predictions = model.predict(image)
top = np.array(tf.argmax(predictions, 1))
result = top[0]
def test_model():
dir_good = os.fsencode(global_dir + "/Contours/Clover")
dir_bad = os.fsencode(global_dir + "/Contours/Grass")
test = []
for file2 in os.listdir(dir_good):
filename2 = os.fsdecode(file2)
if (filename2.endswith(".jpg")):
test.append([0,os.path.join(global_dir + "/Contours/Clover", filename2)])
for file2 in os.listdir(dir_bad):
filename2 = os.fsdecode(file2)
if (filename2.endswith(".jpg")):
test.append([1,os.path.join(global_dir + "/Contours/Grass", filename2)])
random.shuffle(test)
count = 0
right = 0
for i in range(0,len(test)):
tmp = cv.imread(test[i][1])
result = predict_image(tmp) #<--- this function is already quoted above
count += 1
right += (1 if result == test[i][0] else 0)
print(str(test[i][0]) + "->" + str(result),count,right,round(right/count*100,1))
最佳答案
如我们的谈话所述,您使用的是 cv2.imread
加载以 BGR 格式加载到颜色 channel 中的图像。 Keras 数据生成器在内部以 RGB 格式加载图像。您必须在推理之前反转 channel :
tmp = tmp[...,::-1]
关于python - Keras 预测精度与训练精度不匹配,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/62292134/
使用sed和/或awk,仅在行包含字符串“ foo”并且行之前和之后的行分别包含字符串“ bar”和“ baz”时,我才希望删除行。 因此,对于此输入: blah blah foo blah bar
例如: S1: "some filename contains few words.txt" S2:“一些文件名包含几个单词 - draft.txt” S3:“一些文件名包含几个单词 - 另一个 dr
我正在尝试处理一些非常困惑的数据。我需要通过样本 ID 合并两个包含不同类型数据的大数据框。问题是一张表的样本 ID 有许多不同的格式,但大多数都包含用于匹配其 ID 中某处所需的 ID 字符串,例如
我想在匹配特定屏幕尺寸时显示特定图像。在这种情况下,对于 Bootstrap ,我使用 col-xx-## 作为我的选择。但似乎它并没有真正按照我认为应该的方式工作。 基本思路,我想显示一种全屏图像,
出于某种原因,这条规则 RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule ^(.*
我想做类似的东西(Nemerle 语法) def something = match(STT) | 1 with st= "Summ" | 2 with st= "AVG" =>
假设这是我的代码 var str="abc=1234587;abc=19855284;abc=1234587;abc=19855284;abc=1234587;abc=19855284;abc=123
我怎样才能得到这个字符串的数字:'(31.5393701, -82.46235569999999)' 我已经在尝试了,但这离解决方案还很远:) text.match(/\((\d+),(\d+)\)/
如何去除输出中的逗号 (,)?有没有更好的方法从字符串或句子中搜索 url。 alert(" http://www.cnn.com df".match(/https?:\/\/([-\w\.]+
a = ('one', 'two') b = ('ten', 'ten') z = [('four', 'five', 'six'), ('one', 'two', 'twenty')] 我正在尝试
我已经编写了以下代码,我希望用它来查找从第 21 列到另一张表中最后一行的值,并根据这张表中 A 列和另一张表中 B 列中的值将它们返回到这张表床单。 当我使用下面的代码时,我得到一个工作表错误。你能
我在以下结构中有两列 A B 1 49 4922039670 我已经能够评估 =LEN(A1)如2 , =LEFT(B1,2)如49 , 和 =LEFT(B1,LEN(A1)
我有一个文件,其中一行可以以 + 开头, -或 * .在其中一些行之间可以有以字母或数字(一般文本)开头的行(也包含这些字符,但不在第 1 列中!)。 知道这一点,设置匹配和突出显示机制的最简单方法是
我有一个数据字段文件,其中可能包含注释,如下所示: id, data, data, data 101 a, b, c 102 d, e, f 103 g, h, i // has to do with
我有以下模式:/^\/(?P.+)$/匹配:/url . 我的问题是它也匹配 /url/page ,如何忽略/在这个正则表达式中? 该模式应该: 模式匹配:/url 模式不匹配:/url/page 提
我有一个非常庞大且复杂的数据集,其中包含许多对公司的观察。公司的一些观察是多余的,我需要制作一个键来将多余的观察映射到一个单独的观察。然而,判断他们是否真的代表同一家公司的唯一方法是通过各种变量的相似
我有以下 XML A B C 我想查找 if not(exists(//Record/subRecord
我制作了一个正则表达式来验证潜在的比特币地址,现在当我单击报价按钮时,我希望根据正则表达式检查表单中输入的值,但它不起作用。 https://jsfiddle.net/arkqdc8a/5/ var
我有一些 MS Word 文档,我已将其全部内容转移到 SQL 表中。 内容包含多个方括号和大括号,例如 [{a} as at [b],] {c,} {d,} etc 我需要进行检查以确保括号平衡/匹
我正在使用 Node.js 从 XML 文件读取数据。但是当我尝试将文件中的数据与文字进行比较时,它不匹配,即使它看起来相同: const parser: xml2js.Parser = new
我是一名优秀的程序员,十分优秀!