- html - 出于某种原因,IE8 对我的 Sass 文件中继承的 html5 CSS 不友好?
- JMeter 在响应断言中使用 span 标签的问题
- html - 在 :hover and :active? 上具有不同效果的 CSS 动画
- html - 相对于居中的 html 内容固定的 CSS 重复背景?
我使用 Keras 构建了一个 CNN,对 2 个不同类别的图像进行分类。我遇到的问题是训练后我似乎无法得到正确的预测。
一些背景...数据集有 78750 个示例(大约 95% Cat.1 和 5% Cat.2),这可能是罪魁祸首,因为我假设 Cat.1 发生了过度拟合。 1(我认为这是问题所在,但由于许多其他原因,更改数据集大小很困难)
为了解决这个问题,我在每个卷积层上添加了正则化,但没有效果。
我的问题是......我是否绝对需要更改我的类别大小,或者我可以采取其他措施来对抗 Cat 的过度拟合。 1?
这是 CNN 的代码:
model = Sequential()
model.add(Conv2D(filters=25,
kernel_size=(10, 10),
strides=(1, 1),
activation='relu',
input_shape=input_shape,
padding="VALID",
kernel_initializer=random_normal(mean=0, stddev=.1),
kernel_regularizer=l2(.001)))
model.add(MaxPooling2D(pool_size=(2, 2),
strides=(2, 2)))
model.add(Conv2D(filters=25,
kernel_size=(7, 7),
strides=(1, 1),
activation='relu',
padding="VALID",
kernel_initializer=random_normal(mean=0, stddev=.1),
kernel_regularizer=l2(.001)))
model.add(MaxPooling2D(pool_size=(2, 2),
strides=(2, 2)))
model.add(Conv2D(filters=25,
kernel_size=(5, 5),
strides=(2, 2),
activation='relu',
padding="VALID",
kernel_initializer=random_normal(mean=0, stddev=.1),
kernel_regularizer=l2(.001)))
model.add(MaxPooling2D(pool_size=(2, 2),
strides=(1, 1)))
model.add(Conv2D(filters=25,
kernel_size=(5, 5),
strides=(2, 2),
activation='relu',
padding="VALID",
kernel_initializer=random_normal(mean=0, stddev=.1),
kernel_regularizer=l2(.001)))
model.add(Flatten())
model.add(Dense(2, activation='relu', kernel_initializer=random_normal(mean=0, stddev=.1), kernel_regularizer=l2(.001)))
model.add(Dense(2, activation='softmax'))
model.compile(loss=keras.losses.categorical_crossentropy,
optimizer=keras.optimizers.sgd(lr=.001, momentum=0.9),
metrics=['accuracy'])
编辑 1
这是运行 1 epoch 训练的输出...
Epoch 1/2
500/78750 [..............................] - ETA: 664s - loss: 1.3999 - acc: 0.9460
1000/78750 [..............................] - ETA: 652s - loss: 1.3713 - acc: 0.9500
1500/78750 [..............................] - ETA: 648s - loss: 1.3897 - acc: 0.9460
2000/78750 [..............................] - ETA: 648s - loss: 1.3970 - acc: 0.9420
2500/78750 [..............................] - ETA: 646s - loss: 1.3965 - acc: 0.9376
3000/78750 [>.............................] - ETA: 640s - loss: 1.3972 - acc: 0.9373
3500/78750 [>.............................] - ETA: 636s - loss: 1.3886 - acc: 0.9377
4000/78750 [>.............................] - ETA: 628s - loss: 1.3886 - acc: 0.9403
4500/78750 [>.............................] - ETA: 625s - loss: 1.3857 - acc: 0.9400
5000/78750 [>.............................] - ETA: 619s - loss: 1.3813 - acc: 0.9416
5500/78750 [=>............................] - ETA: 612s - loss: 1.3773 - acc: 0.9436
6000/78750 [=>............................] - ETA: 608s - loss: 1.3756 - acc: 0.9447
6500/78750 [=>............................] - ETA: 606s - loss: 1.3735 - acc: 0.9454
7000/78750 [=>............................] - ETA: 602s - loss: 1.3733 - acc: 0.9466
7500/78750 [=>............................] - ETA: 597s - loss: 1.3709 - acc: 0.9481
8000/78750 [==>...........................] - ETA: 594s - loss: 1.3688 - acc: 0.9480
8500/78750 [==>...........................] - ETA: 589s - loss: 1.3672 - acc: 0.9485
9000/78750 [==>...........................] - ETA: 584s - loss: 1.3656 - acc: 0.9491
9500/78750 [==>...........................] - ETA: 580s - loss: 1.3642 - acc: 0.9491
10000/78750 [==>...........................] - ETA: 576s - loss: 1.3629 - acc: 0.9497
10500/78750 [===>..........................] - ETA: 571s - loss: 1.3625 - acc: 0.9494
11000/78750 [===>..........................] - ETA: 567s - loss: 1.3615 - acc: 0.9495
11500/78750 [===>..........................] - ETA: 562s - loss: 1.3604 - acc: 0.9496
12000/78750 [===>..........................] - ETA: 558s - loss: 1.3596 - acc: 0.9496
12500/78750 [===>..........................] - ETA: 554s - loss: 1.3599 - acc: 0.9496
13000/78750 [===>..........................] - ETA: 549s - loss: 1.3591 - acc: 0.9494
13500/78750 [====>.........................] - ETA: 545s - loss: 1.3588 - acc: 0.9496
14000/78750 [====>.........................] - ETA: 541s - loss: 1.3588 - acc: 0.9496
14500/78750 [====>.........................] - ETA: 537s - loss: 1.3581 - acc: 0.9497
15000/78750 [====>.........................] - ETA: 533s - loss: 1.3577 - acc: 0.9497
15500/78750 [====>.........................] - ETA: 529s - loss: 1.3571 - acc: 0.9503
16000/78750 [=====>........................] - ETA: 525s - loss: 1.3568 - acc: 0.9502
16500/78750 [=====>........................] - ETA: 520s - loss: 1.3563 - acc: 0.9498
17000/78750 [=====>........................] - ETA: 515s - loss: 1.3557 - acc: 0.9500
17500/78750 [=====>........................] - ETA: 510s - loss: 1.3552 - acc: 0.9501
18000/78750 [=====>........................] - ETA: 506s - loss: 1.3547 - acc: 0.9504
18500/78750 [======>.......................] - ETA: 502s - loss: 1.3544 - acc: 0.9504
19000/78750 [======>.......................] - ETA: 497s - loss: 1.3540 - acc: 0.9502
19500/78750 [======>.......................] - ETA: 492s - loss: 1.3537 - acc: 0.9502
20000/78750 [======>.......................] - ETA: 488s - loss: 1.3533 - acc: 0.9501
20500/78750 [======>.......................] - ETA: 483s - loss: 1.3529 - acc: 0.9497
21000/78750 [=======>......................] - ETA: 479s - loss: 1.3525 - acc: 0.9496
21500/78750 [=======>......................] - ETA: 475s - loss: 1.3522 - acc: 0.9500
22000/78750 [=======>......................] - ETA: 471s - loss: 1.3518 - acc: 0.9498
22500/78750 [=======>......................] - ETA: 466s - loss: 1.3515 - acc: 0.9497
23000/78750 [=======>......................] - ETA: 462s - loss: 1.3512 - acc: 0.9499
23500/78750 [=======>......................] - ETA: 458s - loss: 1.3509 - acc: 0.9496
24000/78750 [========>.....................] - ETA: 454s - loss: 1.3506 - acc: 0.9495
24500/78750 [========>.....................] - ETA: 450s - loss: 1.3503 - acc: 0.9499
25000/78750 [========>.....................] - ETA: 445s - loss: 1.3501 - acc: 0.9501
25500/78750 [========>.....................] - ETA: 441s - loss: 1.3498 - acc: 0.9500
26000/78750 [========>.....................] - ETA: 437s - loss: 1.3496 - acc: 0.9501
26500/78750 [=========>....................] - ETA: 433s - loss: 1.3494 - acc: 0.9503
27000/78750 [=========>....................] - ETA: 428s - loss: 1.3491 - acc: 0.9501
27500/78750 [=========>....................] - ETA: 424s - loss: 1.3489 - acc: 0.9501
28000/78750 [=========>....................] - ETA: 419s - loss: 1.3487 - acc: 0.9501
28500/78750 [=========>....................] - ETA: 415s - loss: 1.3484 - acc: 0.9503
29000/78750 [==========>...................] - ETA: 411s - loss: 1.3482 - acc: 0.9503
29500/78750 [==========>...................] - ETA: 407s - loss: 1.3480 - acc: 0.9501
30000/78750 [==========>...................] - ETA: 403s - loss: 1.3478 - acc: 0.9503
30500/78750 [==========>...................] - ETA: 399s - loss: 1.3476 - acc: 0.9501
31000/78750 [==========>...................] - ETA: 395s - loss: 1.3474 - acc: 0.9502
31500/78750 [===========>..................] - ETA: 391s - loss: 1.3472 - acc: 0.9501
32000/78750 [===========>..................] - ETA: 387s - loss: 1.3470 - acc: 0.9501
32500/78750 [===========>..................] - ETA: 383s - loss: 1.3468 - acc: 0.9502
33000/78750 [===========>..................] - ETA: 379s - loss: 1.3467 - acc: 0.9501
33500/78750 [===========>..................] - ETA: 375s - loss: 1.3465 - acc: 0.9501
34000/78750 [===========>..................] - ETA: 371s - loss: 1.3464 - acc: 0.9503
34500/78750 [============>.................] - ETA: 367s - loss: 1.3462 - acc: 0.9502
35000/78750 [============>.................] - ETA: 363s - loss: 1.3461 - acc: 0.9503
35500/78750 [============>.................] - ETA: 358s - loss: 1.3459 - acc: 0.9503
36000/78750 [============>.................] - ETA: 354s - loss: 1.3458 - acc: 0.9502
36500/78750 [============>.................] - ETA: 350s - loss: 1.3456 - acc: 0.9504
37000/78750 [=============>................] - ETA: 346s - loss: 1.3455 - acc: 0.9504
37500/78750 [=============>................] - ETA: 341s - loss: 1.3454 - acc: 0.9505
38000/78750 [=============>................] - ETA: 337s - loss: 1.3452 - acc: 0.9506
38500/78750 [=============>................] - ETA: 333s - loss: 1.3451 - acc: 0.9506
39000/78750 [=============>................] - ETA: 329s - loss: 1.3450 - acc: 0.9506
39500/78750 [==============>...............] - ETA: 325s - loss: 1.3449 - acc: 0.9506
40000/78750 [==============>...............] - ETA: 321s - loss: 1.3448 - acc: 0.9508
40500/78750 [==============>...............] - ETA: 317s - loss: 1.3447 - acc: 0.9509
41000/78750 [==============>...............] - ETA: 313s - loss: 1.3445 - acc: 0.9507
41500/78750 [==============>...............] - ETA: 309s - loss: 1.3444 - acc: 0.9506
42000/78750 [===============>..............] - ETA: 304s - loss: 1.3443 - acc: 0.9507
42500/78750 [===============>..............] - ETA: 300s - loss: 1.3442 - acc: 0.9508
43000/78750 [===============>..............] - ETA: 296s - loss: 1.3441 - acc: 0.9508
43500/78750 [===============>..............] - ETA: 292s - loss: 1.3440 - acc: 0.9508
44000/78750 [===============>..............] - ETA: 287s - loss: 1.3439 - acc: 0.9508
44500/78750 [===============>..............] - ETA: 283s - loss: 1.3438 - acc: 0.9509
45000/78750 [================>.............] - ETA: 279s - loss: 1.3438 - acc: 0.9509
45500/78750 [================>.............] - ETA: 275s - loss: 1.3437 - acc: 0.9511
46000/78750 [================>.............] - ETA: 271s - loss: 1.3436 - acc: 0.9510
46500/78750 [================>.............] - ETA: 267s - loss: 1.3435 - acc: 0.9512
47000/78750 [================>.............] - ETA: 263s - loss: 1.3434 - acc: 0.9513
47500/78750 [=================>............] - ETA: 259s - loss: 1.3433 - acc: 0.9512
48000/78750 [=================>............] - ETA: 255s - loss: 1.3432 - acc: 0.9513
48500/78750 [=================>............] - ETA: 250s - loss: 1.3431 - acc: 0.9512
49000/78750 [=================>............] - ETA: 246s - loss: 1.3430 - acc: 0.9511
49500/78750 [=================>............] - ETA: 242s - loss: 1.3429 - acc: 0.9511
50000/78750 [==================>...........] - ETA: 238s - loss: 1.3428 - acc: 0.9513
50500/78750 [==================>...........] - ETA: 233s - loss: 1.3428 - acc: 0.9514
51000/78750 [==================>...........] - ETA: 229s - loss: 1.3427 - acc: 0.9514
51500/78750 [==================>...........] - ETA: 225s - loss: 1.3426 - acc: 0.9514
52000/78750 [==================>...........] - ETA: 221s - loss: 1.3427 - acc: 0.9515
52500/78750 [===================>..........] - ETA: 217s - loss: 1.3426 - acc: 0.9515
53000/78750 [===================>..........] - ETA: 213s - loss: 1.3425 - acc: 0.9515
53500/78750 [===================>..........] - ETA: 209s - loss: 1.3425 - acc: 0.9516
54000/78750 [===================>..........] - ETA: 204s - loss: 1.3424 - acc: 0.9515
54500/78750 [===================>..........] - ETA: 200s - loss: 1.3423 - acc: 0.9513
55000/78750 [===================>..........] - ETA: 196s - loss: 1.3423 - acc: 0.9515
55500/78750 [====================>.........] - ETA: 192s - loss: 1.3422 - acc: 0.9514
56000/78750 [====================>.........] - ETA: 188s - loss: 1.3421 - acc: 0.9513
56500/78750 [====================>.........] - ETA: 184s - loss: 1.3420 - acc: 0.9513
57000/78750 [====================>.........] - ETA: 179s - loss: 1.3420 - acc: 0.9513
57500/78750 [====================>.........] - ETA: 175s - loss: 1.3419 - acc: 0.9513
58000/78750 [=====================>........] - ETA: 171s - loss: 1.3419 - acc: 0.9513
58500/78750 [=====================>........] - ETA: 167s - loss: 1.3418 - acc: 0.9512
59000/78750 [=====================>........] - ETA: 163s - loss: 1.3417 - acc: 0.9510
59500/78750 [=====================>........] - ETA: 159s - loss: 1.3417 - acc: 0.9511
60000/78750 [=====================>........] - ETA: 155s - loss: 1.3416 - acc: 0.9511
60500/78750 [======================>.......] - ETA: 150s - loss: 1.3415 - acc: 0.9512
61000/78750 [======================>.......] - ETA: 146s - loss: 1.3414 - acc: 0.9512
61500/78750 [======================>.......] - ETA: 142s - loss: 1.3414 - acc: 0.9512
62000/78750 [======================>.......] - ETA: 138s - loss: 1.3413 - acc: 0.9512
62500/78750 [======================>.......] - ETA: 134s - loss: 1.3412 - acc: 0.9513
63000/78750 [=======================>......] - ETA: 130s - loss: 1.3412 - acc: 0.9514
63500/78750 [=======================>......] - ETA: 126s - loss: 1.3411 - acc: 0.9514
64000/78750 [=======================>......] - ETA: 121s - loss: 1.3411 - acc: 0.9515
64500/78750 [=======================>......] - ETA: 117s - loss: 1.3411 - acc: 0.9516
65000/78750 [=======================>......] - ETA: 113s - loss: 1.3410 - acc: 0.9516
65500/78750 [=======================>......] - ETA: 109s - loss: 1.3412 - acc: 0.9516
66000/78750 [========================>.....] - ETA: 105s - loss: 1.3411 - acc: 0.9517
66500/78750 [========================>.....] - ETA: 101s - loss: 1.3410 - acc: 0.9516
67000/78750 [========================>.....] - ETA: 97s - loss: 1.3410 - acc: 0.9516
67500/78750 [========================>.....] - ETA: 92s - loss: 1.3409 - acc: 0.9516
68000/78750 [========================>.....] - ETA: 88s - loss: 1.3408 - acc: 0.9515
68500/78750 [=========================>....] - ETA: 84s - loss: 1.3408 - acc: 0.9515
69000/78750 [=========================>....] - ETA: 80s - loss: 1.3407 - acc: 0.9515
69500/78750 [=========================>....] - ETA: 76s - loss: 1.3407 - acc: 0.9515
70000/78750 [=========================>....] - ETA: 72s - loss: 1.3406 - acc: 0.9515
70500/78750 [=========================>....] - ETA: 68s - loss: 1.3405 - acc: 0.9516
71000/78750 [==========================>...] - ETA: 64s - loss: 1.3405 - acc: 0.9516
71500/78750 [==========================>...] - ETA: 59s - loss: 1.3404 - acc: 0.9516
72000/78750 [==========================>...] - ETA: 55s - loss: 1.3404 - acc: 0.9517
72500/78750 [==========================>...] - ETA: 51s - loss: 1.3403 - acc: 0.9518
73000/78750 [==========================>...] - ETA: 47s - loss: 1.3403 - acc: 0.9517
73500/78750 [===========================>..] - ETA: 43s - loss: 1.3402 - acc: 0.9518
74000/78750 [===========================>..] - ETA: 39s - loss: 1.3401 - acc: 0.9517
74500/78750 [===========================>..] - ETA: 35s - loss: 1.3401 - acc: 0.9518
75000/78750 [===========================>..] - ETA: 31s - loss: 1.3400 - acc: 0.9518
75500/78750 [===========================>..] - ETA: 26s - loss: 1.3401 - acc: 0.9519
76000/78750 [===========================>..] - ETA: 22s - loss: 1.3400 - acc: 0.9519
76500/78750 [============================>.] - ETA: 18s - loss: 1.3400 - acc: 0.9519
77000/78750 [============================>.] - ETA: 14s - loss: 1.3399 - acc: 0.9519
77500/78750 [============================>.] - ETA: 10s - loss: 1.3399 - acc: 0.9519
78000/78750 [============================>.] - ETA: 6s - loss: 1.3398 - acc: 0.9518
78500/78750 [============================>.] - ETA: 2s - loss: 1.3398 - acc: 0.9518
78750/78750 [==============================] - 855s - loss: 1.3397 - acc: 0.9518 - val_loss: 1.3321 - val_acc: 0.9523
这是 model.summary()...
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
conv2d_1 (Conv2D) (None, 72, 72, 25) 2525
_________________________________________________________________
max_pooling2d_1 (MaxPooling2 (None, 36, 36, 25) 0
_________________________________________________________________
conv2d_2 (Conv2D) (None, 30, 30, 25) 30650
_________________________________________________________________
max_pooling2d_2 (MaxPooling2 (None, 15, 15, 25) 0
_________________________________________________________________
conv2d_3 (Conv2D) (None, 6, 6, 25) 15650
_________________________________________________________________
max_pooling2d_3 (MaxPooling2 (None, 5, 5, 25) 0
_________________________________________________________________
conv2d_4 (Conv2D) (None, 1, 1, 25) 15650
_________________________________________________________________
flatten_1 (Flatten) (None, 25) 0
_________________________________________________________________
dense_1 (Dense) (None, 2) 52
_________________________________________________________________
dense_2 (Dense) (None, 2) 6
=================================================================
Total params: 64,533
Trainable params: 64,533
Non-trainable params: 0
最佳答案
您的数据集高度不平衡,因此模型将第二类视为噪声,并将所有内容分类为类别 1。平衡数据集的最简单方法是对第二类的示例进行过采样,使模型更多地看到类别 2经常。
这可能会解决类输出的问题,但这样的模型泛化性较差。为了提高泛化能力你可以尝试data augmentation ,对图像应用随机变换。
关于machine-learning - Keras - 无法获得正确的类别预测,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/46085413/
关闭。这个问题是opinion-based .它目前不接受答案。 想改善这个问题吗?更新问题,以便可以通过 editing this post 用事实和引文回答问题. 9 个月前关闭。 Improve
我使用 partykit打包并遇到以下错误消息: Error in matrix(0, nrow = mi, ncol = nl) : invalid 'nrow' value (too large
我一直在尝试寻找一个量表或分类指标,为 VADER 情绪分析分配一些情感程度,而不仅仅是积极、消极或中性。如果有人可以分享他们的观点或资源来帮助按以下方式对 VADER 复合分数进行分类,我将非常感激
伙计们,我想自动循环..但我不知道是我放错了 while 还是循环错了? 我的数据库 标签:kt_barang kd_kategori | nama_kategori 1
我正在创建一个列出本地企业并按类别、子类别和关键字对它们进行分组的应用程序。以下是企业排序规则: 一个企业可以属于多个类别和子类别 一个企业可以有多个关键字 并非每个类别都有子类别,但有子类别的只有两
我有一系列单词 - 我的刺激 - 它们显示在屏幕上。然而,每个词都有另一个“条件”,即它们是类别 A、类别 B 或类别 C。这可能很简单,但我找不到答案并坚持下去。我的最终目标是在每次运行脚本时将类别
我正在使用 Laravel 5.5 和 MySql。如果不向 Services 表中添加 subCategoryID 列,我无法弄清楚如何将类别和子类别与服务相关联。 目前这是我的表结构 服务类别 i
我有两个部分/类别结构的链接表。 the section table structure id sec_title 1 section 1 2 section 2 the category str
我有一个类层次结构如下 @interface PTLDatasource : NSObject ... @interface PTLFetchedDatasource : PTLDatasource
我有一个 DataFrame df 一列,category 使用以下代码创建: import pandas as pd import random as rand from string import
我经常在多个类中设置获取请求以从核心数据(加上一些其他结果)中检索“allRecipes”或“lastModifiedDate”。 为此使用专门的类别 NSManagedObjectContext+R
可以在 Objective C 中创建类别之间的依赖关系吗?也在类别和它们的基类之间? 我知道在运行时应该没有区别,它们可能只是在编译时合并在一起。例如,假设我将 B 类分解为: B(base cla
这个问题在这里已经有了答案: 关闭 10 年前。
example img of a category selection by user 嘿,我正在尝试设置一个选择,用户必须选择一个类别和第二个类别,但我不知道如何获取他单击的信息。用户单击类别后,它
尝试将投资组合库添加到我正在制作的自定义 wp 主题中。我已经筋疲力尽地试图寻找甚至可以修改一些的解决方案和插件。我认为我在寻找解决方案时遇到的一个问题是我不完全确定哪些搜索词可以帮助我找到与我想要实
当我查看 Cocoa Touch API 时,我可以在同一个头文件中找到一些与类别一起声明的类,例如 @interface NSArray : NSObject @property (readonl
我的 log4j.properties 中有以下内容 log4j.rootLogger = debug, stdout, fileLog log4j.appender.stdout = org.apa
如果我在类中添加类别方法,比如NSXMLNode: @interface NSXMLNode (mycat) - (void)myFunc; @end NSXMLNode 的子类,例如 NSXMLEl
先说场景,wordpress的分类结构是这样的 Level 1: Top Level 2: -Nextme_1 Level 3: --Nextme_2 --Nextme_3 Leve
我有一个解析网络,现在我想浏览标签,或显示图表。我怎样才能得到图表?或者在树中导航。显示第一步然后其他等。并了解这棵树是如何 build 的。 import urllib from lxml impo
我是一名优秀的程序员,十分优秀!