欧美bbbwbbbw肥妇,免费乱码人妻系列日韩,一级黄片

Python實(shí)現(xiàn)Keras搭建神經(jīng)網(wǎng)絡(luò)訓(xùn)練分類模型教程

 更新時(shí)間:2020年06月12日 15:59:54   作者:Wish_97  
這篇文章主要介紹了Python實(shí)現(xiàn)Keras搭建神經(jīng)網(wǎng)絡(luò)訓(xùn)練分類模型教程,具有很好的參考價(jià)值,希望對(duì)大家有所幫助。一起跟隨小編過(guò)來(lái)看看吧

我就廢話不多說(shuō)了,大家還是直接看代碼吧~

注釋講解版:

# Classifier example

import numpy as np
# for reproducibility
np.random.seed(1337)
# from keras.datasets import mnist
from keras.utils import np_utils
from keras.models import Sequential
from keras.layers import Dense, Activation
from keras.optimizers import RMSprop

# 程序中用到的數(shù)據(jù)是經(jīng)典的手寫體識(shí)別mnist數(shù)據(jù)集
# download the mnist to the path if it is the first time to be called
# X shape (60,000 28x28), y
# (X_train, y_train), (X_test, y_test) = mnist.load_data()
# 下載minst.npz:
# 鏈接: https://pan.baidu.com/s/1b2ppKDOdzDJxivgmyOoQsA
# 提取碼: y5ir
# 將下載好的minst.npz放到當(dāng)前目錄下
path='./mnist.npz'
f = np.load(path)
X_train, y_train = f['x_train'], f['y_train']
X_test, y_test = f['x_test'], f['y_test']
f.close()

# data pre-processing
# 數(shù)據(jù)預(yù)處理
# normalize
# X shape (60,000 28x28),表示輸入數(shù)據(jù) X 是個(gè)三維的數(shù)據(jù)
# 可以理解為 60000行數(shù)據(jù),每一行是一張28 x 28 的灰度圖片
# X_train.reshape(X_train.shape[0], -1)表示:只保留第一維,其余的緯度,不管多少緯度,重新排列為一維
# 參數(shù)-1就是不知道行數(shù)或者列數(shù)多少的情況下使用的參數(shù)
# 所以先確定除了參數(shù)-1之外的其他參數(shù),然后通過(guò)(總參數(shù)的計(jì)算) / (確定除了參數(shù)-1之外的其他參數(shù)) = 該位置應(yīng)該是多少的參數(shù)
# 這里用-1是偷懶的做法,等同于 28*28
# reshape后的數(shù)據(jù)是:共60000行,每一行是784個(gè)數(shù)據(jù)點(diǎn)(feature)
# 輸入的 x 變成 60,000*784 的數(shù)據(jù),然后除以 255 進(jìn)行標(biāo)準(zhǔn)化
# 因?yàn)槊總€(gè)像素都是在 0 到 255 之間的,標(biāo)準(zhǔn)化之后就變成了 0 到 1 之間
X_train = X_train.reshape(X_train.shape[0], -1) / 255
X_test = X_test.reshape(X_test.shape[0], -1) / 255
# 分類標(biāo)簽編碼
# 將y轉(zhuǎn)化為one-hot vector
y_train = np_utils.to_categorical(y_train, num_classes = 10)
y_test = np_utils.to_categorical(y_test, num_classes = 10)

# Another way to build your neural net
# 建立神經(jīng)網(wǎng)絡(luò)
# 應(yīng)用了2層的神經(jīng)網(wǎng)絡(luò),前一層的激活函數(shù)用的是relu,后一層的激活函數(shù)用的是softmax
#32是輸出的維數(shù)
model = Sequential([
  Dense(32, input_dim=784),
  Activation('relu'),
  Dense(10),
  Activation('softmax')
])

# Another way to define your optimizer
# 優(yōu)化函數(shù)
# 優(yōu)化算法用的是RMSprop
rmsprop = RMSprop(lr=0.001, rho=0.9, epsilon=1e-08, decay=0.0)

# We add metrics to get more results you want to see
# 不自己定義,直接用內(nèi)置的優(yōu)化器也行,optimizer='rmsprop'
#激活模型:接下來(lái)用 model.compile 激勵(lì)神經(jīng)網(wǎng)絡(luò)
model.compile(
  optimizer=rmsprop,
  loss='categorical_crossentropy',
  metrics=['accuracy']
)

print('Training------------')
# Another way to train the model
# 訓(xùn)練模型
# 上一個(gè)程序是用train_on_batch 一批一批的訓(xùn)練 X_train, Y_train
# 默認(rèn)的返回值是 cost,每100步輸出一下結(jié)果
# 輸出的樣式與上一個(gè)程序的有所不同,感覺(jué)用model.fit()更清晰明了
# 上一個(gè)程序是Python實(shí)現(xiàn)Keras搭建神經(jīng)網(wǎng)絡(luò)訓(xùn)練回歸模型:
# https://blog.csdn.net/weixin_45798684/article/details/106503685
model.fit(X_train, y_train, nb_epoch=2, batch_size=32)

print('\nTesting------------')
# Evaluate the model with the metrics we defined earlier
# 測(cè)試
loss, accuracy = model.evaluate(X_test, y_test)

print('test loss:', loss)
print('test accuracy:', accuracy)

運(yùn)行結(jié)果:

Using TensorFlow backend.

Training------------

Epoch 1/2

  32/60000 [..............................] - ETA: 5:03 - loss: 2.4464 - accuracy: 0.0625
 864/60000 [..............................] - ETA: 14s - loss: 1.8023 - accuracy: 0.4850 
 1696/60000 [..............................] - ETA: 9s - loss: 1.5119 - accuracy: 0.6002 
 2432/60000 [>.............................] - ETA: 7s - loss: 1.3151 - accuracy: 0.6637
 3200/60000 [>.............................] - ETA: 6s - loss: 1.1663 - accuracy: 0.7056
 3968/60000 [>.............................] - ETA: 5s - loss: 1.0533 - accuracy: 0.7344
 4704/60000 [=>............................] - ETA: 5s - loss: 0.9696 - accuracy: 0.7564
 5408/60000 [=>............................] - ETA: 5s - loss: 0.9162 - accuracy: 0.7681
 6112/60000 [==>...........................] - ETA: 5s - loss: 0.8692 - accuracy: 0.7804
 6784/60000 [==>...........................] - ETA: 4s - loss: 0.8225 - accuracy: 0.7933
 7424/60000 [==>...........................] - ETA: 4s - loss: 0.7871 - accuracy: 0.8021
 8128/60000 [===>..........................] - ETA: 4s - loss: 0.7546 - accuracy: 0.8099
 8960/60000 [===>..........................] - ETA: 4s - loss: 0.7196 - accuracy: 0.8183
 9568/60000 [===>..........................] - ETA: 4s - loss: 0.6987 - accuracy: 0.8230
10144/60000 [====>.........................] - ETA: 4s - loss: 0.6812 - accuracy: 0.8262
10784/60000 [====>.........................] - ETA: 4s - loss: 0.6640 - accuracy: 0.8297
11456/60000 [====>.........................] - ETA: 4s - loss: 0.6462 - accuracy: 0.8329
12128/60000 [=====>........................] - ETA: 4s - loss: 0.6297 - accuracy: 0.8366
12704/60000 [=====>........................] - ETA: 4s - loss: 0.6156 - accuracy: 0.8405
13408/60000 [=====>........................] - ETA: 3s - loss: 0.6009 - accuracy: 0.8430
14112/60000 [======>.......................] - ETA: 3s - loss: 0.5888 - accuracy: 0.8457
14816/60000 [======>.......................] - ETA: 3s - loss: 0.5772 - accuracy: 0.8487
15488/60000 [======>.......................] - ETA: 3s - loss: 0.5685 - accuracy: 0.8503
16192/60000 [=======>......................] - ETA: 3s - loss: 0.5576 - accuracy: 0.8534
16896/60000 [=======>......................] - ETA: 3s - loss: 0.5477 - accuracy: 0.8555
17600/60000 [=======>......................] - ETA: 3s - loss: 0.5380 - accuracy: 0.8576
18240/60000 [========>.....................] - ETA: 3s - loss: 0.5279 - accuracy: 0.8600
18976/60000 [========>.....................] - ETA: 3s - loss: 0.5208 - accuracy: 0.8617
19712/60000 [========>.....................] - ETA: 3s - loss: 0.5125 - accuracy: 0.8634
20416/60000 [=========>....................] - ETA: 3s - loss: 0.5046 - accuracy: 0.8654
21088/60000 [=========>....................] - ETA: 3s - loss: 0.4992 - accuracy: 0.8669
21792/60000 [=========>....................] - ETA: 3s - loss: 0.4932 - accuracy: 0.8684
22432/60000 [==========>...................] - ETA: 3s - loss: 0.4893 - accuracy: 0.8693
23072/60000 [==========>...................] - ETA: 2s - loss: 0.4845 - accuracy: 0.8703
23648/60000 [==========>...................] - ETA: 2s - loss: 0.4800 - accuracy: 0.8712
24096/60000 [===========>..................] - ETA: 2s - loss: 0.4776 - accuracy: 0.8718
24576/60000 [===========>..................] - ETA: 2s - loss: 0.4733 - accuracy: 0.8728
25056/60000 [===========>..................] - ETA: 2s - loss: 0.4696 - accuracy: 0.8736
25568/60000 [===========>..................] - ETA: 2s - loss: 0.4658 - accuracy: 0.8745
26080/60000 [============>.................] - ETA: 2s - loss: 0.4623 - accuracy: 0.8753
26592/60000 [============>.................] - ETA: 2s - loss: 0.4600 - accuracy: 0.8756
27072/60000 [============>.................] - ETA: 2s - loss: 0.4566 - accuracy: 0.8763
27584/60000 [============>.................] - ETA: 2s - loss: 0.4532 - accuracy: 0.8771
28032/60000 [=============>................] - ETA: 2s - loss: 0.4513 - accuracy: 0.8775
28512/60000 [=============>................] - ETA: 2s - loss: 0.4477 - accuracy: 0.8784
28992/60000 [=============>................] - ETA: 2s - loss: 0.4464 - accuracy: 0.8786
29472/60000 [=============>................] - ETA: 2s - loss: 0.4439 - accuracy: 0.8791
29952/60000 [=============>................] - ETA: 2s - loss: 0.4404 - accuracy: 0.8800
30464/60000 [==============>...............] - ETA: 2s - loss: 0.4375 - accuracy: 0.8807
30784/60000 [==============>...............] - ETA: 2s - loss: 0.4349 - accuracy: 0.8813
31296/60000 [==============>...............] - ETA: 2s - loss: 0.4321 - accuracy: 0.8820
31808/60000 [==============>...............] - ETA: 2s - loss: 0.4301 - accuracy: 0.8827
32256/60000 [===============>..............] - ETA: 2s - loss: 0.4279 - accuracy: 0.8832
32736/60000 [===============>..............] - ETA: 2s - loss: 0.4258 - accuracy: 0.8838
33280/60000 [===============>..............] - ETA: 2s - loss: 0.4228 - accuracy: 0.8844
33920/60000 [===============>..............] - ETA: 2s - loss: 0.4195 - accuracy: 0.8849
34560/60000 [================>.............] - ETA: 2s - loss: 0.4179 - accuracy: 0.8852
35104/60000 [================>.............] - ETA: 2s - loss: 0.4165 - accuracy: 0.8854
35680/60000 [================>.............] - ETA: 2s - loss: 0.4139 - accuracy: 0.8860
36288/60000 [=================>............] - ETA: 2s - loss: 0.4111 - accuracy: 0.8870
36928/60000 [=================>............] - ETA: 2s - loss: 0.4088 - accuracy: 0.8874
37504/60000 [=================>............] - ETA: 2s - loss: 0.4070 - accuracy: 0.8878
38048/60000 [==================>...........] - ETA: 1s - loss: 0.4052 - accuracy: 0.8882
38656/60000 [==================>...........] - ETA: 1s - loss: 0.4031 - accuracy: 0.8888
39264/60000 [==================>...........] - ETA: 1s - loss: 0.4007 - accuracy: 0.8894
39840/60000 [==================>...........] - ETA: 1s - loss: 0.3997 - accuracy: 0.8896
40416/60000 [===================>..........] - ETA: 1s - loss: 0.3978 - accuracy: 0.8901
40960/60000 [===================>..........] - ETA: 1s - loss: 0.3958 - accuracy: 0.8906
41504/60000 [===================>..........] - ETA: 1s - loss: 0.3942 - accuracy: 0.8911
42016/60000 [====================>.........] - ETA: 1s - loss: 0.3928 - accuracy: 0.8915
42592/60000 [====================>.........] - ETA: 1s - loss: 0.3908 - accuracy: 0.8920
43168/60000 [====================>.........] - ETA: 1s - loss: 0.3889 - accuracy: 0.8924
43744/60000 [====================>.........] - ETA: 1s - loss: 0.3868 - accuracy: 0.8931
44288/60000 [=====================>........] - ETA: 1s - loss: 0.3864 - accuracy: 0.8931
44832/60000 [=====================>........] - ETA: 1s - loss: 0.3842 - accuracy: 0.8938
45408/60000 [=====================>........] - ETA: 1s - loss: 0.3822 - accuracy: 0.8944
45984/60000 [=====================>........] - ETA: 1s - loss: 0.3804 - accuracy: 0.8949
46560/60000 [======================>.......] - ETA: 1s - loss: 0.3786 - accuracy: 0.8953
47168/60000 [======================>.......] - ETA: 1s - loss: 0.3767 - accuracy: 0.8958
47808/60000 [======================>.......] - ETA: 1s - loss: 0.3744 - accuracy: 0.8963
48416/60000 [=======================>......] - ETA: 1s - loss: 0.3732 - accuracy: 0.8966
48928/60000 [=======================>......] - ETA: 0s - loss: 0.3714 - accuracy: 0.8971
49440/60000 [=======================>......] - ETA: 0s - loss: 0.3701 - accuracy: 0.8974
50048/60000 [========================>.....] - ETA: 0s - loss: 0.3678 - accuracy: 0.8979
50688/60000 [========================>.....] - ETA: 0s - loss: 0.3669 - accuracy: 0.8983
51264/60000 [========================>.....] - ETA: 0s - loss: 0.3654 - accuracy: 0.8988
51872/60000 [========================>.....] - ETA: 0s - loss: 0.3636 - accuracy: 0.8992
52608/60000 [=========================>....] - ETA: 0s - loss: 0.3618 - accuracy: 0.8997
53376/60000 [=========================>....] - ETA: 0s - loss: 0.3599 - accuracy: 0.9003
54048/60000 [==========================>...] - ETA: 0s - loss: 0.3583 - accuracy: 0.9006
54560/60000 [==========================>...] - ETA: 0s - loss: 0.3568 - accuracy: 0.9010
55296/60000 [==========================>...] - ETA: 0s - loss: 0.3548 - accuracy: 0.9016
56064/60000 [===========================>..] - ETA: 0s - loss: 0.3526 - accuracy: 0.9021
56736/60000 [===========================>..] - ETA: 0s - loss: 0.3514 - accuracy: 0.9026
57376/60000 [===========================>..] - ETA: 0s - loss: 0.3499 - accuracy: 0.9029
58112/60000 [============================>.] - ETA: 0s - loss: 0.3482 - accuracy: 0.9033
58880/60000 [============================>.] - ETA: 0s - loss: 0.3459 - accuracy: 0.9039
59584/60000 [============================>.] - ETA: 0s - loss: 0.3444 - accuracy: 0.9043
60000/60000 [==============================] - 5s 87us/step - loss: 0.3435 - accuracy: 0.9046

Epoch 2/2

  32/60000 [..............................] - ETA: 11s - loss: 0.0655 - accuracy: 1.0000
 736/60000 [..............................] - ETA: 4s - loss: 0.2135 - accuracy: 0.9389 
 1408/60000 [..............................] - ETA: 4s - loss: 0.2217 - accuracy: 0.9361
 1984/60000 [..............................] - ETA: 4s - loss: 0.2316 - accuracy: 0.9390
 2432/60000 [>.............................] - ETA: 4s - loss: 0.2280 - accuracy: 0.9379
 3040/60000 [>.............................] - ETA: 4s - loss: 0.2374 - accuracy: 0.9368
 3808/60000 [>.............................] - ETA: 4s - loss: 0.2251 - accuracy: 0.9386
 4576/60000 [=>............................] - ETA: 4s - loss: 0.2225 - accuracy: 0.9379
 5216/60000 [=>............................] - ETA: 4s - loss: 0.2208 - accuracy: 0.9377
 5920/60000 [=>............................] - ETA: 4s - loss: 0.2173 - accuracy: 0.9383
 6656/60000 [==>...........................] - ETA: 4s - loss: 0.2217 - accuracy: 0.9370
 7392/60000 [==>...........................] - ETA: 4s - loss: 0.2224 - accuracy: 0.9360
 8096/60000 [===>..........................] - ETA: 4s - loss: 0.2234 - accuracy: 0.9363
 8800/60000 [===>..........................] - ETA: 3s - loss: 0.2235 - accuracy: 0.9358
 9408/60000 [===>..........................] - ETA: 3s - loss: 0.2196 - accuracy: 0.9365
10016/60000 [====>.........................] - ETA: 3s - loss: 0.2207 - accuracy: 0.9363
10592/60000 [====>.........................] - ETA: 3s - loss: 0.2183 - accuracy: 0.9369
11168/60000 [====>.........................] - ETA: 3s - loss: 0.2177 - accuracy: 0.9377
11776/60000 [====>.........................] - ETA: 3s - loss: 0.2154 - accuracy: 0.9385
12544/60000 [=====>........................] - ETA: 3s - loss: 0.2152 - accuracy: 0.9393
13216/60000 [=====>........................] - ETA: 3s - loss: 0.2163 - accuracy: 0.9390
13920/60000 [=====>........................] - ETA: 3s - loss: 0.2155 - accuracy: 0.9391
14624/60000 [======>.......................] - ETA: 3s - loss: 0.2150 - accuracy: 0.9391
15424/60000 [======>.......................] - ETA: 3s - loss: 0.2143 - accuracy: 0.9398
16032/60000 [=======>......................] - ETA: 3s - loss: 0.2122 - accuracy: 0.9405
16672/60000 [=======>......................] - ETA: 3s - loss: 0.2096 - accuracy: 0.9409
17344/60000 [=======>......................] - ETA: 3s - loss: 0.2091 - accuracy: 0.9411
18112/60000 [========>.....................] - ETA: 3s - loss: 0.2086 - accuracy: 0.9416
18784/60000 [========>.....................] - ETA: 3s - loss: 0.2084 - accuracy: 0.9418
19392/60000 [========>.....................] - ETA: 3s - loss: 0.2076 - accuracy: 0.9418
20000/60000 [=========>....................] - ETA: 3s - loss: 0.2067 - accuracy: 0.9421
20608/60000 [=========>....................] - ETA: 3s - loss: 0.2071 - accuracy: 0.9419
21184/60000 [=========>....................] - ETA: 3s - loss: 0.2056 - accuracy: 0.9423
21856/60000 [=========>....................] - ETA: 3s - loss: 0.2063 - accuracy: 0.9419
22624/60000 [==========>...................] - ETA: 2s - loss: 0.2059 - accuracy: 0.9421
23328/60000 [==========>...................] - ETA: 2s - loss: 0.2056 - accuracy: 0.9422
23936/60000 [==========>...................] - ETA: 2s - loss: 0.2051 - accuracy: 0.9423
24512/60000 [===========>..................] - ETA: 2s - loss: 0.2041 - accuracy: 0.9424
25248/60000 [===========>..................] - ETA: 2s - loss: 0.2036 - accuracy: 0.9426
26016/60000 [============>.................] - ETA: 2s - loss: 0.2031 - accuracy: 0.9424
26656/60000 [============>.................] - ETA: 2s - loss: 0.2035 - accuracy: 0.9422
27360/60000 [============>.................] - ETA: 2s - loss: 0.2050 - accuracy: 0.9417
28128/60000 [=============>................] - ETA: 2s - loss: 0.2045 - accuracy: 0.9418
28896/60000 [=============>................] - ETA: 2s - loss: 0.2046 - accuracy: 0.9418
29536/60000 [=============>................] - ETA: 2s - loss: 0.2052 - accuracy: 0.9417
30208/60000 [==============>...............] - ETA: 2s - loss: 0.2050 - accuracy: 0.9417
30848/60000 [==============>...............] - ETA: 2s - loss: 0.2046 - accuracy: 0.9419
31552/60000 [==============>...............] - ETA: 2s - loss: 0.2037 - accuracy: 0.9421
32224/60000 [===============>..............] - ETA: 2s - loss: 0.2043 - accuracy: 0.9420
32928/60000 [===============>..............] - ETA: 2s - loss: 0.2041 - accuracy: 0.9420
33632/60000 [===============>..............] - ETA: 2s - loss: 0.2035 - accuracy: 0.9422
34272/60000 [================>.............] - ETA: 1s - loss: 0.2029 - accuracy: 0.9423
34944/60000 [================>.............] - ETA: 1s - loss: 0.2030 - accuracy: 0.9423
35648/60000 [================>.............] - ETA: 1s - loss: 0.2027 - accuracy: 0.9422
36384/60000 [=================>............] - ETA: 1s - loss: 0.2027 - accuracy: 0.9421
37120/60000 [=================>............] - ETA: 1s - loss: 0.2024 - accuracy: 0.9421
37760/60000 [=================>............] - ETA: 1s - loss: 0.2013 - accuracy: 0.9424
38464/60000 [==================>...........] - ETA: 1s - loss: 0.2011 - accuracy: 0.9424
39200/60000 [==================>...........] - ETA: 1s - loss: 0.2000 - accuracy: 0.9426
40000/60000 [===================>..........] - ETA: 1s - loss: 0.1990 - accuracy: 0.9428
40672/60000 [===================>..........] - ETA: 1s - loss: 0.1986 - accuracy: 0.9430
41344/60000 [===================>..........] - ETA: 1s - loss: 0.1982 - accuracy: 0.9432
42112/60000 [====================>.........] - ETA: 1s - loss: 0.1981 - accuracy: 0.9432
42848/60000 [====================>.........] - ETA: 1s - loss: 0.1977 - accuracy: 0.9433
43552/60000 [====================>.........] - ETA: 1s - loss: 0.1970 - accuracy: 0.9435
44256/60000 [=====================>........] - ETA: 1s - loss: 0.1972 - accuracy: 0.9436
44992/60000 [=====================>........] - ETA: 1s - loss: 0.1972 - accuracy: 0.9437
45664/60000 [=====================>........] - ETA: 1s - loss: 0.1966 - accuracy: 0.9438
46176/60000 [======================>.......] - ETA: 1s - loss: 0.1968 - accuracy: 0.9437
46752/60000 [======================>.......] - ETA: 1s - loss: 0.1969 - accuracy: 0.9438
47488/60000 [======================>.......] - ETA: 0s - loss: 0.1965 - accuracy: 0.9439
48256/60000 [=======================>......] - ETA: 0s - loss: 0.1965 - accuracy: 0.9438
48896/60000 [=======================>......] - ETA: 0s - loss: 0.1963 - accuracy: 0.9436
49568/60000 [=======================>......] - ETA: 0s - loss: 0.1962 - accuracy: 0.9438
50304/60000 [========================>.....] - ETA: 0s - loss: 0.1965 - accuracy: 0.9437
51072/60000 [========================>.....] - ETA: 0s - loss: 0.1967 - accuracy: 0.9437
51744/60000 [========================>.....] - ETA: 0s - loss: 0.1961 - accuracy: 0.9439
52480/60000 [=========================>....] - ETA: 0s - loss: 0.1957 - accuracy: 0.9439
53248/60000 [=========================>....] - ETA: 0s - loss: 0.1959 - accuracy: 0.9438
54016/60000 [==========================>...] - ETA: 0s - loss: 0.1963 - accuracy: 0.9437
54592/60000 [==========================>...] - ETA: 0s - loss: 0.1965 - accuracy: 0.9436
55168/60000 [==========================>...] - ETA: 0s - loss: 0.1962 - accuracy: 0.9436
55776/60000 [==========================>...] - ETA: 0s - loss: 0.1959 - accuracy: 0.9437
56448/60000 [===========================>..] - ETA: 0s - loss: 0.1965 - accuracy: 0.9437
57152/60000 [===========================>..] - ETA: 0s - loss: 0.1958 - accuracy: 0.9439
57824/60000 [===========================>..] - ETA: 0s - loss: 0.1956 - accuracy: 0.9438
58560/60000 [============================>.] - ETA: 0s - loss: 0.1951 - accuracy: 0.9440
59360/60000 [============================>.] - ETA: 0s - loss: 0.1947 - accuracy: 0.9440
60000/60000 [==============================] - 5s 76us/step - loss: 0.1946 - accuracy: 0.9440

Testing------------

  32/10000 [..............................] - ETA: 15s
 1248/10000 [==>...........................] - ETA: 0s 
 2656/10000 [======>.......................] - ETA: 0s
 4064/10000 [===========>..................] - ETA: 0s
 5216/10000 [==============>...............] - ETA: 0s
 6464/10000 [==================>...........] - ETA: 0s
 7744/10000 [======================>.......] - ETA: 0s
 9056/10000 [==========================>...] - ETA: 0s
 9984/10000 [============================>.] - ETA: 0s
10000/10000 [==============================] - 0s 47us/step
test loss: 0.17407772153392434
test accuracy: 0.9513000249862671

補(bǔ)充知識(shí):Keras 搭建簡(jiǎn)單神經(jīng)網(wǎng)絡(luò):順序模型+回歸問(wèn)題

多層全連接神經(jīng)網(wǎng)絡(luò)

每層神經(jīng)元個(gè)數(shù)、神經(jīng)網(wǎng)絡(luò)層數(shù)、激活函數(shù)等可自由修改

使用不同的損失函數(shù)可適用于其他任務(wù),比如:分類問(wèn)題

這是Keras搭建神經(jīng)網(wǎng)絡(luò)模型最基礎(chǔ)的方法之一,Keras還有其他進(jìn)階的方法,官網(wǎng)給出了一些基本使用方法:Keras官網(wǎng)

# 這里搭建了一個(gè)4層全連接神經(jīng)網(wǎng)絡(luò)(不算輸入層),傳入函數(shù)以及函數(shù)內(nèi)部的參數(shù)均可自由修改
def ann(X, y):
  '''
  X: 輸入的訓(xùn)練集數(shù)據(jù)
  y: 訓(xùn)練集對(duì)應(yīng)的標(biāo)簽
  '''
  
  '''初始化模型'''
  # 首先定義了一個(gè)順序模型作為框架,然后往這個(gè)框架里面添加網(wǎng)絡(luò)層
  # 這是最基礎(chǔ)搭建神經(jīng)網(wǎng)絡(luò)的方法之一
  model = Sequential()
  
  '''開(kāi)始添加網(wǎng)絡(luò)層'''
  # Dense表示全連接層,第一層需要我們提供輸入的維度 input_shape
  # Activation表示每層的激活函數(shù),可以傳入預(yù)定義的激活函數(shù),也可以傳入符合接口規(guī)則的其他高級(jí)激活函數(shù)
  model.add(Dense(64, input_shape=(X.shape[1],)))
  model.add(Activation('sigmoid'))
  
  model.add(Dense(256))
  model.add(Activation('relu'))
  
  model.add(Dense(256))
  model.add(Activation('tanh'))
  
  model.add(Dense(32))
  model.add(Activation('tanh'))
  
  # 輸出層,輸出的維度大小由具體任務(wù)而定
  # 這里是一維輸出的回歸問(wèn)題
  model.add(Dense(1))
  model.add(Activation('linear'))
  
  '''模型編譯'''
  # optimizer表示優(yōu)化器(可自由選擇),loss表示使用哪一種
  model.compile(optimizer='rmsprop', loss='mean_squared_error')
  # 自定義學(xué)習(xí)率,也可以使用原始的基礎(chǔ)學(xué)習(xí)率
  reduce_lr = ReduceLROnPlateau(monitor='loss', factor=0.1, patience=10, 
                 verbose=0, mode='auto', min_delta=0.001, 
                 cooldown=0, min_lr=0)
  
  '''模型訓(xùn)練'''
  # 這里的模型也可以先從函數(shù)返回后,再進(jìn)行訓(xùn)練
  # epochs表示訓(xùn)練的輪數(shù),batch_size表示每次訓(xùn)練的樣本數(shù)量(小批量學(xué)習(xí)),validation_split表示用作驗(yàn)證集的訓(xùn)練數(shù)據(jù)的比例
  # callbacks表示回調(diào)函數(shù)的集合,用于模型訓(xùn)練時(shí)查看模型的內(nèi)在狀態(tài)和統(tǒng)計(jì)數(shù)據(jù),相應(yīng)的回調(diào)函數(shù)方法會(huì)在各自的階段被調(diào)用
  # verbose表示輸出的詳細(xì)程度,值越大輸出越詳細(xì)
  model.fit(X, y, epochs=100,
       batch_size=50, validation_split=0.0,
       callbacks=[reduce_lr],
       verbose=0)
  
  # 打印模型結(jié)構(gòu)
  print(model.summary())

  return model

下圖是此模型的結(jié)構(gòu)圖,其中下劃線后面的數(shù)字是根據(jù)調(diào)用次數(shù)而定

以上這篇Python實(shí)現(xiàn)Keras搭建神經(jīng)網(wǎng)絡(luò)訓(xùn)練分類模型教程就是小編分享給大家的全部?jī)?nèi)容了,希望能給大家一個(gè)參考,也希望大家多多支持腳本之家。

相關(guān)文章

  • Python設(shè)計(jì)模式之策略模式實(shí)例詳解

    Python設(shè)計(jì)模式之策略模式實(shí)例詳解

    這篇文章主要介紹了Python設(shè)計(jì)模式之策略模式,結(jié)合實(shí)例形式分析了策略模式的概念、原理并結(jié)合實(shí)例形式分析了Python定義與使用策略模式相關(guān)操作技巧,需要的朋友可以參考下
    2019-01-01
  • python使用in操作符時(shí)元組和數(shù)組的區(qū)別分析

    python使用in操作符時(shí)元組和數(shù)組的區(qū)別分析

    有時(shí)候要判斷一個(gè)數(shù)是否在一個(gè)序列里面,這時(shí)就會(huì)用到in運(yùn)算符來(lái)判斷成員資格,如果條件為真時(shí),就會(huì)返回true,條件為假時(shí),返回一個(gè)flase。這樣的運(yùn)算符叫做布爾運(yùn)算符,其真值叫做布爾值。
    2015-05-05
  • 詳解PyQt5信號(hào)與槽的幾種高級(jí)玩法

    詳解PyQt5信號(hào)與槽的幾種高級(jí)玩法

    這篇文章主要介紹了詳解PyQt5信號(hào)與槽的幾種高級(jí)玩法,文中通過(guò)示例代碼介紹的非常詳細(xì),對(duì)大家的學(xué)習(xí)或者工作具有一定的參考學(xué)習(xí)價(jià)值,需要的朋友們下面隨著小編來(lái)一起學(xué)習(xí)學(xué)習(xí)吧
    2020-03-03
  • 用Python從零實(shí)現(xiàn)貝葉斯分類器的機(jī)器學(xué)習(xí)的教程

    用Python從零實(shí)現(xiàn)貝葉斯分類器的機(jī)器學(xué)習(xí)的教程

    這篇文章主要介紹了用Python從零實(shí)現(xiàn)貝葉斯分類器的教程,樸素貝葉斯算法屬于機(jī)器學(xué)習(xí)中的基礎(chǔ)內(nèi)容、實(shí)用而高效,本文詳細(xì)展示了用Python語(yǔ)言實(shí)現(xiàn)的步驟,需要的朋友可以參考下
    2015-03-03
  • Python: 傳遞列表副本方式

    Python: 傳遞列表副本方式

    今天小編就為大家分享一篇Python: 傳遞列表副本方式,具有很好的參考價(jià)值,希望對(duì)大家有所幫助。一起跟隨小編過(guò)來(lái)看看吧
    2019-12-12
  • Flask URL傳參與視圖映射的實(shí)現(xiàn)方法

    Flask URL傳參與視圖映射的實(shí)現(xiàn)方法

    這篇文章主要介紹了Flask URL傳參與視圖映射的實(shí)現(xiàn)方法,文中通過(guò)示例代碼介紹的非常詳細(xì),對(duì)大家的學(xué)習(xí)或者工作具有一定的參考學(xué)習(xí)價(jià)值,需要的朋友們下面隨著小編來(lái)一起學(xué)習(xí)吧
    2023-03-03
  • 復(fù)制粘貼功能的Python程序

    復(fù)制粘貼功能的Python程序

    接下來(lái),由于我覺(jué)得手動(dòng)復(fù)制粘貼這29000條插入語(yǔ)句實(shí)在是太痛苦了,所以打算用Python來(lái)完成這項(xiàng)工作。這是我第一次自己動(dòng)手寫Python代碼,感覺(jué)還挺順利的。
    2008-04-04
  • python如何將兩個(gè)數(shù)據(jù)表中的對(duì)應(yīng)數(shù)據(jù)相加

    python如何將兩個(gè)數(shù)據(jù)表中的對(duì)應(yīng)數(shù)據(jù)相加

    這篇文章主要介紹了python如何將兩個(gè)數(shù)據(jù)表中的對(duì)應(yīng)數(shù)據(jù)相加問(wèn)題,具有很好的參考價(jià)值,希望對(duì)大家有所幫助,如有錯(cuò)誤或未考慮完全的地方,望不吝賜教
    2023-08-08
  • conda創(chuàng)建pytorch環(huán)境報(bào)錯(cuò)

    conda創(chuàng)建pytorch環(huán)境報(bào)錯(cuò)

    這篇文章主要介紹了conda創(chuàng)建pytorch環(huán)境報(bào)錯(cuò),幫助大家更好的理解和學(xué)習(xí)使用python,感興趣的朋友可以了解下
    2021-04-04
  • python web框架學(xué)習(xí)筆記

    python web框架學(xué)習(xí)筆記

    這篇文章主要為大家分享了python web框架學(xué)習(xí)筆記,感興趣的小伙伴們可以參考一下
    2016-05-05

最新評(píng)論