Keras中可以通過(guò)在模型的層中添加正則化項(xiàng)來(lái)對(duì)模型進(jìn)行正則化??梢栽诿總€(gè)層的參數(shù)中指定正則化項(xiàng),例如:
from keras import regularizers
model = Sequential()
model.add(Dense(64, input_dim=64, kernel_regularizer=regularizers.l2(0.01)))
model.add(Activation('relu'))
model.add(Dense(64, kernel_regularizer=regularizers.l2(0.01)))
model.add(Activation('relu'))
model.add(Dense(10, kernel_regularizer=regularizers.l2(0.01)))
model.add(Activation('softmax'))
在上面的例子中,我們?cè)诿總€(gè)全連接層的參數(shù)中添加了L2正則化項(xiàng),參數(shù)值為0.01??梢愿鶕?jù)需要選擇不同的正則化方式,比如L1正則化、L1L2正則化等。添加了正則化項(xiàng)后,模型在訓(xùn)練時(shí)將會(huì)對(duì)權(quán)重進(jìn)行約束,防止過(guò)擬合的發(fā)生。