Pytorch實現(xiàn)GoogLeNet的方法
GoogLeNet也叫InceptionNet,在2014年被提出,如今已到V4版本。GoogleNet比VGGNet具有更深的網(wǎng)絡(luò)結(jié)構(gòu),一共有22層,但是參數(shù)比AlexNet要少12倍,但是計算量是AlexNet的4倍,原因就是它采用很有效的Inception模塊,并且沒有全連接層。
最重要的創(chuàng)新點就在于使用inception模塊,通過使用不同維度的卷積提取不同尺度的特征圖。左圖是最初的Inception模塊,右圖是使用的1×1得卷積對左圖的改進(jìn),降低了輸入的特征圖維度,同時降低了網(wǎng)絡(luò)的參數(shù)量和計算復(fù)雜度,稱為inception V1。
GoogleNet在架構(gòu)設(shè)計上為保持低層為傳統(tǒng)卷積方式不變,只在較高的層開始用Inception模塊。
inception V2中將5x5的卷積改為2個3x3的卷積,擴(kuò)大了感受野,原來是5x5,現(xiàn)在是6x6。Pytorch實現(xiàn)GoogLeNet(inception V2):
'''GoogLeNet with PyTorch.''' import torch import torch.nn as nn import torch.nn.functional as F # 編寫卷積+bn+relu模塊 class BasicConv2d(nn.Module): def __init__(self, in_channels, out_channals, **kwargs): super(BasicConv2d, self).__init__() self.conv = nn.Conv2d(in_channels, out_channals, **kwargs) self.bn = nn.BatchNorm2d(out_channals) def forward(self, x): x = self.conv(x) x = self.bn(x) return F.relu(x) # 編寫Inception模塊 class Inception(nn.Module): def __init__(self, in_planes, n1x1, n3x3red, n3x3, n5x5red, n5x5, pool_planes): super(Inception, self).__init__() # 1x1 conv branch self.b1 = BasicConv2d(in_planes, n1x1, kernel_size=1) # 1x1 conv -> 3x3 conv branch self.b2_1x1_a = BasicConv2d(in_planes, n3x3red, kernel_size=1) self.b2_3x3_b = BasicConv2d(n3x3red, n3x3, kernel_size=3, padding=1) # 1x1 conv -> 3x3 conv -> 3x3 conv branch self.b3_1x1_a = BasicConv2d(in_planes, n5x5red, kernel_size=1) self.b3_3x3_b = BasicConv2d(n5x5red, n5x5, kernel_size=3, padding=1) self.b3_3x3_c = BasicConv2d(n5x5, n5x5, kernel_size=3, padding=1) # 3x3 pool -> 1x1 conv branch self.b4_pool = nn.MaxPool2d(3, stride=1, padding=1) self.b4_1x1 = BasicConv2d(in_planes, pool_planes, kernel_size=1) def forward(self, x): y1 = self.b1(x) y2 = self.b2_3x3_b(self.b2_1x1_a(x)) y3 = self.b3_3x3_c(self.b3_3x3_b(self.b3_1x1_a(x))) y4 = self.b4_1x1(self.b4_pool(x)) # y的維度為[batch_size, out_channels, C_out,L_out] # 合并不同卷積下的特征圖 return torch.cat([y1, y2, y3, y4], 1) class GoogLeNet(nn.Module): def __init__(self): super(GoogLeNet, self).__init__() self.pre_layers = BasicConv2d(3, 192, kernel_size=3, padding=1) self.a3 = Inception(192, 64, 96, 128, 16, 32, 32) self.b3 = Inception(256, 128, 128, 192, 32, 96, 64) self.maxpool = nn.MaxPool2d(3, stride=2, padding=1) self.a4 = Inception(480, 192, 96, 208, 16, 48, 64) self.b4 = Inception(512, 160, 112, 224, 24, 64, 64) self.c4 = Inception(512, 128, 128, 256, 24, 64, 64) self.d4 = Inception(512, 112, 144, 288, 32, 64, 64) self.e4 = Inception(528, 256, 160, 320, 32, 128, 128) self.a5 = Inception(832, 256, 160, 320, 32, 128, 128) self.b5 = Inception(832, 384, 192, 384, 48, 128, 128) self.avgpool = nn.AvgPool2d(8, stride=1) self.linear = nn.Linear(1024, 10) def forward(self, x): out = self.pre_layers(x) out = self.a3(out) out = self.b3(out) out = self.maxpool(out) out = self.a4(out) out = self.b4(out) out = self.c4(out) out = self.d4(out) out = self.e4(out) out = self.maxpool(out) out = self.a5(out) out = self.b5(out) out = self.avgpool(out) out = out.view(out.size(0), -1) out = self.linear(out) return out def test(): net = GoogLeNet() x = torch.randn(1,3,32,32) y = net(x) print(y.size()) test()
以上這篇Pytorch實現(xiàn)GoogLeNet的方法就是小編分享給大家的全部內(nèi)容了,希望能給大家一個參考,也希望大家多多支持腳本之家。
相關(guān)文章
python中的hashlib和base64加密模塊使用實例
這篇文章主要介紹了python中的hashlib和base64加密模塊使用實例,hashlib模塊支持的加密算法有md5 sha1 sha224 sha256 sha384 sha512,需要的朋友可以參考下2014-09-09關(guān)于python selenium 運行時彈出窗口問題
最近在做一個網(wǎng)頁代填項目,用到了python的selenium知識,經(jīng)過了各種嘗試與搜索最后終算是較完美的解決了,下面小編給大家?guī)砹藀ython selenium 運行時彈出窗口問題,感興趣的朋友一起看看吧2021-11-11Python使用?TCP協(xié)議實現(xiàn)智能聊天機(jī)器人功能
TCP協(xié)議適用于對效率要求相對較低而準(zhǔn)確性要求很高的場合,下面通過本文給大家介紹基于Python?使用?TCP?實現(xiàn)智能聊天機(jī)器人,需要的朋友可以參考下2022-05-05Django 權(quán)限管理(permissions)與用戶組(group)詳解
這篇文章主要介紹了Django 權(quán)限管理(permissions)與用戶組(group)詳解,文中通過示例代碼介紹的非常詳細(xì),對大家的學(xué)習(xí)或者工作具有一定的參考學(xué)習(xí)價值,需要的朋友們下面隨著小編來一起學(xué)習(xí)學(xué)習(xí)吧2020-11-11Python使用BeautifulSoup爬取網(wǎng)頁數(shù)據(jù)的操作步驟
在網(wǎng)絡(luò)時代,數(shù)據(jù)是最寶貴的資源之一,而爬蟲技術(shù)就是一種獲取數(shù)據(jù)的重要手段,Python 作為一門高效、易學(xué)、易用的編程語言,自然成為了爬蟲技術(shù)的首選語言之一,本文將介紹如何使用 BeautifulSoup 爬取網(wǎng)頁數(shù)據(jù),并提供詳細(xì)的代碼和注釋,幫助讀者快速上手2023-11-11