婷婷综合国产,91蜜桃婷婷狠狠久久综合9色 ,九九九九九精品,国产综合av

主頁 > 知識庫 > Pytorch模型遷移和遷移學習,導入部分模型參數的操作

Pytorch模型遷移和遷移學習,導入部分模型參數的操作

熱門標簽:螳螂科技外呼系統怎么用 舉辦過冬奧會的城市地圖標注 地圖地圖標注有嘆號 遼寧智能外呼系統需要多少錢 電銷機器人系統廠家鄭州 正安縣地圖標注app 阿里電話機器人對話 400電話申請資格 qt百度地圖標注

1. 利用resnet18做遷移學習

import torch
from torchvision import models 
if __name__ == "__main__":
  # device = torch.device("cuda" if torch.cuda.is_available() else "cpu")
  device = 'cpu'
  print("-----device:{}".format(device))
  print("-----Pytorch version:{}".format(torch.__version__))
 
  input_tensor = torch.zeros(1, 3, 100, 100)
  print('input_tensor:', input_tensor.shape)
  pretrained_file = "model/resnet18-5c106cde.pth"
  model = models.resnet18()
  model.load_state_dict(torch.load(pretrained_file))
  model.eval()
  out = model(input_tensor)
  print("out:", out.shape, out[0, 0:10])

結果輸出:

input_tensor: torch.Size([1, 3, 100, 100])
out: torch.Size([1, 1000]) tensor([ 0.4010, 0.8436, 0.3072, 0.0627, 0.4446, 0.8470, 0.1882, 0.7012,0.2988, -0.7574], grad_fn=SliceBackward>)

如果,我們修改了resnet18的網絡結構,如何將原來預訓練模型參數(resnet18-5c106cde.pth)遷移到新的resnet18網絡中呢?

比如,這里將官方的resnet18的self.layer4 = self._make_layer(block, 512, layers[3], stride=2)改為:self.layer44 = self._make_layer(block, 512, layers[3], stride=2)

class ResNet(nn.Module): 
  def __init__(self, block, layers, num_classes=1000, zero_init_residual=False):
    super(ResNet, self).__init__()
    self.inplanes = 64
    self.conv1 = nn.Conv2d(3, 64, kernel_size=7, stride=2, padding=3,
                bias=False)
    self.bn1 = nn.BatchNorm2d(64)
    self.relu = nn.ReLU(inplace=True)
    self.maxpool = nn.MaxPool2d(kernel_size=3, stride=2, padding=1)
    self.layer1 = self._make_layer(block, 64, layers[0])
    self.layer2 = self._make_layer(block, 128, layers[1], stride=2)
    self.layer3 = self._make_layer(block, 256, layers[2], stride=2)
    self.layer44 = self._make_layer(block, 512, layers[3], stride=2)
    self.avgpool = nn.AdaptiveAvgPool2d((1, 1))
    self.fc = nn.Linear(512 * block.expansion, num_classes)
 
    for m in self.modules():
      if isinstance(m, nn.Conv2d):
        nn.init.kaiming_normal_(m.weight, mode='fan_out', nonlinearity='relu')
      elif isinstance(m, nn.BatchNorm2d):
        nn.init.constant_(m.weight, 1)
        nn.init.constant_(m.bias, 0)
 
    # Zero-initialize the last BN in each residual branch,
    # so that the residual branch starts with zeros, and each residual block behaves like an identity.
    # This improves the model by 0.2~0.3% according to https://arxiv.org/abs/1706.02677
    if zero_init_residual:
      for m in self.modules():
        if isinstance(m, Bottleneck):
          nn.init.constant_(m.bn3.weight, 0)
        elif isinstance(m, BasicBlock):
          nn.init.constant_(m.bn2.weight, 0)
 
  def _make_layer(self, block, planes, blocks, stride=1):
    downsample = None
    if stride != 1 or self.inplanes != planes * block.expansion:
      downsample = nn.Sequential(
        conv1x1(self.inplanes, planes * block.expansion, stride),
        nn.BatchNorm2d(planes * block.expansion),
      )
 
    layers = []
    layers.append(block(self.inplanes, planes, stride, downsample))
    self.inplanes = planes * block.expansion
    for _ in range(1, blocks):
      layers.append(block(self.inplanes, planes))
 
    return nn.Sequential(*layers)
 
  def forward(self, x):
    x = self.conv1(x)
    x = self.bn1(x)
    x = self.relu(x)
    x = self.maxpool(x)
 
    x = self.layer1(x)
    x = self.layer2(x)
    x = self.layer3(x)
    x = self.layer44(x)
 
    x = self.avgpool(x)
    x = x.view(x.size(0), -1)
    x = self.fc(x)
 
    return x

這時,直接加載模型:

  model = models.resnet18()
  model.load_state_dict(torch.load(pretrained_file))

這時,肯定會報錯,類似:Missing key(s) in state_dict或者Unexpected key(s) in state_dict的錯誤:

RuntimeError: Error(s) in loading state_dict for ResNet:
Missing key(s) in state_dict: "layer44.0.conv1.weight", "layer44.0.bn1.weight", "layer44.0.bn1.bias", "layer44.0.bn1.running_mean", "layer44.0.bn1.running_var", "layer44.0.conv2.weight", "layer44.0.bn2.weight", "layer44.0.bn2.bias", "layer44.0.bn2.running_mean", "layer44.0.bn2.running_var", "layer44.0.downsample.0.weight", "layer44.0.downsample.1.weight", "layer44.0.downsample.1.bias", "layer44.0.downsample.1.running_mean", "layer44.0.downsample.1.running_var", "layer44.1.conv1.weight", "layer44.1.bn1.weight", "layer44.1.bn1.bias", "layer44.1.bn1.running_mean", "layer44.1.bn1.running_var", "layer44.1.conv2.weight", "layer44.1.bn2.weight", "layer44.1.bn2.bias", "layer44.1.bn2.running_mean", "layer44.1.bn2.running_var".
Unexpected key(s) in state_dict: "layer4.0.conv1.weight", "layer4.0.bn1.running_mean", "layer4.0.bn1.running_var", "layer4.0.bn1.weight", "layer4.0.bn1.bias", "layer4.0.conv2.weight", "layer4.0.bn2.running_mean", "layer4.0.bn2.running_var", "layer4.0.bn2.weight", "layer4.0.bn2.bias", "layer4.0.downsample.0.weight", "layer4.0.downsample.1.running_mean", "layer4.0.downsample.1.running_var", "layer4.0.downsample.1.weight", "layer4.0.downsample.1.bias", "layer4.1.conv1.weight", "layer4.1.bn1.running_mean", "layer4.1.bn1.running_var", "layer4.1.bn1.weight", "layer4.1.bn1.bias", "layer4.1.conv2.weight", "layer4.1.bn2.running_mean", "layer4.1.bn2.running_var", "layer4.1.bn2.weight", "layer4.1.bn2.bias".

Process finished with

RuntimeError: Error(s) in loading state_dict for ResNet:
Unexpected key(s) in state_dict: "layer4.0.conv1.weight", "layer4.0.bn1.running_mean", "layer4.0.bn1.running_var", "layer4.0.bn1.weight", "layer4.0.bn1.bias", "layer4.0.conv2.weight", "layer4.0.bn2.running_mean", "layer4.0.bn2.running_var", "layer4.0.bn2.weight", "layer4.0.bn2.bias", "layer4.0.downsample.0.weight", "layer4.0.downsample.1.running_mean", "layer4.0.downsample.1.running_var", "layer4.0.downsample.1.weight", "layer4.0.downsample.1.bias", "layer4.1.conv1.weight", "layer4.1.bn1.running_mean", "layer4.1.bn1.running_var", "layer4.1.bn1.weight", "layer4.1.bn1.bias", "layer4.1.conv2.weight", "layer4.1.bn2.running_mean", "layer4.1.bn2.running_var", "layer4.1.bn2.weight", "layer4.1.bn2.bias".

我們希望將原來預訓練模型參數(resnet18-5c106cde.pth)遷移到新的resnet18網絡,當然只能遷移二者相同的模型參數,不同的參數還是隨機初始化的.

 
def transfer_model(pretrained_file, model):
  '''
  只導入pretrained_file部分模型參數
  tensor([-0.7119, 0.0688, -1.7247, -1.7182, -1.2161, -0.7323, -2.1065, -0.5433,-1.5893, -0.5562]
  update:
    D.update([E, ]**F) -> None. Update D from dict/iterable E and F.
    If E is present and has a .keys() method, then does: for k in E: D[k] = E[k]
    If E is present and lacks a .keys() method, then does: for k, v in E: D[k] = v
    In either case, this is followed by: for k in F: D[k] = F[k]
  :param pretrained_file:
  :param model:
  :return:
  '''
  pretrained_dict = torch.load(pretrained_file) # get pretrained dict
  model_dict = model.state_dict() # get model dict
  # 在合并前(update),需要去除pretrained_dict一些不需要的參數
  pretrained_dict = transfer_state_dict(pretrained_dict, model_dict)
  model_dict.update(pretrained_dict) # 更新(合并)模型的參數
  model.load_state_dict(model_dict)
  return model
 
def transfer_state_dict(pretrained_dict, model_dict):
  '''
  根據model_dict,去除pretrained_dict一些不需要的參數,以便遷移到新的網絡
  url: https://blog.csdn.net/qq_34914551/article/details/87871134
  :param pretrained_dict:
  :param model_dict:
  :return:
  '''
  # state_dict2 = {k: v for k, v in save_model.items() if k in model_dict.keys()}
  state_dict = {}
  for k, v in pretrained_dict.items():
    if k in model_dict.keys():
      # state_dict.setdefault(k, v)
      state_dict[k] = v
    else:
      print("Missing key(s) in state_dict :{}".format(k))
  return state_dict
 
if __name__ == "__main__":
 
  input_tensor = torch.zeros(1, 3, 100, 100)
  print('input_tensor:', input_tensor.shape)
  pretrained_file = "model/resnet18-5c106cde.pth"
  # model = resnet18()
  # model.load_state_dict(torch.load(pretrained_file))
  # model.eval()
  # out = model(input_tensor)
  # print("out:", out.shape, out[0, 0:10])
 
  model1 = resnet18()
  model1 = transfer_model(pretrained_file, model1)
  out1 = model1(input_tensor)
  print("out1:", out1.shape, out1[0, 0:10])

2. 修改網絡名稱并遷移學習

上面的例子,只是將官方的resnet18的self.layer4 = self._make_layer(block, 512, layers[3], stride=2)改為了:self.layer44 = self._make_layer(block, 512, layers[3], stride=2),我們僅僅是修改了一個網絡名稱而已,就導致 model.load_state_dict(torch.load(pretrained_file))出錯,

那么,我們如何將預訓練模型"model/resnet18-5c106cde.pth"轉換成符合新的網絡的模型參數呢?

方法很簡單,只需要將resnet18-5c106cde.pth的模型參數中所有前綴為layer4的名稱,改為layer44即可

本人已經定義好了方法:

modify_state_dict(pretrained_dict, model_dict, old_prefix, new_prefix)
def string_rename(old_string, new_string, start, end):
  new_string = old_string[:start] + new_string + old_string[end:]
  return new_string
 
def modify_model(pretrained_file, model, old_prefix, new_prefix):
  '''
  :param pretrained_file:
  :param model:
  :param old_prefix:
  :param new_prefix:
  :return:
  '''
  pretrained_dict = torch.load(pretrained_file)
  model_dict = model.state_dict()
  state_dict = modify_state_dict(pretrained_dict, model_dict, old_prefix, new_prefix)
  model.load_state_dict(state_dict)
  return model 
 
def modify_state_dict(pretrained_dict, model_dict, old_prefix, new_prefix):
  '''
  修改model dict
  :param pretrained_dict:
  :param model_dict:
  :param old_prefix:
  :param new_prefix:
  :return:
  '''
  state_dict = {}
  for k, v in pretrained_dict.items():
    if k in model_dict.keys():
      # state_dict.setdefault(k, v)
      state_dict[k] = v
    else:
      for o, n in zip(old_prefix, new_prefix):
        prefix = k[:len(o)]
        if prefix == o:
          kk = string_rename(old_string=k, new_string=n, start=0, end=len(o))
          print("rename layer modules:{}-->{}".format(k, kk))
          state_dict[kk] = v
  return state_dict
if __name__ == "__main__":
  input_tensor = torch.zeros(1, 3, 100, 100)
  print('input_tensor:', input_tensor.shape)
  pretrained_file = "model/resnet18-5c106cde.pth"
  # model = models.resnet18()
  # model.load_state_dict(torch.load(pretrained_file))
  # model.eval()
  # out = model(input_tensor)
  # print("out:", out.shape, out[0, 0:10])
  #
  # model1 = resnet18()
  # model1 = transfer_model(pretrained_file, model1)
  # out1 = model1(input_tensor)
  # print("out1:", out1.shape, out1[0, 0:10])
  #
  new_file = "new_model.pth"
  model = resnet18()
  new_model = modify_model(pretrained_file, model, old_prefix=["layer4"], new_prefix=["layer44"])
  torch.save(new_model.state_dict(), new_file)
 
  model2 = resnet18()
  model2.load_state_dict(torch.load(new_file))
  model2.eval()
  out2 = model2(input_tensor)
  print("out2:", out2.shape, out2[0, 0:10])

這時,輸出,跟之前一模一樣了。

out: torch.Size([1, 1000]) tensor([ 0.4010, 0.8436, 0.3072, 0.0627, 0.4446, 0.8470, 0.1882, 0.7012,0.2988, -0.7574], grad_fn=SliceBackward>)

3.去除原模型的某些模塊

下面是在不修改原模型代碼的情況下,通過"resnet18.named_children()"和"resnet18.children()"的方法去除子模塊"fc"和"avgpool"

import torch
import torchvision.models as models
from collections import OrderedDict
 
if __name__=="__main__":
  resnet18 = models.resnet18(False)
  print("resnet18",resnet18)
 
  # use named_children()
  resnet18_v1 = OrderedDict(resnet18.named_children())
  # remove avgpool,fc
  resnet18_v1.pop("avgpool")
  resnet18_v1.pop("fc")
  resnet18_v1 = torch.nn.Sequential(resnet18_v1)
  print("resnet18_v1",resnet18_v1)
  # use children
  resnet18_v2 = torch.nn.Sequential(*list(resnet18.children())[:-2])
  print(resnet18_v2,resnet18_v2)

補充:pytorch導入(部分)模型參數

背景介紹:

我的想法是把一個預訓練的網絡的參數導入到我的模型中,但是預訓練模型的參數只是我模型參數的一小部分,怎樣導進去不出差錯了,請來聽我說說。

解法

首先把你需要添加參數的那一小部分模型提取出來,并新建一個類進行重新定義,如圖向Alexnet中添加前三層的參數,重新定義前三層。

接下來就是導入參數

    checkpoint = torch.load(config.pretrained_model)
    # change name and load parameters
    model_dict = model.net1.state_dict()
    checkpoint = {k.replace('features.features', 'featureExtract1'): v for k, v in checkpoint.items()}
    checkpoint = {k:v for k,v in checkpoint.items() if k in model_dict.keys()}
 
    model_dict.update(checkpoint)
    model.net1.load_state_dict(model_dict)

程序如上圖所示,主要是第三、四句,第三是替換,別人訓練的模型參數的鍵和自己的定義的會不一樣,所以需要替換成自己的;第四句有個if用于判斷導入需要的參數。其他語句都相當于是模板,套用即可。

以上為個人經驗,希望能給大家一個參考,也希望大家多多支持腳本之家。如有錯誤或未考慮完全的地方,望不吝賜教。

您可能感興趣的文章:
  • PyTorch 遷移學習實踐(幾分鐘即可訓練好自己的模型)
  • PyTorch一小時掌握之遷移學習篇

標簽:昭通 信陽 淘寶好評回訪 興安盟 濟源 合肥 阜新 隨州

巨人網絡通訊聲明:本文標題《Pytorch模型遷移和遷移學習,導入部分模型參數的操作》,本文關鍵詞  Pytorch,模型,遷移,和,學習,;如發現本文內容存在版權問題,煩請提供相關信息告之我們,我們將及時溝通與處理。本站內容系統采集于網絡,涉及言論、版權與本站無關。
  • 相關文章
  • 下面列出與本文章《Pytorch模型遷移和遷移學習,導入部分模型參數的操作》相關的同類信息!
  • 本頁收集關于Pytorch模型遷移和遷移學習,導入部分模型參數的操作的相關信息資訊供網民參考!
  • 推薦文章
    婷婷综合国产,91蜜桃婷婷狠狠久久综合9色 ,九九九九九精品,国产综合av
    日本高清视频一区二区| 丝袜美腿成人在线| 成人黄色小视频在线观看| 成人中文字幕合集| 日韩欧美一区二区免费| 亚洲小说欧美激情另类| 国模娜娜一区二区三区| 欧美午夜精品免费| 成人听书哪个软件好| 国产精品网曝门| 色综合中文字幕| 伊人色综合久久天天| 成人综合婷婷国产精品久久蜜臀| 欧美三级午夜理伦三级中视频| 亚洲无线码一区二区三区| 亚洲精品精品亚洲| 免费观看成人av| 欧美亚洲综合在线| 亚洲一区二区三区四区五区黄| 成人91在线观看| 国产精品美女久久久久久久网站| 国产日韩成人精品| 成人免费高清在线观看| 国产精品久久久久婷婷二区次| 国产99一区视频免费| 国产精品无人区| 99精品国产99久久久久久白柏| 中文字幕视频一区| 色综合天天综合网天天狠天天 | 亚洲欧美日韩国产中文在线| 午夜伊人狠狠久久| 又紧又大又爽精品一区二区| 国产欧美日韩综合| 国产精品入口麻豆九色| 中文字幕永久在线不卡| 久久综合九色综合久久久精品综合| 7777精品伊人久久久大香线蕉超级流畅 | 亚洲欧洲成人自拍| 粉嫩欧美一区二区三区高清影视| 日韩免费观看2025年上映的电影| 国产一区二区三区四区五区入口 | 亚洲综合视频在线| 欧美熟乱第一页| 久久国产视频网| 欧洲av在线精品| 欧美偷拍一区二区| 91丝袜美腿高跟国产极品老师 | 亚洲成av人片一区二区| 亚洲日本欧美天堂| 日韩电影免费一区| 日韩和欧美一区二区| 懂色av一区二区夜夜嗨| 欧美日韩一区二区电影| 欧美哺乳videos| 日韩欧美一二区| 欧美色欧美亚洲另类二区| 久久精品国产一区二区三区免费看| 色先锋久久av资源部| 久久久久久久久久久久电影| 欧美性大战久久久久久久蜜臀| 欧美日韩免费一区二区三区视频| 337p日本欧洲亚洲大胆精品| 成人黄色a**站在线观看| 亚洲综合区在线| 国产亚洲自拍一区| 欧美性生交片4| 国产aⅴ精品一区二区三区色成熟| 亚洲综合另类小说| 亚洲一区欧美一区| 国产在线精品不卡| av一二三不卡影片| 精品日韩一区二区三区| 亚洲欧洲中文日韩久久av乱码| 亚洲图片欧美色图| 久久丁香综合五月国产三级网站| 不卡欧美aaaaa| 91精品在线观看入口| 亚洲日韩欧美一区二区在线| 成人av资源网站| 国产欧美日本一区二区三区| 亚洲国产精品影院| 色视频欧美一区二区三区| 中文字幕一区二区三| 成人性生交大合| 亚洲免费在线电影| 风间由美一区二区三区在线观看| 91精品国产全国免费观看 | 成人美女视频在线观看18| 久久综合久久99| 日韩激情av在线| 在线看国产日韩| 26uuu色噜噜精品一区| 午夜精品久久久久久久久久| 日韩精品一区二区三区在线 | 国产一区91精品张津瑜| 91麻豆精品国产91久久久久久| 国产伦理精品不卡| 久久99精品久久久久久 | 91精品国产综合久久久久久 | 国产曰批免费观看久久久| 日韩欧美综合一区| 欧美一区二区三区啪啪| 欧美日韩久久久一区| 日韩黄色片在线观看| 日韩美女天天操| 99国产精品国产精品久久| 亚洲欧美二区三区| 欧美mv和日韩mv国产网站| 免费在线观看一区| 国产精品丝袜黑色高跟| 精品国产一区二区国模嫣然| 国产91富婆露脸刺激对白| 一区二区三区在线免费视频| 欧洲精品中文字幕| 精品一区二区三区免费| 亚洲免费在线看| 久久影院午夜片一区| 欧美亚洲丝袜传媒另类| 国产一区二区91| 免费日本视频一区| 亚洲综合免费观看高清在线观看| 精品噜噜噜噜久久久久久久久试看 | 曰韩精品一区二区| 午夜精品爽啪视频| 亚洲精品在线免费播放| 国产一区二区三区免费观看| 免费观看在线色综合| 成人av动漫在线| 国产一区二区精品在线观看| 亚洲成av人片一区二区三区| 久久久精品国产免大香伊| 国产精品午夜免费| 日韩一区二区麻豆国产| 久久久久久久久伊人| 日韩免费电影一区| 国产精品久久国产精麻豆99网站 | 欧美日本一道本| 国产ts人妖一区二区| 国产在线播放一区二区三区| 丝袜a∨在线一区二区三区不卡| 一区二区三区四区五区视频在线观看| 日韩欧美在线123| 欧美日韩一区二区三区在线| 99riav一区二区三区| 青青草国产成人99久久| 一区二区免费在线播放| 一区二区三区 在线观看视频| 成人欧美一区二区三区1314| 久久精品亚洲国产奇米99| 欧美精品一区二区三区在线| 欧美日韩国产一二三| 91精品福利在线一区二区三区| 欧美日韩激情一区二区| 91精品国产入口在线| 日本一区二区三区在线不卡| 日韩一区二区免费高清| 日韩理论片在线| 亚洲永久精品国产| 91在线免费播放| 欧美色综合久久| 国产精品视频免费看| 国产欧美久久久精品影院| 在线观看免费成人| 成人免费在线视频观看| 91精品国产色综合久久久蜜香臀| 国产99精品视频| 一区二区在线观看不卡| 久久99热99| 国产精品一区二区免费不卡| 欧美日韩一级二级| 日韩午夜在线影院| 亚洲国产三级在线| 亚洲免费资源在线播放| 一区二区三区.www| 国产一区二区三区黄视频 | 国产不卡在线播放| 欧美色偷偷大香| 日韩一卡二卡三卡四卡| 日韩二区三区四区| 欧美午夜精品一区二区蜜桃| 一区二区三区高清在线| 国产一区二区不卡在线 | 91久久精品日日躁夜夜躁欧美| 26uuu精品一区二区| 亚洲一区免费在线观看| aaa国产一区| 久久久不卡网国产精品一区| 麻豆91在线看| 日韩欧美中文字幕精品| 日韩av中文字幕一区二区三区| 在线亚洲高清视频| 亚洲自拍与偷拍| 欧美日韩亚洲另类| 日本欧美久久久久免费播放网| 成人sese在线| 精品福利视频一区二区三区| 九九在线精品视频| 在线观看亚洲精品视频| 蜜臀久久99精品久久久久宅男| 日韩一区二区三区视频在线观看|