深度学习Day-20:DenseNet算法实战 乳腺癌识别

 🍨 本文为:[🔗365天深度学习训练营] 中的学习记录博客
 🍖 原作者:[K同学啊 | 接辅导、项目定制]

一、 基础配置

  • 语言环境:Python3.8
  • 编译器选择:Pycharm
  • 深度学习环境:
    • torch==1.12.1+cu113
    • torchvision==0.13.1+cu113

二、 前期准备 

1.设置GPU

import torch
import torch.nn as nn
from torchvision import transforms,datasets
import pathlib,warnings

warnings.filterwarnings("ignore")

device = torch.device('cuda' if torch.cuda.is_available() else 'cpu')

2. 导入数据

本项目所采用的数据集未收录于公开数据中,故需要自己在文件目录中导入相应数据集合,并设置对应文件目录,以供后续学习过程中使用。

运行下述代码:

data_dir = "./data/J3-data"
data_dir = pathlib.Path(data_dir)

data_path = list(data_dir.glob('*'))
classNames = [str(path).split('\\')[2] for path in data_path]
print(classNames)

得到如下输出:

['0', '1']

接下来,我们通过transforms.Compose对整个数据集进行预处理:

train_transforms = transforms.Compose([
    transforms.Resize([224, 224]),      # 将输入图片resize成统一尺寸
    # transforms.RandomHorizontalFlip(), # 随机水平翻转
    transforms.ToTensor(),              # 将PIL Image或numpy.ndarray转换为tensor,并归一化到[0,1]之间
    transforms.Normalize(               # 标准化处理-->转换为标准正太分布(高斯分布),使模型更容易收敛
        mean=[0.485, 0.456, 0.406],
        std=[0.229, 0.224, 0.225])      # 其中 mean=[0.485,0.456,0.406]与std=[0.229,0.224,0.225] 从数据集中随机抽样计算得到的。
])

total_dataset = datasets.ImageFolder(data_dir,transform=transforms)
print(total_dataset.class_to_idx)

得到如下输出:

{'0': 0, '1': 1}

3. 划分数据集

 此处数据集需要做按比例划分的操作:

train_size = int(0.8*len(total_dataset))
test_size = len(total_dataset) - train_size
train_dataset,test_dataset = torch.utils.data.random_split(total_dataset,[train_size,test_size])

接下来,根据划分得到的训练集和验证集对数据集进行包装:

batch_size = 32

train_dl = torch.utils.data.DataLoader(train_dataset,
                                       batch_size = batch_size,
                                       shuffle = True,
                                       num_workers = 0)

test_dl = torch.utils.data.DataLoader(test_dataset,
                                      batch_size = batch_size,
                                      shuffle = True,
                                      num_workers = 0)

并通过:

for X,y in test_dl:
    print('Shape of X:',X.shape)
    print('shape of y:',y.shape,y.dtype)
    break

输出测试数据集的数据分布情况:

Shape of X: torch.Size([32, 3, 224, 224])
shape of y: torch.Size([32]) torch.int64

4.搭建模型

首先,导入搭建模型所依赖的库用于后续模型的搭建过程:

import torch.nn.functional as F
from collections import OrderedDict

1.DenseLayer

class DenseLayer(nn.Sequential):
    def __init__(self, in_channel, growth_rate, bn_size, drop_rate):
        super(DenseLayer, self).__init__()
        self.add_module('norm1', nn.BatchNorm2d(in_channel))
        self.add_module('relu1', nn.ReLU(inplace=True))
        self.add_module('conv1', nn.Conv2d(in_channel, bn_size * growth_rate, kernel_size=1, stride=1))
        self.add_module('norm2', nn.BatchNorm2d(bn_size * growth_rate))
        self.add_module('relu2', nn.ReLU(inplace=True))
        self.add_module('conv2', nn.Conv2d(bn_size * growth_rate, growth_rate, kernel_size=3, stride=1, padding=1))

        self.drop_rate = drop_rate

    def forward(self, x):
        new_feature = super(DenseLayer, self).forward(x)
        if self.drop_rate > 0:
            new_feature = F.dropout(new_feature, p=self.drop_rate, training=self.training)
        return torch.cat([x, new_feature], 1)

2.DenseBlock 

class DenseBlock(nn.Sequential):
    def __init__(self, num_layers, in_channel, bn_size, growth_rate, drop_rate):
        super(DenseBlock, self).__init__()
        for i in range(num_layers):
            layer = DenseLayer(in_channel + i * growth_rate, growth_rate, bn_size, drop_rate)
            self.add_module('denselayer%d' % (i + 1,), layer)

3.Transition 

class Transition(nn.Sequential):
    def __init__(self, in_channel, out_channel):
        super(Transition, self).__init__()
        self.add_module('norm', nn.BatchNorm2d(in_channel))
        self.add_module('relu', nn.ReLU(inplace=True))
        self.add_module('conv', nn.Conv2d(in_channel, out_channel, kernel_size=1, stride=1))
        self.add_module('pool', nn.AvgPool2d(kernel_size=2, stride=2))

4.搭建DenseNet 

class DenseNet(nn.Module):
    def __init__(self, growth_rate=32, block_config=(6, 12, 24, 16), init_channel=64, bn_size=4,
                 compression_rate=0.5, drop_rate=0, num_classes=1000):
        super(DenseNet, self).__init__()
        self.features = nn.Sequential(OrderedDict([
            ('conv0', nn.Conv2d(3, init_channel, kernel_size=7, stride=2, padding=3)),
            ('norm0', nn.BatchNorm2d(init_channel)),
            ('relu0', nn.ReLU(inplace=True)),
            ('pool0', nn.MaxPool2d(kernel_size=3, stride=2, padding=1))
        ]))
        num_features = init_channel
        for i, num_layers in enumerate(block_config):
            block = DenseBlock(num_layers, num_features, bn_size=bn_size, growth_rate=growth_rate, drop_rate=drop_rate)
            self.features.add_module('denseblock%d' % (i + 1), block)
            num_features += num_layers * growth_rate
            if i != len(block_config) - 1:
                transition = Transition(num_features, int(num_features * compression_rate))
                self.features.add_module('transition%d' % (i + 1), transition)
                num_features = int(num_features * compression_rate)

        self.features.add_module('norm5', nn.BatchNorm2d(num_features))
        self.features.add_module('relu5', nn.ReLU(inplace=True))
        self.classifier = nn.Linear(num_features, num_classes)

        for m in self.modules():
            if isinstance(m, nn.Conv2d):
                nn.init.kaiming_normal_(m.weight)
            elif isinstance(m, nn.BatchNorm2d):
                nn.init.constant_(m.bias, 0)
                nn.init.constant_(m.weight, 1)
            elif isinstance(m, nn.Linear):
                nn.init.constant_(m.bias, 0)

    def forward(self, x):
        x = self.features(x)
        x = F.avg_pool2d(x, 7, stride=1).view(x.size(0), -1)
        x = self.classifier(x)
        return x

5.利用DenseNet搭建DenseNet121

densenet121 = DenseNet(init_channel=64,growth_rate=32,block_config=(6,12,24,16),num_classes=len(classNames))
model = densenet121.to(device)

2.查看模型信息

import torchsummary as summary
summary.summary(model, (3, 224, 224))

得到如下输出:

----------------------------------------------------------------
        Layer (type)               Output Shape         Param #
================================================================
            Conv2d-1         [-1, 64, 112, 112]           9,472
       BatchNorm2d-2         [-1, 64, 112, 112]             128
              ReLU-3         [-1, 64, 112, 112]               0
         MaxPool2d-4           [-1, 64, 56, 56]               0
       BatchNorm2d-5           [-1, 64, 56, 56]             128
              ReLU-6           [-1, 64, 56, 56]               0
            Conv2d-7          [-1, 128, 56, 56]           8,320
       BatchNorm2d-8          [-1, 128, 56, 56]             256
              ReLU-9          [-1, 128, 56, 56]               0
           Conv2d-10           [-1, 32, 56, 56]          36,896
      BatchNorm2d-11           [-1, 96, 56, 56]             192
             ReLU-12           [-1, 96, 56, 56]               0
           Conv2d-13          [-1, 128, 56, 56]          12,416
      BatchNorm2d-14          [-1, 128, 56, 56]             256
             ReLU-15          [-1, 128, 56, 56]               0
           Conv2d-16           [-1, 32, 56, 56]          36,896
      BatchNorm2d-17          [-1, 128, 56, 56]             256
             ReLU-18          [-1, 128, 56, 56]               0
           Conv2d-19          [-1, 128, 56, 56]          16,512
      BatchNorm2d-20          [-1, 128, 56, 56]             256
             ReLU-21          [-1, 128, 56, 56]               0
           Conv2d-22           [-1, 32, 56, 56]          36,896
      BatchNorm2d-23          [-1, 160, 56, 56]             320
             ReLU-24          [-1, 160, 56, 56]               0
           Conv2d-25          [-1, 128, 56, 56]          20,608
      BatchNorm2d-26          [-1, 128, 56, 56]             256
             ReLU-27          [-1, 128, 56, 56]               0
           Conv2d-28           [-1, 32, 56, 56]          36,896
      BatchNorm2d-29          [-1, 192, 56, 56]             384
             ReLU-30          [-1, 192, 56, 56]               0
           Conv2d-31          [-1, 128, 56, 56]          24,704
      BatchNorm2d-32          [-1, 128, 56, 56]             256
             ReLU-33          [-1, 128, 56, 56]               0
           Conv2d-34           [-1, 32, 56, 56]          36,896
      BatchNorm2d-35          [-1, 224, 56, 56]             448
             ReLU-36          [-1, 224, 56, 56]               0
           Conv2d-37          [-1, 128, 56, 56]          28,800
      BatchNorm2d-38          [-1, 128, 56, 56]             256
             ReLU-39          [-1, 128, 56, 56]               0
           Conv2d-40           [-1, 32, 56, 56]          36,896
      BatchNorm2d-41          [-1, 256, 56, 56]             512
             ReLU-42          [-1, 256, 56, 56]               0
           Conv2d-43          [-1, 128, 56, 56]          32,896
        AvgPool2d-44          [-1, 128, 28, 28]               0
      BatchNorm2d-45          [-1, 128, 28, 28]             256
             ReLU-46          [-1, 128, 28, 28]               0
           Conv2d-47          [-1, 128, 28, 28]          16,512
      BatchNorm2d-48          [-1, 128, 28, 28]             256
             ReLU-49          [-1, 128, 28, 28]               0
           Conv2d-50           [-1, 32, 28, 28]          36,896
      BatchNorm2d-51          [-1, 160, 28, 28]             320
             ReLU-52          [-1, 160, 28, 28]               0
           Conv2d-53          [-1, 128, 28, 28]          20,608
      BatchNorm2d-54          [-1, 128, 28, 28]             256
             ReLU-55          [-1, 128, 28, 28]               0
           Conv2d-56           [-1, 32, 28, 28]          36,896
      BatchNorm2d-57          [-1, 192, 28, 28]             384
             ReLU-58          [-1, 192, 28, 28]               0
           Conv2d-59          [-1, 128, 28, 28]          24,704
      BatchNorm2d-60          [-1, 128, 28, 28]             256
             ReLU-61          [-1, 128, 28, 28]               0
           Conv2d-62           [-1, 32, 28, 28]          36,896
      BatchNorm2d-63          [-1, 224, 28, 28]             448
             ReLU-64          [-1, 224, 28, 28]               0
           Conv2d-65          [-1, 128, 28, 28]          28,800
      BatchNorm2d-66          [-1, 128, 28, 28]             256
             ReLU-67          [-1, 128, 28, 28]               0
           Conv2d-68           [-1, 32, 28, 28]          36,896
      BatchNorm2d-69          [-1, 256, 28, 28]             512
             ReLU-70          [-1, 256, 28, 28]               0
           Conv2d-71          [-1, 128, 28, 28]          32,896
      BatchNorm2d-72          [-1, 128, 28, 28]             256
             ReLU-73          [-1, 128, 28, 28]               0
           Conv2d-74           [-1, 32, 28, 28]          36,896
      BatchNorm2d-75          [-1, 288, 28, 28]             576
             ReLU-76          [-1, 288, 28, 28]               0
           Conv2d-77          [-1, 128, 28, 28]          36,992
      BatchNorm2d-78          [-1, 128, 28, 28]             256
             ReLU-79          [-1, 128, 28, 28]               0
           Conv2d-80           [-1, 32, 28, 28]          36,896
      BatchNorm2d-81          [-1, 320, 28, 28]             640
             ReLU-82          [-1, 320, 28, 28]               0
           Conv2d-83          [-1, 128, 28, 28]          41,088
      BatchNorm2d-84          [-1, 128, 28, 28]             256
             ReLU-85          [-1, 128, 28, 28]               0
           Conv2d-86           [-1, 32, 28, 28]          36,896
      BatchNorm2d-87          [-1, 352, 28, 28]             704
             ReLU-88          [-1, 352, 28, 28]               0
           Conv2d-89          [-1, 128, 28, 28]          45,184
      BatchNorm2d-90          [-1, 128, 28, 28]             256
             ReLU-91          [-1, 128, 28, 28]               0
           Conv2d-92           [-1, 32, 28, 28]          36,896
      BatchNorm2d-93          [-1, 384, 28, 28]             768
             ReLU-94          [-1, 384, 28, 28]               0
           Conv2d-95          [-1, 128, 28, 28]          49,280
      BatchNorm2d-96          [-1, 128, 28, 28]             256
             ReLU-97          [-1, 128, 28, 28]               0
           Conv2d-98           [-1, 32, 28, 28]          36,896
      BatchNorm2d-99          [-1, 416, 28, 28]             832
            ReLU-100          [-1, 416, 28, 28]               0
          Conv2d-101          [-1, 128, 28, 28]          53,376
     BatchNorm2d-102          [-1, 128, 28, 28]             256
            ReLU-103          [-1, 128, 28, 28]               0
          Conv2d-104           [-1, 32, 28, 28]          36,896
     BatchNorm2d-105          [-1, 448, 28, 28]             896
            ReLU-106          [-1, 448, 28, 28]               0
          Conv2d-107          [-1, 128, 28, 28]          57,472
     BatchNorm2d-108          [-1, 128, 28, 28]             256
            ReLU-109          [-1, 128, 28, 28]               0
          Conv2d-110           [-1, 32, 28, 28]          36,896
     BatchNorm2d-111          [-1, 480, 28, 28]             960
            ReLU-112          [-1, 480, 28, 28]               0
          Conv2d-113          [-1, 128, 28, 28]          61,568
     BatchNorm2d-114          [-1, 128, 28, 28]             256
            ReLU-115          [-1, 128, 28, 28]               0
          Conv2d-116           [-1, 32, 28, 28]          36,896
     BatchNorm2d-117          [-1, 512, 28, 28]           1,024
            ReLU-118          [-1, 512, 28, 28]               0
          Conv2d-119          [-1, 256, 28, 28]         131,328
       AvgPool2d-120          [-1, 256, 14, 14]               0
     BatchNorm2d-121          [-1, 256, 14, 14]             512
            ReLU-122          [-1, 256, 14, 14]               0
          Conv2d-123          [-1, 128, 14, 14]          32,896
     BatchNorm2d-124          [-1, 128, 14, 14]             256
            ReLU-125          [-1, 128, 14, 14]               0
          Conv2d-126           [-1, 32, 14, 14]          36,896
     BatchNorm2d-127          [-1, 288, 14, 14]             576
            ReLU-128          [-1, 288, 14, 14]               0
          Conv2d-129          [-1, 128, 14, 14]          36,992
     BatchNorm2d-130          [-1, 128, 14, 14]             256
            ReLU-131          [-1, 128, 14, 14]               0
          Conv2d-132           [-1, 32, 14, 14]          36,896
     BatchNorm2d-133          [-1, 320, 14, 14]             640
            ReLU-134          [-1, 320, 14, 14]               0
          Conv2d-135          [-1, 128, 14, 14]          41,088
     BatchNorm2d-136          [-1, 128, 14, 14]             256
            ReLU-137          [-1, 128, 14, 14]               0
          Conv2d-138           [-1, 32, 14, 14]          36,896
     BatchNorm2d-139          [-1, 352, 14, 14]             704
            ReLU-140          [-1, 352, 14, 14]               0
          Conv2d-141          [-1, 128, 14, 14]          45,184
     BatchNorm2d-142          [-1, 128, 14, 14]             256
            ReLU-143          [-1, 128, 14, 14]               0
          Conv2d-144           [-1, 32, 14, 14]          36,896
     BatchNorm2d-145          [-1, 384, 14, 14]             768
            ReLU-146          [-1, 384, 14, 14]               0
          Conv2d-147          [-1, 128, 14, 14]          49,280
     BatchNorm2d-148          [-1, 128, 14, 14]             256
            ReLU-149          [-1, 128, 14, 14]               0
          Conv2d-150           [-1, 32, 14, 14]          36,896
     BatchNorm2d-151          [-1, 416, 14, 14]             832
            ReLU-152          [-1, 416, 14, 14]               0
          Conv2d-153          [-1, 128, 14, 14]          53,376
     BatchNorm2d-154          [-1, 128, 14, 14]             256
            ReLU-155          [-1, 128, 14, 14]               0
          Conv2d-156           [-1, 32, 14, 14]          36,896
     BatchNorm2d-157          [-1, 448, 14, 14]             896
            ReLU-158          [-1, 448, 14, 14]               0
          Conv2d-159          [-1, 128, 14, 14]          57,472
     BatchNorm2d-160          [-1, 128, 14, 14]             256
            ReLU-161          [-1, 128, 14, 14]               0
          Conv2d-162           [-1, 32, 14, 14]          36,896
     BatchNorm2d-163          [-1, 480, 14, 14]             960
            ReLU-164          [-1, 480, 14, 14]               0
          Conv2d-165          [-1, 128, 14, 14]          61,568
     BatchNorm2d-166          [-1, 128, 14, 14]             256
            ReLU-167          [-1, 128, 14, 14]               0
          Conv2d-168           [-1, 32, 14, 14]          36,896
     BatchNorm2d-169          [-1, 512, 14, 14]           1,024
            ReLU-170          [-1, 512, 14, 14]               0
          Conv2d-171          [-1, 128, 14, 14]          65,664
     BatchNorm2d-172          [-1, 128, 14, 14]             256
            ReLU-173          [-1, 128, 14, 14]               0
          Conv2d-174           [-1, 32, 14, 14]          36,896
     BatchNorm2d-175          [-1, 544, 14, 14]           1,088
            ReLU-176          [-1, 544, 14, 14]               0
          Conv2d-177          [-1, 128, 14, 14]          69,760
     BatchNorm2d-178          [-1, 128, 14, 14]             256
            ReLU-179          [-1, 128, 14, 14]               0
          Conv2d-180           [-1, 32, 14, 14]          36,896
     BatchNorm2d-181          [-1, 576, 14, 14]           1,152
            ReLU-182          [-1, 576, 14, 14]               0
          Conv2d-183          [-1, 128, 14, 14]          73,856
     BatchNorm2d-184          [-1, 128, 14, 14]             256
            ReLU-185          [-1, 128, 14, 14]               0
          Conv2d-186           [-1, 32, 14, 14]          36,896
     BatchNorm2d-187          [-1, 608, 14, 14]           1,216
            ReLU-188          [-1, 608, 14, 14]               0
          Conv2d-189          [-1, 128, 14, 14]          77,952
     BatchNorm2d-190          [-1, 128, 14, 14]             256
            ReLU-191          [-1, 128, 14, 14]               0
          Conv2d-192           [-1, 32, 14, 14]          36,896
     BatchNorm2d-193          [-1, 640, 14, 14]           1,280
            ReLU-194          [-1, 640, 14, 14]               0
          Conv2d-195          [-1, 128, 14, 14]          82,048
     BatchNorm2d-196          [-1, 128, 14, 14]             256
            ReLU-197          [-1, 128, 14, 14]               0
          Conv2d-198           [-1, 32, 14, 14]          36,896
     BatchNorm2d-199          [-1, 672, 14, 14]           1,344
            ReLU-200          [-1, 672, 14, 14]               0
          Conv2d-201          [-1, 128, 14, 14]          86,144
     BatchNorm2d-202          [-1, 128, 14, 14]             256
            ReLU-203          [-1, 128, 14, 14]               0
          Conv2d-204           [-1, 32, 14, 14]          36,896
     BatchNorm2d-205          [-1, 704, 14, 14]           1,408
            ReLU-206          [-1, 704, 14, 14]               0
          Conv2d-207          [-1, 128, 14, 14]          90,240
     BatchNorm2d-208          [-1, 128, 14, 14]             256
            ReLU-209          [-1, 128, 14, 14]               0
          Conv2d-210           [-1, 32, 14, 14]          36,896
     BatchNorm2d-211          [-1, 736, 14, 14]           1,472
            ReLU-212          [-1, 736, 14, 14]               0
          Conv2d-213          [-1, 128, 14, 14]          94,336
     BatchNorm2d-214          [-1, 128, 14, 14]             256
            ReLU-215          [-1, 128, 14, 14]               0
          Conv2d-216           [-1, 32, 14, 14]          36,896
     BatchNorm2d-217          [-1, 768, 14, 14]           1,536
            ReLU-218          [-1, 768, 14, 14]               0
          Conv2d-219          [-1, 128, 14, 14]          98,432
     BatchNorm2d-220          [-1, 128, 14, 14]             256
            ReLU-221          [-1, 128, 14, 14]               0
          Conv2d-222           [-1, 32, 14, 14]          36,896
     BatchNorm2d-223          [-1, 800, 14, 14]           1,600
            ReLU-224          [-1, 800, 14, 14]               0
          Conv2d-225          [-1, 128, 14, 14]         102,528
     BatchNorm2d-226          [-1, 128, 14, 14]             256
            ReLU-227          [-1, 128, 14, 14]               0
          Conv2d-228           [-1, 32, 14, 14]          36,896
     BatchNorm2d-229          [-1, 832, 14, 14]           1,664
            ReLU-230          [-1, 832, 14, 14]               0
          Conv2d-231          [-1, 128, 14, 14]         106,624
     BatchNorm2d-232          [-1, 128, 14, 14]             256
            ReLU-233          [-1, 128, 14, 14]               0
          Conv2d-234           [-1, 32, 14, 14]          36,896
     BatchNorm2d-235          [-1, 864, 14, 14]           1,728
            ReLU-236          [-1, 864, 14, 14]               0
          Conv2d-237          [-1, 128, 14, 14]         110,720
     BatchNorm2d-238          [-1, 128, 14, 14]             256
            ReLU-239          [-1, 128, 14, 14]               0
          Conv2d-240           [-1, 32, 14, 14]          36,896
     BatchNorm2d-241          [-1, 896, 14, 14]           1,792
            ReLU-242          [-1, 896, 14, 14]               0
          Conv2d-243          [-1, 128, 14, 14]         114,816
     BatchNorm2d-244          [-1, 128, 14, 14]             256
            ReLU-245          [-1, 128, 14, 14]               0
          Conv2d-246           [-1, 32, 14, 14]          36,896
     BatchNorm2d-247          [-1, 928, 14, 14]           1,856
            ReLU-248          [-1, 928, 14, 14]               0
          Conv2d-249          [-1, 128, 14, 14]         118,912
     BatchNorm2d-250          [-1, 128, 14, 14]             256
            ReLU-251          [-1, 128, 14, 14]               0
          Conv2d-252           [-1, 32, 14, 14]          36,896
     BatchNorm2d-253          [-1, 960, 14, 14]           1,920
            ReLU-254          [-1, 960, 14, 14]               0
          Conv2d-255          [-1, 128, 14, 14]         123,008
     BatchNorm2d-256          [-1, 128, 14, 14]             256
            ReLU-257          [-1, 128, 14, 14]               0
          Conv2d-258           [-1, 32, 14, 14]          36,896
     BatchNorm2d-259          [-1, 992, 14, 14]           1,984
            ReLU-260          [-1, 992, 14, 14]               0
          Conv2d-261          [-1, 128, 14, 14]         127,104
     BatchNorm2d-262          [-1, 128, 14, 14]             256
            ReLU-263          [-1, 128, 14, 14]               0
          Conv2d-264           [-1, 32, 14, 14]          36,896
     BatchNorm2d-265         [-1, 1024, 14, 14]           2,048
            ReLU-266         [-1, 1024, 14, 14]               0
          Conv2d-267          [-1, 512, 14, 14]         524,800
       AvgPool2d-268            [-1, 512, 7, 7]               0
     BatchNorm2d-269            [-1, 512, 7, 7]           1,024
            ReLU-270            [-1, 512, 7, 7]               0
          Conv2d-271            [-1, 128, 7, 7]          65,664
     BatchNorm2d-272            [-1, 128, 7, 7]             256
            ReLU-273            [-1, 128, 7, 7]               0
          Conv2d-274             [-1, 32, 7, 7]          36,896
     BatchNorm2d-275            [-1, 544, 7, 7]           1,088
            ReLU-276            [-1, 544, 7, 7]               0
          Conv2d-277            [-1, 128, 7, 7]          69,760
     BatchNorm2d-278            [-1, 128, 7, 7]             256
            ReLU-279            [-1, 128, 7, 7]               0
          Conv2d-280             [-1, 32, 7, 7]          36,896
     BatchNorm2d-281            [-1, 576, 7, 7]           1,152
            ReLU-282            [-1, 576, 7, 7]               0
          Conv2d-283            [-1, 128, 7, 7]          73,856
     BatchNorm2d-284            [-1, 128, 7, 7]             256
            ReLU-285            [-1, 128, 7, 7]               0
          Conv2d-286             [-1, 32, 7, 7]          36,896
     BatchNorm2d-287            [-1, 608, 7, 7]           1,216
            ReLU-288            [-1, 608, 7, 7]               0
          Conv2d-289            [-1, 128, 7, 7]          77,952
     BatchNorm2d-290            [-1, 128, 7, 7]             256
            ReLU-291            [-1, 128, 7, 7]               0
          Conv2d-292             [-1, 32, 7, 7]          36,896
     BatchNorm2d-293            [-1, 640, 7, 7]           1,280
            ReLU-294            [-1, 640, 7, 7]               0
          Conv2d-295            [-1, 128, 7, 7]          82,048
     BatchNorm2d-296            [-1, 128, 7, 7]             256
            ReLU-297            [-1, 128, 7, 7]               0
          Conv2d-298             [-1, 32, 7, 7]          36,896
     BatchNorm2d-299            [-1, 672, 7, 7]           1,344
            ReLU-300            [-1, 672, 7, 7]               0
          Conv2d-301            [-1, 128, 7, 7]          86,144
     BatchNorm2d-302            [-1, 128, 7, 7]             256
            ReLU-303            [-1, 128, 7, 7]               0
          Conv2d-304             [-1, 32, 7, 7]          36,896
     BatchNorm2d-305            [-1, 704, 7, 7]           1,408
            ReLU-306            [-1, 704, 7, 7]               0
          Conv2d-307            [-1, 128, 7, 7]          90,240
     BatchNorm2d-308            [-1, 128, 7, 7]             256
            ReLU-309            [-1, 128, 7, 7]               0
          Conv2d-310             [-1, 32, 7, 7]          36,896
     BatchNorm2d-311            [-1, 736, 7, 7]           1,472
            ReLU-312            [-1, 736, 7, 7]               0
          Conv2d-313            [-1, 128, 7, 7]          94,336
     BatchNorm2d-314            [-1, 128, 7, 7]             256
            ReLU-315            [-1, 128, 7, 7]               0
          Conv2d-316             [-1, 32, 7, 7]          36,896
     BatchNorm2d-317            [-1, 768, 7, 7]           1,536
            ReLU-318            [-1, 768, 7, 7]               0
          Conv2d-319            [-1, 128, 7, 7]          98,432
     BatchNorm2d-320            [-1, 128, 7, 7]             256
            ReLU-321            [-1, 128, 7, 7]               0
          Conv2d-322             [-1, 32, 7, 7]          36,896
     BatchNorm2d-323            [-1, 800, 7, 7]           1,600
            ReLU-324            [-1, 800, 7, 7]               0
          Conv2d-325            [-1, 128, 7, 7]         102,528
     BatchNorm2d-326            [-1, 128, 7, 7]             256
            ReLU-327            [-1, 128, 7, 7]               0
          Conv2d-328             [-1, 32, 7, 7]          36,896
     BatchNorm2d-329            [-1, 832, 7, 7]           1,664
            ReLU-330            [-1, 832, 7, 7]               0
          Conv2d-331            [-1, 128, 7, 7]         106,624
     BatchNorm2d-332            [-1, 128, 7, 7]             256
            ReLU-333            [-1, 128, 7, 7]               0
          Conv2d-334             [-1, 32, 7, 7]          36,896
     BatchNorm2d-335            [-1, 864, 7, 7]           1,728
            ReLU-336            [-1, 864, 7, 7]               0
          Conv2d-337            [-1, 128, 7, 7]         110,720
     BatchNorm2d-338            [-1, 128, 7, 7]             256
            ReLU-339            [-1, 128, 7, 7]               0
          Conv2d-340             [-1, 32, 7, 7]          36,896
     BatchNorm2d-341            [-1, 896, 7, 7]           1,792
            ReLU-342            [-1, 896, 7, 7]               0
          Conv2d-343            [-1, 128, 7, 7]         114,816
     BatchNorm2d-344            [-1, 128, 7, 7]             256
            ReLU-345            [-1, 128, 7, 7]               0
          Conv2d-346             [-1, 32, 7, 7]          36,896
     BatchNorm2d-347            [-1, 928, 7, 7]           1,856
            ReLU-348            [-1, 928, 7, 7]               0
          Conv2d-349            [-1, 128, 7, 7]         118,912
     BatchNorm2d-350            [-1, 128, 7, 7]             256
            ReLU-351            [-1, 128, 7, 7]               0
          Conv2d-352             [-1, 32, 7, 7]          36,896
     BatchNorm2d-353            [-1, 960, 7, 7]           1,920
            ReLU-354            [-1, 960, 7, 7]               0
          Conv2d-355            [-1, 128, 7, 7]         123,008
     BatchNorm2d-356            [-1, 128, 7, 7]             256
            ReLU-357            [-1, 128, 7, 7]               0
          Conv2d-358             [-1, 32, 7, 7]          36,896
     BatchNorm2d-359            [-1, 992, 7, 7]           1,984
            ReLU-360            [-1, 992, 7, 7]               0
          Conv2d-361            [-1, 128, 7, 7]         127,104
     BatchNorm2d-362            [-1, 128, 7, 7]             256
            ReLU-363            [-1, 128, 7, 7]               0
          Conv2d-364             [-1, 32, 7, 7]          36,896
     BatchNorm2d-365           [-1, 1024, 7, 7]           2,048
            ReLU-366           [-1, 1024, 7, 7]               0
          Linear-367                    [-1, 2]           2,050
================================================================
Total params: 6,966,146
Trainable params: 6,966,146
Non-trainable params: 0
----------------------------------------------------------------
Input size (MB): 0.57
Forward/backward pass size (MB): 294.57
Params size (MB): 26.57
Estimated Total Size (MB): 321.72
----------------------------------------------------------------

三、 训练模型 

1. 编写训练函数

def train(dataloader,model,optimizer,loss_fn):
    size = len(dataloader.dataset)
    num_batches = len(dataloader)

    train_acc,train_loss = 0,0

    for X,y in dataloader:
        X,y = X.to(device),y.to(device)

        pred = model(X)
        loss = loss_fn(pred,y)

        optimizer.zero_grad()
        loss.backward()
        optimizer.step()

        train_loss += loss.item()
        train_acc += (pred.argmax(1) == y).type(torch.float).sum().item()

    train_loss /= num_batches
    train_acc /= size

    return train_acc,train_loss

2. 编写测试函数

测试函数和训练函数大致相同,但是由于不进行梯度下降对网络权重进行更新,所以不需要传入优化器

def test(dataloader, model, loss_fn):
    size = len(dataloader.dataset)  # 测试集的大小
    num_batches = len(dataloader)  # 批次数目, (size/batch_size,向上取整)
    test_loss, test_acc = 0, 0

    # 当不进行训练时,停止梯度更新,节省计算内存消耗
    with torch.no_grad():
        for imgs, target in dataloader:
            imgs, target = imgs.to(device), target.to(device)

            # 计算loss
            target_pred = model(imgs)
            loss = loss_fn(target_pred, target)

            test_loss += loss.item()
            test_acc += (target_pred.argmax(1) == target).type(torch.float).sum().item()

    test_acc /= size
    test_loss /= num_batches

    return test_acc, test_loss

3.正式训练

import copy

optimizer = torch.optim.Adam(model.parameters(), lr=1e-4)
loss_fn = nn.CrossEntropyLoss()  # 创建损失函数

epochs = 10

train_loss=[]
train_acc=[]
test_loss=[]
test_acc=[]
best_acc = 0

for epoch in range(epochs):

    model.train()
    epoch_train_acc,epoch_train_loss = train(train_dl,model,opt,loss_fn)

    model.eval()
    epoch_test_acc,epoch_test_loss = test(test_dl,model,loss_fn)

    if epoch_test_acc > best_acc:
        best_acc = epoch_test_acc
        best_model = copy.deepcopy(model)

    train_acc.append(epoch_train_acc)
    train_loss.append(epoch_train_loss)
    test_acc.append(epoch_test_acc)
    test_loss.append(epoch_test_loss)

    lr = opt.state_dict()['param_groups'][0]['lr']

    template = ('Epoch:{:2d}, Train_acc:{:.1f}%, Train_loss:{:.3f}, Test_acc:{:.1f}%, Test_loss:{:.3f}, Lr:{:.2E}')
    print(template.format(epoch+1, epoch_train_acc*100, epoch_train_loss,
                          epoch_test_acc*100, epoch_test_loss, lr))

# 保存最佳模型到文件中
PATH = '/best_model.pth'  # 保存的参数文件名
torch.save(best_model.state_dict(), PATH)

print('Done')

得到如下输出:

Epoch: 1, Train_acc:84.3%, Train_loss:0.359, Test_acc:86.7%, Test_loss:0.317, Lr:1.00E-04
Epoch: 2, Train_acc:87.6%, Train_loss:0.292, Test_acc:89.0%, Test_loss:0.270, Lr:1.00E-04
Epoch: 3, Train_acc:89.2%, Train_loss:0.260, Test_acc:89.8%, Test_loss:0.264, Lr:1.00E-04
Epoch: 4, Train_acc:90.2%, Train_loss:0.239, Test_acc:89.7%, Test_loss:0.259, Lr:1.00E-04
Epoch: 5, Train_acc:91.0%, Train_loss:0.222, Test_acc:90.3%, Test_loss:0.228, Lr:1.00E-04
Epoch: 6, Train_acc:91.1%, Train_loss:0.218, Test_acc:90.9%, Test_loss:0.236, Lr:1.00E-04
Epoch: 7, Train_acc:91.7%, Train_loss:0.201, Test_acc:82.4%, Test_loss:0.462, Lr:1.00E-04
Epoch: 8, Train_acc:92.5%, Train_loss:0.184, Test_acc:90.2%, Test_loss:0.264, Lr:1.00E-04
Epoch: 9, Train_acc:93.3%, Train_loss:0.172, Test_acc:90.2%, Test_loss:0.272, Lr:1.00E-04
Epoch:10, Train_acc:93.2%, Train_loss:0.171, Test_acc:90.7%, Test_loss:0.229, Lr:1.00E-04
Done

Process finished with exit code 0

四、 结果可视化

1. Loss&Accuracy

import matplotlib.pyplot as plt
#隐藏警告
import warnings
warnings.filterwarnings("ignore")               #忽略警告信息
plt.rcParams['font.sans-serif']    = ['SimHei'] # 用来正常显示中文标签
plt.rcParams['axes.unicode_minus'] = False      # 用来正常显示负号
plt.rcParams['figure.dpi']         = 100        #分辨率

epochs_range = range(epochs)

plt.figure(figsize=(12, 3))
plt.subplot(1, 2, 1)

plt.plot(epochs_range, train_acc, label='Training Accuracy')
plt.plot(epochs_range, test_acc, label='Test Accuracy')
plt.legend(loc='lower right')
plt.title('Training and Validation Accuracy')

plt.subplot(1, 2, 2)
plt.plot(epochs_range, train_loss, label='Training Loss')
plt.plot(epochs_range, test_loss, label='Test Loss')
plt.legend(loc='upper right')
plt.title('Training and Validation Loss')
plt.show()

得到的可视化结果:

五、个人理解

本文为实战帖,具体代码细节及网络理解在之前的文章中已有涉及,这里不再做细节阐述。

本文来自互联网用户投稿,该文观点仅代表作者本人,不代表本站立场。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如若转载,请注明出处:/a/703525.html

如若内容造成侵权/违法违规/事实不符,请联系我们进行投诉反馈qq邮箱809451989@qq.com,一经查实,立即删除!

相关文章

【全开源】Java 白色陪玩高端小程序源码陪练APP源码H5公众号源码电竞系统比心源码

🎮游戏陪玩白色高端小程序:你的专属游戏伙伴 🎉一、引言:游戏不再孤单 你是否曾在游戏中感到孤单,或是想寻找一个能够与你并肩作战的伙伴?现在,有了这款“游戏陪玩白色高端小程序”&#xff0…

Yapi代码执行 waf绕过实战记录

本文记录了2021年一次有趣的客户目标测试实战。这次经历颇为特别,因此我将其整理成笔记,并在此分享,希望对大家有所帮助。 事件起因 疫情在家办公,准备开始划水的一天,这时接到 boss 的电话说要做项目,老…

3D培训大师:深度沉浸式学习,筑牢基建安全防线

基建行业作为国家发展的基石,横跨公共事业、交通运输、水利环境等核心领域,施工现场的安全保障至关重要。这不仅是为了锤炼员工精湛的操作技能,更是为了提升每一位员工的安全意识,确保他们在复杂的工程环境中游刃有余,…

【Ardiuno】实验使用ESP32单片机实现高级web服务器暂时动态图表功能(图文)

接下来&#xff0c;我们继续实验示例代码中的Wifi“高级web服务器”&#xff0c;配置相关的无线密码后&#xff0c;开始实验 #include <WiFi.h> #include <WiFiClient.h> #include <WebServer.h> #include <ESPmDNS.h>const char *ssid "XIAOFE…

总负债50万,32万网贷已还清,征信花了,多久才能养好?

​近年来&#xff0c;我国网贷市场如同春天的竹笋&#xff0c;一夜间冒出无数。不论企业资质如何&#xff0c;都纷纷涉足网贷领域&#xff0c;导致网贷乱象丛生。在如此环境下&#xff0c;借钱变得轻而易举&#xff0c;门槛低得让人难以置信&#xff0c;许多人一缺钱就想到网贷…

使用CSS、JavaScript、jQuery三种方式实现手风琴效果

手风琴效果有不少&#xff0c;王者荣耀官网&#xff08;源网址 https://pvp.qq.com/raiders/ &#xff09;有一处周免英雄&#xff0c;使用的就是手风琴效果&#xff0c;如图所示。 我试着用css、js、jQuery三种方式实现了这种效果&#xff0c;最终效果差不多&#xff0c;美中不…

【产品经理】发票系统简述

一、发票类型 增值税电子普通发票&#xff1a;简称电票 增值税普通发票和增值税专用发票&#xff0c;简称&#xff1a;纸票 蓝票&#xff1a;开票金额为正值的发票。红票&#xff1a;发票金额为负值的发票。 注&#xff1a;专票电子化系统国家目前在推&#xff0c;后续有更新…

C++ 13 之 常量的引用

c13常量的引用.cpp #include <iostream> using namespace std;void showval(const int &a) {// 给函数形参设置const, 目的是防止后续误操作修改变量值// a 200; // 参数一旦设置为const常量&#xff0c;就无法直接修改&#xff0c;想改需要使用指针cout << …

Spring IoC注解

一、回顾反射机制 反射的调用三步&#xff1a;1&#xff09;获取类。2&#xff09;获取方法。3&#xff09;调用方法 调用方法&#xff1a;调用哪个对象&#xff0c;哪个方法&#xff0c;传什么参数&#xff0c;返回什么值。 方法&#xff08;Do&#xff09;类&#xff1a; …

GPT-4o更易越狱?北航南洋理工上万次测试详解!

GPT-4o&#xff0c;比上一代更容易遭受越狱攻击&#xff1f; 北航和南洋理工的研究人员通过上万次API查询&#xff0c;对GPT-4o的各种模态安全性进行了详细测试。 结果发现&#xff0c;GPT-4o新引入的语音模态带来了新的攻击面&#xff0c;多模态整体安全性不如GPT-4V。 GPT-4o…

SAP 使用BAPI更改分配字段ZUONR不生效解决方案

需求&#xff1a; 使用BAPI变更财务凭证上的ZUONR分配字段&#xff0c;正常情况下使用BAPI:FI_DOCUMENT_CHANGE或者FI_ITEMS_MASS_CHANGE都是可以进行变更的&#xff0c;但是项目上要变更的科目类型是KZ&#xff0c;导致这两个BAPI都没办法进行更改&#xff0c;故对此BAPI进行…

Kali Linux 2022.2 发布,包含 10 个新工具和WSL 改进

Offensive Security发布了Kali Linux 2022.2&#xff0c;这是2022年的第二个版本&#xff0c;具有桌面增强功能&#xff0c;有趣的愚人节屏幕保护程序&#xff0c;WSL GUI改进&#xff0c;终端调整&#xff0c;最重要的是&#xff0c;新的工具&#xff01; Kali Linux是一个Li…

中电金信:产教联合共育人才 AFAC2024金融智能创新大赛启动

当前&#xff0c;人工智能技术正在蓬勃发展&#xff0c;引领着各行各业迈向智能化的新纪元&#xff0c;特别是在金融科技领域&#xff0c;伴随人工智能技术的不断迭代与突破&#xff0c;金融服务的边界也在不断拓展&#xff0c;传统的金融业态正经历着深刻的变革与重塑。 与此同…

SpringCash

文章目录 简介引入依赖常用注解application.yml使用1. 启动类添加注解使用方法上添加注解 简介 Spring Cache是一个框架&#xff0c;实现了基于注解的缓存功能底层可以使用EHCache、Caffeine、Redis实现缓存。 注解一般放在Controller的方法上&#xff0c;CachePut 注解一般有…

给文件夹加密的最简单方法

安当TDE透明加密针对文件夹数据加密的保护方案主要包括以下几个方面&#xff1a; 1. 透明加密机制&#xff1a; 用户无需关心数据的加密和解密过程&#xff0c;操作文件夹时就像处理普通数据一样。加密和解密操作在后台自动进行&#xff0c;对用户和应用程序透明。 2. 高性能加…

【LeetCode 前缀和 + 哈希表】LC_560_和为K的子数组

文章目录 1. 和为K的子数组&#x1f197; 1. 和为K的子数组&#x1f197; 题目链接&#x1f517; &#x1f427;解题思路&#xff1a; 前缀和 哈希表 &#x1f34e; 设i为数组中的任意位置&#xff0c;⽤ sum[i] 表⽰ [0, i] 区间内所有元素的和。 &#x1f34e; 想知道有…

Mybatis05-一对多和多对一处理

多对一和一对多 多对一 多对一的理解&#xff1a; 多个学生对应一个老师 如果对于学生这边&#xff0c;就是一个多对一的现象&#xff0c;即从学生这边关联一个老师&#xff01; 结果映射&#xff08;resultMap&#xff09;&#xff1a; association 一个复杂类型的关联&…

Spark安装、解压、配置环境变量、WordCount

Spark 小白的spark学习笔记 2024/5/30 10:14 文章目录 Spark安装解压改名配置spark-env.sh重命名&#xff0c;配置slaves启动查看配置环境变量 工作流程maven创建maven项目配置maven更改pom.xml WordCount按照用户求消费额上传到spark集群上运行 安装 上传&#xff0c;直接拖拽…

RPA-UiBot6.0数据分发机器人—工作通知一键分发

前言 &#x1f4e2;友友们本篇博客的焦点机器人&#xff1a;信息群发机器人&#x1f44b; &#xff08;可以参考小北之前的微信群发助手和校园网更新提示助手两篇博客&#xff09;Uibot (RPA设计软件&#xff09;智能识别信息&#xff0b;微信群发助手&#xff08;升级版&…

Tomcat配置中最大线程数和句柄数分别意义和关系

哈喽&#xff0c;大家好&#xff0c;我是木头左&#xff01; 在Tomcat服务器的配置中&#xff0c;有两个参数是非常重要的&#xff1a;最大线程数和最大句柄数。这两个参数对于服务器的性能和稳定性有着至关重要的影响。本文将详细介绍这两个参数的意义和关系。 1. 最大线程数 …