G1 - 生成对抗网络(GAN)

  • 🍨 本文为🔗365天深度学习训练营 中的学习记录博客
  • 🍖 原作者:K同学啊

目录

  • 理论知识
    • 生成器
    • 判别器
    • 基本原理
  • 环境
  • 步骤
    • 环境设置
    • 数据准备
    • 模型设计
    • 模型训练
    • 模型效果展示
  • 总结与心得体会


理论知识

生成对抗网络(Generative Adversarial Networks, GAN)并不是指某一个具体的神经网络,而是指一类基于博弈思想而设计的神经网络。

GAN通常由两个部分组成,分别是:生成器(Generator)和判别器(Discriminator)。其中,生成器从某种噪声分布中随机采样作为输入,输出与训练集中真实样本非常相似的人工样本;判别器的输入则为真实样本或人工样本,其目的是将人工样本与真实的样本尽可能的区分出来。

理想情况下,经过足够多次的博弈,判别器会无法分辨出样本的真实性,这时可以认为生成器的结果已经逼真到让判别器无法分辨,就可以停止博弈了。

生成器

GANs中,生成器G选取随机噪声z作为输入,通过生成器的不断拟合,最终输出一个和真实样本尺寸相同,分布相似的伪造样本G(z)。生成器的本质是一个使用生成式方法的模型,它对数据的分布假设和分布参数进行学习,然后根据学习到的模型重新采样出新的样本。

从数据角度来说,生成式的方法对于特定的真实数据,首先要对数据的显式变量或隐含变量做分布假设;然后再将真实的数据输入到模型中对变量、参数进行训练;最后得到一个学习后的近似分布,这个分布可以用来生成新的数据。

从机器学习的角度来说,模型不会做分布假设,而是通过不断地学习真实的数据,对模型进行修正,最后也可以得到一个学习后的模型来做样本的生成任务。这种方法不同于数学方法,学习的过程对人类理解较不直观。

判别器

GANs中,判别器D对于输入的样本x,输出一个[0, 1]之间的概率数值D(x)。x可以是来自于原始数据集中的真实样本x,也可以是来自于生成器G的人工样本G(z)。通常约定,概率值 D(x) 越接近于1就代表样本为真实样本的可能性越大;反之概率值越小则此样本为伪造样本的可能性更大。也就是说,这里的判别器是一个二分类的神经网络分类器,目的不是判定输入数据的原始类别,而是区分输入样本的真伪。可以注意到,不管是在生成器中还是在判别器中,样本的类别信息都没有用到,也表明GAN是一个无监督学习的过程。

基本原理

GAN是博弈论和机器学习相结合的产物。于2014年Ian Goodfellow的论文中问世。
GAN模型结构示意图

环境

Python: 3.11
Pytorch: 2.3.0+cu121
显卡:GTX3070

步骤

环境设置

首先设置数据的目录

PARENT_DIR = 'GAN01/'

然后引用本次需要的包

import torch.nn as nn
import torch
import numpy as np
import os
from torch.autograd import Variable
from torch.utils.data import DataLoader
from torchvision.utils import save_image
from torchvision import transforms, datasets
import torch.optim as optim

创建需要用到的文件夹

os.makedirs(PARENT_DIR + 'images/', exist_ok=True) # 保存生成的图像
os.makedirs(PARENT_DIR + 'save/', exist_ok=True) # 保存模型参数
os.makedirs(PARENT_DIR + 'datasets', exist_ok=True) # 保存下载的数据集

超参数设置

n_epochs = 50  # 训练轮数
batch_size = 64 # 批次大小
lr = 2e-4 # 学习率
b1 = 0.5 # Adam参数1
b2 = 0.999 # Adam参数2
n_cpu = 2 # 数据加载时使用的cpu数量
latent_dim = 100 # 随机向量的维度
img_size = 28 # 图像的大小
channels = 1 # 图像的通道数
sample_intervals = 500 # 保存生成图像的间隔

img_shape = (channels, img_size, img_size) # 图像的尺寸
img_area = np.prod(img_shape)

# 全局设备
device = torch.device('cuda' if torch.cuda.is_available() else 'cpu')

数据准备

下载数据集

mnist = datasets.MNIST(root=PARENT_DIR+'/datasets', train=True, download=True, transform=transforms.Compose([
	transforms.Resize(img_size),
	transforms.ToTensor(),
	transforms.Normalize([0.5], [0.5]),
]))

配置数据加载器

dataloader = DataLoader(mnist, batch_size=batch_size, shuffle=True)

模型设计

鉴别器模块

class Discriminator(nn.Module):
	def __init__(self):
		super().__init__()
		self.model = nn.Sequential(
			nn.Linear(img_area, 512),
			nn.LeakyReLU(0.2, inplace=True),
			nn.Linear(512, 256),
			nn.LeakyReLU(0.2, inplace=True),
			nn.Linear(256, 1),
			nn.Sigmoid(),
		)
	def forward(self, img):
		img_flat = img.view(img.size(0), -1)
		validity = self.model(img_flat)
		return validity

生成器模块

class Generator(nn.Module):
	def __init__(self):
		super().__init__()
		def block(in_feat, out_feat, normalize=True):
			layers = [nn.Linear(in_feat, out_feat)]
			if normalize:
				layers.append(nn.BatchNorm1d(out_feat, 0.8))
			layers.append(nn.LeakyReLU(0.2, inplace=True))
			return layers
		self.model = nn.Sequential(
			*block(latent_dim, 128, normalize=False),
			*block(128, 256),
			*block(256, 512),
			*block(512, 1024),
			nn.Linear(1024, img_area),
			nn.Tanh(),
		)
	def forward(self, z):
		imgs = self.model(z)
		imgs = imgs.view(imgs.size(0), *img_shape)
		return imgs

模型训练

创建模型实例

# 生成器
generator = Generator().to(device)
# 判别器
discriminator = Discriminator().to(device)
# 损失函数
criterion = nn.BCELoss()

optimizer_G = optim.Adam(generator.parameters(), lr=lr, betas=(b1, b2))
optimizer_D = optim.Adam(discriminator.parameters(), lr=lr, betas=(b1, b2))

训练过程

for epoch in n_epochs:
	for i, (imgs, _ ) in enumerate(dataloader):
		imgs = imgs.view(imgs.size(0), -1)
		real_img = Variable(imgs).to(device)
		real_label = Variable(torch.ones(imgs.size(0), -1).to(device)
		fake_label = Variable(torch.zeros(imgs.size(0), -1).to(device)
		
		# 训练判别器 - 正例
		real_out = discriminator(real_img)
		loss_real_D = criterion(real_out, real_label)
		real_scores = real_out

		# 训练判别器 - 反例
		z = Variable(torch.randn(imgs.size(0), latent_dim).to(device)
		fake_img = generator(z).detach()
		fake_out = discriminator(fake_img)
		loss_fake_D = criterion(fake_out, fake_label)
		fake_scores = fake_out

		# 训练判别器
		loss_D = loss_real_D + loss_fake_D
		optimizer_D.zero_grad()
		loss_D.backward()
		optimizer_D.step()

		# 训练生成器
		z = Variable(torch.randn(imgs.size(0), latent_dim).to(device)
		fake_img = generator(z)
		output = discriminator(fake_img)
		loss_G = criterion(output, real_label)
		optimizer_G.zero_grad()
		loss_G.backward()
		optimizer_G.step()

		# 日志打印
		if (i+1) % 300 == 0:
			print('[Epoch %d/%d] [Batch %d/%d] [D loss: %f] [G loss: %f] [D real: %f] [D fake: %f]' % (epoch, n_epochs, i, len(dataloader), loss_D.item(). loss_G.item(), real_scores.data.mean(). fake_scores.data.mean()))
		# 保存训练过的图片
		batches_done = epoch * len(dataloader) + i
		if batches_done % sample_intervals == 0:
			save_image(fake_img.data[:25], (PARENT_DIR + 'images/%d.png') % batches_done, nrow-5, normalize=True)

训练过程

[Epoch 0/50] [Batch 299/938] [D loss: 1.420514] [G loss: 1.961581] [D real: 0.811836] [D fake: 0.694541]
[Epoch 0/50] [Batch 599/938] [D loss: 0.922259] [G loss: 1.839481] [D real: 0.734683] [D fake: 0.444037]
[Epoch 0/50] [Batch 899/938] [D loss: 0.883128] [G loss: 1.256595] [D real: 0.541903] [D fake: 0.187425]
[Epoch 1/50] [Batch 299/938] [D loss: 0.952949] [G loss: 0.963832] [D real: 0.596297] [D fake: 0.311674]
[Epoch 1/50] [Batch 599/938] [D loss: 0.950359] [G loss: 0.834425] [D real: 0.543845] [D fake: 0.203204]
[Epoch 1/50] [Batch 899/938] [D loss: 0.973158] [G loss: 1.313089] [D real: 0.631495] [D fake: 0.311304]
[Epoch 2/50] [Batch 299/938] [D loss: 0.812588] [G loss: 1.251890] [D real: 0.721250] [D fake: 0.340005]
[Epoch 2/50] [Batch 599/938] [D loss: 0.804412] [G loss: 1.442456] [D real: 0.651448] [D fake: 0.206814]
[Epoch 2/50] [Batch 899/938] [D loss: 0.796317] [G loss: 1.303452] [D real: 0.636756] [D fake: 0.235744]
[Epoch 3/50] [Batch 299/938] [D loss: 0.818155] [G loss: 1.293481] [D real: 0.613244] [D fake: 0.196964]
[Epoch 3/50] [Batch 599/938] [D loss: 0.929434] [G loss: 1.275021] [D real: 0.659611] [D fake: 0.259689]
[Epoch 3/50] [Batch 899/938] [D loss: 0.712755] [G loss: 2.305767] [D real: 0.800935] [D fake: 0.339025]
[Epoch 4/50] [Batch 299/938] [D loss: 0.740710] [G loss: 2.199127] [D real: 0.808014] [D fake: 0.370125]
[Epoch 4/50] [Batch 599/938] [D loss: 0.796852] [G loss: 2.494107] [D real: 0.848230] [D fake: 0.427257]
[Epoch 4/50] [Batch 899/938] [D loss: 0.801556] [G loss: 1.366514] [D real: 0.619396] [D fake: 0.125212]
[Epoch 5/50] [Batch 299/938] [D loss: 0.866250] [G loss: 2.395396] [D real: 0.806042] [D fake: 0.434844]
[Epoch 5/50] [Batch 599/938] [D loss: 0.802661] [G loss: 1.157616] [D real: 0.661669] [D fake: 0.212725]
[Epoch 5/50] [Batch 899/938] [D loss: 0.886610] [G loss: 1.533640] [D real: 0.700454] [D fake: 0.274916]
[Epoch 6/50] [Batch 299/938] [D loss: 0.677418] [G loss: 2.137760] [D real: 0.714654] [D fake: 0.156297]
[Epoch 6/50] [Batch 599/938] [D loss: 0.852677] [G loss: 1.679850] [D real: 0.712336] [D fake: 0.336238]
[Epoch 6/50] [Batch 899/938] [D loss: 0.894991] [G loss: 1.345476] [D real: 0.609528] [D fake: 0.158049]
[Epoch 7/50] [Batch 299/938] [D loss: 0.749311] [G loss: 2.185987] [D real: 0.786740] [D fake: 0.332746]
[Epoch 7/50] [Batch 599/938] [D loss: 0.823957] [G loss: 2.364408] [D real: 0.828811] [D fake: 0.423014]
[Epoch 7/50] [Batch 899/938] [D loss: 0.811460] [G loss: 1.441192] [D real: 0.611505] [D fake: 0.110525]
[Epoch 8/50] [Batch 299/938] [D loss: 0.653301] [G loss: 1.886070] [D real: 0.764065] [D fake: 0.245890]
[Epoch 8/50] [Batch 599/938] [D loss: 0.843600] [G loss: 1.917097] [D real: 0.792145] [D fake: 0.408509]
[Epoch 8/50] [Batch 899/938] [D loss: 0.798109] [G loss: 1.314119] [D real: 0.653977] [D fake: 0.185030]
[Epoch 9/50] [Batch 299/938] [D loss: 0.947685] [G loss: 3.152684] [D real: 0.910022] [D fake: 0.502504]
[Epoch 9/50] [Batch 599/938] [D loss: 0.959668] [G loss: 0.570251] [D real: 0.570070] [D fake: 0.106891]
[Epoch 9/50] [Batch 899/938] [D loss: 0.856521] [G loss: 1.218792] [D real: 0.566056] [D fake: 0.080608]
[Epoch 10/50] [Batch 299/938] [D loss: 0.935204] [G loss: 2.985981] [D real: 0.830788] [D fake: 0.465305]
[Epoch 10/50] [Batch 599/938] [D loss: 0.692477] [G loss: 1.852279] [D real: 0.835356] [D fake: 0.337193]
[Epoch 10/50] [Batch 899/938] [D loss: 0.763710] [G loss: 1.751910] [D real: 0.770129] [D fake: 0.313941]
[Epoch 11/50] [Batch 299/938] [D loss: 0.703495] [G loss: 1.861948] [D real: 0.808757] [D fake: 0.338974]
[Epoch 11/50] [Batch 599/938] [D loss: 0.815235] [G loss: 2.208552] [D real: 0.757724] [D fake: 0.324712]
[Epoch 11/50] [Batch 899/938] [D loss: 0.997158] [G loss: 2.022480] [D real: 0.701744] [D fake: 0.380837]
[Epoch 12/50] [Batch 299/938] [D loss: 0.759668] [G loss: 1.911369] [D real: 0.777231] [D fake: 0.329774]
[Epoch 12/50] [Batch 599/938] [D loss: 0.845963] [G loss: 2.053480] [D real: 0.846165] [D fake: 0.441215]
[Epoch 12/50] [Batch 899/938] [D loss: 1.091019] [G loss: 1.313121] [D real: 0.482313] [D fake: 0.064774]
[Epoch 13/50] [Batch 299/938] [D loss: 0.860023] [G loss: 1.465194] [D real: 0.635496] [D fake: 0.124226]
[Epoch 13/50] [Batch 599/938] [D loss: 0.756671] [G loss: 1.716278] [D real: 0.674119] [D fake: 0.216907]
[Epoch 13/50] [Batch 899/938] [D loss: 0.716931] [G loss: 1.802271] [D real: 0.683680] [D fake: 0.195853]
[Epoch 14/50] [Batch 299/938] [D loss: 1.083009] [G loss: 1.358789] [D real: 0.642891] [D fake: 0.376322]
[Epoch 14/50] [Batch 599/938] [D loss: 1.075695] [G loss: 0.908844] [D real: 0.521514] [D fake: 0.123268]
[Epoch 14/50] [Batch 899/938] [D loss: 0.943146] [G loss: 1.356610] [D real: 0.595750] [D fake: 0.180492]
[Epoch 15/50] [Batch 299/938] [D loss: 0.929019] [G loss: 0.617656] [D real: 0.552842] [D fake: 0.151570]
[Epoch 15/50] [Batch 599/938] [D loss: 1.052583] [G loss: 2.127165] [D real: 0.853073] [D fake: 0.523554]
[Epoch 15/50] [Batch 899/938] [D loss: 1.021363] [G loss: 0.625215] [D real: 0.529443] [D fake: 0.186696]
[Epoch 16/50] [Batch 299/938] [D loss: 0.929158] [G loss: 2.104063] [D real: 0.770136] [D fake: 0.399831]
[Epoch 16/50] [Batch 599/938] [D loss: 0.832833] [G loss: 1.665707] [D real: 0.736168] [D fake: 0.343671]
[Epoch 16/50] [Batch 899/938] [D loss: 0.730055] [G loss: 1.724510] [D real: 0.755085] [D fake: 0.289238]
[Epoch 17/50] [Batch 299/938] [D loss: 0.677890] [G loss: 1.755648] [D real: 0.779917] [D fake: 0.276746]
[Epoch 17/50] [Batch 599/938] [D loss: 0.920615] [G loss: 1.416380] [D real: 0.681394] [D fake: 0.310024]
[Epoch 17/50] [Batch 899/938] [D loss: 0.937411] [G loss: 2.415898] [D real: 0.789968] [D fake: 0.450372]
[Epoch 18/50] [Batch 299/938] [D loss: 0.841531] [G loss: 1.211814] [D real: 0.672196] [D fake: 0.268470]
[Epoch 18/50] [Batch 599/938] [D loss: 0.806454] [G loss: 1.246511] [D real: 0.657565] [D fake: 0.237899]
[Epoch 18/50] [Batch 899/938] [D loss: 0.965483] [G loss: 1.558758] [D real: 0.590535] [D fake: 0.202962]
[Epoch 19/50] [Batch 299/938] [D loss: 0.941242] [G loss: 1.201063] [D real: 0.580414] [D fake: 0.159809]
[Epoch 19/50] [Batch 599/938] [D loss: 0.763269] [G loss: 1.117927] [D real: 0.687018] [D fake: 0.217924]
[Epoch 19/50] [Batch 899/938] [D loss: 1.208787] [G loss: 0.476900] [D real: 0.450625] [D fake: 0.153400]
[Epoch 20/50] [Batch 299/938] [D loss: 0.938517] [G loss: 1.020504] [D real: 0.583086] [D fake: 0.211353]
[Epoch 20/50] [Batch 599/938] [D loss: 0.814142] [G loss: 1.717330] [D real: 0.767125] [D fake: 0.357556]
[Epoch 20/50] [Batch 899/938] [D loss: 0.914405] [G loss: 1.084474] [D real: 0.614264] [D fake: 0.197619]
[Epoch 21/50] [Batch 299/938] [D loss: 0.911557] [G loss: 1.857509] [D real: 0.690771] [D fake: 0.324216]
[Epoch 21/50] [Batch 599/938] [D loss: 0.846429] [G loss: 1.522789] [D real: 0.756585] [D fake: 0.380105]
[Epoch 21/50] [Batch 899/938] [D loss: 0.903101] [G loss: 1.311370] [D real: 0.641948] [D fake: 0.270648]
[Epoch 22/50] [Batch 299/938] [D loss: 1.136434] [G loss: 1.967754] [D real: 0.829407] [D fake: 0.539150]
[Epoch 22/50] [Batch 599/938] [D loss: 0.761561] [G loss: 1.451730] [D real: 0.719943] [D fake: 0.253529]
[Epoch 22/50] [Batch 899/938] [D loss: 0.947273] [G loss: 1.578539] [D real: 0.757281] [D fake: 0.402005]
[Epoch 23/50] [Batch 299/938] [D loss: 0.984664] [G loss: 1.381901] [D real: 0.676672] [D fake: 0.345036]
[Epoch 23/50] [Batch 599/938] [D loss: 1.056997] [G loss: 1.273649] [D real: 0.645240] [D fake: 0.341262]
[Epoch 23/50] [Batch 899/938] [D loss: 0.846916] [G loss: 1.618449] [D real: 0.673545] [D fake: 0.255247]
[Epoch 24/50] [Batch 299/938] [D loss: 1.020407] [G loss: 2.467137] [D real: 0.789029] [D fake: 0.483512]
[Epoch 24/50] [Batch 599/938] [D loss: 1.039248] [G loss: 1.711153] [D real: 0.794231] [D fake: 0.498774]
[Epoch 24/50] [Batch 899/938] [D loss: 0.891359] [G loss: 1.549422] [D real: 0.648600] [D fake: 0.242511]
[Epoch 25/50] [Batch 299/938] [D loss: 0.828505] [G loss: 1.678849] [D real: 0.726778] [D fake: 0.317394]
[Epoch 25/50] [Batch 599/938] [D loss: 0.835318] [G loss: 1.619812] [D real: 0.776715] [D fake: 0.385841]
[Epoch 25/50] [Batch 899/938] [D loss: 0.903816] [G loss: 2.057058] [D real: 0.759536] [D fake: 0.398490]
[Epoch 26/50] [Batch 299/938] [D loss: 0.963138] [G loss: 2.443241] [D real: 0.829611] [D fake: 0.456530]
[Epoch 26/50] [Batch 599/938] [D loss: 1.219956] [G loss: 0.801282] [D real: 0.441290] [D fake: 0.112515]
[Epoch 26/50] [Batch 899/938] [D loss: 1.282843] [G loss: 0.742314] [D real: 0.440508] [D fake: 0.091521]
[Epoch 27/50] [Batch 299/938] [D loss: 1.044027] [G loss: 1.633780] [D real: 0.730091] [D fake: 0.396968]
[Epoch 27/50] [Batch 599/938] [D loss: 1.039986] [G loss: 1.568461] [D real: 0.674084] [D fake: 0.377297]
[Epoch 27/50] [Batch 899/938] [D loss: 0.949207] [G loss: 1.193219] [D real: 0.626887] [D fake: 0.216131]
[Epoch 28/50] [Batch 299/938] [D loss: 0.813487] [G loss: 1.266051] [D real: 0.645924] [D fake: 0.218628]
[Epoch 28/50] [Batch 599/938] [D loss: 0.849271] [G loss: 1.476346] [D real: 0.680232] [D fake: 0.284335]
[Epoch 28/50] [Batch 899/938] [D loss: 0.831895] [G loss: 1.817335] [D real: 0.720941] [D fake: 0.326133]
[Epoch 29/50] [Batch 299/938] [D loss: 0.772127] [G loss: 1.586798] [D real: 0.751645] [D fake: 0.311902]
[Epoch 29/50] [Batch 599/938] [D loss: 0.862494] [G loss: 1.736295] [D real: 0.752732] [D fake: 0.362572]
[Epoch 29/50] [Batch 899/938] [D loss: 0.880609] [G loss: 1.480912] [D real: 0.700861] [D fake: 0.325915]
[Epoch 30/50] [Batch 299/938] [D loss: 0.933771] [G loss: 1.715545] [D real: 0.721025] [D fake: 0.349197]
[Epoch 30/50] [Batch 599/938] [D loss: 0.795781] [G loss: 1.756403] [D real: 0.724120] [D fake: 0.293765]
[Epoch 30/50] [Batch 899/938] [D loss: 0.896606] [G loss: 1.373884] [D real: 0.632874] [D fake: 0.245156]
[Epoch 31/50] [Batch 299/938] [D loss: 0.852626] [G loss: 1.650979] [D real: 0.711945] [D fake: 0.285480]
[Epoch 31/50] [Batch 599/938] [D loss: 0.742924] [G loss: 1.555502] [D real: 0.769613] [D fake: 0.325262]
[Epoch 31/50] [Batch 899/938] [D loss: 0.762007] [G loss: 1.213594] [D real: 0.643857] [D fake: 0.145954]
[Epoch 32/50] [Batch 299/938] [D loss: 0.993882] [G loss: 1.768500] [D real: 0.755884] [D fake: 0.419389]
[Epoch 32/50] [Batch 599/938] [D loss: 0.848629] [G loss: 1.113061] [D real: 0.616361] [D fake: 0.175378]
[Epoch 32/50] [Batch 899/938] [D loss: 0.698725] [G loss: 1.573485] [D real: 0.740353] [D fake: 0.234537]
[Epoch 33/50] [Batch 299/938] [D loss: 0.755047] [G loss: 1.388250] [D real: 0.708191] [D fake: 0.241282]
[Epoch 33/50] [Batch 599/938] [D loss: 0.990773] [G loss: 2.253703] [D real: 0.805822] [D fake: 0.437487]
[Epoch 33/50] [Batch 899/938] [D loss: 0.830166] [G loss: 1.082243] [D real: 0.652710] [D fake: 0.227879]
[Epoch 34/50] [Batch 299/938] [D loss: 1.024394] [G loss: 1.954563] [D real: 0.836840] [D fake: 0.475636]
[Epoch 34/50] [Batch 599/938] [D loss: 0.840215] [G loss: 1.340230] [D real: 0.764619] [D fake: 0.369560]
[Epoch 34/50] [Batch 899/938] [D loss: 1.438599] [G loss: 0.660933] [D real: 0.413844] [D fake: 0.109160]
[Epoch 35/50] [Batch 299/938] [D loss: 0.917200] [G loss: 0.927326] [D real: 0.610525] [D fake: 0.210112]
[Epoch 35/50] [Batch 599/938] [D loss: 0.994579] [G loss: 0.908950] [D real: 0.594053] [D fake: 0.224407]
[Epoch 35/50] [Batch 899/938] [D loss: 0.762671] [G loss: 1.456851] [D real: 0.686805] [D fake: 0.211406]
[Epoch 36/50] [Batch 299/938] [D loss: 0.956092] [G loss: 1.296812] [D real: 0.609507] [D fake: 0.261673]
[Epoch 36/50] [Batch 599/938] [D loss: 1.045313] [G loss: 0.625988] [D real: 0.543889] [D fake: 0.148871]
[Epoch 36/50] [Batch 899/938] [D loss: 0.914145] [G loss: 1.017588] [D real: 0.629813] [D fake: 0.258396]
[Epoch 37/50] [Batch 299/938] [D loss: 1.106073] [G loss: 2.715152] [D real: 0.883800] [D fake: 0.552771]
[Epoch 37/50] [Batch 599/938] [D loss: 0.908618] [G loss: 1.260299] [D real: 0.645083] [D fake: 0.216520]
[Epoch 37/50] [Batch 899/938] [D loss: 0.703876] [G loss: 1.610951] [D real: 0.671662] [D fake: 0.161172]
[Epoch 38/50] [Batch 299/938] [D loss: 0.884505] [G loss: 1.696165] [D real: 0.772144] [D fake: 0.350858]
[Epoch 38/50] [Batch 599/938] [D loss: 0.844707] [G loss: 1.694735] [D real: 0.809112] [D fake: 0.404328]
[Epoch 38/50] [Batch 899/938] [D loss: 0.796929] [G loss: 1.719817] [D real: 0.733676] [D fake: 0.300009]
[Epoch 39/50] [Batch 299/938] [D loss: 0.761804] [G loss: 2.002748] [D real: 0.821613] [D fake: 0.367843]
[Epoch 39/50] [Batch 599/938] [D loss: 1.006947] [G loss: 1.178393] [D real: 0.589913] [D fake: 0.217992]
[Epoch 39/50] [Batch 899/938] [D loss: 0.936502] [G loss: 1.313496] [D real: 0.586952] [D fake: 0.150109]
[Epoch 40/50] [Batch 299/938] [D loss: 1.180398] [G loss: 0.819056] [D real: 0.525922] [D fake: 0.152922]
[Epoch 40/50] [Batch 599/938] [D loss: 0.921446] [G loss: 2.024451] [D real: 0.776659] [D fake: 0.412260]
[Epoch 40/50] [Batch 899/938] [D loss: 0.839164] [G loss: 1.452876] [D real: 0.710732] [D fake: 0.266918]
[Epoch 41/50] [Batch 299/938] [D loss: 0.788981] [G loss: 1.553157] [D real: 0.698234] [D fake: 0.259889]
[Epoch 41/50] [Batch 599/938] [D loss: 0.906144] [G loss: 1.927676] [D real: 0.746730] [D fake: 0.321029]
[Epoch 41/50] [Batch 899/938] [D loss: 1.006926] [G loss: 1.514269] [D real: 0.658016] [D fake: 0.297868]
[Epoch 42/50] [Batch 299/938] [D loss: 0.912167] [G loss: 1.337582] [D real: 0.640350] [D fake: 0.238920]
[Epoch 42/50] [Batch 599/938] [D loss: 1.029311] [G loss: 1.269561] [D real: 0.581456] [D fake: 0.176677]
[Epoch 42/50] [Batch 899/938] [D loss: 0.851943] [G loss: 2.247482] [D real: 0.792886] [D fake: 0.387960]
[Epoch 43/50] [Batch 299/938] [D loss: 0.813233] [G loss: 1.892390] [D real: 0.755459] [D fake: 0.335725]
[Epoch 43/50] [Batch 599/938] [D loss: 0.849235] [G loss: 1.451456] [D real: 0.713452] [D fake: 0.277743]
[Epoch 43/50] [Batch 899/938] [D loss: 0.796001] [G loss: 1.534391] [D real: 0.769308] [D fake: 0.326947]
[Epoch 44/50] [Batch 299/938] [D loss: 0.828683] [G loss: 2.295016] [D real: 0.865256] [D fake: 0.432770]
[Epoch 44/50] [Batch 599/938] [D loss: 0.784839] [G loss: 1.292179] [D real: 0.740413] [D fake: 0.307002]
[Epoch 44/50] [Batch 899/938] [D loss: 0.869467] [G loss: 1.554963] [D real: 0.669150] [D fake: 0.236974]
[Epoch 45/50] [Batch 299/938] [D loss: 0.955422] [G loss: 0.962375] [D real: 0.612503] [D fake: 0.210810]
[Epoch 45/50] [Batch 599/938] [D loss: 0.845292] [G loss: 2.265598] [D real: 0.802860] [D fake: 0.385311]
[Epoch 45/50] [Batch 899/938] [D loss: 0.902106] [G loss: 2.050767] [D real: 0.793865] [D fake: 0.394070]
[Epoch 46/50] [Batch 299/938] [D loss: 0.775542] [G loss: 1.365795] [D real: 0.687847] [D fake: 0.196163]
[Epoch 46/50] [Batch 599/938] [D loss: 0.697465] [G loss: 1.983589] [D real: 0.843698] [D fake: 0.332088]
[Epoch 46/50] [Batch 899/938] [D loss: 0.925736] [G loss: 1.990964] [D real: 0.757284] [D fake: 0.314272]
[Epoch 47/50] [Batch 299/938] [D loss: 0.915572] [G loss: 0.853581] [D real: 0.599163] [D fake: 0.172304]
[Epoch 47/50] [Batch 599/938] [D loss: 0.809719] [G loss: 2.042508] [D real: 0.859574] [D fake: 0.436016]
[Epoch 47/50] [Batch 899/938] [D loss: 0.823716] [G loss: 2.142160] [D real: 0.778661] [D fake: 0.324191]
[Epoch 48/50] [Batch 299/938] [D loss: 0.913445] [G loss: 1.654897] [D real: 0.653497] [D fake: 0.254123]
[Epoch 48/50] [Batch 599/938] [D loss: 0.686823] [G loss: 2.443697] [D real: 0.748190] [D fake: 0.237947]
[Epoch 48/50] [Batch 899/938] [D loss: 0.918376] [G loss: 1.329718] [D real: 0.659142] [D fake: 0.210957]
[Epoch 49/50] [Batch 299/938] [D loss: 1.088778] [G loss: 1.547866] [D real: 0.574101] [D fake: 0.215879]
[Epoch 49/50] [Batch 599/938] [D loss: 0.858425] [G loss: 1.300929] [D real: 0.698450] [D fake: 0.276681]
[Epoch 49/50] [Batch 899/938] [D loss: 1.208253] [G loss: 2.096180] [D real: 0.777272] [D fake: 0.505027]

模型效果展示

刚开始训练时输出的图像
第0轮结果
50个迭代后输出的图像
第49轮结果

总结与心得体会

GAN是一个非常有趣的网络,它使用了一个非常简直的二分类器来做判别器,然后使用一个输入与输出相同的模型来做生成器。生成器会学习到给定的数据中的分布情况,从而模拟出与给定数据同样的分布,作为生成器的输出。

经过50个轮次的运行,图像竟然真的可以开始输出一些和原始图像非常相似的结果,让我感觉非常的不可思议。从计算机其它领域获得一些概念,然后融入到人工智能中,有时候会有非常不错的结果。

本文来自互联网用户投稿,该文观点仅代表作者本人,不代表本站立场。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如若转载,请注明出处:/a/591017.html

如若内容造成侵权/违法违规/事实不符,请联系我们进行投诉反馈qq邮箱809451989@qq.com,一经查实,立即删除!

相关文章

Jenkins流水线部署springboot项目

文章目录 Jenkins流水线任务介绍Jenkins流水线任务构建Jenkins流水线任务Groovy脚本Jenkinsfile实现 Jenkins流水线任务实现参数化构建拉取Git代码构建代码制作自定义镜像并发布 Jenkins流水线任务介绍 之前采用Jenkins的自由风格构建的项目,每个步骤流程都要通过不…

二维数组的鞍点(C语言)

一、鞍点解释&#xff1b; 鞍点就是该位置上的元素在该行上最大、在该列上最小&#xff1b; 二、N-S流程图&#xff1b; 三、运行结果&#xff1b; 四、源代码&#xff1b; # define _CRT_SECURE_NO_WARNINGS # include <stdio.h>int main() {//初始化变量值&#xff…

Java_JVM_JVMs

JVM 官方文档说明文档目录 官方文档 JVM Specification 说明 以Java SE 17为标准 文档目录 2&#xff1a;JVM 结构 class文件数据类型 基本数据类型引用数据类型 运行时数据区 栈帧 其他内容 对象的表示浮点数运算特殊方法 初始化方法【实例、类】多态方法 3&#xff…

AI代理架构的发展:从单一到多代理系统的演进及其影响分析

每周跟踪AI热点新闻动向和震撼发展 想要探索生成式人工智能的前沿进展吗&#xff1f;订阅我们的简报&#xff0c;深入解析最新的技术突破、实际应用案例和未来的趋势。与全球数同行一同&#xff0c;从行业内部的深度分析和实用指南中受益。不要错过这个机会&#xff0c;成为AI领…

Python中无法pip的解决办法和pip的介绍

什么是pip&#xff1f; PIP是通用的Python包管理工具&#xff0c;提供了对 Python 包的查找、下载、安装、卸载、更新等功能。安装诸如Pygame、Pymysql、requests、Django等Python包时&#xff0c;都要用到pip。 注意&#xff1a;在Python3.4&#xff08;一说是3.6&#xff09…

自动化滇医通

###我已经将数据爬取出来### 现在开源集合大家的思路一起研究 &#xff08;请更换ip 以及 暂停时间 不然会提示违规操作&#xff09; 脚本读取预约信息后开始随机抢一家的&#xff0c;qiang方法里面请自行修改抓包数据参数&#xff01;&#xff01; 现在开源大家一起讨论 pyt…

富文本编辑器 iOS

https://gitee.com/klkxxy/WGEditor-mobile#wgeditor-mobile 采用iOS系统浏览器做的一款富文本编辑器工具。 原理就是使用WKWebView加载一个本地的一个html文件&#xff0c;从而达到编辑器功能的效果&#xff01; 由于浏览器的一些特性等&#xff0c;富文本编辑器手机端很难做…

【开源物联网平台】window环境下搭建调试监控设备环境

&#x1f308; 个人主页&#xff1a;帐篷Li &#x1f525; 系列专栏&#xff1a;FastBee物联网开源项目 &#x1f4aa;&#x1f3fb; 专注于简单&#xff0c;易用&#xff0c;可拓展&#xff0c;低成本商业化的AIOT物联网解决方案 目录 一、使用docker脚本部署zlmediakit 1.1 …

WebDriver使用带用户名密码验证的IP代理解决方案

背景&#xff0c;使用python3 selenium 先定义一个方法&#xff0c;这里主要用到了chrome插件的功能&#xff0c;利用这个插件来放进代理内容。 def create_proxy_auth_extension(proxy_host, proxy_port,proxy_username, proxy_password, schemehttp):manifest_json "…

【HAL库 STM32】输入捕获并实现超声波测距

文章目录 HC-SR04 超声波模块简介HC-SR04 工作原理如何使用HC-SR04模块程序效果 一、工程配置代码如果您发现文章有错误请与我留言&#xff0c;感谢 HC-SR04 超声波模块简介 HC-SR04 工作原理 模块有2个超声波换能器&#xff08;如图所示&#xff09;&#xff0c;一个发出声波…

Spark Stream

一、Spark Streaming是什么 Spark Streaming 用于流式数据的处理。Spark Streaming 支持的数据输入源很多&#xff0c;例如&#xff1a;Kafka、Flume、Twitter、ZeroMQ 和简单的 TCP 套接字等等。数据输入后可以用 Spark 的高度抽象原语如&#xff1a;map、reduce、join、wind…

基于SSM SpringBoot vue教务排课系统

基于SSM SpringBoot vue教务排课系统 系统功能 登录 个人中心 学生信息管理 教师信息管理 课室信息管理 班级信息管理 系别信息管理 专业信息管理 课程信息管理 选课信息管理 课表信息管理 开发环境和技术 开发语言&#xff1a;Java 使用框架: SSM(Spring SpringMVC Myba…

✔ ★Java大项目——用Java模拟RabbitMQ实现一个消息队列(二)【创建核心类、封装数据库操作】

✔ ★Java大项目——用Java模拟RabbitMQ实现一个消息队列 四. 项⽬创建五. 创建核⼼类 ★创建 Exchange&#xff08;名字、类型、持久化、自动删除、参数&#xff09;创建 MSGQueue&#xff08;名字、持久化、独占标识&#xff09;创建 Binding&#xff08;交换机名字、队列名字…

pymeshlab加载物体、创建UV映射(基于平面投影)、创建并保存UV纹理和物体模型

一、关于环境 请参考&#xff1a;pymeshlab遍历文件夹中模型、缩放并导出指定格式-CSDN博客 二、关于代码 本文所给出代码仅为参考&#xff0c;禁止转载和引用&#xff0c;仅供个人学习。本文所给出的例子是https://download.csdn.net/download/weixin_42605076/89233917中的…

MySQL45讲(一)(40)

回顾binlog_formatstatement STATEMENT 记录SQL语句。日志文件小&#xff0c;节约IO&#xff0c;但是对一些系统函数不能准确复制或不能复制&#xff0c;如now()、uuid()等 在RR隔离级别下&#xff0c;binlog_formatstatement 如果执行insert select from 这条语句是对于一张…

uniapp 自定义相机插件(组件版、缩放、裁剪)组件 Ba-CameraView

自定义相机插件&#xff08;组件版、缩放、裁剪&#xff09; Ba-CameraView 简介&#xff08;下载地址&#xff09; Ba-CameraView 是一款自定义相机拍照组件&#xff0c;支持任意界面&#xff0c;支持裁剪 支持任意自定义界面支持手势缩放支持裁剪&#xff08;手势拖动、比…

35.Docker-数据卷,目录挂载

注意&#xff1a;在容器内修改文件是不推荐的。 1.修改不方便&#xff0c;比如vi命令都无法使用。 2.容器内修改&#xff0c;没有日志记录的。 问题&#xff1a;那应该如何修改容器中的文件呢&#xff1f; 数据卷 volume是一个虚拟目录&#xff0c;指向宿主机文件系统中的…

Python量化炒股的获取数据函数—get_index_stocks()

Python量化炒股的获取数据函数—get_index_stocks() 利用get_industry_stocks()函数可以获取在给定日期一个行业的所有股票代码列表&#xff0c;其语法格式如下&#xff1a; get_industry_stocks(industry_code, dateNone)各项参数的意义 参数date和返回值&#xff0c;都与g…

【海博】雅思该怎么练?

文章目录 前言 备考计划 模拟考试 参考资料 前言 见《【海博】浅析海博深造》 见《【海博】雅思和托福该考哪个&#xff1f;》 见《【海博】雅思该怎么考&#xff1f;》 见《【海博】雅思考什么&#xff1f;》 备考计划 第一周确定你的目标考试分数。 做一套雅思模拟试题&…

头歌:RDD的创建 - Python

第1关&#xff1a;集合并行化创建RDD 任务描述 本关任务&#xff1a;编写一个集合并行化创建RDD的程序。 相关知识 为了完成本关任务&#xff0c;你需要掌握&#xff1a;1.如何使用集合并行化创建一个Spark RDD 。 什么是 RDD RDD&#xff08;Resilient Distributed Dataset&…