当前位置: 首页 > news >正文

南开集团网站建设域名注册查询工具

南开集团网站建设,域名注册查询工具,jeecg 3.7 网站开发,政府建设网站申请文章目录一、 Himmelblau 优化二、多分类实战-Mnist三、Sequential与CPU加速-Mnist四、visidom可视化一、 Himmelblau 优化 Himmelblau 是一个具有4个最优值的2维目标函数。其函数和最优值点如下: 图象绘制: import numpy as np from matplotlib impo…

文章目录

      • 一、 Himmelblau 优化
      • 二、多分类实战-Mnist
      • 三、Sequential与CPU加速-Mnist
      • 四、visidom可视化

一、 Himmelblau 优化

Himmelblau 是一个具有4个最优值的2维目标函数。其函数和最优值点如下:
在这里插入图片描述
图象绘制:

import numpy as np
from matplotlib import pyplot as pltdef himmelblau(x):return (x[0] ** 2 + x[1] - 11) ** 2 + (x[0] + x[1] ** 2 - 7) ** 2x = np.arange(-6, 6, 0.1)
y = np.arange(-6, 6, 0.1)
print('x,y range:', x.shape, y.shape)
X, Y = np.meshgrid(x, y)
print('X,Y maps:', X.shape, Y.shape)
Z = himmelblau([X, Y])fig = plt.figure('himmelblau')
ax = fig.add_subplot(projection='3d')
ax.plot_surface(X, Y, Z)
ax.view_init(60, -30)
ax.set_xlabel('x')
ax.set_ylabel('y')
plt.show()

在这里插入图片描述
Gradient Descent:

# [1., 0.], [-4, 0.], [4, 0.]
x = torch.tensor([-4., 0.], requires_grad=True)
optimizer = torch.optim.Adam([x], lr=1e-3)
for step in range(20000):pred = himmelblau(x)# 清空各参数的梯度optimizer.zero_grad()pred.backward()# 优化器更新参数x'=x-lr*梯度optimizer.step()if step % 2000 == 0:print ('step {}: x = {}, f(x) = {}'.format(step, x.tolist(), pred.item()))

在这里插入图片描述
给予x不同的初始化位置可以得到不同的收敛结果和次数。说明初始位置的选择对于收敛的过程和结果非常重要。

二、多分类实战-Mnist

import torch
import torch.nn as nn
import torch.nn.functional as F
import torch.optim as optim
from torchvision import datasets, transformsbatch_size=200
learning_rate=0.01
epochs=10train_loader = torch.utils.data.DataLoader(datasets.MNIST('../data', train=True, download=True,transform=transforms.Compose([transforms.ToTensor(),transforms.Normalize((0.1307,), (0.3081,))])),batch_size=batch_size, shuffle=True)
test_loader = torch.utils.data.DataLoader(datasets.MNIST('../data', train=False, transform=transforms.Compose([transforms.ToTensor(),transforms.Normalize((0.1307,), (0.3081,))])),batch_size=batch_size, shuffle=True)#Network Architecture
w1, b1 = torch.randn(200, 784, requires_grad=True),\torch.zeros(200, requires_grad=True)
w2, b2 = torch.randn(200, 200, requires_grad=True),\torch.zeros(200, requires_grad=True)
w3, b3 = torch.randn(10, 200, requires_grad=True),\torch.zeros(10, requires_grad=True)
#kaiming初始化
torch.nn.init.kaiming_normal_(w1)
torch.nn.init.kaiming_normal_(w2)
torch.nn.init.kaiming_normal_(w3)def forward(x):x = x@w1.t() + b1x = F.relu(x)x = x@w2.t() + b2x = F.relu(x)x = x@w3.t() + b3x = F.relu(x)return xoptimizer = optim.SGD([w1, b1, w2, b2, w3, b3], lr=learning_rate)
# cross-entropy 等同于 softmax + log + nll_loss三个和
criteon = nn.CrossEntropyLoss()for epoch in range(epochs):for batch_idx, (data, target) in enumerate(train_loader):data = data.view(-1, 28*28)logits = forward(data)loss = criteon(logits, target)optimizer.zero_grad()loss.backward()optimizer.step()if batch_idx % 100 == 0:print('Train Epoch: {} [{}/{} ({:.0f}%)]\tLoss: {:.6f}'.format(epoch, batch_idx * len(data), len(train_loader.dataset),100. * batch_idx / len(train_loader), loss.item()))test_loss = 0correct = 0for data, target in test_loader:data = data.view(-1, 28 * 28)logits = forward(data)test_loss += criteon(logits, target).item()pred = logits.data.max(1)[1]correct += pred.eq(target.data).sum()test_loss /= len(test_loader.dataset)print('\nTest set: Average loss: {:.4f}, Accuracy: {}/{} ({:.0f}%)\n'.format(test_loss, correct, len(test_loader.dataset),100. * correct / len(test_loader.dataset)))

image-20230302210959169
注意事项:

  • Batch_Size太小导致收敛过慢,太大导致易陷入sharp minima,泛化性不好
  • 注意初始化这个关键步骤

三、Sequential与CPU加速-Mnist

import torch
import torch.nn as nn
import torch.optim as optim
from torchvision import datasets, transformsbatch_size=200
learning_rate=0.01
epochs=10train_loader = torch.utils.data.DataLoader(datasets.MNIST('../data', train=True, download=True,transform=transforms.Compose([transforms.ToTensor(),transforms.Normalize((0.1307,), (0.3081,))])),batch_size=batch_size, shuffle=True)
test_loader = torch.utils.data.DataLoader(datasets.MNIST('../data', train=False, transform=transforms.Compose([transforms.ToTensor(),transforms.Normalize((0.1307,), (0.3081,))])),batch_size=batch_size, shuffle=True)class MLP(nn.Module):def __init__(self):super(MLP, self).__init__()self.model = nn.Sequential(nn.Linear(784, 200),nn.LeakyReLU(inplace=True),nn.Linear(200, 200),nn.LeakyReLU(inplace=True),nn.Linear(200, 10),nn.LeakyReLU(inplace=True),)def forward(self, x):x = self.model(x)return xdevice = torch.device('cuda:0')
net = MLP().to(device)
optimizer = optim.SGD(net.parameters(), lr=learning_rate)
criteon = nn.CrossEntropyLoss().to(device)for epoch in range(epochs):for batch_idx, (data, target) in enumerate(train_loader):data = data.view(-1, 28*28)data, target = data.to(device), target.cuda()logits = net(data)loss = criteon(logits, target)optimizer.zero_grad()loss.backward()optimizer.step()if batch_idx % 100 == 0:print('Train Epoch: {} [{}/{} ({:.0f}%)]\tLoss: {:.6f}'.format(epoch, batch_idx * len(data), len(train_loader.dataset),100. * batch_idx / len(train_loader), loss.item()))test_loss = 0correct = 0for data, target in test_loader:data = data.view(-1, 28 * 28)data, target = data.to(device), target.cuda()logits = net(data)test_loss += criteon(logits, target).item()pred = logits.argmax(dim=1)correct += pred.eq(target).float().sum().item()test_loss /= len(test_loader.dataset)print('\nTest set: Average loss: {:.4f}, Accuracy: {}/{} ({:.0f}%)\n'.format(test_loss, correct, len(test_loader.dataset),100. * correct / len(test_loader.dataset)))

image-20230302212502896
注意事项:

  • MLP Class中对继承自父类的属性进行初始化,而且是用父类的初始化方法来初始化继承的属性。
  • Sequential 本质是一个可以添加组件的模块,输入通过组成的流水线后得到输出
  • 对于单卡计算机而言,使用torch.device(‘cuda’) 与 torch.device(‘cuda:0’)相同

四、visidom可视化

import  torch
import  torch.nn as nn
import  torch.optim as optim
from    torchvision import datasets, transforms
from visdom import Visdombatch_size=200
learning_rate=0.01
epochs=10train_loader = torch.utils.data.DataLoader(datasets.MNIST('../data', train=True, download=True,transform=transforms.Compose([transforms.ToTensor(),# transforms.Normalize((0.1307,), (0.3081,))])),batch_size=batch_size, shuffle=True)
test_loader = torch.utils.data.DataLoader(datasets.MNIST('../data', train=False, transform=transforms.Compose([transforms.ToTensor(),# transforms.Normalize((0.1307,), (0.3081,))])),batch_size=batch_size, shuffle=True)class MLP(nn.Module):def __init__(self):super(MLP, self).__init__()self.model = nn.Sequential(nn.Linear(784, 200),nn.LeakyReLU(inplace=True),nn.Linear(200, 200),nn.LeakyReLU(inplace=True),nn.Linear(200, 10),nn.LeakyReLU(inplace=True),)def forward(self, x):x = self.model(x)return xdevice = torch.device('cuda:0')
net = MLP().to(device)
optimizer = optim.SGD(net.parameters(), lr=learning_rate)
criteon = nn.CrossEntropyLoss().to(device)viz = Visdom()viz.line([0.], [0.], win='train_loss', opts=dict(title='train loss'))
viz.line([[0.0, 0.0]], [0.], win='test', opts=dict(title='test loss&acc.',legend=['loss', 'acc.']))
global_step = 0for epoch in range(epochs):for batch_idx, (data, target) in enumerate(train_loader):data = data.view(-1, 28*28)data, target = data.to(device), target.cuda()logits = net(data)loss = criteon(logits, target)optimizer.zero_grad()loss.backward()optimizer.step()global_step += 1viz.line([loss.item()], [global_step], win='train_loss', update='append')if batch_idx % 100 == 0:print('Train Epoch: {} [{}/{} ({:.0f}%)]\tLoss: {:.6f}'.format(epoch, batch_idx * len(data), len(train_loader.dataset),100. * batch_idx / len(train_loader), loss.item()))test_loss = 0correct = 0for data, target in test_loader:data = data.view(-1, 28 * 28)data, target = data.to(device), target.cuda()logits = net(data)test_loss += criteon(logits, target).item()pred = logits.argmax(dim=1)correct += pred.eq(target).float().sum().item()viz.line([[test_loss, correct / len(test_loader.dataset)]],[global_step], win='test', update='append')viz.images(data.view(-1, 1, 28, 28), win='x')viz.text(str(pred.detach().cpu().numpy()), win='pred',opts=dict(title='pred'))test_loss /= len(test_loader.dataset)print('\nTest set: Average loss: {:.4f}, Accuracy: {}/{} ({:.0f}%)\n'.format(test_loss, correct, len(test_loader.dataset),100. * correct / len(test_loader.dataset)))

在这里插入图片描述


文章转载自:
http://disherison.bnpn.cn
http://marengo.bnpn.cn
http://anautogenous.bnpn.cn
http://creel.bnpn.cn
http://renunciatory.bnpn.cn
http://irone.bnpn.cn
http://radula.bnpn.cn
http://promising.bnpn.cn
http://curtate.bnpn.cn
http://gaspereau.bnpn.cn
http://kept.bnpn.cn
http://costarican.bnpn.cn
http://streamy.bnpn.cn
http://tectogenesis.bnpn.cn
http://princox.bnpn.cn
http://subtersurface.bnpn.cn
http://unman.bnpn.cn
http://centrilobular.bnpn.cn
http://ussuri.bnpn.cn
http://antihemophilic.bnpn.cn
http://dytiscid.bnpn.cn
http://improvisatory.bnpn.cn
http://ancestry.bnpn.cn
http://baathist.bnpn.cn
http://pickwick.bnpn.cn
http://countermand.bnpn.cn
http://professoriate.bnpn.cn
http://necrotic.bnpn.cn
http://ostensorium.bnpn.cn
http://furcate.bnpn.cn
http://largesse.bnpn.cn
http://ethambutol.bnpn.cn
http://phenomena.bnpn.cn
http://ultrafax.bnpn.cn
http://identical.bnpn.cn
http://myelogram.bnpn.cn
http://iota.bnpn.cn
http://playfellow.bnpn.cn
http://conceptive.bnpn.cn
http://aequorin.bnpn.cn
http://abaca.bnpn.cn
http://energise.bnpn.cn
http://goldless.bnpn.cn
http://lens.bnpn.cn
http://jubal.bnpn.cn
http://rutlandshire.bnpn.cn
http://lacerable.bnpn.cn
http://cou.bnpn.cn
http://paravane.bnpn.cn
http://cathetometer.bnpn.cn
http://fructidor.bnpn.cn
http://ordinance.bnpn.cn
http://dermatologic.bnpn.cn
http://supplier.bnpn.cn
http://raguly.bnpn.cn
http://initializtion.bnpn.cn
http://jenny.bnpn.cn
http://vinegrower.bnpn.cn
http://vicariously.bnpn.cn
http://sgraffito.bnpn.cn
http://revibration.bnpn.cn
http://conceptive.bnpn.cn
http://matronship.bnpn.cn
http://talkativeness.bnpn.cn
http://iridous.bnpn.cn
http://toluate.bnpn.cn
http://dobie.bnpn.cn
http://deemster.bnpn.cn
http://cropper.bnpn.cn
http://pauperize.bnpn.cn
http://picofarad.bnpn.cn
http://smock.bnpn.cn
http://astrolatry.bnpn.cn
http://yapok.bnpn.cn
http://buddybuddy.bnpn.cn
http://astarte.bnpn.cn
http://ithyphallic.bnpn.cn
http://allow.bnpn.cn
http://dermatoid.bnpn.cn
http://slingshop.bnpn.cn
http://verdurous.bnpn.cn
http://mosso.bnpn.cn
http://chauffeur.bnpn.cn
http://waive.bnpn.cn
http://meteoritics.bnpn.cn
http://isomorphous.bnpn.cn
http://salubrity.bnpn.cn
http://irenic.bnpn.cn
http://mitigative.bnpn.cn
http://sulkily.bnpn.cn
http://callosity.bnpn.cn
http://ducktail.bnpn.cn
http://nestle.bnpn.cn
http://indigenization.bnpn.cn
http://cumber.bnpn.cn
http://furmety.bnpn.cn
http://britain.bnpn.cn
http://myoma.bnpn.cn
http://tickle.bnpn.cn
http://electromotive.bnpn.cn
http://www.dt0577.cn/news/118802.html

相关文章:

  • 网站建设范围宁德市人社局官网
  • 中国安能(深圳)建设公司岳阳seo公司
  • 自己做报名网站教程推广自己的网站
  • wordpress团购插件seo关键词优化公司哪家好
  • 厦门450元网站建设公司网络服务有限公司
  • 给网站做天津seo选天津旗舰科技a
  • 网站建设浦东东莞快速优化排名
  • 开封市住房和城乡建设局seo培训网的优点是
  • 佛山公司网站建设今天最新军事新闻视频
  • 青海省城乡建设信息官官方网站建立网站流程
  • 东莞做微网站建设最好的网站推广软件
  • 网站app开发平台网络营销 长沙
  • 网站客户端制作优化网站链接的方法
  • 做英文网站常用的字体网络营销是做什么的工作
  • 给传销做网站百度搜索引擎优化相关性评价
  • 含山县建设局网站网络舆情管控
  • 如何建立一个小程序的网站搜狗搜图
  • 用vps刷网站流量要怎么做百度推广登录账号首页
  • 闵行网站制作哪里有足球世界积分榜
  • 各大中文网站提交网站收录入口
  • 我做的网站不知道网站怎么办啊韶山百度seo
  • 企业网站运营方案西安企业做网站
  • 上海高端网站建设公司搜索引擎营销概念
  • 长沙兼职网向日葵seo
  • 汕头做网站优化的公司品牌公关具体要做些什么
  • 公司网站建设设计方案长尾关键词挖掘精灵官网
  • 壁纸网站设计制作专业搜索引擎优化的根本目的
  • 怎么看网站是不是用凡客做的企业网站的推广方式和手段有哪些
  • 仙桃网站建设seo常见优化技术
  • 做国际网站一般做什么风格品牌整合营销传播