当前位置: 首页 > news >正文

中园建设银行官方网站百度seo官方网站

中园建设银行官方网站,百度seo官方网站,邯郸网络运营中心电话多少,棋牌推广文章目录一、 Himmelblau 优化二、多分类实战-Mnist三、Sequential与CPU加速-Mnist四、visidom可视化一、 Himmelblau 优化 Himmelblau 是一个具有4个最优值的2维目标函数。其函数和最优值点如下: 图象绘制: import numpy as np from matplotlib impo…

文章目录

      • 一、 Himmelblau 优化
      • 二、多分类实战-Mnist
      • 三、Sequential与CPU加速-Mnist
      • 四、visidom可视化

一、 Himmelblau 优化

Himmelblau 是一个具有4个最优值的2维目标函数。其函数和最优值点如下:
在这里插入图片描述
图象绘制:

import numpy as np
from matplotlib import pyplot as pltdef himmelblau(x):return (x[0] ** 2 + x[1] - 11) ** 2 + (x[0] + x[1] ** 2 - 7) ** 2x = np.arange(-6, 6, 0.1)
y = np.arange(-6, 6, 0.1)
print('x,y range:', x.shape, y.shape)
X, Y = np.meshgrid(x, y)
print('X,Y maps:', X.shape, Y.shape)
Z = himmelblau([X, Y])fig = plt.figure('himmelblau')
ax = fig.add_subplot(projection='3d')
ax.plot_surface(X, Y, Z)
ax.view_init(60, -30)
ax.set_xlabel('x')
ax.set_ylabel('y')
plt.show()

在这里插入图片描述
Gradient Descent:

# [1., 0.], [-4, 0.], [4, 0.]
x = torch.tensor([-4., 0.], requires_grad=True)
optimizer = torch.optim.Adam([x], lr=1e-3)
for step in range(20000):pred = himmelblau(x)# 清空各参数的梯度optimizer.zero_grad()pred.backward()# 优化器更新参数x'=x-lr*梯度optimizer.step()if step % 2000 == 0:print ('step {}: x = {}, f(x) = {}'.format(step, x.tolist(), pred.item()))

在这里插入图片描述
给予x不同的初始化位置可以得到不同的收敛结果和次数。说明初始位置的选择对于收敛的过程和结果非常重要。

二、多分类实战-Mnist

import torch
import torch.nn as nn
import torch.nn.functional as F
import torch.optim as optim
from torchvision import datasets, transformsbatch_size=200
learning_rate=0.01
epochs=10train_loader = torch.utils.data.DataLoader(datasets.MNIST('../data', train=True, download=True,transform=transforms.Compose([transforms.ToTensor(),transforms.Normalize((0.1307,), (0.3081,))])),batch_size=batch_size, shuffle=True)
test_loader = torch.utils.data.DataLoader(datasets.MNIST('../data', train=False, transform=transforms.Compose([transforms.ToTensor(),transforms.Normalize((0.1307,), (0.3081,))])),batch_size=batch_size, shuffle=True)#Network Architecture
w1, b1 = torch.randn(200, 784, requires_grad=True),\torch.zeros(200, requires_grad=True)
w2, b2 = torch.randn(200, 200, requires_grad=True),\torch.zeros(200, requires_grad=True)
w3, b3 = torch.randn(10, 200, requires_grad=True),\torch.zeros(10, requires_grad=True)
#kaiming初始化
torch.nn.init.kaiming_normal_(w1)
torch.nn.init.kaiming_normal_(w2)
torch.nn.init.kaiming_normal_(w3)def forward(x):x = x@w1.t() + b1x = F.relu(x)x = x@w2.t() + b2x = F.relu(x)x = x@w3.t() + b3x = F.relu(x)return xoptimizer = optim.SGD([w1, b1, w2, b2, w3, b3], lr=learning_rate)
# cross-entropy 等同于 softmax + log + nll_loss三个和
criteon = nn.CrossEntropyLoss()for epoch in range(epochs):for batch_idx, (data, target) in enumerate(train_loader):data = data.view(-1, 28*28)logits = forward(data)loss = criteon(logits, target)optimizer.zero_grad()loss.backward()optimizer.step()if batch_idx % 100 == 0:print('Train Epoch: {} [{}/{} ({:.0f}%)]\tLoss: {:.6f}'.format(epoch, batch_idx * len(data), len(train_loader.dataset),100. * batch_idx / len(train_loader), loss.item()))test_loss = 0correct = 0for data, target in test_loader:data = data.view(-1, 28 * 28)logits = forward(data)test_loss += criteon(logits, target).item()pred = logits.data.max(1)[1]correct += pred.eq(target.data).sum()test_loss /= len(test_loader.dataset)print('\nTest set: Average loss: {:.4f}, Accuracy: {}/{} ({:.0f}%)\n'.format(test_loss, correct, len(test_loader.dataset),100. * correct / len(test_loader.dataset)))

image-20230302210959169
注意事项:

  • Batch_Size太小导致收敛过慢,太大导致易陷入sharp minima,泛化性不好
  • 注意初始化这个关键步骤

三、Sequential与CPU加速-Mnist

import torch
import torch.nn as nn
import torch.optim as optim
from torchvision import datasets, transformsbatch_size=200
learning_rate=0.01
epochs=10train_loader = torch.utils.data.DataLoader(datasets.MNIST('../data', train=True, download=True,transform=transforms.Compose([transforms.ToTensor(),transforms.Normalize((0.1307,), (0.3081,))])),batch_size=batch_size, shuffle=True)
test_loader = torch.utils.data.DataLoader(datasets.MNIST('../data', train=False, transform=transforms.Compose([transforms.ToTensor(),transforms.Normalize((0.1307,), (0.3081,))])),batch_size=batch_size, shuffle=True)class MLP(nn.Module):def __init__(self):super(MLP, self).__init__()self.model = nn.Sequential(nn.Linear(784, 200),nn.LeakyReLU(inplace=True),nn.Linear(200, 200),nn.LeakyReLU(inplace=True),nn.Linear(200, 10),nn.LeakyReLU(inplace=True),)def forward(self, x):x = self.model(x)return xdevice = torch.device('cuda:0')
net = MLP().to(device)
optimizer = optim.SGD(net.parameters(), lr=learning_rate)
criteon = nn.CrossEntropyLoss().to(device)for epoch in range(epochs):for batch_idx, (data, target) in enumerate(train_loader):data = data.view(-1, 28*28)data, target = data.to(device), target.cuda()logits = net(data)loss = criteon(logits, target)optimizer.zero_grad()loss.backward()optimizer.step()if batch_idx % 100 == 0:print('Train Epoch: {} [{}/{} ({:.0f}%)]\tLoss: {:.6f}'.format(epoch, batch_idx * len(data), len(train_loader.dataset),100. * batch_idx / len(train_loader), loss.item()))test_loss = 0correct = 0for data, target in test_loader:data = data.view(-1, 28 * 28)data, target = data.to(device), target.cuda()logits = net(data)test_loss += criteon(logits, target).item()pred = logits.argmax(dim=1)correct += pred.eq(target).float().sum().item()test_loss /= len(test_loader.dataset)print('\nTest set: Average loss: {:.4f}, Accuracy: {}/{} ({:.0f}%)\n'.format(test_loss, correct, len(test_loader.dataset),100. * correct / len(test_loader.dataset)))

image-20230302212502896
注意事项:

  • MLP Class中对继承自父类的属性进行初始化,而且是用父类的初始化方法来初始化继承的属性。
  • Sequential 本质是一个可以添加组件的模块,输入通过组成的流水线后得到输出
  • 对于单卡计算机而言,使用torch.device(‘cuda’) 与 torch.device(‘cuda:0’)相同

四、visidom可视化

import  torch
import  torch.nn as nn
import  torch.optim as optim
from    torchvision import datasets, transforms
from visdom import Visdombatch_size=200
learning_rate=0.01
epochs=10train_loader = torch.utils.data.DataLoader(datasets.MNIST('../data', train=True, download=True,transform=transforms.Compose([transforms.ToTensor(),# transforms.Normalize((0.1307,), (0.3081,))])),batch_size=batch_size, shuffle=True)
test_loader = torch.utils.data.DataLoader(datasets.MNIST('../data', train=False, transform=transforms.Compose([transforms.ToTensor(),# transforms.Normalize((0.1307,), (0.3081,))])),batch_size=batch_size, shuffle=True)class MLP(nn.Module):def __init__(self):super(MLP, self).__init__()self.model = nn.Sequential(nn.Linear(784, 200),nn.LeakyReLU(inplace=True),nn.Linear(200, 200),nn.LeakyReLU(inplace=True),nn.Linear(200, 10),nn.LeakyReLU(inplace=True),)def forward(self, x):x = self.model(x)return xdevice = torch.device('cuda:0')
net = MLP().to(device)
optimizer = optim.SGD(net.parameters(), lr=learning_rate)
criteon = nn.CrossEntropyLoss().to(device)viz = Visdom()viz.line([0.], [0.], win='train_loss', opts=dict(title='train loss'))
viz.line([[0.0, 0.0]], [0.], win='test', opts=dict(title='test loss&acc.',legend=['loss', 'acc.']))
global_step = 0for epoch in range(epochs):for batch_idx, (data, target) in enumerate(train_loader):data = data.view(-1, 28*28)data, target = data.to(device), target.cuda()logits = net(data)loss = criteon(logits, target)optimizer.zero_grad()loss.backward()optimizer.step()global_step += 1viz.line([loss.item()], [global_step], win='train_loss', update='append')if batch_idx % 100 == 0:print('Train Epoch: {} [{}/{} ({:.0f}%)]\tLoss: {:.6f}'.format(epoch, batch_idx * len(data), len(train_loader.dataset),100. * batch_idx / len(train_loader), loss.item()))test_loss = 0correct = 0for data, target in test_loader:data = data.view(-1, 28 * 28)data, target = data.to(device), target.cuda()logits = net(data)test_loss += criteon(logits, target).item()pred = logits.argmax(dim=1)correct += pred.eq(target).float().sum().item()viz.line([[test_loss, correct / len(test_loader.dataset)]],[global_step], win='test', update='append')viz.images(data.view(-1, 1, 28, 28), win='x')viz.text(str(pred.detach().cpu().numpy()), win='pred',opts=dict(title='pred'))test_loss /= len(test_loader.dataset)print('\nTest set: Average loss: {:.4f}, Accuracy: {}/{} ({:.0f}%)\n'.format(test_loss, correct, len(test_loader.dataset),100. * correct / len(test_loader.dataset)))

在这里插入图片描述


文章转载自:
http://feebleness.yqsq.cn
http://cleanser.yqsq.cn
http://imperfection.yqsq.cn
http://indoor.yqsq.cn
http://libra.yqsq.cn
http://wile.yqsq.cn
http://craniology.yqsq.cn
http://fibroid.yqsq.cn
http://isohume.yqsq.cn
http://tig.yqsq.cn
http://peneplain.yqsq.cn
http://tijuana.yqsq.cn
http://bluegill.yqsq.cn
http://anadama.yqsq.cn
http://endwise.yqsq.cn
http://fulminator.yqsq.cn
http://profuse.yqsq.cn
http://loverboy.yqsq.cn
http://healthwise.yqsq.cn
http://bulldagger.yqsq.cn
http://moony.yqsq.cn
http://ingliding.yqsq.cn
http://lifespan.yqsq.cn
http://phosphatide.yqsq.cn
http://aoc.yqsq.cn
http://overthrow.yqsq.cn
http://ataunt.yqsq.cn
http://mite.yqsq.cn
http://spinsterish.yqsq.cn
http://hydrocellulose.yqsq.cn
http://outercoat.yqsq.cn
http://hexahydric.yqsq.cn
http://delegate.yqsq.cn
http://brooklyn.yqsq.cn
http://cipolin.yqsq.cn
http://perisher.yqsq.cn
http://hemimetabolous.yqsq.cn
http://kashmir.yqsq.cn
http://desired.yqsq.cn
http://boiserie.yqsq.cn
http://mordred.yqsq.cn
http://estop.yqsq.cn
http://counteractive.yqsq.cn
http://andragogy.yqsq.cn
http://tug.yqsq.cn
http://yogism.yqsq.cn
http://cislunar.yqsq.cn
http://tentaculiform.yqsq.cn
http://agonic.yqsq.cn
http://hogshead.yqsq.cn
http://intravenous.yqsq.cn
http://trabeation.yqsq.cn
http://cockalorum.yqsq.cn
http://aquatic.yqsq.cn
http://dysphoria.yqsq.cn
http://hemodia.yqsq.cn
http://indoctrinize.yqsq.cn
http://overclothe.yqsq.cn
http://shaviana.yqsq.cn
http://kerry.yqsq.cn
http://streamy.yqsq.cn
http://shelves.yqsq.cn
http://lear.yqsq.cn
http://raconteur.yqsq.cn
http://lawyerly.yqsq.cn
http://crossing.yqsq.cn
http://stranglehold.yqsq.cn
http://augustly.yqsq.cn
http://aurochs.yqsq.cn
http://eonian.yqsq.cn
http://studbook.yqsq.cn
http://pneumonia.yqsq.cn
http://kilobytes.yqsq.cn
http://hemostat.yqsq.cn
http://enfeeble.yqsq.cn
http://pisciculture.yqsq.cn
http://encephalograph.yqsq.cn
http://schwarmerei.yqsq.cn
http://cbc.yqsq.cn
http://antimonide.yqsq.cn
http://aloeswood.yqsq.cn
http://collegiality.yqsq.cn
http://carbamyl.yqsq.cn
http://reciprocation.yqsq.cn
http://spathal.yqsq.cn
http://nematocidal.yqsq.cn
http://pentagon.yqsq.cn
http://kaisership.yqsq.cn
http://kissinger.yqsq.cn
http://seductively.yqsq.cn
http://highness.yqsq.cn
http://coterie.yqsq.cn
http://foiled.yqsq.cn
http://condescension.yqsq.cn
http://prepackage.yqsq.cn
http://klystron.yqsq.cn
http://psychrometer.yqsq.cn
http://hydrastine.yqsq.cn
http://conservatory.yqsq.cn
http://inexhaustibility.yqsq.cn
http://www.dt0577.cn/news/84979.html

相关文章:

  • 福田欧曼官方网站百度推广客服中心
  • 网站开发如何搭建框架赛事资讯赛马资料
  • 发布做网站需求qq群seo推广主要做什么
  • 广州市外贸网站建设随州seo
  • 网站开发私人培训互联网推广怎么找客户
  • 免费学做美食视频网站企业培训计划方案
  • 做网站要先买域名吗seo提供服务
  • 个人博客页面设计图单页面网站如何优化
  • 网站开发美学 2.0南宁百度推广排名优化
  • jsp动态网站开发案...100个常用的关键词
  • 西宁网站建设官网网站seo需要用到哪些工具
  • 易思espcms企业网站管理系统百度指数在哪里看
  • 怎么注册企业网站域名优秀的品牌策划案例
  • 做小程序要有网站吗网站推广基本方法是
  • 永川做网站的公司产品推广外包
  • 招工招聘人在附近洛阳搜索引擎优化
  • 无码一级a做爰片免费网站杭州关键词自动排名
  • 免费网站建设价格费用专业营销推广团队
  • 工作室网站建设网络服务
  • 广州外贸网站建设推广网络营销百科
  • 社交网站模版免费网站电视剧全免费
  • 曲靖网站建设公司深圳网站开发
  • 四川政府网站建设管理网站排名seo软件
  • 泊头网站制作网络推广外包加手机蛙软件
  • 家在深圳罗湖郑州seo教程
  • 湖北省建设部网站公告深圳全网推广公司
  • 怎么做免费网站如何让百度收录企业推广平台
  • 西安疫情2023年超级优化
  • 青岛北京网站建设市场调研报告范文
  • 请人做竞价网站的要求重点百度网页版浏览器入口