日本搞逼视频_黄色一级片免费在线观看_色99久久_性明星video另类hd_欧美77_综合在线视频

國內(nèi)最全I(xiàn)T社區(qū)平臺 聯(lián)系我們 | 收藏本站
阿里云優(yōu)惠2
您當(dāng)前位置:首頁 > php開源 > php教程 > 如何繪制caffe網(wǎng)絡(luò)訓(xùn)練曲線

如何繪制caffe網(wǎng)絡(luò)訓(xùn)練曲線

來源:程序員人生   發(fā)布時間:2016-07-04 08:26:41 閱讀次數(shù):3036次


本系列文章由 @yhl_leo 出品,轉(zhuǎn)載請注明出處。
文章鏈接: http://blog.csdn.net/yhl_leo/article/details/51774966


當(dāng)我們設(shè)計好網(wǎng)絡(luò)結(jié)構(gòu)后,在神經(jīng)網(wǎng)絡(luò)訓(xùn)練的進(jìn)程中,迭代輸出的log信息中,1般包括,迭代次數(shù),訓(xùn)練損失代價,測試損失代價,測試精度等。本文提供1段示例,簡單講述如何繪制訓(xùn)練曲線(training curve)。

首先看1段訓(xùn)練的log輸出,網(wǎng)絡(luò)結(jié)構(gòu)參數(shù)的那段疏忽,直接跳到訓(xùn)練迭代階段:

I0627 21:30:06.004370 15558 solver.cpp:242] Iteration 0, loss = 21.6953 I0627 21:30:06.004420 15558 solver.cpp:258] Train net output #0: loss = 21.6953 (* 1 = 21.6953 loss) I0627 21:30:06.004426 15558 solver.cpp:571] Iteration 0, lr = 0.01 I0627 21:30:28.592690 15558 solver.cpp:242] Iteration 100, loss = 13.6593 I0627 21:30:28.592730 15558 solver.cpp:258] Train net output #0: loss = 13.6593 (* 1 = 13.6593 loss) I0627 21:30:28.592733 15558 solver.cpp:571] Iteration 100, lr = 0.01 ... I0627 21:37:47.926597 15558 solver.cpp:346] Iteration 2000, Testing net (#0) I0627 21:37:48.588079 15558 blocking_queue.cpp:50] Data layer prefetch queue empty I0627 21:40:40.575474 15558 solver.cpp:414] Test net output #0: loss = 13.07728 (* 1 = 13.07728 loss) I0627 21:40:40.575477 15558 solver.cpp:414] Test net output #1: loss/top⑴ = 0.00226 I0627 21:40:40.575487 15558 solver.cpp:414] Test net output #2: loss/top⑸ = 0.01204 I0627 21:40:40.708261 15558 solver.cpp:242] Iteration 2000, loss = 13.1739 I0627 21:40:40.708302 15558 solver.cpp:258] Train net output #0: loss = 13.1739 (* 1 = 13.1739 loss) I0627 21:40:40.708307 15558 solver.cpp:571] Iteration 2000, lr = 0.01 ... I0628 01:28:47.426129 15558 solver.cpp:242] Iteration 49900, loss = 0.960628 I0628 01:28:47.426177 15558 solver.cpp:258] Train net output #0: loss = 0.960628 (* 1 = 0.960628 loss) I0628 01:28:47.426182 15558 solver.cpp:571] Iteration 49900, lr = 0.01 I0628 01:29:10.084050 15558 solver.cpp:449] Snapshotting to binary proto file train_net/net_iter_50000.caffemodel I0628 01:29:10.563587 15558 solver.cpp:734] Snapshotting solver state to binary proto filetrain_net/net_iter_50000.solverstate I0628 01:29:10.692239 15558 solver.cpp:346] Iteration 50000, Testing net (#0) I0628 01:29:13.192075 15558 blocking_queue.cpp:50] Data layer prefetch queue empty I0628 01:31:00.595120 15558 solver.cpp:414] Test net output #0: loss = 0.6404232 (* 1 = 0.6404232 loss) I0628 01:31:00.595124 15558 solver.cpp:414] Test net output #1: loss/top⑴ = 0.953861 I0628 01:31:00.595127 15558 solver.cpp:414] Test net output #2: loss/top⑸ = 0.998659 I0628 01:31:00.727577 15558 solver.cpp:242] Iteration 50000, loss = 0.680951 I0628 01:31:00.727618 15558 solver.cpp:258] Train net output #0: loss = 0.680951 (* 1 = 0.680951 loss) I0628 01:31:00.727623 15558 solver.cpp:571] Iteration 50000, lr = 0.0096

這是1個普通的網(wǎng)絡(luò)訓(xùn)練輸出,含有1個loss,可以看出solver.prototxt的部份參數(shù)為:

test_interval: 2000 base_lr: 0.01 lr_policy: "step" # or "multistep" gamma: 0.96 display: 100 stepsize: 50000 # if is "multistep", the first stepvalue is set as 50000 snapshot_prefix: "train_net/net"

固然,上面的分析,即使不理睬,對下面的代碼也沒甚么影響,繪制訓(xùn)練曲線本質(zhì)就是文件操作,從上面的log文件中,我們可以看出:

  • 對每一個出現(xiàn)字段] Iterationloss =的文本行,含有訓(xùn)練的迭代次數(shù)和損失代價;
  • 對每一個含有字段] IterationTesting net (#0)的文本行,含有測試的對應(yīng)的訓(xùn)練迭代次數(shù);
  • 對每一個含有字段#2:loss/top⑸的文本行,含有測試top⑸的精度。

根據(jù)這些分析,就能夠?qū)ξ谋具M(jìn)行以下處理:

import os import sys import numpy as np import matplotlib.pyplot as plt import math import re import pylab from pylab import figure, show, legend from mpl_toolkits.axes_grid1 import host_subplot # read the log file fp = open('log.txt', 'r') train_iterations = [] train_loss = [] test_iterations = [] test_accuracy = [] for ln in fp: # get train_iterations and train_loss if '] Iteration ' in ln and 'loss = ' in ln: arr = re.findall(r'ion \b\d+\b,',ln) train_iterations.append(int(arr[0].strip(',')[4:])) train_loss.append(float(ln.strip().split(' = ')[-1])) # get test_iteraitions if '] Iteration' in ln and 'Testing net (#0)' in ln: arr = re.findall(r'ion \b\d+\b,',ln) test_iterations.append(int(arr[0].strip(',')[4:])) # get test_accuracy if '#2:' in ln and 'loss/top⑸' in ln: test_accuracy.append(float(ln.strip().split(' = ')[-1])) fp.close() host = host_subplot(111) plt.subplots_adjust(right=0.8) # ajust the right boundary of the plot window par1 = host.twinx() # set labels host.set_xlabel("iterations") host.set_ylabel("log loss") par1.set_ylabel("validation accuracy") # plot curves p1, = host.plot(train_iterations, train_loss, label="training log loss") p2, = par1.plot(test_iterations, test_accuracy, label="validation accuracy") # set location of the legend, # 1->rightup corner, 2->leftup corner, 3->leftdown corner # 4->rightdown corner, 5->rightmid ... host.legend(loc=5) # set label color host.axis["left"].label.set_color(p1.get_color()) par1.axis["right"].label.set_color(p2.get_color()) # set the range of x axis of host and y axis of par1 host.set_xlim([-1500, 160000]) par1.set_ylim([0., 1.05]) plt.draw() plt.show()

示例代碼中,添加了簡單的注釋,如果網(wǎng)絡(luò)訓(xùn)練的log輸出與本中所列出的不同,只需要稍微修改其中的1些參數(shù)設(shè)置,就可以繪制出訓(xùn)練曲線圖。

最后附上繪制出的訓(xùn)練曲線圖:

train_curve

生活不易,碼農(nóng)辛苦
如果您覺得本網(wǎng)站對您的學(xué)習(xí)有所幫助,可以手機(jī)掃描二維碼進(jìn)行捐贈
程序員人生
------分隔線----------------------------
分享到:
------分隔線----------------------------
關(guān)閉
程序員人生
主站蜘蛛池模板: 亚洲欧美激情精品一区二区 | 最近中文字幕在线观看视频 | aaaa网站| 91i在线| a亚洲天堂 | 中文日韩| 欧美亚洲国产精品 | 亚洲精品一二 | 福利视频一区二区三区 | 91福利国产在线观看菠萝蜜 | 日本高清久久 | 亚洲国产精品成人女人久久 | 久热国产精品 | 99国产在线| 一极黄色大片 | 久久免费国产精品1 | 黄免费看 | 久久国产一区 | www.久久精品 | 曰韩在线 | 日韩av电影在线免费观看 | 少妇一级淫片免费放 | 久久综合亚洲 | 国产毛片久久 | 精品一区久久 | 国产一区精品视频 | 久久99精品久久久久久噜噜 | 久久久久国产精品免费免费搜索 | 亚洲视频大全 | 黄色不卡 | 久久精品一区二区三区不卡牛牛 | 国产日产欧美一区二区 | 欧美淫视频 | 久久精品亚洲一区二区 | 亚洲精品乱码久久久久v最新版 | 伊人久久超碰 | 日本成人一区 | 欧美在线视频网站 | 91短视频在线观看视频 | 久久久综合色 | 免费a视频 |