FCN-PCL 应用解析

前言


FCN(fully convolutional networks, 全卷积神经网络)的图片语义分割(semantic segmentation)论文:Fully Convolutional Networks for Semantic Segmentation。全卷积网络首现于这篇文章。这篇文章是将CNN结构应用到图像语义分割领域并取得突出结果的开山之作,因而拿到了CVPR 2015年的Best paper honorable mention。图像语义分割,简而言之就是对一张图片上的所有像素点进行分类。如下图就是一个语义分割例子,不同颜色像素代表不同类别:

原图
语义分割(semantic segmentation)

UCB的FCN源码Github地址:https://github.com/shelhamer/fcn.berkeleyvision.org
源码中一共包含了4种网络结构模型:nyud-fcnpascalcontext-fcnsiftflow-fcnvoc-fcn。每一种网络结构根据提取卷积层不同,又分了3-4个不等的网络类别。
工作中个人的数据类型和格式不一定与voc-fcn-alexnet源代码提供的数据接口相同或类似(图片),如本文接下来要输入网络模型的数据类型为由激光雷达(LiDAR)扫描得到的点云数据(.pcd),那么如何进行实际操作呢?下面一步一步进行。

1. 激光雷达数据转换


1.1 激光雷达点云数据介绍

首先介绍机械式旋转激光雷达生成的数据格式,激光雷达内部电机以一定角速度旋转,通过固定于其上的激光发射器和激光接收器测量激光雷达到障碍物的距离。以速腾聚创公司生产的16线激光雷达RS-LiDAR-16为例,每秒进行10次360°旋转(10Hz),每次旋转扫描得到周围场景的信息,每一线激光旋转一周得到2016个点,储存在 .pcd 格式文件中。以二维彩色图像的方式(如.png)来理解.pcd文件,16线代表图片高度,2016代表图片宽度,一共16x2016=32256个像素点。每个点 point 的数据有[x, y, z, intensity],与二维图片中的RGB通道(RGB chanel)是同样的道理,每一个数据代表一个通道。

速腾聚创 RS-LiDAR-16

激光雷达点云示意图

1.2 点云预处理

根据点云数据特征属性对其进行预处理,每个 point 的处理后特征有[row, column, height, range, mark]分别代表 point 的:[行序号列标号高度距离属性],其中 heightz 值相等,rangesqrt(x^2 + y^2 + z^2) 计算得出, mark 为通过决策树(Decision tree)方式对 point 进行分类得到属性:障碍物点(obstacle mark)或地面点(ground mark),与ground true图片道理相同,作为训练预测分类的结果参考标准用于计算loss。这里作用相当于,人工添加了更多的特征通道,方便进行分类和预测。
以上预处理得到的数据通过cnpy库转换为 .npy 格式的二进制文件,方便NumPy对数据进行读取,cnpy库使用教程请移步:cnpy库使用笔记以及官方example。每一帧点云数据储存为一个 .npy 格式文件,命名方式越简单越好,方便读取排序,本文直接以序号作为文件名[0.npy, 1.npy, …, n.npy ]

2. FCN-AlexNet的点云数据分类任务


FCN-AlexNet的点云数据分类任务工程包含:

  • 5个Python文件: pcl_data_layer.pynet.pysolver.pysurgery.pyscore.py
  • 3个prototxt文件: train.prototxtval.prototxtsolver.prototxt
  • 1个caffe_model文件: fcn-alexnet-pascal.caffemodel

2.1 FCN-AlexNet读取数据层(Data layer)

文件命名为pcl_data_layer.py,该文件内包含class PCLSegDataLayer()类函数:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
import caffe
import numpy as np
import random
import os
class PCLSegDataLayer(caffe.Layer):
def setup(self, bottom, top):
params = eval(self.param_str)
self.npy_dir = params["pcl_dir"]
self.list_name = list()
# two tops: data and label
if len(top) != 2:
raise Exception("Need to define two tops: data and label.")
# data layers have no bottoms
if len(bottom) != 0:
raise Exception("Do not define a bottom.")
self.load_file_name( self.npy_dir, self.list_name )
self.idx = 0
def reshape(self, bottom, top):
self.data, self.label = self.load_file( self.idx )
# reshape tops to fit (leading 1 is for batch dimension)
top[0].reshape(1, *self.data.shape)
top[1].reshape(1, *self.label.shape)
def forward(self, bottom, top):
# assign output
top[0].data[...] = self.data
top[1].data[...] = self.label
# pick next input
self.idx += 1
if self.idx == len(self.list_name):
self.idx = 0
def backward(self, top, propagate_down, bottom):
pass
def load_file(self, idx):
in_file = np.load(self.list_name[idx]) #[mark, row, col, height, range]
in_data = in_file[:,:,1:-1]
in_data = in_data.transpose((2, 0, 1))
in_label = in_file[:,:,0]
return in_data, in_label
def load_file_name(self, path, list_name):
for file in os.listdir(path):
file_path = os.path.join(path, file)
if os.path.isdir(file_path):
os.listdir(file_path, list_name)
else:
list_name.append(file_path)

  • setup(): 建立类时的参数
  • reshape(): 根据输入调整模型入口大小
  • forward(): 前向传播,由于是数据输入层,所以输出为原点云数据及其分类label
  • backward(): 后向传播,数据层没有后向传播,所以舍弃
  • load_file_name(): 读取指定文件夹内 .npy 格式文件并储存如列表list
  • load_file(): 载入单个.npy 文件,并按照储存顺序对属性进行分类,输出data和label

2.2 FCN-AlexNet模型定义函数(net.py)

net.py文件用于生成net.prototxt文件,其定义了整个模型的结构和模型每层的各个参数。当然,模型网络结构可以利用官方已经训练好的fcn-alexnet-pascal.caffemodel来导出,也可以使用net.py自己生成,为了简化操作,本文使用fcn-alexnet-pascal.caffemodel来导出模型网络结构。

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
import sys
sys.path.append('../../python')
import caffe
from caffe import layers as L, params as P
from caffe.coord_map import crop
def conv_relu(bottom, ks, nout, stride=1, pad=0, group=1):
conv = L.Convolution(bottom, kernel_size=ks, stride=stride,
num_output=nout, pad=pad, group=group)
return conv, L.ReLU(conv, in_place=True)
def max_pool(bottom, ks, stride=1):
return L.Pooling(bottom, pool=P.Pooling.MAX, kernel_size=ks, stride=stride)
def fcn(split):
n = caffe.NetSpec()
pydata_params = dict()
pydata_params['pcl_dir'] = '../fcn_data_gen/data/npy' #.npy files path
pylayer = 'PCLSegDataLayer'
n.data, n.label = L.Python(module='pcl_data_layer', layer=pylayer,
ntop=2, param_str=str(pydata_params))
# the base net
n.conv1, n.relu1 = conv_relu(n.data, 11, 96, stride=4, pad=100)
n.pool1 = max_pool(n.relu1, 3, stride=2)
n.norm1 = L.LRN(n.pool1, local_size=5, alpha=1e-4, beta=0.75)
n.conv2, n.relu2 = conv_relu(n.norm1, 5, 256, pad=2, group=2)
n.pool2 = max_pool(n.relu2, 3, stride=2)
n.norm2 = L.LRN(n.pool2, local_size=5, alpha=1e-4, beta=0.75)
n.conv3, n.relu3 = conv_relu(n.norm2, 3, 384, pad=1)
n.conv4, n.relu4 = conv_relu(n.relu3, 3, 384, pad=1, group=2)
n.conv5, n.relu5 = conv_relu(n.relu4, 3, 256, pad=1, group=2)
n.pool5 = max_pool(n.relu5, 3, stride=2)
# fully conv
n.fc6, n.relu6 = conv_relu(n.pool5, 6, 4096)
n.drop6 = L.Dropout(n.relu6, dropout_ratio=0.5, in_place=True)
n.fc7, n.relu7 = conv_relu(n.drop6, 1, 4096)
n.drop7 = L.Dropout(n.relu7, dropout_ratio=0.5, in_place=True)
n.score_fr = L.Convolution(n.drop7, num_output=21, kernel_size=1, pad=0,
param=[dict(lr_mult=1, decay_mult=1), dict(lr_mult=2, decay_mult=0)])
n.upscore = L.Deconvolution(n.score_fr,
convolution_param=dict(num_output=21, kernel_size=63, stride=32,
bias_term=False),
param=[dict(lr_mult=0)])
n.score = crop(n.upscore, n.data)
n.loss = L.SoftmaxWithLoss(n.score, n.label,
loss_param=dict(normalize=True, ignore_label=255))
return n.to_proto()
def make_net():
with open('train.prototxt', 'w') as f:
f.write(str(fcn('train')))
with open('val.prototxt', 'w') as f:
f.write(str(fcn('seg11valid')))
if __name__ == '__main__':
make_net()

  • conv_relu(): 定义卷积层输入参数
  • max_pool(): 定义池化层输入参数
  • fcn(): 定义模型网络结构

fcn()模型结构详解

这里建议结合AlexNet原论文ImageNet Classification with Deep Convolutional Neural Networks一起看,并参考AlexNet模型结构图例来进行比较好理解每个参数的意义。

(1). 数据输入层
1
2
3
4
5
6
n = caffe.NetSpec()
pydata_params = dict()
pydata_params['pcl_dir'] = '../fcn_data_gen/data/npy' #.npy files path
pylayer = 'PCLSegDataLayer'
n.data, n.label = L.Python(module='pcl_data_layer', layer=pylayer,
ntop=2, param_str=str(pydata_params))

找到pcl_data_layer.py文件中的PCLSegDataLayer函数,使用该类处理数据方式作为模型数据输入层函数。

(2). 第一个卷积层
1
2
3
n.conv1, n.relu1 = conv_relu(n.data, 11, 96, stride=4, pad=100)
n.pool1 = max_pool(n.relu1, 3, stride=2)
n.norm1 = L.LRN(n.pool1, local_size=5, alpha=1e-4, beta=0.75)

关于为何pad=100,此文中有详细解释:FCN学习:Semantic Segmentation

(3). 第二个卷积层
1
2
3
n.conv2, n.relu2 = conv_relu(n.norm1, 5, 256, pad=2, group=2)
n.pool2 = max_pool(n.relu2, 3, stride=2)
n.norm2 = L.LRN(n.pool2, local_size=5, alpha=1e-4, beta=0.75)
(4). 第三个卷积层
1
n.conv3, n.relu3 = conv_relu(n.norm2, 3, 384, pad=1)
(5). 第四个卷积层
1
n.conv4, n.relu4 = conv_relu(n.relu3, 3, 384, pad=1, group=2)
(6). 第五个卷积层
1
2
n.conv5, n.relu5 = conv_relu(n.relu4, 3, 256, pad=1, group=2)
n.pool5 = max_pool(n.relu5, 3, stride=2)
(7). 第六个全连接层
1
2
n.fc6, n.relu6 = conv_relu(n.pool5, 6, 4096)
n.drop6 = L.Dropout(n.relu6, dropout_ratio=0.5, in_place=True)
(8). 第七个全连接层
1
2
n.fc7, n.relu7 = conv_relu(n.drop6, 1, 4096)
n.drop7 = L.Dropout(n.relu7, dropout_ratio=0.5, in_place=True)
(9). 第八个全连接层
1
2
3
4
5
6
7
8
9
n.score_fr = L.Convolution(n.drop7, num_output=21, kernel_size=1, pad=0,
param=[dict(lr_mult=1, decay_mult=1), dict(lr_mult=2, decay_mult=0)])
n.upscore = L.Deconvolution(n.score_fr,
convolution_param=dict(num_output=21, kernel_size=63, stride=32,
bias_term=False),
param=[dict(lr_mult=0)])
n.score = crop(n.upscore, n.data)
n.loss = L.SoftmaxWithLoss(n.score, n.label,
loss_param=dict(normalize=True, ignore_label=255))

2.3 FCN-AlexNet求解函数(solve.py)

solve.py 文件是整个模型的入口,它整合各个文件,输入外部参数,对结果进行求解并输出。由 solve.py 生成的 solver.prototxt 文件定义了求解函数的结构。

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
import caffe
import surgery, score
import numpy as np
import os
import sys
try:
import setproctitle
setproctitle.setproctitle(os.path.basename(os.getcwd()))
except:
pass
weights = '../ilsvrc-nets/fcn-alexnet-pascal.caffemodel'
# init
# caffe.set_device(int(sys.argv[0]))
# caffe.set_mode_gpu()
solver = caffe.SGDSolver('solver.prototxt')
solver.net.copy_from(weights)
# surgeries
interp_layers = [k for k in solver.net.params.keys() if 'up' in k]
surgery.interp(solver.net, interp_layers)
# scoring
val = np.loadtxt('../data/pascal/seg11valid.txt', dtype=str)
for _ in range(25):
solver.step(4000)
score.seg_tests(solver, False, val, layer='score')
  • weights = '../ilsvrc-nets/fcn-alexnet-pascal.caffemodel': 导入训练好的模型,可在[Netscope]中输入net.prototxt来进行网络结构可视化
  • # caffe.set_device(int(sys.argv[0]))
    # caffe.set_mode_gpu(): 设置gpu来进行训练,本人电脑使用gpu报错,所以没有使用
  • solver = caffe.SGDSolver('solver.prototxt')
    solver.net.copy_from(weights):设置求解器模型
  • # surgeries: (待补充)
  • # scoring : (待补充)

3. 点云分割试验结果


1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
876
877
878
879
880
881
882
883
884
885
886
887
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
938
939
940
941
942
943
944
945
946
947
948
949
950
951
952
953
954
955
956
957
958
959
960
961
962
963
964
965
966
967
968
969
970
971
972
973
974
975
976
977
978
979
980
981
982
983
984
985
986
987
988
989
990
991
992
993
994
995
996
997
998
999
1000
1001
1002
1003
1004
1005
1006
1007
1008
1009
1010
1011
1012
1013
1014
1015
1016
1017
1018
1019
1020
1021
1022
1023
1024
1025
1026
1027
1028
1029
1030
1031
1032
1033
1034
1035
1036
1037
1038
1039
1040
1041
1042
1043
1044
1045
1046
1047
1048
1049
1050
1051
1052
1053
1054
1055
1056
1057
1058
1059
1060
1061
1062
1063
1064
1065
1066
1067
1068
1069
1070
1071
1072
1073
1074
1075
1076
1077
1078
pydev debugger: process 9249 is connecting
Connected to pydev debugger (build 173.4301.16)
WARNING: Logging before InitGoogleLogging() is written to STDERR
I0313 11:41:39.369604 9249 solver.cpp:45] Initializing solver from parameters:
train_net: "train.prototxt"
test_net: "val.prototxt"
test_iter: 736
test_interval: 999999999
base_lr: 0.0001
display: 20
max_iter: 100000
lr_policy: "fixed"
momentum: 0.9
weight_decay: 0.0005
snapshot: 4000
snapshot_prefix: "snapshot/train"
test_initialization: false
average_loss: 20
iter_size: 20
I0313 11:41:39.369671 9249 solver.cpp:92] Creating training net from train_net file: train.prototxt
I0313 11:41:39.370101 9249 net.cpp:51] Initializing net from parameters:
state {
phase: TRAIN
}
layer {
name: "data"
type: "Python"
top: "data"
top: "label"
python_param {
module: "pcl_data_layer"
layer: "PCLSegDataLayer"
param_str: "{\'pcl_dir\': \'/home/zzy/CLionProjects/ROS_Project/ws/src/fcn_data_gen/data/npy\'}"
}
}
layer {
name: "conv1"
type: "Convolution"
bottom: "data"
top: "conv1"
convolution_param {
num_output: 96
pad: 100
kernel_size: 11
group: 1
stride: 4
}
}
layer {
name: "relu1"
type: "ReLU"
bottom: "conv1"
top: "conv1"
}
layer {
name: "pool1"
type: "Pooling"
bottom: "conv1"
top: "pool1"
pooling_param {
pool: MAX
kernel_size: 3
stride: 2
}
}
layer {
name: "norm1"
type: "LRN"
bottom: "pool1"
top: "norm1"
lrn_param {
local_size: 5
alpha: 0.0001
beta: 0.75
}
}
layer {
name: "conv2"
type: "Convolution"
bottom: "norm1"
top: "conv2"
convolution_param {
num_output: 256
pad: 2
kernel_size: 5
group: 2
stride: 1
}
}
layer {
name: "relu2"
type: "ReLU"
bottom: "conv2"
top: "conv2"
}
layer {
name: "pool2"
type: "Pooling"
bottom: "conv2"
top: "pool2"
pooling_param {
pool: MAX
kernel_size: 3
stride: 2
}
}
layer {
name: "norm2"
type: "LRN"
bottom: "pool2"
top: "norm2"
lrn_param {
local_size: 5
alpha: 0.0001
beta: 0.75
}
}
layer {
name: "conv3"
type: "Convolution"
bottom: "norm2"
top: "conv3"
convolution_param {
num_output: 384
pad: 1
kernel_size: 3
group: 1
stride: 1
}
}
layer {
name: "relu3"
type: "ReLU"
bottom: "conv3"
top: "conv3"
}
layer {
name: "conv4"
type: "Convolution"
bottom: "conv3"
top: "conv4"
convolution_param {
num_output: 384
pad: 1
kernel_size: 3
group: 2
stride: 1
}
}
layer {
name: "relu4"
type: "ReLU"
bottom: "conv4"
top: "conv4"
}
layer {
name: "conv5"
type: "Convolution"
bottom: "conv4"
top: "conv5"
convolution_param {
num_output: 256
pad: 1
kernel_size: 3
group: 2
stride: 1
}
}
layer {
name: "relu5"
type: "ReLU"
bottom: "conv5"
top: "conv5"
}
layer {
name: "pool5"
type: "Pooling"
bottom: "conv5"
top: "pool5"
pooling_param {
pool: MAX
kernel_size: 3
stride: 2
}
}
layer {
name: "fc6"
type: "Convolution"
bottom: "pool5"
top: "fc6"
convolution_param {
num_output: 4096
pad: 0
kernel_size: 6
group: 1
stride: 1
}
}
layer {
name: "relu6"
type: "ReLU"
bottom: "fc6"
top: "fc6"
}
layer {
name: "drop6"
type: "Dropout"
bottom: "fc6"
top: "fc6"
dropout_param {
dropout_ratio: 0.5
}
}
layer {
name: "fc7"
type: "Convolution"
bottom: "fc6"
top: "fc7"
convolution_param {
num_output: 4096
pad: 0
kernel_size: 1
group: 1
stride: 1
}
}
layer {
name: "relu7"
type: "ReLU"
bottom: "fc7"
top: "fc7"
}
layer {
name: "drop7"
type: "Dropout"
bottom: "fc7"
top: "fc7"
dropout_param {
dropout_ratio: 0.5
}
}
layer {
name: "score_fr"
type: "Convolution"
bottom: "fc7"
top: "score_fr"
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 21
pad: 0
kernel_size: 1
}
}
layer {
name: "upscore"
type: "Deconvolution"
bottom: "score_fr"
top: "upscore"
param {
lr_mult: 0
}
convolution_param {
num_output: 21
bias_term: false
kernel_size: 63
stride: 32
}
}
layer {
name: "score"
type: "Crop"
bottom: "upscore"
bottom: "data"
top: "score"
crop_param {
axis: 2
offset: 18
}
}
layer {
name: "loss"
type: "SoftmaxWithLoss"
bottom: "score"
bottom: "label"
top: "loss"
loss_param {
ignore_label: 255
normalize: true
}
}
I0313 11:41:39.370163 9249 layer_factory.hpp:77] Creating layer data
I0313 11:41:39.370743 9249 net.cpp:84] Creating Layer data
I0313 11:41:39.370753 9249 net.cpp:380] data -> data
I0313 11:41:39.370759 9249 net.cpp:380] data -> label
I0313 11:41:39.372340 9249 net.cpp:122] Setting up data
I0313 11:41:39.372354 9249 net.cpp:129] Top shape: 1 3 16 2016 (96768)
I0313 11:41:39.372357 9249 net.cpp:129] Top shape: 1 16 2016 (32256)
I0313 11:41:39.372360 9249 net.cpp:137] Memory required for data: 516096
I0313 11:41:39.372364 9249 layer_factory.hpp:77] Creating layer data_data_0_split
I0313 11:41:39.372370 9249 net.cpp:84] Creating Layer data_data_0_split
I0313 11:41:39.372372 9249 net.cpp:406] data_data_0_split <- data
I0313 11:41:39.372376 9249 net.cpp:380] data_data_0_split -> data_data_0_split_0
I0313 11:41:39.372382 9249 net.cpp:380] data_data_0_split -> data_data_0_split_1
I0313 11:41:39.372387 9249 net.cpp:122] Setting up data_data_0_split
I0313 11:41:39.372391 9249 net.cpp:129] Top shape: 1 3 16 2016 (96768)
I0313 11:41:39.372395 9249 net.cpp:129] Top shape: 1 3 16 2016 (96768)
I0313 11:41:39.372397 9249 net.cpp:137] Memory required for data: 1290240
I0313 11:41:39.372400 9249 layer_factory.hpp:77] Creating layer conv1
I0313 11:41:39.372406 9249 net.cpp:84] Creating Layer conv1
I0313 11:41:39.372409 9249 net.cpp:406] conv1 <- data_data_0_split_0
I0313 11:41:39.372412 9249 net.cpp:380] conv1 -> conv1
I0313 11:41:39.372515 9249 net.cpp:122] Setting up conv1
I0313 11:41:39.372521 9249 net.cpp:129] Top shape: 1 96 52 552 (2755584)
I0313 11:41:39.372524 9249 net.cpp:137] Memory required for data: 12312576
I0313 11:41:39.372531 9249 layer_factory.hpp:77] Creating layer relu1
I0313 11:41:39.372535 9249 net.cpp:84] Creating Layer relu1
I0313 11:41:39.372539 9249 net.cpp:406] relu1 <- conv1
I0313 11:41:39.372541 9249 net.cpp:367] relu1 -> conv1 (in-place)
I0313 11:41:39.372546 9249 net.cpp:122] Setting up relu1
I0313 11:41:39.372550 9249 net.cpp:129] Top shape: 1 96 52 552 (2755584)
I0313 11:41:39.372552 9249 net.cpp:137] Memory required for data: 23334912
I0313 11:41:39.372555 9249 layer_factory.hpp:77] Creating layer pool1
I0313 11:41:39.372558 9249 net.cpp:84] Creating Layer pool1
I0313 11:41:39.372560 9249 net.cpp:406] pool1 <- conv1
I0313 11:41:39.372565 9249 net.cpp:380] pool1 -> pool1
I0313 11:41:39.372573 9249 net.cpp:122] Setting up pool1
I0313 11:41:39.372576 9249 net.cpp:129] Top shape: 1 96 26 276 (688896)
I0313 11:41:39.372579 9249 net.cpp:137] Memory required for data: 26090496
I0313 11:41:39.372581 9249 layer_factory.hpp:77] Creating layer norm1
I0313 11:41:39.372586 9249 net.cpp:84] Creating Layer norm1
I0313 11:41:39.372588 9249 net.cpp:406] norm1 <- pool1
I0313 11:41:39.372593 9249 net.cpp:380] norm1 -> norm1
I0313 11:41:39.372599 9249 net.cpp:122] Setting up norm1
I0313 11:41:39.372602 9249 net.cpp:129] Top shape: 1 96 26 276 (688896)
I0313 11:41:39.372604 9249 net.cpp:137] Memory required for data: 28846080
I0313 11:41:39.372607 9249 layer_factory.hpp:77] Creating layer conv2
I0313 11:41:39.372611 9249 net.cpp:84] Creating Layer conv2
I0313 11:41:39.372613 9249 net.cpp:406] conv2 <- norm1
I0313 11:41:39.372617 9249 net.cpp:380] conv2 -> conv2
I0313 11:41:39.373008 9249 net.cpp:122] Setting up conv2
I0313 11:41:39.373013 9249 net.cpp:129] Top shape: 1 256 26 276 (1837056)
I0313 11:41:39.373015 9249 net.cpp:137] Memory required for data: 36194304
I0313 11:41:39.373021 9249 layer_factory.hpp:77] Creating layer relu2
I0313 11:41:39.373025 9249 net.cpp:84] Creating Layer relu2
I0313 11:41:39.373028 9249 net.cpp:406] relu2 <- conv2
I0313 11:41:39.373030 9249 net.cpp:367] relu2 -> conv2 (in-place)
I0313 11:41:39.373034 9249 net.cpp:122] Setting up relu2
I0313 11:41:39.373039 9249 net.cpp:129] Top shape: 1 256 26 276 (1837056)
I0313 11:41:39.373040 9249 net.cpp:137] Memory required for data: 43542528
I0313 11:41:39.373042 9249 layer_factory.hpp:77] Creating layer pool2
I0313 11:41:39.373046 9249 net.cpp:84] Creating Layer pool2
I0313 11:41:39.373049 9249 net.cpp:406] pool2 <- conv2
I0313 11:41:39.373052 9249 net.cpp:380] pool2 -> pool2
I0313 11:41:39.373057 9249 net.cpp:122] Setting up pool2
I0313 11:41:39.373061 9249 net.cpp:129] Top shape: 1 256 13 138 (459264)
I0313 11:41:39.373064 9249 net.cpp:137] Memory required for data: 45379584
I0313 11:41:39.373065 9249 layer_factory.hpp:77] Creating layer norm2
I0313 11:41:39.373070 9249 net.cpp:84] Creating Layer norm2
I0313 11:41:39.373071 9249 net.cpp:406] norm2 <- pool2
I0313 11:41:39.373075 9249 net.cpp:380] norm2 -> norm2
I0313 11:41:39.373080 9249 net.cpp:122] Setting up norm2
I0313 11:41:39.373082 9249 net.cpp:129] Top shape: 1 256 13 138 (459264)
I0313 11:41:39.373085 9249 net.cpp:137] Memory required for data: 47216640
I0313 11:41:39.373087 9249 layer_factory.hpp:77] Creating layer conv3
I0313 11:41:39.373091 9249 net.cpp:84] Creating Layer conv3
I0313 11:41:39.373093 9249 net.cpp:406] conv3 <- norm2
I0313 11:41:39.373096 9249 net.cpp:380] conv3 -> conv3
I0313 11:41:39.373900 9249 net.cpp:122] Setting up conv3
I0313 11:41:39.373906 9249 net.cpp:129] Top shape: 1 384 13 138 (688896)
I0313 11:41:39.373909 9249 net.cpp:137] Memory required for data: 49972224
I0313 11:41:39.373914 9249 layer_factory.hpp:77] Creating layer relu3
I0313 11:41:39.373919 9249 net.cpp:84] Creating Layer relu3
I0313 11:41:39.373921 9249 net.cpp:406] relu3 <- conv3
I0313 11:41:39.373924 9249 net.cpp:367] relu3 -> conv3 (in-place)
I0313 11:41:39.373929 9249 net.cpp:122] Setting up relu3
I0313 11:41:39.373931 9249 net.cpp:129] Top shape: 1 384 13 138 (688896)
I0313 11:41:39.373934 9249 net.cpp:137] Memory required for data: 52727808
I0313 11:41:39.373936 9249 layer_factory.hpp:77] Creating layer conv4
I0313 11:41:39.373941 9249 net.cpp:84] Creating Layer conv4
I0313 11:41:39.373944 9249 net.cpp:406] conv4 <- conv3
I0313 11:41:39.373947 9249 net.cpp:380] conv4 -> conv4
I0313 11:41:39.374778 9249 net.cpp:122] Setting up conv4
I0313 11:41:39.374783 9249 net.cpp:129] Top shape: 1 384 13 138 (688896)
I0313 11:41:39.374785 9249 net.cpp:137] Memory required for data: 55483392
I0313 11:41:39.374789 9249 layer_factory.hpp:77] Creating layer relu4
I0313 11:41:39.374794 9249 net.cpp:84] Creating Layer relu4
I0313 11:41:39.374795 9249 net.cpp:406] relu4 <- conv4
I0313 11:41:39.374800 9249 net.cpp:367] relu4 -> conv4 (in-place)
I0313 11:41:39.374804 9249 net.cpp:122] Setting up relu4
I0313 11:41:39.374807 9249 net.cpp:129] Top shape: 1 384 13 138 (688896)
I0313 11:41:39.374809 9249 net.cpp:137] Memory required for data: 58238976
I0313 11:41:39.374811 9249 layer_factory.hpp:77] Creating layer conv5
I0313 11:41:39.374816 9249 net.cpp:84] Creating Layer conv5
I0313 11:41:39.374819 9249 net.cpp:406] conv5 <- conv4
I0313 11:41:39.374824 9249 net.cpp:380] conv5 -> conv5
I0313 11:41:39.375376 9249 net.cpp:122] Setting up conv5
I0313 11:41:39.375382 9249 net.cpp:129] Top shape: 1 256 13 138 (459264)
I0313 11:41:39.375385 9249 net.cpp:137] Memory required for data: 60076032
I0313 11:41:39.375392 9249 layer_factory.hpp:77] Creating layer relu5
I0313 11:41:39.375397 9249 net.cpp:84] Creating Layer relu5
I0313 11:41:39.375399 9249 net.cpp:406] relu5 <- conv5
I0313 11:41:39.375402 9249 net.cpp:367] relu5 -> conv5 (in-place)
I0313 11:41:39.375406 9249 net.cpp:122] Setting up relu5
I0313 11:41:39.375409 9249 net.cpp:129] Top shape: 1 256 13 138 (459264)
I0313 11:41:39.375412 9249 net.cpp:137] Memory required for data: 61913088
I0313 11:41:39.375414 9249 layer_factory.hpp:77] Creating layer pool5
I0313 11:41:39.375421 9249 net.cpp:84] Creating Layer pool5
I0313 11:41:39.375422 9249 net.cpp:406] pool5 <- conv5
I0313 11:41:39.375425 9249 net.cpp:380] pool5 -> pool5
I0313 11:41:39.375432 9249 net.cpp:122] Setting up pool5
I0313 11:41:39.375434 9249 net.cpp:129] Top shape: 1 256 6 69 (105984)
I0313 11:41:39.375437 9249 net.cpp:137] Memory required for data: 62337024
I0313 11:41:39.375439 9249 layer_factory.hpp:77] Creating layer fc6
I0313 11:41:39.375444 9249 net.cpp:84] Creating Layer fc6
I0313 11:41:39.375447 9249 net.cpp:406] fc6 <- pool5
I0313 11:41:39.375452 9249 net.cpp:380] fc6 -> fc6
I0313 11:41:39.404399 9249 net.cpp:122] Setting up fc6
I0313 11:41:39.404426 9249 net.cpp:129] Top shape: 1 4096 1 64 (262144)
I0313 11:41:39.404430 9249 net.cpp:137] Memory required for data: 63385600
I0313 11:41:39.404438 9249 layer_factory.hpp:77] Creating layer relu6
I0313 11:41:39.404445 9249 net.cpp:84] Creating Layer relu6
I0313 11:41:39.404449 9249 net.cpp:406] relu6 <- fc6
I0313 11:41:39.404453 9249 net.cpp:367] relu6 -> fc6 (in-place)
I0313 11:41:39.404460 9249 net.cpp:122] Setting up relu6
I0313 11:41:39.404464 9249 net.cpp:129] Top shape: 1 4096 1 64 (262144)
I0313 11:41:39.404466 9249 net.cpp:137] Memory required for data: 64434176
I0313 11:41:39.404469 9249 layer_factory.hpp:77] Creating layer drop6
I0313 11:41:39.404474 9249 net.cpp:84] Creating Layer drop6
I0313 11:41:39.404476 9249 net.cpp:406] drop6 <- fc6
I0313 11:41:39.404481 9249 net.cpp:367] drop6 -> fc6 (in-place)
I0313 11:41:39.404486 9249 net.cpp:122] Setting up drop6
I0313 11:41:39.404489 9249 net.cpp:129] Top shape: 1 4096 1 64 (262144)
I0313 11:41:39.404492 9249 net.cpp:137] Memory required for data: 65482752
I0313 11:41:39.404495 9249 layer_factory.hpp:77] Creating layer fc7
I0313 11:41:39.404500 9249 net.cpp:84] Creating Layer fc7
I0313 11:41:39.404503 9249 net.cpp:406] fc7 <- fc6
I0313 11:41:39.404506 9249 net.cpp:380] fc7 -> fc7
I0313 11:41:39.417629 9249 net.cpp:122] Setting up fc7
I0313 11:41:39.417654 9249 net.cpp:129] Top shape: 1 4096 1 64 (262144)
I0313 11:41:39.417657 9249 net.cpp:137] Memory required for data: 66531328
I0313 11:41:39.417665 9249 layer_factory.hpp:77] Creating layer relu7
I0313 11:41:39.417672 9249 net.cpp:84] Creating Layer relu7
I0313 11:41:39.417676 9249 net.cpp:406] relu7 <- fc7
I0313 11:41:39.417680 9249 net.cpp:367] relu7 -> fc7 (in-place)
I0313 11:41:39.417687 9249 net.cpp:122] Setting up relu7
I0313 11:41:39.417690 9249 net.cpp:129] Top shape: 1 4096 1 64 (262144)
I0313 11:41:39.417693 9249 net.cpp:137] Memory required for data: 67579904
I0313 11:41:39.417696 9249 layer_factory.hpp:77] Creating layer drop7
I0313 11:41:39.417703 9249 net.cpp:84] Creating Layer drop7
I0313 11:41:39.417706 9249 net.cpp:406] drop7 <- fc7
I0313 11:41:39.417709 9249 net.cpp:367] drop7 -> fc7 (in-place)
I0313 11:41:39.417713 9249 net.cpp:122] Setting up drop7
I0313 11:41:39.417716 9249 net.cpp:129] Top shape: 1 4096 1 64 (262144)
I0313 11:41:39.417719 9249 net.cpp:137] Memory required for data: 68628480
I0313 11:41:39.417721 9249 layer_factory.hpp:77] Creating layer score_fr
I0313 11:41:39.417727 9249 net.cpp:84] Creating Layer score_fr
I0313 11:41:39.417729 9249 net.cpp:406] score_fr <- fc7
I0313 11:41:39.417734 9249 net.cpp:380] score_fr -> score_fr
I0313 11:41:39.417858 9249 net.cpp:122] Setting up score_fr
I0313 11:41:39.417865 9249 net.cpp:129] Top shape: 1 21 1 64 (1344)
I0313 11:41:39.417867 9249 net.cpp:137] Memory required for data: 68633856
I0313 11:41:39.417871 9249 layer_factory.hpp:77] Creating layer upscore
I0313 11:41:39.417877 9249 net.cpp:84] Creating Layer upscore
I0313 11:41:39.417881 9249 net.cpp:406] upscore <- score_fr
I0313 11:41:39.417884 9249 net.cpp:380] upscore -> upscore
I0313 11:41:39.419461 9249 net.cpp:122] Setting up upscore
I0313 11:41:39.419472 9249 net.cpp:129] Top shape: 1 21 63 2079 (2750517)
I0313 11:41:39.419476 9249 net.cpp:137] Memory required for data: 79635924
I0313 11:41:39.419484 9249 layer_factory.hpp:77] Creating layer score
I0313 11:41:39.419497 9249 net.cpp:84] Creating Layer score
I0313 11:41:39.419499 9249 net.cpp:406] score <- upscore
I0313 11:41:39.419503 9249 net.cpp:406] score <- data_data_0_split_1
I0313 11:41:39.419507 9249 net.cpp:380] score -> score
I0313 11:41:39.419517 9249 net.cpp:122] Setting up score
I0313 11:41:39.419539 9249 net.cpp:129] Top shape: 1 21 16 2016 (677376)
I0313 11:41:39.419543 9249 net.cpp:137] Memory required for data: 82345428
I0313 11:41:39.419544 9249 layer_factory.hpp:77] Creating layer loss
I0313 11:41:39.419554 9249 net.cpp:84] Creating Layer loss
I0313 11:41:39.419558 9249 net.cpp:406] loss <- score
I0313 11:41:39.419560 9249 net.cpp:406] loss <- label
I0313 11:41:39.419564 9249 net.cpp:380] loss -> loss
I0313 11:41:39.419572 9249 layer_factory.hpp:77] Creating layer loss
I0313 11:41:39.420116 9249 net.cpp:122] Setting up loss
I0313 11:41:39.420122 9249 net.cpp:129] Top shape: (1)
I0313 11:41:39.420125 9249 net.cpp:132] with loss weight 1
I0313 11:41:39.420135 9249 net.cpp:137] Memory required for data: 82345432
I0313 11:41:39.420137 9249 net.cpp:198] loss needs backward computation.
I0313 11:41:39.420140 9249 net.cpp:198] score needs backward computation.
I0313 11:41:39.420143 9249 net.cpp:198] upscore needs backward computation.
I0313 11:41:39.420145 9249 net.cpp:198] score_fr needs backward computation.
I0313 11:41:39.420148 9249 net.cpp:198] drop7 needs backward computation.
I0313 11:41:39.420151 9249 net.cpp:198] relu7 needs backward computation.
I0313 11:41:39.420155 9249 net.cpp:198] fc7 needs backward computation.
I0313 11:41:39.420156 9249 net.cpp:198] drop6 needs backward computation.
I0313 11:41:39.420159 9249 net.cpp:198] relu6 needs backward computation.
I0313 11:41:39.420161 9249 net.cpp:198] fc6 needs backward computation.
I0313 11:41:39.420164 9249 net.cpp:198] pool5 needs backward computation.
I0313 11:41:39.420167 9249 net.cpp:198] relu5 needs backward computation.
I0313 11:41:39.420169 9249 net.cpp:198] conv5 needs backward computation.
I0313 11:41:39.420172 9249 net.cpp:198] relu4 needs backward computation.
I0313 11:41:39.420176 9249 net.cpp:198] conv4 needs backward computation.
I0313 11:41:39.420177 9249 net.cpp:198] relu3 needs backward computation.
I0313 11:41:39.420181 9249 net.cpp:198] conv3 needs backward computation.
I0313 11:41:39.420183 9249 net.cpp:198] norm2 needs backward computation.
I0313 11:41:39.420186 9249 net.cpp:198] pool2 needs backward computation.
I0313 11:41:39.420189 9249 net.cpp:198] relu2 needs backward computation.
I0313 11:41:39.420192 9249 net.cpp:198] conv2 needs backward computation.
I0313 11:41:39.420194 9249 net.cpp:198] norm1 needs backward computation.
I0313 11:41:39.420197 9249 net.cpp:198] pool1 needs backward computation.
I0313 11:41:39.420200 9249 net.cpp:198] relu1 needs backward computation.
I0313 11:41:39.420203 9249 net.cpp:198] conv1 needs backward computation.
I0313 11:41:39.420207 9249 net.cpp:200] data_data_0_split does not need backward computation.
I0313 11:41:39.420210 9249 net.cpp:200] data does not need backward computation.
I0313 11:41:39.420212 9249 net.cpp:242] This network produces output loss
I0313 11:41:39.420224 9249 net.cpp:255] Network initialization done.
I0313 11:41:39.420586 9249 solver.cpp:190] Creating test net (#0) specified by test_net file: val.prototxt
I0313 11:41:39.420764 9249 net.cpp:51] Initializing net from parameters:
state {
phase: TEST
}
layer {
name: "data"
type: "Python"
top: "data"
top: "label"
python_param {
module: "pcl_data_layer"
layer: "PCLSegDataLayer"
param_str: "{\'pcl_dir\': \'/home/zzy/CLionProjects/ROS_Project/ws/src/fcn_data_gen/data/npy\'}"
}
}
layer {
name: "conv1"
type: "Convolution"
bottom: "data"
top: "conv1"
convolution_param {
num_output: 96
pad: 100
kernel_size: 11
group: 1
stride: 4
}
}
layer {
name: "relu1"
type: "ReLU"
bottom: "conv1"
top: "conv1"
}
layer {
name: "pool1"
type: "Pooling"
bottom: "conv1"
top: "pool1"
pooling_param {
pool: MAX
kernel_size: 3
stride: 2
}
}
layer {
name: "norm1"
type: "LRN"
bottom: "pool1"
top: "norm1"
lrn_param {
local_size: 5
alpha: 0.0001
beta: 0.75
}
}
layer {
name: "conv2"
type: "Convolution"
bottom: "norm1"
top: "conv2"
convolution_param {
num_output: 256
pad: 2
kernel_size: 5
group: 2
stride: 1
}
}
layer {
name: "relu2"
type: "ReLU"
bottom: "conv2"
top: "conv2"
}
layer {
name: "pool2"
type: "Pooling"
bottom: "conv2"
top: "pool2"
pooling_param {
pool: MAX
kernel_size: 3
stride: 2
}
}
layer {
name: "norm2"
type: "LRN"
bottom: "pool2"
top: "norm2"
lrn_param {
local_size: 5
alpha: 0.0001
beta: 0.75
}
}
layer {
name: "conv3"
type: "Convolution"
bottom: "norm2"
top: "conv3"
convolution_param {
num_output: 384
pad: 1
kernel_size: 3
group: 1
stride: 1
}
}
layer {
name: "relu3"
type: "ReLU"
bottom: "conv3"
top: "conv3"
}
layer {
name: "conv4"
type: "Convolution"
bottom: "conv3"
top: "conv4"
convolution_param {
num_output: 384
pad: 1
kernel_size: 3
group: 2
stride: 1
}
}
layer {
name: "relu4"
type: "ReLU"
bottom: "conv4"
top: "conv4"
}
layer {
name: "conv5"
type: "Convolution"
bottom: "conv4"
top: "conv5"
convolution_param {
num_output: 256
pad: 1
kernel_size: 3
group: 2
stride: 1
}
}
layer {
name: "relu5"
type: "ReLU"
bottom: "conv5"
top: "conv5"
}
layer {
name: "pool5"
type: "Pooling"
bottom: "conv5"
top: "pool5"
pooling_param {
pool: MAX
kernel_size: 3
stride: 2
}
}
layer {
name: "fc6"
type: "Convolution"
bottom: "pool5"
top: "fc6"
convolution_param {
num_output: 4096
pad: 0
kernel_size: 6
group: 1
stride: 1
}
}
layer {
name: "relu6"
type: "ReLU"
bottom: "fc6"
top: "fc6"
}
layer {
name: "drop6"
type: "Dropout"
bottom: "fc6"
top: "fc6"
dropout_param {
dropout_ratio: 0.5
}
}
layer {
name: "fc7"
type: "Convolution"
bottom: "fc6"
top: "fc7"
convolution_param {
num_output: 4096
pad: 0
kernel_size: 1
group: 1
stride: 1
}
}
layer {
name: "relu7"
type: "ReLU"
bottom: "fc7"
top: "fc7"
}
layer {
name: "drop7"
type: "Dropout"
bottom: "fc7"
top: "fc7"
dropout_param {
dropout_ratio: 0.5
}
}
layer {
name: "score_fr"
type: "Convolution"
bottom: "fc7"
top: "score_fr"
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 21
pad: 0
kernel_size: 1
}
}
layer {
name: "upscore"
type: "Deconvolution"
bottom: "score_fr"
top: "upscore"
param {
lr_mult: 0
}
convolution_param {
num_output: 21
bias_term: false
kernel_size: 63
stride: 32
}
}
layer {
name: "score"
type: "Crop"
bottom: "upscore"
bottom: "data"
top: "score"
crop_param {
axis: 2
offset: 18
}
}
layer {
name: "loss"
type: "SoftmaxWithLoss"
bottom: "score"
bottom: "label"
top: "loss"
loss_param {
ignore_label: 255
normalize: true
}
}
I0313 11:41:39.420830 9249 layer_factory.hpp:77] Creating layer data
I0313 11:41:39.420866 9249 net.cpp:84] Creating Layer data
I0313 11:41:39.420871 9249 net.cpp:380] data -> data
I0313 11:41:39.420877 9249 net.cpp:380] data -> label
I0313 11:41:39.422286 9249 net.cpp:122] Setting up data
I0313 11:41:39.422296 9249 net.cpp:129] Top shape: 1 3 16 2016 (96768)
I0313 11:41:39.422299 9249 net.cpp:129] Top shape: 1 16 2016 (32256)
I0313 11:41:39.422302 9249 net.cpp:137] Memory required for data: 516096
I0313 11:41:39.422305 9249 layer_factory.hpp:77] Creating layer data_data_0_split
I0313 11:41:39.422310 9249 net.cpp:84] Creating Layer data_data_0_split
I0313 11:41:39.422313 9249 net.cpp:406] data_data_0_split <- data
I0313 11:41:39.422317 9249 net.cpp:380] data_data_0_split -> data_data_0_split_0
I0313 11:41:39.422322 9249 net.cpp:380] data_data_0_split -> data_data_0_split_1
I0313 11:41:39.422327 9249 net.cpp:122] Setting up data_data_0_split
I0313 11:41:39.422332 9249 net.cpp:129] Top shape: 1 3 16 2016 (96768)
I0313 11:41:39.422334 9249 net.cpp:129] Top shape: 1 3 16 2016 (96768)
I0313 11:41:39.422338 9249 net.cpp:137] Memory required for data: 1290240
I0313 11:41:39.422339 9249 layer_factory.hpp:77] Creating layer conv1
I0313 11:41:39.422346 9249 net.cpp:84] Creating Layer conv1
I0313 11:41:39.422349 9249 net.cpp:406] conv1 <- data_data_0_split_0
I0313 11:41:39.422353 9249 net.cpp:380] conv1 -> conv1
I0313 11:41:39.422446 9249 net.cpp:122] Setting up conv1
I0313 11:41:39.422451 9249 net.cpp:129] Top shape: 1 96 52 552 (2755584)
I0313 11:41:39.422454 9249 net.cpp:137] Memory required for data: 12312576
I0313 11:41:39.422461 9249 layer_factory.hpp:77] Creating layer relu1
I0313 11:41:39.422466 9249 net.cpp:84] Creating Layer relu1
I0313 11:41:39.422469 9249 net.cpp:406] relu1 <- conv1
I0313 11:41:39.422472 9249 net.cpp:367] relu1 -> conv1 (in-place)
I0313 11:41:39.422477 9249 net.cpp:122] Setting up relu1
I0313 11:41:39.422479 9249 net.cpp:129] Top shape: 1 96 52 552 (2755584)
I0313 11:41:39.422482 9249 net.cpp:137] Memory required for data: 23334912
I0313 11:41:39.422484 9249 layer_factory.hpp:77] Creating layer pool1
I0313 11:41:39.422488 9249 net.cpp:84] Creating Layer pool1
I0313 11:41:39.422490 9249 net.cpp:406] pool1 <- conv1
I0313 11:41:39.422495 9249 net.cpp:380] pool1 -> pool1
I0313 11:41:39.422502 9249 net.cpp:122] Setting up pool1
I0313 11:41:39.422504 9249 net.cpp:129] Top shape: 1 96 26 276 (688896)
I0313 11:41:39.422507 9249 net.cpp:137] Memory required for data: 26090496
I0313 11:41:39.422508 9249 layer_factory.hpp:77] Creating layer norm1
I0313 11:41:39.422513 9249 net.cpp:84] Creating Layer norm1
I0313 11:41:39.422516 9249 net.cpp:406] norm1 <- pool1
I0313 11:41:39.422519 9249 net.cpp:380] norm1 -> norm1
I0313 11:41:39.422524 9249 net.cpp:122] Setting up norm1
I0313 11:41:39.422528 9249 net.cpp:129] Top shape: 1 96 26 276 (688896)
I0313 11:41:39.422529 9249 net.cpp:137] Memory required for data: 28846080
I0313 11:41:39.422531 9249 layer_factory.hpp:77] Creating layer conv2
I0313 11:41:39.422536 9249 net.cpp:84] Creating Layer conv2
I0313 11:41:39.422539 9249 net.cpp:406] conv2 <- norm1
I0313 11:41:39.422543 9249 net.cpp:380] conv2 -> conv2
I0313 11:41:39.422933 9249 net.cpp:122] Setting up conv2
I0313 11:41:39.422940 9249 net.cpp:129] Top shape: 1 256 26 276 (1837056)
I0313 11:41:39.422941 9249 net.cpp:137] Memory required for data: 36194304
I0313 11:41:39.422947 9249 layer_factory.hpp:77] Creating layer relu2
I0313 11:41:39.422951 9249 net.cpp:84] Creating Layer relu2
I0313 11:41:39.422955 9249 net.cpp:406] relu2 <- conv2
I0313 11:41:39.422958 9249 net.cpp:367] relu2 -> conv2 (in-place)
I0313 11:41:39.422962 9249 net.cpp:122] Setting up relu2
I0313 11:41:39.422966 9249 net.cpp:129] Top shape: 1 256 26 276 (1837056)
I0313 11:41:39.422967 9249 net.cpp:137] Memory required for data: 43542528
I0313 11:41:39.422971 9249 layer_factory.hpp:77] Creating layer pool2
I0313 11:41:39.422973 9249 net.cpp:84] Creating Layer pool2
I0313 11:41:39.422976 9249 net.cpp:406] pool2 <- conv2
I0313 11:41:39.422979 9249 net.cpp:380] pool2 -> pool2
I0313 11:41:39.422984 9249 net.cpp:122] Setting up pool2
I0313 11:41:39.422988 9249 net.cpp:129] Top shape: 1 256 13 138 (459264)
I0313 11:41:39.422991 9249 net.cpp:137] Memory required for data: 45379584
I0313 11:41:39.422992 9249 layer_factory.hpp:77] Creating layer norm2
I0313 11:41:39.422997 9249 net.cpp:84] Creating Layer norm2
I0313 11:41:39.422999 9249 net.cpp:406] norm2 <- pool2
I0313 11:41:39.423003 9249 net.cpp:380] norm2 -> norm2
I0313 11:41:39.423008 9249 net.cpp:122] Setting up norm2
I0313 11:41:39.423012 9249 net.cpp:129] Top shape: 1 256 13 138 (459264)
I0313 11:41:39.423013 9249 net.cpp:137] Memory required for data: 47216640
I0313 11:41:39.423015 9249 layer_factory.hpp:77] Creating layer conv3
I0313 11:41:39.423020 9249 net.cpp:84] Creating Layer conv3
I0313 11:41:39.423023 9249 net.cpp:406] conv3 <- norm2
I0313 11:41:39.423027 9249 net.cpp:380] conv3 -> conv3
I0313 11:41:39.423882 9249 net.cpp:122] Setting up conv3
I0313 11:41:39.423888 9249 net.cpp:129] Top shape: 1 384 13 138 (688896)
I0313 11:41:39.423892 9249 net.cpp:137] Memory required for data: 49972224
I0313 11:41:39.423897 9249 layer_factory.hpp:77] Creating layer relu3
I0313 11:41:39.423902 9249 net.cpp:84] Creating Layer relu3
I0313 11:41:39.423904 9249 net.cpp:406] relu3 <- conv3
I0313 11:41:39.423907 9249 net.cpp:367] relu3 -> conv3 (in-place)
I0313 11:41:39.423912 9249 net.cpp:122] Setting up relu3
I0313 11:41:39.423914 9249 net.cpp:129] Top shape: 1 384 13 138 (688896)
I0313 11:41:39.423918 9249 net.cpp:137] Memory required for data: 52727808
I0313 11:41:39.423919 9249 layer_factory.hpp:77] Creating layer conv4
I0313 11:41:39.423923 9249 net.cpp:84] Creating Layer conv4
I0313 11:41:39.423925 9249 net.cpp:406] conv4 <- conv3
I0313 11:41:39.423930 9249 net.cpp:380] conv4 -> conv4
I0313 11:41:39.424738 9249 net.cpp:122] Setting up conv4
I0313 11:41:39.424744 9249 net.cpp:129] Top shape: 1 384 13 138 (688896)
I0313 11:41:39.424747 9249 net.cpp:137] Memory required for data: 55483392
I0313 11:41:39.424751 9249 layer_factory.hpp:77] Creating layer relu4
I0313 11:41:39.424756 9249 net.cpp:84] Creating Layer relu4
I0313 11:41:39.424757 9249 net.cpp:406] relu4 <- conv4
I0313 11:41:39.424762 9249 net.cpp:367] relu4 -> conv4 (in-place)
I0313 11:41:39.424764 9249 net.cpp:122] Setting up relu4
I0313 11:41:39.424767 9249 net.cpp:129] Top shape: 1 384 13 138 (688896)
I0313 11:41:39.424770 9249 net.cpp:137] Memory required for data: 58238976
I0313 11:41:39.424772 9249 layer_factory.hpp:77] Creating layer conv5
I0313 11:41:39.424777 9249 net.cpp:84] Creating Layer conv5
I0313 11:41:39.424779 9249 net.cpp:406] conv5 <- conv4
I0313 11:41:39.424784 9249 net.cpp:380] conv5 -> conv5
I0313 11:41:39.425376 9249 net.cpp:122] Setting up conv5
I0313 11:41:39.425384 9249 net.cpp:129] Top shape: 1 256 13 138 (459264)
I0313 11:41:39.425385 9249 net.cpp:137] Memory required for data: 60076032
I0313 11:41:39.425393 9249 layer_factory.hpp:77] Creating layer relu5
I0313 11:41:39.425397 9249 net.cpp:84] Creating Layer relu5
I0313 11:41:39.425400 9249 net.cpp:406] relu5 <- conv5
I0313 11:41:39.425403 9249 net.cpp:367] relu5 -> conv5 (in-place)
I0313 11:41:39.425406 9249 net.cpp:122] Setting up relu5
I0313 11:41:39.425410 9249 net.cpp:129] Top shape: 1 256 13 138 (459264)
I0313 11:41:39.425412 9249 net.cpp:137] Memory required for data: 61913088
I0313 11:41:39.425415 9249 layer_factory.hpp:77] Creating layer pool5
I0313 11:41:39.425420 9249 net.cpp:84] Creating Layer pool5
I0313 11:41:39.425423 9249 net.cpp:406] pool5 <- conv5
I0313 11:41:39.425426 9249 net.cpp:380] pool5 -> pool5
I0313 11:41:39.425432 9249 net.cpp:122] Setting up pool5
I0313 11:41:39.425436 9249 net.cpp:129] Top shape: 1 256 6 69 (105984)
I0313 11:41:39.425437 9249 net.cpp:137] Memory required for data: 62337024
I0313 11:41:39.425441 9249 layer_factory.hpp:77] Creating layer fc6
I0313 11:41:39.425446 9249 net.cpp:84] Creating Layer fc6
I0313 11:41:39.425448 9249 net.cpp:406] fc6 <- pool5
I0313 11:41:39.425452 9249 net.cpp:380] fc6 -> fc6
I0313 11:41:39.454087 9249 net.cpp:122] Setting up fc6
I0313 11:41:39.454115 9249 net.cpp:129] Top shape: 1 4096 1 64 (262144)
I0313 11:41:39.454118 9249 net.cpp:137] Memory required for data: 63385600
I0313 11:41:39.454126 9249 layer_factory.hpp:77] Creating layer relu6
I0313 11:41:39.454134 9249 net.cpp:84] Creating Layer relu6
I0313 11:41:39.454138 9249 net.cpp:406] relu6 <- fc6
I0313 11:41:39.454143 9249 net.cpp:367] relu6 -> fc6 (in-place)
I0313 11:41:39.454149 9249 net.cpp:122] Setting up relu6
I0313 11:41:39.454152 9249 net.cpp:129] Top shape: 1 4096 1 64 (262144)
I0313 11:41:39.454155 9249 net.cpp:137] Memory required for data: 64434176
I0313 11:41:39.454157 9249 layer_factory.hpp:77] Creating layer drop6
I0313 11:41:39.454162 9249 net.cpp:84] Creating Layer drop6
I0313 11:41:39.454165 9249 net.cpp:406] drop6 <- fc6
I0313 11:41:39.454169 9249 net.cpp:367] drop6 -> fc6 (in-place)
I0313 11:41:39.454174 9249 net.cpp:122] Setting up drop6
I0313 11:41:39.454177 9249 net.cpp:129] Top shape: 1 4096 1 64 (262144)
I0313 11:41:39.454180 9249 net.cpp:137] Memory required for data: 65482752
I0313 11:41:39.454182 9249 layer_factory.hpp:77] Creating layer fc7
I0313 11:41:39.454188 9249 net.cpp:84] Creating Layer fc7
I0313 11:41:39.454190 9249 net.cpp:406] fc7 <- fc6
I0313 11:41:39.454195 9249 net.cpp:380] fc7 -> fc7
I0313 11:41:39.467375 9249 net.cpp:122] Setting up fc7
I0313 11:41:39.467401 9249 net.cpp:129] Top shape: 1 4096 1 64 (262144)
I0313 11:41:39.467403 9249 net.cpp:137] Memory required for data: 66531328
I0313 11:41:39.467411 9249 layer_factory.hpp:77] Creating layer relu7
I0313 11:41:39.467418 9249 net.cpp:84] Creating Layer relu7
I0313 11:41:39.467422 9249 net.cpp:406] relu7 <- fc7
I0313 11:41:39.467427 9249 net.cpp:367] relu7 -> fc7 (in-place)
I0313 11:41:39.467433 9249 net.cpp:122] Setting up relu7
I0313 11:41:39.467437 9249 net.cpp:129] Top shape: 1 4096 1 64 (262144)
I0313 11:41:39.467439 9249 net.cpp:137] Memory required for data: 67579904
I0313 11:41:39.467442 9249 layer_factory.hpp:77] Creating layer drop7
I0313 11:41:39.467449 9249 net.cpp:84] Creating Layer drop7
I0313 11:41:39.467452 9249 net.cpp:406] drop7 <- fc7
I0313 11:41:39.467455 9249 net.cpp:367] drop7 -> fc7 (in-place)
I0313 11:41:39.467460 9249 net.cpp:122] Setting up drop7
I0313 11:41:39.467463 9249 net.cpp:129] Top shape: 1 4096 1 64 (262144)
I0313 11:41:39.467465 9249 net.cpp:137] Memory required for data: 68628480
I0313 11:41:39.467468 9249 layer_factory.hpp:77] Creating layer score_fr
I0313 11:41:39.467474 9249 net.cpp:84] Creating Layer score_fr
I0313 11:41:39.467476 9249 net.cpp:406] score_fr <- fc7
I0313 11:41:39.467481 9249 net.cpp:380] score_fr -> score_fr
I0313 11:41:39.467617 9249 net.cpp:122] Setting up score_fr
I0313 11:41:39.467622 9249 net.cpp:129] Top shape: 1 21 1 64 (1344)
I0313 11:41:39.467624 9249 net.cpp:137] Memory required for data: 68633856
I0313 11:41:39.467629 9249 layer_factory.hpp:77] Creating layer upscore
I0313 11:41:39.467635 9249 net.cpp:84] Creating Layer upscore
I0313 11:41:39.467638 9249 net.cpp:406] upscore <- score_fr
I0313 11:41:39.467643 9249 net.cpp:380] upscore -> upscore
I0313 11:41:39.469235 9249 net.cpp:122] Setting up upscore
I0313 11:41:39.469246 9249 net.cpp:129] Top shape: 1 21 63 2079 (2750517)
I0313 11:41:39.469249 9249 net.cpp:137] Memory required for data: 79635924
I0313 11:41:39.469259 9249 layer_factory.hpp:77] Creating layer score
I0313 11:41:39.469266 9249 net.cpp:84] Creating Layer score
I0313 11:41:39.469269 9249 net.cpp:406] score <- upscore
I0313 11:41:39.469272 9249 net.cpp:406] score <- data_data_0_split_1
I0313 11:41:39.469276 9249 net.cpp:380] score -> score
I0313 11:41:39.469285 9249 net.cpp:122] Setting up score
I0313 11:41:39.469288 9249 net.cpp:129] Top shape: 1 21 16 2016 (677376)
I0313 11:41:39.469290 9249 net.cpp:137] Memory required for data: 82345428
I0313 11:41:39.469293 9249 layer_factory.hpp:77] Creating layer loss
I0313 11:41:39.469300 9249 net.cpp:84] Creating Layer loss
I0313 11:41:39.469301 9249 net.cpp:406] loss <- score
I0313 11:41:39.469305 9249 net.cpp:406] loss <- label
I0313 11:41:39.469308 9249 net.cpp:380] loss -> loss
I0313 11:41:39.469314 9249 layer_factory.hpp:77] Creating layer loss
I0313 11:41:39.469894 9249 net.cpp:122] Setting up loss
I0313 11:41:39.469900 9249 net.cpp:129] Top shape: (1)
I0313 11:41:39.469903 9249 net.cpp:132] with loss weight 1
I0313 11:41:39.469913 9249 net.cpp:137] Memory required for data: 82345432
I0313 11:41:39.469915 9249 net.cpp:198] loss needs backward computation.
I0313 11:41:39.469918 9249 net.cpp:198] score needs backward computation.
I0313 11:41:39.469920 9249 net.cpp:198] upscore needs backward computation.
I0313 11:41:39.469923 9249 net.cpp:198] score_fr needs backward computation.
I0313 11:41:39.469926 9249 net.cpp:198] drop7 needs backward computation.
I0313 11:41:39.469929 9249 net.cpp:198] relu7 needs backward computation.
I0313 11:41:39.469931 9249 net.cpp:198] fc7 needs backward computation.
I0313 11:41:39.469934 9249 net.cpp:198] drop6 needs backward computation.
I0313 11:41:39.469936 9249 net.cpp:198] relu6 needs backward computation.
I0313 11:41:39.469939 9249 net.cpp:198] fc6 needs backward computation.
I0313 11:41:39.469943 9249 net.cpp:198] pool5 needs backward computation.
I0313 11:41:39.469945 9249 net.cpp:198] relu5 needs backward computation.
I0313 11:41:39.469947 9249 net.cpp:198] conv5 needs backward computation.
I0313 11:41:39.469950 9249 net.cpp:198] relu4 needs backward computation.
I0313 11:41:39.469952 9249 net.cpp:198] conv4 needs backward computation.
I0313 11:41:39.469955 9249 net.cpp:198] relu3 needs backward computation.
I0313 11:41:39.469957 9249 net.cpp:198] conv3 needs backward computation.
I0313 11:41:39.469960 9249 net.cpp:198] norm2 needs backward computation.
I0313 11:41:39.469964 9249 net.cpp:198] pool2 needs backward computation.
I0313 11:41:39.469965 9249 net.cpp:198] relu2 needs backward computation.
I0313 11:41:39.469969 9249 net.cpp:198] conv2 needs backward computation.
I0313 11:41:39.469971 9249 net.cpp:198] norm1 needs backward computation.
I0313 11:41:39.469974 9249 net.cpp:198] pool1 needs backward computation.
I0313 11:41:39.469979 9249 net.cpp:198] relu1 needs backward computation.
I0313 11:41:39.469981 9249 net.cpp:198] conv1 needs backward computation.
I0313 11:41:39.469985 9249 net.cpp:200] data_data_0_split does not need backward computation.
I0313 11:41:39.469987 9249 net.cpp:200] data does not need backward computation.
I0313 11:41:39.469990 9249 net.cpp:242] This network produces output loss
I0313 11:41:39.470001 9249 net.cpp:255] Network initialization done.
I0313 11:41:39.470055 9249 solver.cpp:57] Solver scaffolding done.
I0313 11:42:40.745103 9249 solver.cpp:239] Iteration 0 (-1.4013e-45 iter/s, 61.136s/20 iters), loss = 4.54161
I0313 11:42:40.745129 9249 solver.cpp:258] Train net output #0: loss = 4.00278 (* 1 = 4.00278 loss)
I0313 11:42:40.745136 9249 sgd_solver.cpp:112] Iteration 0, lr = 0.0001
I0313 12:02:52.273387 9249 solver.cpp:239] Iteration 20 (0.0165081 iter/s, 1211.53s/20 iters), loss = 17.0233
I0313 12:02:52.273416 9249 solver.cpp:258] Train net output #0: loss = 19.2508 (* 1 = 19.2508 loss)
I0313 12:02:52.273422 9249 sgd_solver.cpp:112] Iteration 20, lr = 0.0001
I0313 12:23:09.810516 9249 solver.cpp:239] Iteration 40 (0.0164266 iter/s, 1217.54s/20 iters), loss = 26.7316
I0313 12:23:09.810544 9249 solver.cpp:258] Train net output #0: loss = 30.1355 (* 1 = 30.1355 loss)
I0313 12:23:09.810550 9249 sgd_solver.cpp:112] Iteration 40, lr = 0.0001
I0313 12:43:32.716285 9249 solver.cpp:239] Iteration 60 (0.0163545 iter/s, 1222.91s/20 iters), loss = 30.2106
I0313 12:43:32.716313 9249 solver.cpp:258] Train net output #0: loss = 22.8696 (* 1 = 22.8696 loss)
I0313 12:43:32.716320 9249 sgd_solver.cpp:112] Iteration 60, lr = 0.0001
I0313 13:03:49.434516 9249 solver.cpp:239] Iteration 80 (0.0164377 iter/s, 1216.72s/20 iters), loss = 31.0818
I0313 13:03:49.434543 9249 solver.cpp:258] Train net output #0: loss = 23.1428 (* 1 = 23.1428 loss)
I0313 13:03:49.434551 9249 sgd_solver.cpp:112] Iteration 80, lr = 0.0001
I0313 13:23:51.860294 9249 solver.cpp:239] Iteration 100 (0.0166331 iter/s, 1202.43s/20 iters), loss = 32.5238
I0313 13:23:51.860322 9249 solver.cpp:258] Train net output #0: loss = 35.1909 (* 1 = 35.1909 loss)
I0313 13:23:51.860328 9249 sgd_solver.cpp:112] Iteration 100, lr = 0.0001
I0313 13:43:38.481149 9249 solver.cpp:239] Iteration 120 (0.0168546 iter/s, 1186.62s/20 iters), loss = 33.0024
I0313 13:43:38.481176 9249 solver.cpp:258] Train net output #0: loss = 40.9104 (* 1 = 40.9104 loss)
I0313 13:43:38.481182 9249 sgd_solver.cpp:112] Iteration 120, lr = 0.0001
I0313 14:03:27.667078 9249 solver.cpp:239] Iteration 140 (0.0168182 iter/s, 1189.19s/20 iters), loss = 36.4908
I0313 14:03:27.667104 9249 solver.cpp:258] Train net output #0: loss = 53.9975 (* 1 = 53.9975 loss)
I0313 14:03:27.667111 9249 sgd_solver.cpp:112] Iteration 140, lr = 0.0001
I0313 14:23:25.009404 9249 solver.cpp:239] Iteration 160 (0.0167037 iter/s, 1197.34s/20 iters), loss = 52.2285
I0313 14:23:25.009431 9249 solver.cpp:258] Train net output #0: loss = 26.9314 (* 1 = 26.9314 loss)
I0313 14:23:25.009438 9249 sgd_solver.cpp:112] Iteration 160, lr = 0.0001
I0313 14:43:35.026921 9249 solver.cpp:239] Iteration 180 (0.0165287 iter/s, 1210.02s/20 iters), loss = 33.087
I0313 14:43:35.026950 9249 solver.cpp:258] Train net output #0: loss = 44.6887 (* 1 = 44.6887 loss)
I0313 14:43:35.026957 9249 sgd_solver.cpp:112] Iteration 180, lr = 0.0001
I0313 15:03:45.718956 9249 solver.cpp:239] Iteration 200 (0.0165195 iter/s, 1210.69s/20 iters), loss = 33.0793
I0313 15:03:45.718984 9249 solver.cpp:258] Train net output #0: loss = 34.2235 (* 1 = 34.2235 loss)
I0313 15:03:45.718991 9249 sgd_solver.cpp:112] Iteration 200, lr = 0.0001
I0313 15:24:27.503715 9249 solver.cpp:239] Iteration 220 (0.0161059 iter/s, 1241.78s/20 iters), loss = 33.1698
I0313 15:24:27.503741 9249 solver.cpp:258] Train net output #0: loss = 45.0323 (* 1 = 45.0323 loss)
I0313 15:24:27.503748 9249 sgd_solver.cpp:112] Iteration 220, lr = 0.0001
I0313 15:44:53.585564 9249 solver.cpp:239] Iteration 240 (0.0163121 iter/s, 1226.08s/20 iters), loss = 35.7697
I0313 15:44:53.585592 9249 solver.cpp:258] Train net output #0: loss = 45.5302 (* 1 = 45.5302 loss)
I0313 15:44:53.585598 9249 sgd_solver.cpp:112] Iteration 240, lr = 0.0001
I0313 16:04:44.744935 9249 solver.cpp:239] Iteration 260 (0.0167904 iter/s, 1191.16s/20 iters), loss = 29.4003
I0313 16:04:44.744963 9249 solver.cpp:258] Train net output #0: loss = 19.9242 (* 1 = 19.9242 loss)
I0313 16:04:44.744969 9249 sgd_solver.cpp:112] Iteration 260, lr = 0.0001
I0313 16:24:00.216655 9249 solver.cpp:239] Iteration 280 (0.017309 iter/s, 1155.47s/20 iters), loss = 24.0391
I0313 16:24:00.216681 9249 solver.cpp:258] Train net output #0: loss = 35.6398 (* 1 = 35.6398 loss)
I0313 16:24:00.216687 9249 sgd_solver.cpp:112] Iteration 280, lr = 0.0001
I0313 16:43:22.672458 9249 solver.cpp:239] Iteration 300 (0.017205 iter/s, 1162.45s/20 iters), loss = 33.2369
I0313 16:43:22.672485 9249 solver.cpp:258] Train net output #0: loss = 38.8301 (* 1 = 38.8301 loss)
I0313 16:43:22.672492 9249 sgd_solver.cpp:112] Iteration 300, lr = 0.0001
I0313 17:02:56.876072 9249 solver.cpp:239] Iteration 320 (0.0170328 iter/s, 1174.2s/20 iters), loss = 33.7243
I0313 17:02:56.876101 9249 solver.cpp:258] Train net output #0: loss = 33.6585 (* 1 = 33.6585 loss)
I0313 17:02:56.876106 9249 sgd_solver.cpp:112] Iteration 320, lr = 0.0001

可以看到在未更改其他网络参数的情况下,loss居高不下,本文着重与对Data layer的理解,下一篇文章将对网络结构内部进行优化,以达到点云数据loss达到预期的目标。


参考文献:

  1. FCN学习:Semantic Segmentation
  2. AlexNet