-
Notifications
You must be signed in to change notification settings - Fork 0
/
Copy pathloss_ae_wtv.txt
2806 lines (2805 loc) · 124 KB
/
loss_ae_wtv.txt
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
876
877
878
879
880
881
882
883
884
885
886
887
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
938
939
940
941
942
943
944
945
946
947
948
949
950
951
952
953
954
955
956
957
958
959
960
961
962
963
964
965
966
967
968
969
970
971
972
973
974
975
976
977
978
979
980
981
982
983
984
985
986
987
988
989
990
991
992
993
994
995
996
997
998
999
1000
2020-04-29 07:28:43.127712: I tensorflow/core/platform/cpu_feature_guard.cc:141] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA
2020-04-29 07:28:44.070691: I tensorflow/compiler/xla/service/service.cc:150] XLA service 0x5590dd640700 executing computations on platform CUDA. Devices:
2020-04-29 07:28:44.070753: I tensorflow/compiler/xla/service/service.cc:158] StreamExecutor device (0): GeForce GTX 980, Compute Capability 5.2
2020-04-29 07:28:44.070770: I tensorflow/compiler/xla/service/service.cc:158] StreamExecutor device (1): GeForce GTX 980, Compute Capability 5.2
2020-04-29 07:28:44.075114: I tensorflow/core/platform/profile_utils/cpu_utils.cc:94] CPU Frequency: 2399750000 Hz
2020-04-29 07:28:44.077652: I tensorflow/compiler/xla/service/service.cc:150] XLA service 0x5590dd738eb0 executing computations on platform Host. Devices:
2020-04-29 07:28:44.077690: I tensorflow/compiler/xla/service/service.cc:158] StreamExecutor device (0): <undefined>, <undefined>
2020-04-29 07:28:44.078250: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1433] Found device 0 with properties:
name: GeForce GTX 980 major: 5 minor: 2 memoryClockRate(GHz): 1.2785
pciBusID: 0000:03:00.0
totalMemory: 3.95GiB freeMemory: 3.87GiB
2020-04-29 07:28:44.078660: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1433] Found device 1 with properties:
name: GeForce GTX 980 major: 5 minor: 2 memoryClockRate(GHz): 1.2785
pciBusID: 0000:83:00.0
totalMemory: 3.95GiB freeMemory: 3.87GiB
2020-04-29 07:28:44.078725: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1512] Adding visible gpu devices: 0, 1
2020-04-29 07:28:44.080971: I tensorflow/core/common_runtime/gpu/gpu_device.cc:984] Device interconnect StreamExecutor with strength 1 edge matrix:
2020-04-29 07:28:44.080999: I tensorflow/core/common_runtime/gpu/gpu_device.cc:990] 0 1
2020-04-29 07:28:44.081011: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1003] 0: N N
2020-04-29 07:28:44.081021: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1003] 1: N N
2020-04-29 07:28:44.081857: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1115] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 3665 MB memory) -> physical GPU (device: 0, name: GeForce GTX 980, pci bus id: 0000:03:00.0, compute capability: 5.2)
2020-04-29 07:28:44.082543: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1115] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:1 with 3665 MB memory) -> physical GPU (device: 1, name: GeForce GTX 980, pci bus id: 0000:83:00.0, compute capability: 5.2)
Epoch: 0
WARNING:tensorflow:From /home/getalp/leferrae/.local/lib/python3.6/site-packages/tensorflow/python/ops/resource_variable_ops.py:642: colocate_with (from tensorflow.python.framework.ops) is deprecated and will be removed in a future version.
Instructions for updating:
Colocations handled automatically by placer.
WARNING - From /home/getalp/leferrae/.local/lib/python3.6/site-packages/tensorflow/python/ops/resource_variable_ops.py:642: colocate_with (from tensorflow.python.framework.ops) is deprecated and will be removed in a future version.
Instructions for updating:
Colocations handled automatically by placer.
2020-04-29 07:28:51.702216: I tensorflow/stream_executor/dso_loader.cc:152] successfully opened CUDA library libcublas.so.10.0 locally
WARNING:tensorflow:From /home/getalp/leferrae/.local/lib/python3.6/site-packages/tensorflow/python/ops/losses/losses_impl.py:667: to_float (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version.
Instructions for updating:
Use tf.cast instead.
WARNING - From /home/getalp/leferrae/.local/lib/python3.6/site-packages/tensorflow/python/ops/losses/losses_impl.py:667: to_float (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version.
Instructions for updating:
Use tf.cast instead.
batch nb. 0 loss: 0.024470722302794456
batch nb. 1 loss: 0.025331217795610428
batch nb. 2 loss: 0.024176480248570442
batch nb. 3 loss: 0.021502574905753136
batch nb. 4 loss: 0.02113870158791542
batch nb. 5 loss: 0.021304551512002945
batch nb. 6 loss: 0.023636961355805397
batch nb. 7 loss: 0.020748233422636986
batch nb. 8 loss: 0.017970005050301552
batch nb. 9 loss: 0.01769545115530491
batch nb. 10 loss: 0.016202406957745552
batch nb. 11 loss: 0.017922647297382355
batch nb. 12 loss: 0.017406396567821503
batch nb. 13 loss: 0.016209589317440987
batch nb. 14 loss: 0.015728024765849113
batch nb. 15 loss: 0.015459691174328327
validation: Loss: 0.01635664328932762 train loss : 0.015459691174328327
WARNING:tensorflow:From /home/getalp/leferrae/.local/lib/python3.6/site-packages/tensorflow/python/keras/engine/network.py:1436: update_checkpoint_state (from tensorflow.python.training.checkpoint_management) is deprecated and will be removed in a future version.
Instructions for updating:
Use tf.train.CheckpointManager to manage checkpoints rather than manually editing the Checkpoint proto.
WARNING - From /home/getalp/leferrae/.local/lib/python3.6/site-packages/tensorflow/python/keras/engine/network.py:1436: update_checkpoint_state (from tensorflow.python.training.checkpoint_management) is deprecated and will be removed in a future version.
Instructions for updating:
Use tf.train.CheckpointManager to manage checkpoints rather than manually editing the Checkpoint proto.
new best model
Epoch: 1
batch nb. 0 loss: 0.017899414524435997
batch nb. 1 loss: 0.017715975642204285
batch nb. 2 loss: 0.01669376716017723
batch nb. 3 loss: 0.014909137040376663
batch nb. 4 loss: 0.014883449301123619
batch nb. 5 loss: 0.015185076743364334
batch nb. 6 loss: 0.01711972989141941
batch nb. 7 loss: 0.015013769268989563
batch nb. 8 loss: 0.013415331952273846
batch nb. 9 loss: 0.014363721013069153
batch nb. 10 loss: 0.012929555028676987
batch nb. 11 loss: 0.01513123232871294
batch nb. 12 loss: 0.015023265965282917
batch nb. 13 loss: 0.012972449883818626
batch nb. 14 loss: 0.012301400303840637
batch nb. 15 loss: 0.011672711931169033
validation: Loss: 0.013088245876133442 train loss : 0.01356620155274868
new best model
Epoch: 2
batch nb. 0 loss: 0.014277933165431023
batch nb. 1 loss: 0.013683647848665714
batch nb. 2 loss: 0.012755041942000389
batch nb. 3 loss: 0.011792146600782871
batch nb. 4 loss: 0.01201306376606226
batch nb. 5 loss: 0.012362349778413773
batch nb. 6 loss: 0.014152449555695057
batch nb. 7 loss: 0.012590783648192883
batch nb. 8 loss: 0.01175472978502512
batch nb. 9 loss: 0.013084858655929565
batch nb. 10 loss: 0.011786707676947117
batch nb. 11 loss: 0.013828197494149208
batch nb. 12 loss: 0.013633256778120995
batch nb. 13 loss: 0.011373063549399376
batch nb. 14 loss: 0.010821053758263588
batch nb. 15 loss: 0.010224228724837303
validation: Loss: 0.011649330146610737 train loss : 0.012452210299670696
new best model
Epoch: 3
batch nb. 0 loss: 0.012602549046278
batch nb. 1 loss: 0.01196136511862278
batch nb. 2 loss: 0.011026577092707157
batch nb. 3 loss: 0.010410690680146217
batch nb. 4 loss: 0.0105265649035573
batch nb. 5 loss: 0.010883078910410404
batch nb. 6 loss: 0.012446831911802292
batch nb. 7 loss: 0.011054811999201775
batch nb. 8 loss: 0.010394318029284477
batch nb. 9 loss: 0.011816209182143211
batch nb. 10 loss: 0.010694770142436028
batch nb. 11 loss: 0.012709403410553932
batch nb. 12 loss: 0.012633572332561016
batch nb. 13 loss: 0.01029148418456316
batch nb. 14 loss: 0.009592454880475998
batch nb. 15 loss: 0.009057138115167618
validation: Loss: 0.010434899479150772 train loss : 0.011603442020714283
new best model
Epoch: 4
batch nb. 0 loss: 0.0111545380204916
batch nb. 1 loss: 0.01031928975135088
batch nb. 2 loss: 0.009466400370001793
batch nb. 3 loss: 0.009088999591767788
batch nb. 4 loss: 0.009390421211719513
batch nb. 5 loss: 0.009572207927703857
batch nb. 6 loss: 0.011011932976543903
batch nb. 7 loss: 0.009875641204416752
batch nb. 8 loss: 0.009412962011992931
batch nb. 9 loss: 0.010838303714990616
batch nb. 10 loss: 0.009879549965262413
batch nb. 11 loss: 0.011567538604140282
batch nb. 12 loss: 0.011519607156515121
batch nb. 13 loss: 0.009367058984935284
batch nb. 14 loss: 0.008710665628314018
batch nb. 15 loss: 0.008001739159226418
validation: Loss: 0.009415630251169205 train loss : 0.01088310219347477
new best model
Epoch: 5
batch nb. 0 loss: 0.009957903064787388
batch nb. 1 loss: 0.009055249392986298
batch nb. 2 loss: 0.008284499868750572
batch nb. 3 loss: 0.00804929994046688
batch nb. 4 loss: 0.00831550545990467
batch nb. 5 loss: 0.008529036305844784
batch nb. 6 loss: 0.009834506548941135
batch nb. 7 loss: 0.008834157139062881
batch nb. 8 loss: 0.008551513776183128
batch nb. 9 loss: 0.010001440532505512
batch nb. 10 loss: 0.009175009094178677
batch nb. 11 loss: 0.01073745358735323
batch nb. 12 loss: 0.01066673919558525
batch nb. 13 loss: 0.008631023578345776
batch nb. 14 loss: 0.008037206716835499
batch nb. 15 loss: 0.007425585761666298
validation: Loss: 0.00863307248800993 train loss : 0.010306849144399166
new best model
Epoch: 6
batch nb. 0 loss: 0.008989992551505566
batch nb. 1 loss: 0.008196684531867504
batch nb. 2 loss: 0.007493111304938793
batch nb. 3 loss: 0.007271199952811003
batch nb. 4 loss: 0.007576193194836378
batch nb. 5 loss: 0.007809434086084366
batch nb. 6 loss: 0.009029528126120567
batch nb. 7 loss: 0.008080772124230862
batch nb. 8 loss: 0.008048094809055328
batch nb. 9 loss: 0.009536915458738804
batch nb. 10 loss: 0.008673997595906258
batch nb. 11 loss: 0.010051276534795761
batch nb. 12 loss: 0.010063119232654572
batch nb. 13 loss: 0.00811722595244646
batch nb. 14 loss: 0.0075994315557181835
batch nb. 15 loss: 0.007047909777611494
validation: Loss: 0.008196822367608547 train loss : 0.009841285645961761
new best model
Epoch: 7
batch nb. 0 loss: 0.00846685841679573
batch nb. 1 loss: 0.007635884918272495
batch nb. 2 loss: 0.006889748852699995
batch nb. 3 loss: 0.006746202707290649
batch nb. 4 loss: 0.007040264550596476
batch nb. 5 loss: 0.007289724424481392
batch nb. 6 loss: 0.008387950249016285
batch nb. 7 loss: 0.007621095050126314
batch nb. 8 loss: 0.007732384372502565
batch nb. 9 loss: 0.009266325272619724
batch nb. 10 loss: 0.008494831621646881
batch nb. 11 loss: 0.00971982255578041
batch nb. 12 loss: 0.009681403636932373
batch nb. 13 loss: 0.0077205197885632515
batch nb. 14 loss: 0.00715650524944067
batch nb. 15 loss: 0.006760300602763891
validation: Loss: 0.007817105390131474 train loss : 0.00945616327226162
new best model
Epoch: 8
batch nb. 0 loss: 0.008072895929217339
batch nb. 1 loss: 0.00723741902038455
batch nb. 2 loss: 0.006527725141495466
batch nb. 3 loss: 0.006341307424008846
batch nb. 4 loss: 0.0066343192011117935
batch nb. 5 loss: 0.006799777038395405
batch nb. 6 loss: 0.007885688915848732
batch nb. 7 loss: 0.007152928970754147
batch nb. 8 loss: 0.007373111322522163
batch nb. 9 loss: 0.008933275938034058
batch nb. 10 loss: 0.008123793639242649
batch nb. 11 loss: 0.009372924454510212
batch nb. 12 loss: 0.009296970441937447
batch nb. 13 loss: 0.007468522526323795
batch nb. 14 loss: 0.006822030525654554
batch nb. 15 loss: 0.006419990211725235
validation: Loss: 0.0074564921669662 train loss : 0.009118810296058655
new best model
Epoch: 9
batch nb. 0 loss: 0.007628282066434622
batch nb. 1 loss: 0.006811571307480335
batch nb. 2 loss: 0.006178501062095165
batch nb. 3 loss: 0.006102773826569319
batch nb. 4 loss: 0.006304780021309853
batch nb. 5 loss: 0.006440846715122461
batch nb. 6 loss: 0.007435767445713282
batch nb. 7 loss: 0.0067710732109844685
batch nb. 8 loss: 0.0071443067863583565
batch nb. 9 loss: 0.008654661476612091
batch nb. 10 loss: 0.007892907597124577
batch nb. 11 loss: 0.008944745175540447
batch nb. 12 loss: 0.008999978192150593
batch nb. 13 loss: 0.007085008546710014
batch nb. 14 loss: 0.006557251326739788
batch nb. 15 loss: 0.006142199970781803
validation: Loss: 0.0071064792573452 train loss : 0.008821149356663227
new best model
Epoch: 10
batch nb. 0 loss: 0.007219783961772919
batch nb. 1 loss: 0.006408684886991978
batch nb. 2 loss: 0.005779493134468794
batch nb. 3 loss: 0.005748336669057608
batch nb. 4 loss: 0.005950812250375748
batch nb. 5 loss: 0.006140845827758312
batch nb. 6 loss: 0.0070937699638307095
batch nb. 7 loss: 0.00649395352229476
batch nb. 8 loss: 0.00697792274877429
batch nb. 9 loss: 0.008574509993195534
batch nb. 10 loss: 0.007685781456530094
batch nb. 11 loss: 0.008757064118981361
batch nb. 12 loss: 0.008716427721083164
batch nb. 13 loss: 0.006914800964295864
batch nb. 14 loss: 0.0063716815784573555
batch nb. 15 loss: 0.006071747280657291
validation: Loss: 0.007075706962496042 train loss : 0.008571203798055649
new best model
Epoch: 11
batch nb. 0 loss: 0.007160734385251999
batch nb. 1 loss: 0.006235400680452585
batch nb. 2 loss: 0.005487178917974234
batch nb. 3 loss: 0.005583802703768015
batch nb. 4 loss: 0.005848987028002739
batch nb. 5 loss: 0.006060885731130838
batch nb. 6 loss: 0.0069711096584796906
batch nb. 7 loss: 0.006304615177214146
batch nb. 8 loss: 0.006808036472648382
batch nb. 9 loss: 0.008374427445232868
batch nb. 10 loss: 0.007488773204386234
batch nb. 11 loss: 0.008614234626293182
batch nb. 12 loss: 0.008559498935937881
batch nb. 13 loss: 0.006747888401150703
batch nb. 14 loss: 0.0063201361335814
batch nb. 15 loss: 0.00617142952978611
validation: Loss: 0.007003473583608866 train loss : 0.008371221832931042
new best model
Epoch: 12
batch nb. 0 loss: 0.007191007491201162
batch nb. 1 loss: 0.006149348337203264
batch nb. 2 loss: 0.005349976476281881
batch nb. 3 loss: 0.005428984295576811
batch nb. 4 loss: 0.005662466865032911
batch nb. 5 loss: 0.0058356053195893764
batch nb. 6 loss: 0.006830490194261074
batch nb. 7 loss: 0.006267140153795481
batch nb. 8 loss: 0.006804171949625015
batch nb. 9 loss: 0.008449731394648552
batch nb. 10 loss: 0.0074921962805092335
batch nb. 11 loss: 0.008502420969307423
batch nb. 12 loss: 0.008405406028032303
batch nb. 13 loss: 0.006559161003679037
batch nb. 14 loss: 0.006107384338974953
batch nb. 15 loss: 0.005788586102426052
validation: Loss: 0.006960554979741573 train loss : 0.008172557689249516
new best model
Epoch: 13
batch nb. 0 loss: 0.007034188136458397
batch nb. 1 loss: 0.006185072939842939
batch nb. 2 loss: 0.005447643343359232
batch nb. 3 loss: 0.005423560738563538
batch nb. 4 loss: 0.005603328347206116
batch nb. 5 loss: 0.0056887282989919186
batch nb. 6 loss: 0.00659868074581027
batch nb. 7 loss: 0.00597715750336647
batch nb. 8 loss: 0.0064704143442213535
batch nb. 9 loss: 0.008032113313674927
batch nb. 10 loss: 0.007286414038389921
batch nb. 11 loss: 0.008365392684936523
batch nb. 12 loss: 0.008289329707622528
batch nb. 13 loss: 0.006461780983954668
batch nb. 14 loss: 0.005977097433060408
batch nb. 15 loss: 0.005670465994626284
validation: Loss: 0.00659077288582921 train loss : 0.007993836887180805
new best model
Epoch: 14
batch nb. 0 loss: 0.006617662496864796
batch nb. 1 loss: 0.005818316247314215
batch nb. 2 loss: 0.0052040209993720055
batch nb. 3 loss: 0.005121894180774689
batch nb. 4 loss: 0.00542178051546216
batch nb. 5 loss: 0.005604549311101437
batch nb. 6 loss: 0.006382578518241644
batch nb. 7 loss: 0.005868713837116957
batch nb. 8 loss: 0.006461977027356625
batch nb. 9 loss: 0.007851125672459602
batch nb. 10 loss: 0.00712656369432807
batch nb. 11 loss: 0.008076434955000877
batch nb. 12 loss: 0.008036740124225616
batch nb. 13 loss: 0.006396941840648651
batch nb. 14 loss: 0.005788401234894991
batch nb. 15 loss: 0.005512813106179237
validation: Loss: 0.006470098625868559 train loss : 0.00782843492925167
new best model
Epoch: 15
batch nb. 0 loss: 0.006479969713836908
batch nb. 1 loss: 0.005692703649401665
batch nb. 2 loss: 0.005076732952147722
batch nb. 3 loss: 0.005123353097587824
batch nb. 4 loss: 0.005334744695574045
batch nb. 5 loss: 0.005410060286521912
batch nb. 6 loss: 0.006196442060172558
batch nb. 7 loss: 0.005674488376826048
batch nb. 8 loss: 0.006207487545907497
batch nb. 9 loss: 0.007573969662189484
batch nb. 10 loss: 0.006971360649913549
batch nb. 11 loss: 0.00792989693582058
batch nb. 12 loss: 0.007890518754720688
batch nb. 13 loss: 0.006111846771091223
batch nb. 14 loss: 0.005612273700535297
batch nb. 15 loss: 0.00538759445771575
validation: Loss: 0.006238073576241732 train loss : 0.007675883360207081
new best model
Epoch: 16
batch nb. 0 loss: 0.00631038099527359
batch nb. 1 loss: 0.005487065762281418
batch nb. 2 loss: 0.004856203217059374
batch nb. 3 loss: 0.0048814453184604645
batch nb. 4 loss: 0.005147062242031097
batch nb. 5 loss: 0.005266082938760519
batch nb. 6 loss: 0.006018262356519699
batch nb. 7 loss: 0.005511217284947634
batch nb. 8 loss: 0.006035695783793926
batch nb. 9 loss: 0.007368688937276602
batch nb. 10 loss: 0.006732914596796036
batch nb. 11 loss: 0.007685898337513208
batch nb. 12 loss: 0.007656891830265522
batch nb. 13 loss: 0.005916074849665165
batch nb. 14 loss: 0.005417114123702049
batch nb. 15 loss: 0.0051877135410904884
validation: Loss: 0.0059651220217347145 train loss : 0.007529520895332098
new best model
Epoch: 17
batch nb. 0 loss: 0.006013455335050821
batch nb. 1 loss: 0.005296105984598398
batch nb. 2 loss: 0.004639300983399153
batch nb. 3 loss: 0.004669651389122009
batch nb. 4 loss: 0.004938766825944185
batch nb. 5 loss: 0.0050260936841368675
batch nb. 6 loss: 0.005812617018818855
batch nb. 7 loss: 0.005310556851327419
batch nb. 8 loss: 0.0058661652728915215
batch nb. 9 loss: 0.007228092290461063
batch nb. 10 loss: 0.006482618860900402
batch nb. 11 loss: 0.00747564435005188
batch nb. 12 loss: 0.0075029730796813965
batch nb. 13 loss: 0.005742610897868872
batch nb. 14 loss: 0.005277122836560011
batch nb. 15 loss: 0.005090269260108471
validation: Loss: 0.00582860317081213 train loss : 0.007394006475806236
new best model
Epoch: 18
batch nb. 0 loss: 0.00581356743350625
batch nb. 1 loss: 0.005145502742379904
batch nb. 2 loss: 0.004589777905493975
batch nb. 3 loss: 0.0045512826181948185
batch nb. 4 loss: 0.004831759724766016
batch nb. 5 loss: 0.004958732519298792
batch nb. 6 loss: 0.005655972752720118
batch nb. 7 loss: 0.005198017694056034
batch nb. 8 loss: 0.005742750596255064
batch nb. 9 loss: 0.007083812262862921
batch nb. 10 loss: 0.006462676916271448
batch nb. 11 loss: 0.007396611850708723
batch nb. 12 loss: 0.00730295991525054
batch nb. 13 loss: 0.005674291402101517
batch nb. 14 loss: 0.005218631122261286
batch nb. 15 loss: 0.004988570231944323
validation: Loss: 0.005663326941430569 train loss : 0.007267404347658157
new best model
Epoch: 19
batch nb. 0 loss: 0.0056978738866746426
batch nb. 1 loss: 0.004871135577559471
batch nb. 2 loss: 0.004355413839221001
batch nb. 3 loss: 0.004459313116967678
batch nb. 4 loss: 0.00466623529791832
batch nb. 5 loss: 0.004799460526555777
batch nb. 6 loss: 0.0055860485881567
batch nb. 7 loss: 0.005110404919832945
batch nb. 8 loss: 0.005610867869108915
batch nb. 9 loss: 0.006955380085855722
batch nb. 10 loss: 0.006343675311654806
batch nb. 11 loss: 0.007146098185330629
batch nb. 12 loss: 0.007163702975958586
batch nb. 13 loss: 0.005538501311093569
batch nb. 14 loss: 0.005107241682708263
batch nb. 15 loss: 0.0048480406403541565
validation: Loss: 0.005632428918033838 train loss : 0.0071464357897639275
new best model
Epoch: 20
batch nb. 0 loss: 0.005616549868136644
batch nb. 1 loss: 0.004922253079712391
batch nb. 2 loss: 0.00432028342038393
batch nb. 3 loss: 0.004392704460769892
batch nb. 4 loss: 0.004634103272110224
batch nb. 5 loss: 0.004722656216472387
batch nb. 6 loss: 0.005445368122309446
batch nb. 7 loss: 0.005006559193134308
batch nb. 8 loss: 0.005468366201967001
batch nb. 9 loss: 0.006798201706260443
batch nb. 10 loss: 0.006243635900318623
batch nb. 11 loss: 0.00705695291981101
batch nb. 12 loss: 0.007018644362688065
batch nb. 13 loss: 0.005370850209146738
batch nb. 14 loss: 0.004998291842639446
batch nb. 15 loss: 0.004749232437461615
validation: Loss: 0.005517322104424238 train loss : 0.007032283581793308
new best model
Epoch: 21
batch nb. 0 loss: 0.005461407825350761
batch nb. 1 loss: 0.004730882588773966
batch nb. 2 loss: 0.0042040543630719185
batch nb. 3 loss: 0.004284985363483429
batch nb. 4 loss: 0.004508182872086763
batch nb. 5 loss: 0.004591912496834993
batch nb. 6 loss: 0.005314519163221121
batch nb. 7 loss: 0.004893930163234472
batch nb. 8 loss: 0.005381621886044741
batch nb. 9 loss: 0.0066403052769601345
batch nb. 10 loss: 0.0060931844636797905
batch nb. 11 loss: 0.0069283368065953255
batch nb. 12 loss: 0.006954590789973736
batch nb. 13 loss: 0.005263644270598888
batch nb. 14 loss: 0.004853060934692621
batch nb. 15 loss: 0.004658335819840431
validation: Loss: 0.005397578235715628 train loss : 0.006924377288669348
new best model
Epoch: 22
batch nb. 0 loss: 0.00539391627535224
batch nb. 1 loss: 0.004641256295144558
batch nb. 2 loss: 0.004059921484440565
batch nb. 3 loss: 0.004176619928330183
batch nb. 4 loss: 0.004449550528079271
batch nb. 5 loss: 0.004536850843578577
batch nb. 6 loss: 0.005216512829065323
batch nb. 7 loss: 0.004823701456189156
batch nb. 8 loss: 0.005369833204895258
batch nb. 9 loss: 0.0066528962925076485
batch nb. 10 loss: 0.0059970952570438385
batch nb. 11 loss: 0.006812083534896374
batch nb. 12 loss: 0.0068954420275986195
batch nb. 13 loss: 0.00523813022300601
batch nb. 14 loss: 0.004849135875701904
batch nb. 15 loss: 0.004653577692806721
validation: Loss: 0.0054526859894394875 train loss : 0.006825646851211786
Epoch: 23
batch nb. 0 loss: 0.005413792096078396
batch nb. 1 loss: 0.004658329766243696
batch nb. 2 loss: 0.004079846199601889
batch nb. 3 loss: 0.004157883580774069
batch nb. 4 loss: 0.004395690746605396
batch nb. 5 loss: 0.004507545847445726
batch nb. 6 loss: 0.005210088100284338
batch nb. 7 loss: 0.004809959791600704
batch nb. 8 loss: 0.005317512433975935
batch nb. 9 loss: 0.006616527214646339
batch nb. 10 loss: 0.005983801558613777
batch nb. 11 loss: 0.006779790390282869
batch nb. 12 loss: 0.006815710105001926
batch nb. 13 loss: 0.005192216020077467
batch nb. 14 loss: 0.004820924252271652
batch nb. 15 loss: 0.004628678783774376
validation: Loss: 0.005364927463233471 train loss : 0.006734106689691544
new best model
Epoch: 24
batch nb. 0 loss: 0.00532815745100379
batch nb. 1 loss: 0.0046068099327385426
batch nb. 2 loss: 0.0040270304307341576
batch nb. 3 loss: 0.004095663782209158
batch nb. 4 loss: 0.004317000973969698
batch nb. 5 loss: 0.004445978906005621
batch nb. 6 loss: 0.005129610653966665
batch nb. 7 loss: 0.004752733279019594
batch nb. 8 loss: 0.005236259661614895
batch nb. 9 loss: 0.0065098353661596775
batch nb. 10 loss: 0.005890578497201204
batch nb. 11 loss: 0.006729251239448786
batch nb. 12 loss: 0.0067084613256156445
batch nb. 13 loss: 0.005080949980765581
batch nb. 14 loss: 0.004747400060296059
batch nb. 15 loss: 0.004499940667301416
validation: Loss: 0.005280972458422184 train loss : 0.006644740235060453
new best model
Epoch: 25
batch nb. 0 loss: 0.005251607857644558
batch nb. 1 loss: 0.004500462207943201
batch nb. 2 loss: 0.00395760266110301
batch nb. 3 loss: 0.004050069488584995
batch nb. 4 loss: 0.0042250980623066425
batch nb. 5 loss: 0.004350471775978804
batch nb. 6 loss: 0.005037564784288406
batch nb. 7 loss: 0.004664067644625902
batch nb. 8 loss: 0.005141113884747028
batch nb. 9 loss: 0.006404061336070299
batch nb. 10 loss: 0.005838536657392979
batch nb. 11 loss: 0.006655495148152113
batch nb. 12 loss: 0.006638264283537865
batch nb. 13 loss: 0.004972136579453945
batch nb. 14 loss: 0.004652284551411867
batch nb. 15 loss: 0.004434110596776009
validation: Loss: 0.005124243907630444 train loss : 0.006559715606272221
new best model
Epoch: 26
batch nb. 0 loss: 0.0050755157135427
batch nb. 1 loss: 0.004408714827150106
batch nb. 2 loss: 0.003913138527423143
batch nb. 3 loss: 0.00402207113802433
batch nb. 4 loss: 0.004198156297206879
batch nb. 5 loss: 0.004264783579856157
batch nb. 6 loss: 0.004939218517392874
batch nb. 7 loss: 0.004540989641100168
batch nb. 8 loss: 0.005006244871765375
batch nb. 9 loss: 0.006275587249547243
batch nb. 10 loss: 0.005782569758594036
batch nb. 11 loss: 0.006580614950507879
batch nb. 12 loss: 0.006574766710400581
batch nb. 13 loss: 0.004926740191876888
batch nb. 14 loss: 0.0045807091519236565
batch nb. 15 loss: 0.004427014850080013
validation: Loss: 0.005050119012594223 train loss : 0.006480726879090071
new best model
Epoch: 27
batch nb. 0 loss: 0.004986047279089689
batch nb. 1 loss: 0.00421152962371707
batch nb. 2 loss: 0.003749303286895156
batch nb. 3 loss: 0.00395719800144434
batch nb. 4 loss: 0.004152256995439529
batch nb. 5 loss: 0.004204041324555874
batch nb. 6 loss: 0.004838801920413971
batch nb. 7 loss: 0.0044814590364694595
batch nb. 8 loss: 0.005009880755096674
batch nb. 9 loss: 0.0062348502688109875
batch nb. 10 loss: 0.005744745954871178
batch nb. 11 loss: 0.006491347681730986
batch nb. 12 loss: 0.006515595596283674
batch nb. 13 loss: 0.0049394783563911915
batch nb. 14 loss: 0.004579977132380009
batch nb. 15 loss: 0.004424023907631636
validation: Loss: 0.005094241816550493 train loss : 0.006407273001968861
Epoch: 28
batch nb. 0 loss: 0.004962916951626539
batch nb. 1 loss: 0.004321999382227659
batch nb. 2 loss: 0.0037675760686397552
batch nb. 3 loss: 0.003919751383364201
batch nb. 4 loss: 0.004154241178184748
batch nb. 5 loss: 0.004219207912683487
batch nb. 6 loss: 0.004793589934706688
batch nb. 7 loss: 0.004486612975597382
batch nb. 8 loss: 0.005037196446210146
batch nb. 9 loss: 0.006180077325552702
batch nb. 10 loss: 0.005732065998017788
batch nb. 11 loss: 0.006483354605734348
batch nb. 12 loss: 0.006455959752202034
batch nb. 13 loss: 0.00487758032977581
batch nb. 14 loss: 0.004526105709373951
batch nb. 15 loss: 0.0043786149471998215
validation: Loss: 0.005018447060137987 train loss : 0.006337319500744343
new best model
Epoch: 29
batch nb. 0 loss: 0.0049408357590436935
batch nb. 1 loss: 0.004251376260071993
batch nb. 2 loss: 0.0037432382814586163
batch nb. 3 loss: 0.003872171975672245
batch nb. 4 loss: 0.004092457704246044
batch nb. 5 loss: 0.0041811163537204266
batch nb. 6 loss: 0.004775750916451216
batch nb. 7 loss: 0.004388943314552307
batch nb. 8 loss: 0.005003234837204218
batch nb. 9 loss: 0.006160939112305641
batch nb. 10 loss: 0.005594531074166298
batch nb. 11 loss: 0.006401990074664354
batch nb. 12 loss: 0.006474616471678019
batch nb. 13 loss: 0.004799703136086464
batch nb. 14 loss: 0.0044832075946033
batch nb. 15 loss: 0.004287251271307468
validation: Loss: 0.004918577149510384 train loss : 0.006268984172493219
new best model
Epoch: 30
batch nb. 0 loss: 0.004853112157434225
batch nb. 1 loss: 0.004200206138193607
batch nb. 2 loss: 0.0036357096396386623
batch nb. 3 loss: 0.0037703935522586107
batch nb. 4 loss: 0.004029382020235062
batch nb. 5 loss: 0.0040893093682825565
batch nb. 6 loss: 0.004701647441834211
batch nb. 7 loss: 0.004374779295176268
batch nb. 8 loss: 0.004971537739038467
batch nb. 9 loss: 0.006183377001434565
batch nb. 10 loss: 0.00557539751753211
batch nb. 11 loss: 0.006276627071201801
batch nb. 12 loss: 0.006374146323651075
batch nb. 13 loss: 0.004758827853947878
batch nb. 14 loss: 0.004493942949920893
batch nb. 15 loss: 0.00441401032730937
validation: Loss: 0.004929578397423029 train loss : 0.006209146231412888
Epoch: 31
batch nb. 0 loss: 0.004842640832066536
batch nb. 1 loss: 0.004205112345516682
batch nb. 2 loss: 0.003704011905938387
batch nb. 3 loss: 0.003751336596906185
batch nb. 4 loss: 0.003989750519394875
batch nb. 5 loss: 0.0041028461419045925
batch nb. 6 loss: 0.004699185490608215
batch nb. 7 loss: 0.004384810570627451
batch nb. 8 loss: 0.0049662464298307896
batch nb. 9 loss: 0.006104763131588697
batch nb. 10 loss: 0.0055834646336734295
batch nb. 11 loss: 0.006281149107962847
batch nb. 12 loss: 0.006315375678241253
batch nb. 13 loss: 0.004725489765405655
batch nb. 14 loss: 0.004393679089844227
batch nb. 15 loss: 0.0043194834142923355
validation: Loss: 0.0049648592248559 train loss : 0.006150093860924244
Epoch: 32
batch nb. 0 loss: 0.004898104351013899
batch nb. 1 loss: 0.004144329112023115
batch nb. 2 loss: 0.0036732207518070936
batch nb. 3 loss: 0.003786528715863824
batch nb. 4 loss: 0.004016099497675896
batch nb. 5 loss: 0.004073001444339752
batch nb. 6 loss: 0.004649726673960686
batch nb. 7 loss: 0.004326303023844957
batch nb. 8 loss: 0.00488175917416811
batch nb. 9 loss: 0.006026922725141048
batch nb. 10 loss: 0.00552004249766469
batch nb. 11 loss: 0.006191718857735395
batch nb. 12 loss: 0.0062672714702785015
batch nb. 13 loss: 0.004720467142760754
batch nb. 14 loss: 0.004346237517893314
batch nb. 15 loss: 0.004242495633661747
validation: Loss: 0.004879834596067667 train loss : 0.006092287600040436
new best model
Epoch: 33
batch nb. 0 loss: 0.004823631141334772
batch nb. 1 loss: 0.004187012556940317
batch nb. 2 loss: 0.0036875088699162006
batch nb. 3 loss: 0.003801818937063217
batch nb. 4 loss: 0.003964269068092108
batch nb. 5 loss: 0.004120965953916311
batch nb. 6 loss: 0.004701296798884869
batch nb. 7 loss: 0.0042765638791024685
batch nb. 8 loss: 0.004841185174882412
batch nb. 9 loss: 0.00601442065089941
batch nb. 10 loss: 0.005488826427608728
batch nb. 11 loss: 0.0062071033753454685
batch nb. 12 loss: 0.006200670730322599
batch nb. 13 loss: 0.004627702757716179
batch nb. 14 loss: 0.0043576923198997974
batch nb. 15 loss: 0.004327762871980667
validation: Loss: 0.004861526191234589 train loss : 0.0060403901152312756
new best model
Epoch: 34
batch nb. 0 loss: 0.004864390939474106
batch nb. 1 loss: 0.004122292157262564
batch nb. 2 loss: 0.0036674889270216227
batch nb. 3 loss: 0.003773152595385909
batch nb. 4 loss: 0.003910399507731199
batch nb. 5 loss: 0.004032545257359743
batch nb. 6 loss: 0.004651930183172226
batch nb. 7 loss: 0.004272702615708113
batch nb. 8 loss: 0.004866440314799547
batch nb. 9 loss: 0.006041080690920353
batch nb. 10 loss: 0.005459373351186514
batch nb. 11 loss: 0.006180560681968927
batch nb. 12 loss: 0.006164903286844492
batch nb. 13 loss: 0.00462930416688323
batch nb. 14 loss: 0.004293074831366539
batch nb. 15 loss: 0.004291026387363672
validation: Loss: 0.004953472875058651 train loss : 0.005990408360958099
Epoch: 35
batch nb. 0 loss: 0.004895645659416914
batch nb. 1 loss: 0.004115359857678413
batch nb. 2 loss: 0.003645988879725337
batch nb. 3 loss: 0.0037113225553184748
batch nb. 4 loss: 0.0039003901183605194
batch nb. 5 loss: 0.00400477135553956
batch nb. 6 loss: 0.004589979536831379
batch nb. 7 loss: 0.004216175526380539
batch nb. 8 loss: 0.004762634634971619
batch nb. 9 loss: 0.0059770760126411915
batch nb. 10 loss: 0.005434478633105755
batch nb. 11 loss: 0.006096040364354849
batch nb. 12 loss: 0.006137790624052286
batch nb. 13 loss: 0.004596452694386244
batch nb. 14 loss: 0.004271116107702255
batch nb. 15 loss: 0.00424271309748292
validation: Loss: 0.0047928341664373875 train loss : 0.005941861309111118
new best model
Epoch: 36
batch nb. 0 loss: 0.004745100624859333
batch nb. 1 loss: 0.004093522671610117
batch nb. 2 loss: 0.00363266677595675
batch nb. 3 loss: 0.0036860292311757803
batch nb. 4 loss: 0.0038828488904982805
batch nb. 5 loss: 0.00399219524115324
batch nb. 6 loss: 0.004557618405669928
batch nb. 7 loss: 0.004200841765850782
batch nb. 8 loss: 0.00475789699703455
batch nb. 9 loss: 0.005896746646612883
batch nb. 10 loss: 0.005401273258030415
batch nb. 11 loss: 0.006111383903771639
batch nb. 12 loss: 0.006075304001569748
batch nb. 13 loss: 0.004549009259790182
batch nb. 14 loss: 0.0042426697909832
batch nb. 15 loss: 0.004147445783019066
validation: Loss: 0.0047704740427434444 train loss : 0.005893363151699305
new best model
Epoch: 37
batch nb. 0 loss: 0.004729399923235178
batch nb. 1 loss: 0.00403842655941844
batch nb. 2 loss: 0.0035235690884292126
batch nb. 3 loss: 0.003655016887933016
batch nb. 4 loss: 0.0038508560974150896
batch nb. 5 loss: 0.0038346070796251297
batch nb. 6 loss: 0.004423145204782486
batch nb. 7 loss: 0.00416217278689146
batch nb. 8 loss: 0.004773569293320179
batch nb. 9 loss: 0.005876679439097643
batch nb. 10 loss: 0.005348830483853817
batch nb. 11 loss: 0.006071445532143116
batch nb. 12 loss: 0.006057871039956808
batch nb. 13 loss: 0.004593174438923597
batch nb. 14 loss: 0.004210367798805237
batch nb. 15 loss: 0.004111068788915873
validation: Loss: 0.004708717577159405 train loss : 0.0058464608155190945
new best model
Epoch: 38
batch nb. 0 loss: 0.004578007385134697
batch nb. 1 loss: 0.003974806517362595
batch nb. 2 loss: 0.003545224666595459
batch nb. 3 loss: 0.003648757701739669
batch nb. 4 loss: 0.0038673775270581245
batch nb. 5 loss: 0.003857932984828949
batch nb. 6 loss: 0.004366571549326181
batch nb. 7 loss: 0.004086468834429979
batch nb. 8 loss: 0.004716203082352877
batch nb. 9 loss: 0.00585594167932868
batch nb. 10 loss: 0.005310419946908951
batch nb. 11 loss: 0.0059988973662257195
batch nb. 12 loss: 0.0060435812920331955
batch nb. 13 loss: 0.00445537781342864
batch nb. 14 loss: 0.004171279259026051
batch nb. 15 loss: 0.004112730734050274
validation: Loss: 0.004659693688154221 train loss : 0.005802006460726261
new best model
Epoch: 39
batch nb. 0 loss: 0.004591943230479956
batch nb. 1 loss: 0.003915685229003429
batch nb. 2 loss: 0.0034969181288033724
batch nb. 3 loss: 0.003594992682337761
batch nb. 4 loss: 0.0038064473774284124
batch nb. 5 loss: 0.0038489876314997673
batch nb. 6 loss: 0.004373500123620033
batch nb. 7 loss: 0.00404260354116559
batch nb. 8 loss: 0.004634778946638107
batch nb. 9 loss: 0.0058035580441355705
batch nb. 10 loss: 0.005261845886707306
batch nb. 11 loss: 0.005884262267500162
batch nb. 12 loss: 0.005962250754237175
batch nb. 13 loss: 0.004423828795552254
batch nb. 14 loss: 0.004153917543590069
batch nb. 15 loss: 0.004128183703869581
validation: Loss: 0.004642379470169544 train loss : 0.005760160740464926
new best model
Epoch: 40
batch nb. 0 loss: 0.004602557048201561
batch nb. 1 loss: 0.0038878133054822683
batch nb. 2 loss: 0.0034452970139682293
batch nb. 3 loss: 0.003608579048886895
batch nb. 4 loss: 0.003830250119790435
batch nb. 5 loss: 0.0038789391983300447
batch nb. 6 loss: 0.004381868988275528
batch nb. 7 loss: 0.0040181358344852924
batch nb. 8 loss: 0.004569450858980417
batch nb. 9 loss: 0.005754216108471155
batch nb. 10 loss: 0.005252111237496138
batch nb. 11 loss: 0.005888802465051413
batch nb. 12 loss: 0.005896673072129488
batch nb. 13 loss: 0.004427258390933275
batch nb. 14 loss: 0.004151785746216774
batch nb. 15 loss: 0.004197815898805857
validation: Loss: 0.004688691347837448 train loss : 0.005722055211663246
Epoch: 41
batch nb. 0 loss: 0.004671463742852211
batch nb. 1 loss: 0.0038746288046240807
batch nb. 2 loss: 0.003421350382268429
batch nb. 3 loss: 0.003561131889000535
batch nb. 4 loss: 0.003804160514846444
batch nb. 5 loss: 0.0038851730059832335
batch nb. 6 loss: 0.0044014169834554195
batch nb. 7 loss: 0.0040618353523314
batch nb. 8 loss: 0.004613775294274092
batch nb. 9 loss: 0.005760221276432276
batch nb. 10 loss: 0.005281345918774605
batch nb. 11 loss: 0.005877786315977573
batch nb. 12 loss: 0.005856696981936693
batch nb. 13 loss: 0.004348904360085726
batch nb. 14 loss: 0.004087409935891628
batch nb. 15 loss: 0.0041442918591201305
validation: Loss: 0.004705579951405525 train loss : 0.005684489384293556
Epoch: 42
batch nb. 0 loss: 0.004661109764128923
batch nb. 1 loss: 0.003939797170460224
batch nb. 2 loss: 0.003397769760340452
batch nb. 3 loss: 0.0035194354131817818
batch nb. 4 loss: 0.003747259033843875
batch nb. 5 loss: 0.0037944838404655457
batch nb. 6 loss: 0.004335024859756231
batch nb. 7 loss: 0.004038140177726746
batch nb. 8 loss: 0.004664374515414238
batch nb. 9 loss: 0.005757075268775225
batch nb. 10 loss: 0.0052355993539094925
batch nb. 11 loss: 0.0058647459372878075
batch nb. 12 loss: 0.0058952863328158855
batch nb. 13 loss: 0.004368916619569063
batch nb. 14 loss: 0.004078466910868883
batch nb. 15 loss: 0.004076111130416393
validation: Loss: 0.00466301292181015 train loss : 0.005647085141390562
Epoch: 43
batch nb. 0 loss: 0.004634522367268801
batch nb. 1 loss: 0.003956140018999577
batch nb. 2 loss: 0.0034400199074298143
batch nb. 3 loss: 0.003528262721374631
batch nb. 4 loss: 0.00374523364007473
batch nb. 5 loss: 0.003865674836561084
batch nb. 6 loss: 0.004350658971816301
batch nb. 7 loss: 0.00403249217197299
batch nb. 8 loss: 0.0047013466246426105
batch nb. 9 loss: 0.005829336121678352
batch nb. 10 loss: 0.005194406025111675
batch nb. 11 loss: 0.0058086710050702095
batch nb. 12 loss: 0.0058923810720443726
batch nb. 13 loss: 0.004382550250738859
batch nb. 14 loss: 0.0041149817407131195
batch nb. 15 loss: 0.004149540327489376
validation: Loss: 0.004630205687135458 train loss : 0.005613049957901239
new best model
Epoch: 44
batch nb. 0 loss: 0.004666999448090792
batch nb. 1 loss: 0.003942089155316353
batch nb. 2 loss: 0.003451813943684101
batch nb. 3 loss: 0.0035318806767463684
batch nb. 4 loss: 0.00369568751193583
batch nb. 5 loss: 0.003836465999484062
batch nb. 6 loss: 0.004356995224952698
batch nb. 7 loss: 0.004046599380671978
batch nb. 8 loss: 0.0047242953442037106
batch nb. 9 loss: 0.005794825032353401
batch nb. 10 loss: 0.005227950401604176
batch nb. 11 loss: 0.005885257385671139
batch nb. 12 loss: 0.00583520857617259
batch nb. 13 loss: 0.0043644532561302185
batch nb. 14 loss: 0.004064209759235382
batch nb. 15 loss: 0.004140171688050032
validation: Loss: 0.0046999393962323666 train loss : 0.005580319091677666
Epoch: 45
batch nb. 0 loss: 0.004652262665331364
batch nb. 1 loss: 0.00394436763599515
batch nb. 2 loss: 0.003407600335776806
batch nb. 3 loss: 0.003487071953713894
batch nb. 4 loss: 0.0036959731951355934
batch nb. 5 loss: 0.003779852529987693
batch nb. 6 loss: 0.004344205837696791
batch nb. 7 loss: 0.004042079672217369
batch nb. 8 loss: 0.004691058304160833
batch nb. 9 loss: 0.0057850368320941925
batch nb. 10 loss: 0.005219279322773218
batch nb. 11 loss: 0.005875040777027607
batch nb. 12 loss: 0.005931057967245579
batch nb. 13 loss: 0.004375117365270853
batch nb. 14 loss: 0.0040947743691504
batch nb. 15 loss: 0.00405451375991106
validation: Loss: 0.004576635546982288 train loss : 0.005547149572521448
new best model
Epoch: 46
batch nb. 0 loss: 0.004510317929089069
batch nb. 1 loss: 0.0037979776971042156
batch nb. 2 loss: 0.0033704889938235283
batch nb. 3 loss: 0.003438229439780116
batch nb. 4 loss: 0.003655575215816498
batch nb. 5 loss: 0.0037335038650780916
batch nb. 6 loss: 0.004283795598894358
batch nb. 7 loss: 0.003951942548155785
batch nb. 8 loss: 0.004593323916196823
batch nb. 9 loss: 0.005716141778975725
batch nb. 10 loss: 0.005137795582413673
batch nb. 11 loss: 0.005846318788826466
batch nb. 12 loss: 0.005837941076606512
batch nb. 13 loss: 0.004357486963272095
batch nb. 14 loss: 0.004047021269798279
batch nb. 15 loss: 0.004047180525958538
validation: Loss: 0.0046754078939557076 train loss : 0.005515235476195812
Epoch: 47
batch nb. 0 loss: 0.004604985937476158
batch nb. 1 loss: 0.003873794572427869
batch nb. 2 loss: 0.0033726058900356293
batch nb. 3 loss: 0.0034928759559988976
batch nb. 4 loss: 0.0037025471683591604
batch nb. 5 loss: 0.0037684522103518248
batch nb. 6 loss: 0.00436586607247591
batch nb. 7 loss: 0.003980346955358982
batch nb. 8 loss: 0.0045802961103618145
batch nb. 9 loss: 0.00569156464189291
batch nb. 10 loss: 0.005096666514873505
batch nb. 11 loss: 0.0057868678122758865
batch nb. 12 loss: 0.005794620141386986
batch nb. 13 loss: 0.004278801381587982
batch nb. 14 loss: 0.003994081635028124
batch nb. 15 loss: 0.003975547384470701
validation: Loss: 0.004603471606969833 train loss : 0.005483158398419619
Epoch: 48
batch nb. 0 loss: 0.004465366713702679
batch nb. 1 loss: 0.003866933984681964
batch nb. 2 loss: 0.0033667946700006723
batch nb. 3 loss: 0.003411791520193219
batch nb. 4 loss: 0.0036589952651411295
batch nb. 5 loss: 0.003709646873176098
batch nb. 6 loss: 0.004221481271088123
batch nb. 7 loss: 0.003914453089237213
batch nb. 8 loss: 0.004499201662838459
batch nb. 9 loss: 0.00561688793823123
batch nb. 10 loss: 0.005078027490526438
batch nb. 11 loss: 0.005688461009413004
batch nb. 12 loss: 0.005703904200345278
batch nb. 13 loss: 0.0042367372661828995
batch nb. 14 loss: 0.003940201364457607
batch nb. 15 loss: 0.003972713369876146
validation: Loss: 0.004563864786177874 train loss : 0.005452333018183708
new best model
Epoch: 49
batch nb. 0 loss: 0.00443801237270236
batch nb. 1 loss: 0.0038473038002848625
batch nb. 2 loss: 0.0033501258585602045
batch nb. 3 loss: 0.003396740648895502
batch nb. 4 loss: 0.0036603782791644335
batch nb. 5 loss: 0.003717163810506463
batch nb. 6 loss: 0.00420430488884449
batch nb. 7 loss: 0.003897553775459528
batch nb. 8 loss: 0.004478983581066132
batch nb. 9 loss: 0.0055504487827420235
batch nb. 10 loss: 0.005043806973844767
batch nb. 11 loss: 0.005699956323951483
batch nb. 12 loss: 0.005704803392291069
batch nb. 13 loss: 0.004226431716233492
batch nb. 14 loss: 0.0039215837605297565
batch nb. 15 loss: 0.003978630993515253
validation: Loss: 0.004580771084874868 train loss : 0.005422858987003565
Epoch: 50
batch nb. 0 loss: 0.004492454696446657
batch nb. 1 loss: 0.003816191339865327
batch nb. 2 loss: 0.0032953398767858744
batch nb. 3 loss: 0.0033833691850304604
batch nb. 4 loss: 0.0035965535789728165
batch nb. 5 loss: 0.0036539307329803705
batch nb. 6 loss: 0.004187095444649458
batch nb. 7 loss: 0.003862404264509678
batch nb. 8 loss: 0.004429968539625406
batch nb. 9 loss: 0.005490139126777649
batch nb. 10 loss: 0.005006314255297184
batch nb. 11 loss: 0.005663239397108555
batch nb. 12 loss: 0.0056609041057527065
batch nb. 13 loss: 0.004202993120998144
batch nb. 14 loss: 0.0038903544191271067
batch nb. 15 loss: 0.003901661606505513
validation: Loss: 0.004506959579885006 train loss : 0.005393031984567642
new best model
Epoch: 51
batch nb. 0 loss: 0.004375149495899677