-
Notifications
You must be signed in to change notification settings - Fork 0
/
Copy pathoutput.2x2_2neurons_batch3_0.01_1layer_5000epochs_model1
162 lines (149 loc) · 5.31 KB
/
output.2x2_2neurons_batch3_0.01_1layer_5000epochs_model1
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
/usr/local/google/home/hylick/bin/anaconda2/envs/hylick-dev/lib/python3.6/site-packages/h5py/__init__.py:36: FutureWarning: Conversion of the second argument of issubdtype from `float` to `np.floating` is deprecated. In future, it will be treated as `np.float64 == np.dtype(float).type`.
from ._conv import register_converters as _register_converters
2018-04-30 14:51:26.464231: I tensorflow/core/platform/cpu_feature_guard.cc:140] Your CPU supports instructions that this TensorFlow binary was not compiled to use: SSE4.1 SSE4.2 AVX AVX2 FMA
E0430 14:51:26.465232371 67188 tcp_server_posix.cc:65] check for SO_REUSEPORT: {"created":"@1525125086.465222938","description":"SO_REUSEPORT unavailable on compiling system","file":"external/grpc/src/core/lib/iomgr/socket_utils_common_posix.cc","file_line":162}
2018-04-30 14:51:26.465358: I tensorflow/core/distributed_runtime/rpc/grpc_channel.cc:215] Initialize GrpcChannelCache for job ps -> {0 -> localhost:2222}
2018-04-30 14:51:26.465374: I tensorflow/core/distributed_runtime/rpc/grpc_channel.cc:215] Initialize GrpcChannelCache for job worker -> {0 -> localhost:2222}
2018-04-30 14:51:26.466116: I tensorflow/core/distributed_runtime/rpc/grpc_server_lib.cc:333] Started server with target: grpc://localhost:2222
Model and variables restored.
...continuing training
epoch: 0 - loss: 0.025454
epoch: 50 - loss: 0.008994
epoch: 100 - loss: 0.007239
epoch: 150 - loss: 0.024456
epoch: 200 - loss: 0.008444
epoch: 250 - loss: 0.006328
epoch: 300 - loss: 0.017185
epoch: 350 - loss: 0.016627
epoch: 400 - loss: 0.005608
epoch: 450 - loss: 0.005995
epoch: 500 - loss: 0.001100
epoch: 550 - loss: 0.003820
epoch: 600 - loss: 0.002854
epoch: 650 - loss: 0.001214
epoch: 700 - loss: 0.002507
epoch: 750 - loss: 0.002753
epoch: 800 - loss: 0.004467
epoch: 850 - loss: 0.002904
epoch: 900 - loss: 0.001952
epoch: 950 - loss: 0.000214
epoch: 1000 - loss: 0.001045
Writing checkpoint 1000
[[ 9. 2.]
[13. 3.]
[29. 7.]] [[0 0]
[4 1]] [[1 0]]
[[ 8.926876 1.9988328]
[12.974272 2.9995892]
[29.016075 7.0002565]] [[7.5496915e-03 1.2045149e-04]
[4.0108209e+00 1.0001729e+00]] [ 0.9062497 -0.00149652]
epoch: 1050 - loss: 0.000764
epoch: 1100 - loss: 0.000018
epoch: 1150 - loss: 0.001290
epoch: 1200 - loss: 0.000380
epoch: 1250 - loss: 0.000324
epoch: 1300 - loss: 0.000304
epoch: 1350 - loss: 0.000360
epoch: 1400 - loss: 0.000023
epoch: 1450 - loss: 0.000212
epoch: 1500 - loss: 0.000289
epoch: 1550 - loss: 0.000200
epoch: 1600 - loss: 0.000075
epoch: 1650 - loss: 0.000029
epoch: 1700 - loss: 0.000121
epoch: 1750 - loss: 0.000037
epoch: 1800 - loss: 0.000080
epoch: 1850 - loss: 0.000083
epoch: 1900 - loss: 0.000057
epoch: 1950 - loss: 0.000062
epoch: 2000 - loss: 0.000044
Writing checkpoint 2000
[[29. 7.]
[37. 9.]
[ 1. 0.]] [[0 0]
[4 1]] [[1 0]]
[[ 2.9007343e+01 7.0001178e+00]
[ 3.7008877e+01 9.0001421e+00]
[ 9.8866475e-01 -1.8102929e-04]] [[1.2609453e-03 2.0079136e-05]
[4.0011115e+00 1.0000178e+00]] [ 9.8238117e-01 -2.8124428e-04]
epoch: 2050 - loss: 0.000015
epoch: 2100 - loss: 0.000003
epoch: 2150 - loss: 0.000007
epoch: 2200 - loss: 0.000005
epoch: 2250 - loss: 0.000022
epoch: 2300 - loss: 0.000011
epoch: 2350 - loss: 0.000004
epoch: 2400 - loss: 0.000008
epoch: 2450 - loss: 0.000003
epoch: 2500 - loss: 0.000002
epoch: 2550 - loss: 0.000005
epoch: 2600 - loss: 0.000002
epoch: 2650 - loss: 0.000005
epoch: 2700 - loss: 0.000004
epoch: 2750 - loss: 0.000003
epoch: 2800 - loss: 0.000001
epoch: 2850 - loss: 0.000001
epoch: 2900 - loss: 0.000001
epoch: 2950 - loss: 0.000001
epoch: 3000 - loss: 0.000002
Writing checkpoint 3000
[[25. 6.]
[37. 9.]
[ 1. 0.]] [[0 0]
[4 1]] [[1 0]]
[[ 2.4999136e+01 5.9999862e+00]
[ 3.7002258e+01 9.0000362e+00]
[ 9.9798751e-01 -3.1935044e-05]] [[2.5951475e-04 4.1899029e-06]
[4.0003295e+00 1.0000052e+00]] [ 9.9657303e-01 -5.4793039e-05]
epoch: 3050 - loss: 0.000002
epoch: 3100 - loss: 0.000000
epoch: 3150 - loss: 0.000000
epoch: 3200 - loss: 0.000000
epoch: 3250 - loss: 0.000001
epoch: 3300 - loss: 0.000000
epoch: 3350 - loss: 0.000000
epoch: 3400 - loss: 0.000001
epoch: 3450 - loss: 0.000000
epoch: 3500 - loss: 0.000000
epoch: 3550 - loss: 0.000000
epoch: 3600 - loss: 0.000000
epoch: 3650 - loss: 0.000000
epoch: 3700 - loss: 0.000000
epoch: 3750 - loss: 0.000000
epoch: 3800 - loss: 0.000000
epoch: 3850 - loss: 0.000000
epoch: 3900 - loss: 0.000000
epoch: 3950 - loss: 0.000000
epoch: 4000 - loss: 0.000000
Writing checkpoint 4000
[[37. 9.]
[25. 6.]
[ 1. 0.]] [[0 0]
[4 1]] [[1 0]]
[[ 3.7000423e+01 9.0000067e+00]
[ 2.5000027e+01 6.0000010e+00]
[ 9.9957842e-01 -6.7176552e-06]] [[6.3528685e-05 1.0030017e-06]
[4.0000496e+00 1.0000008e+00]] [ 9.9936962e-01 -1.0033617e-05]
epoch: 4050 - loss: 0.000000
epoch: 4100 - loss: 0.000000
epoch: 4150 - loss: 0.000000
epoch: 4200 - loss: 0.000000
epoch: 4250 - loss: 0.000000
epoch: 4300 - loss: 0.000000
epoch: 4350 - loss: 0.000000
epoch: 4400 - loss: 0.000000
epoch: 4450 - loss: 0.000000
epoch: 4500 - loss: 0.000000
epoch: 4550 - loss: 0.000000
epoch: 4600 - loss: 0.000000
epoch: 4650 - loss: 0.000000
epoch: 4700 - loss: 0.000000
epoch: 4750 - loss: 0.000000
epoch: 4800 - loss: 0.000000
epoch: 4850 - loss: 0.000000
epoch: 4900 - loss: 0.000000
epoch: 4950 - loss: 0.000000
Model saved to /usr/local/google/home/hylick/Projects/2018/anaconda/tf/slope_transform/checkpoints/model1.ckpt
real 0m4.339s
user 0m4.908s
sys 0m1.200s