059_DLbyABr_05-RecurrentNetworks(Python)

SDS-2.x, Scalable Data Engineering Science

This is a 2019 augmentation and update of Adam Breindel's initial notebooks.

Please feel free to refer to basic concepts here:

Archived YouTube video of this live unedited lab-lecture:

Archived YouTube video of this live unedited lab-lecture Archived YouTube video of this live unedited lab-lecture

Entering the 4th Dimension

Networks for Understanding Time-Oriented Patterns in Data

Common time-based problems include

  • Sequence modeling: "What comes next?"
    • Likely next letter, word, phrase, category, cound, action, value
  • Sequence-to-Sequence modeling: "What alternative sequence is a pattern match?" (i.e., similar probability distribution)
    • Machine translation, text-to-speech/speech-to-text, connected handwriting (specific scripts)

Simplified Approaches

  • If we know all of the sequence states and the probabilities of state transition...

    • ... then we have a simple Markov Chain model.
  • If we don't know all of the states or probabilities (yet) but can make constraining assumptions and acquire solid information from observing (sampling) them...

    • ... we can use a Hidden Markov Model approach.

These approached have only limited capacity because they are effectively stateless and so have some degree of "extreme retrograde amnesia."

Can we use a neural network to learn the "next" record in a sequence?

First approach, using what we already know, might look like

  • Clamp input sequence to a vector of neurons in a feed-forward network
  • Learn a model on the class of the next input record

Let's try it! This can work in some situations, although it's more of a setup and starting point for our next development.

We will make up a simple example of the English alphabet sequence wehere we try to predict the next alphabet from a sequence of length 3.

alphabet = "ABCDEFGHIJKLMNOPQRSTUVWXYZ"
char_to_int = dict((c, i) for i, c in enumerate(alphabet))
int_to_char = dict((i, c) for i, c in enumerate(alphabet))

seq_length = 3
dataX = []
dataY = []
for i in range(0, len(alphabet) - seq_length, 1):
    seq_in = alphabet[i:i + seq_length]
    seq_out = alphabet[i + seq_length]
    dataX.append([char_to_int[char] for char in seq_in])
    dataY.append(char_to_int[seq_out])
    print (seq_in, '->', seq_out)
('ABC', '->', 'D') ('BCD', '->', 'E') ('CDE', '->', 'F') ('DEF', '->', 'G') ('EFG', '->', 'H') ('FGH', '->', 'I') ('GHI', '->', 'J') ('HIJ', '->', 'K') ('IJK', '->', 'L') ('JKL', '->', 'M') ('KLM', '->', 'N') ('LMN', '->', 'O') ('MNO', '->', 'P') ('NOP', '->', 'Q') ('OPQ', '->', 'R') ('PQR', '->', 'S') ('QRS', '->', 'T') ('RST', '->', 'U') ('STU', '->', 'V') ('TUV', '->', 'W') ('UVW', '->', 'X') ('VWX', '->', 'Y') ('WXY', '->', 'Z')
# dataX is just a reindexing of the alphabets in consecutive triplets of numbers
dataX
Out[2]: [[0, 1, 2], [1, 2, 3], [2, 3, 4], [3, 4, 5], [4, 5, 6], [5, 6, 7], [6, 7, 8], [7, 8, 9], [8, 9, 10], [9, 10, 11], [10, 11, 12], [11, 12, 13], [12, 13, 14], [13, 14, 15], [14, 15, 16], [15, 16, 17], [16, 17, 18], [17, 18, 19], [18, 19, 20], [19, 20, 21], [20, 21, 22], [21, 22, 23], [22, 23, 24]]
dataY # just a reindexing of the following alphabet after each consecutive triplet of numbers
Out[3]: [3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25]

Train a network on that data:

import numpy
from keras.models import Sequential
from keras.layers import Dense
from keras.layers import LSTM # <- this is the Long-Short-term memory layer
from keras.utils import np_utils

# begin data generation ------------------------------------------
# this is just a repeat of what we did above
alphabet = "ABCDEFGHIJKLMNOPQRSTUVWXYZ"
char_to_int = dict((c, i) for i, c in enumerate(alphabet))
int_to_char = dict((i, c) for i, c in enumerate(alphabet))

seq_length = 3
dataX = []
dataY = []
for i in range(0, len(alphabet) - seq_length, 1):
    seq_in = alphabet[i:i + seq_length]
    seq_out = alphabet[i + seq_length]
    dataX.append([char_to_int[char] for char in seq_in])
    dataY.append(char_to_int[seq_out])
    print (seq_in, '->', seq_out)
# end data generation ---------------------------------------------

X = numpy.reshape(dataX, (len(dataX), seq_length))
X = X / float(len(alphabet)) # normalize the mapping of alphabets from integers into [0, 1]
y = np_utils.to_categorical(dataY) # make the output we want to predict to be categorical

# keras architecturing of a feed forward dense or fully connected Neural Network
model = Sequential()
# draw the architecture of the network given by next two lines, hint: X.shape[1] = 3, y.shape[1] = 26
model.add(Dense(30, input_dim=X.shape[1], kernel_initializer='normal', activation='relu'))
model.add(Dense(y.shape[1], activation='softmax'))

# keras compiling and fitting
model.compile(loss='categorical_crossentropy', optimizer='adam', metrics=['accuracy'])
model.fit(X, y, epochs=1000, batch_size=5, verbose=2)

scores = model.evaluate(X, y)
print("Model Accuracy: %.2f " % scores[1])

for pattern in dataX:
    x = numpy.reshape(pattern, (1, len(pattern)))
    x = x / float(len(alphabet))
    prediction = model.predict(x, verbose=0) # get prediction from fitted model
    index = numpy.argmax(prediction)
    result = int_to_char[index]
    seq_in = [int_to_char[value] for value in pattern]
    print (seq_in, "->", result) # print the predicted outputs
Using TensorFlow backend. ('ABC', '->', 'D') ('BCD', '->', 'E') ('CDE', '->', 'F') ('DEF', '->', 'G') ('EFG', '->', 'H') ('FGH', '->', 'I') ('GHI', '->', 'J') ('HIJ', '->', 'K') ('IJK', '->', 'L') ('JKL', '->', 'M') ('KLM', '->', 'N') ('LMN', '->', 'O') ('MNO', '->', 'P') ('NOP', '->', 'Q') ('OPQ', '->', 'R') ('PQR', '->', 'S') ('QRS', '->', 'T') ('RST', '->', 'U') ('STU', '->', 'V') ('TUV', '->', 'W') ('UVW', '->', 'X') ('VWX', '->', 'Y') ('WXY', '->', 'Z') WARNING:tensorflow:From /databricks/python/local/lib/python2.7/site-packages/tensorflow/python/framework/op_def_library.py:263: colocate_with (from tensorflow.python.framework.ops) is deprecated and will be removed in a future version. Instructions for updating: Colocations handled automatically by placer. WARNING:tensorflow:From /databricks/python/local/lib/python2.7/site-packages/tensorflow/python/ops/math_ops.py:3066: to_int32 (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version. Instructions for updating: Use tf.cast instead. Epoch 1/1000 - 0s - loss: 3.2628 - acc: 0.0000e+00 Epoch 2/1000 - 0s - loss: 3.2587 - acc: 0.0000e+00 Epoch 3/1000 - 0s - loss: 3.2565 - acc: 0.0000e+00 Epoch 4/1000 - 0s - loss: 3.2544 - acc: 0.0000e+00 Epoch 5/1000 - 0s - loss: 3.2526 - acc: 0.0435 Epoch 6/1000 - 0s - loss: 3.2507 - acc: 0.0435 Epoch 7/1000 - 0s - loss: 3.2488 - acc: 0.0435 Epoch 8/1000 - 0s - loss: 3.2473 - acc: 0.0435 Epoch 9/1000 - 0s - loss: 3.2454 - acc: 0.0435 Epoch 10/1000 - 0s - loss: 3.2437 - acc: 0.0435 Epoch 11/1000 - 0s - loss: 3.2418 - acc: 0.0435 Epoch 12/1000 - 0s - loss: 3.2400 - acc: 0.0435 Epoch 13/1000 - 0s - loss: 3.2380 - acc: 0.0435 Epoch 14/1000 - 0s - loss: 3.2363 - acc: 0.0435 Epoch 15/1000 - 0s - loss: 3.2341 - acc: 0.0435 Epoch 16/1000 - 0s - loss: 3.2319 - acc: 0.0435 Epoch 17/1000 - 0s - loss: 3.2297 - acc: 0.0435 Epoch 18/1000 - 0s - loss: 3.2274 - acc: 0.0435 Epoch 19/1000 - 0s - loss: 3.2252 - acc: 0.0435 Epoch 20/1000 - 0s - loss: 3.2230 - acc: 0.0435 Epoch 21/1000 - 0s - loss: 3.2206 - acc: 0.0435 Epoch 22/1000 - 0s - loss: 3.2178 - acc: 0.0435 Epoch 23/1000 - 0s - loss: 3.2154 - acc: 0.0435 Epoch 24/1000 - 0s - loss: 3.2126 - acc: 0.0435 Epoch 25/1000 - 0s - loss: 3.2101 - acc: 0.0435 Epoch 26/1000 - 0s - loss: 3.2069 - acc: 0.0435 Epoch 27/1000 - 0s - loss: 3.2041 - acc: 0.0435 Epoch 28/1000 - 0s - loss: 3.2014 - acc: 0.0435 Epoch 29/1000 - 0s - loss: 3.1978 - acc: 0.0435 Epoch 30/1000 - 0s - loss: 3.1946 - acc: 0.0435 Epoch 31/1000 - 0s - loss: 3.1916 - acc: 0.0435 Epoch 32/1000 - 0s - loss: 3.1880 - acc: 0.0435 Epoch 33/1000 - 0s - loss: 3.1845 - acc: 0.0435 Epoch 34/1000 - 0s - loss: 3.1814 - acc: 0.0435 Epoch 35/1000 - 0s - loss: 3.1781 - acc: 0.0435 Epoch 36/1000 - 0s - loss: 3.1740 - acc: 0.0435 Epoch 37/1000 - 0s - loss: 3.1705 - acc: 0.0435 Epoch 38/1000 - 0s - loss: 3.1669 - acc: 0.0435 Epoch 39/1000 - 0s - loss: 3.1628 - acc: 0.0435 Epoch 40/1000 - 0s - loss: 3.1591 - acc: 0.0435 Epoch 41/1000 - 0s - loss: 3.1550 - acc: 0.0435 Epoch 42/1000 - 0s - loss: 3.1515 - acc: 0.0435 Epoch 43/1000 - 0s - loss: 3.1470 - acc: 0.0435 Epoch 44/1000 - 0s - loss: 3.1434 - acc: 0.0435 Epoch 45/1000 - 0s - loss: 3.1394 - acc: 0.0435 Epoch 46/1000 - 0s - loss: 3.1357 - acc: 0.0435 Epoch 47/1000 - 0s - loss: 3.1316 - acc: 0.0435 Epoch 48/1000 - 0s - loss: 3.1285 - acc: 0.0435 Epoch 49/1000 - 0s - loss: 3.1240 - acc: 0.0435 Epoch 50/1000 - 0s - loss: 3.1205 - acc: 0.0435 Epoch 51/1000 - 0s - loss: 3.1163 - acc: 0.0435 Epoch 52/1000 - 0s - loss: 3.1125 - acc: 0.0435 Epoch 53/1000 - 0s - loss: 3.1091 - acc: 0.0435 Epoch 54/1000 - 0s - loss: 3.1048 - acc: 0.0435 Epoch 55/1000 - 0s - loss: 3.1010 - acc: 0.0435 Epoch 56/1000 - 0s - loss: 3.0974 - acc: 0.0435 Epoch 57/1000 - 0s - loss: 3.0939 - acc: 0.0435 Epoch 58/1000 - 0s - loss: 3.0899 - acc: 0.0435 Epoch 59/1000 - 0s - loss: 3.0864 - acc: 0.0435 Epoch 60/1000 - 0s - loss: 3.0824 - acc: 0.0435 Epoch 61/1000 - 0s - loss: 3.0785 - acc: 0.0435 Epoch 62/1000 - 0s - loss: 3.0750 - acc: 0.0435 Epoch 63/1000 - 0s - loss: 3.0710 - acc: 0.0435 Epoch 64/1000 - 0s - loss: 3.0674 - acc: 0.0435 Epoch 65/1000 - 0s - loss: 3.0641 - acc: 0.0435 Epoch 66/1000 - 0s - loss: 3.0603 - acc: 0.0435 Epoch 67/1000 - 0s - loss: 3.0566 - acc: 0.0435 Epoch 68/1000 - 0s - loss: 3.0533 - acc: 0.0435 Epoch 69/1000 - 0s - loss: 3.0496 - acc: 0.0435 Epoch 70/1000 - 0s - loss: 3.0461 - acc: 0.0435 Epoch 71/1000 - 0s - loss: 3.0424 - acc: 0.0435 Epoch 72/1000 - 0s - loss: 3.0392 - acc: 0.0435 Epoch 73/1000 - 0s - loss: 3.0355 - acc: 0.0435 Epoch 74/1000 - 0s - loss: 3.0316 - acc: 0.0435 Epoch 75/1000 - 0s - loss: 3.0282 - acc: 0.0435 Epoch 76/1000 - 0s - loss: 3.0249 - acc: 0.0435 Epoch 77/1000 - 0s - loss: 3.0215 - acc: 0.0435 Epoch 78/1000 - 0s - loss: 3.0177 - acc: 0.0870 Epoch 79/1000 - 0s - loss: 3.0145 - acc: 0.0870 Epoch 80/1000 - 0s - loss: 3.0107 - acc: 0.0870 Epoch 81/1000 - 0s - loss: 3.0073 - acc: 0.0870 Epoch 82/1000 - 0s - loss: 3.0040 - acc: 0.0870 Epoch 83/1000 - 0s - loss: 3.0003 - acc: 0.0870 Epoch 84/1000 - 0s - loss: 2.9964 - acc: 0.0870 Epoch 85/1000 - 0s - loss: 2.9931 - acc: 0.0870 Epoch 86/1000 - 0s - loss: 2.9894 - acc: 0.1304 Epoch 87/1000 - 0s - loss: 2.9858 - acc: 0.1304 Epoch 88/1000 - 0s - loss: 2.9821 - acc: 0.1304 Epoch 89/1000 - 0s - loss: 2.9786 - acc: 0.1304 Epoch 90/1000 - 0s - loss: 2.9749 - acc: 0.1304 Epoch 91/1000 - 0s - loss: 2.9710 - acc: 0.1304 Epoch 92/1000 - 0s - loss: 2.9675 - acc: 0.1304 Epoch 93/1000 - 0s - loss: 2.9644 - acc: 0.1304 Epoch 94/1000 - 0s - loss: 2.9601 - acc: 0.1304 Epoch 95/1000 - 0s - loss: 2.9569 - acc: 0.1304 Epoch 96/1000 - 0s - loss: 2.9528 - acc: 0.1304 Epoch 97/1000 - 0s - loss: 2.9492 - acc: 0.1304 Epoch 98/1000 - 0s - loss: 2.9457 - acc: 0.1304 Epoch 99/1000 - 0s - loss: 2.9418 - acc: 0.0870 Epoch 100/1000 - 0s - loss: 2.9379 - acc: 0.1304 Epoch 101/1000 - 0s - loss: 2.9341 - acc: 0.0870 Epoch 102/1000 - 0s - loss: 2.9309 - acc: 0.0870 Epoch 103/1000 - 0s - loss: 2.9268 - acc: 0.0870 Epoch 104/1000 - 0s - loss: 2.9230 - acc: 0.0870 Epoch 105/1000 - 0s - loss: 2.9198 - acc: 0.0870 Epoch 106/1000 - 0s - loss: 2.9159 - acc: 0.0870 Epoch 107/1000 - 0s - loss: 2.9121 - acc: 0.0870 Epoch 108/1000 - 0s - loss: 2.9084 - acc: 0.0870 Epoch 109/1000 - 0s - loss: 2.9046 - acc: 0.0870 Epoch 110/1000 - 0s - loss: 2.9012 - acc: 0.0870 Epoch 111/1000 - 0s - loss: 2.8974 - acc: 0.0870 Epoch 112/1000 - 0s - loss: 2.8935 - acc: 0.0870 Epoch 113/1000 - 0s - loss: 2.8905 - acc: 0.0870 Epoch 114/1000 - 0s - loss: 2.8865 - acc: 0.0870 Epoch 115/1000 - 0s - loss: 2.8829 - acc: 0.0870 Epoch 116/1000 - 0s - loss: 2.8792 - acc: 0.0870 Epoch 117/1000 - 0s - loss: 2.8760 - acc: 0.0870 Epoch 118/1000 - 0s - loss: 2.8719 - acc: 0.0870 Epoch 119/1000 - 0s - loss: 2.8682 - acc: 0.0870 Epoch 120/1000 - 0s - loss: 2.8640 - acc: 0.0870 Epoch 121/1000 - 0s - loss: 2.8605 - acc: 0.0870 Epoch 122/1000 - 0s - loss: 2.8569 - acc: 0.0870 Epoch 123/1000 - 0s - loss: 2.8531 - acc: 0.0870 Epoch 124/1000 - 0s - loss: 2.8494 - acc: 0.0870 Epoch 125/1000 - 0s - loss: 2.8456 - acc: 0.0870 Epoch 126/1000 - 0s - loss: 2.8418 - acc: 0.0870 Epoch 127/1000 - 0s - loss: 2.8380 - acc: 0.0870 Epoch 128/1000 - 0s - loss: 2.8342 - acc: 0.0870 Epoch 129/1000 - 0s - loss: 2.8307 - acc: 0.0870 Epoch 130/1000 - 0s - loss: 2.8266 - acc: 0.0870 Epoch 131/1000 - 0s - loss: 2.8228 - acc: 0.0870 Epoch 132/1000 - 0s - loss: 2.8192 - acc: 0.0870 Epoch 133/1000 - 0s - loss: 2.8160 - acc: 0.0870 Epoch 134/1000 - 0s - loss: 2.8123 - acc: 0.0870 Epoch 135/1000 - 0s - loss: 2.8085 - acc: 0.0870 Epoch 136/1000 - 0s - loss: 2.8049 - acc: 0.0870 Epoch 137/1000 - 0s - loss: 2.8012 - acc: 0.0870 Epoch 138/1000 - 0s - loss: 2.7981 - acc: 0.0870 Epoch 139/1000 - 0s - loss: 2.7947 - acc: 0.0870 Epoch 140/1000 - 0s - loss: 2.7907 - acc: 0.0870 Epoch 141/1000 - 0s - loss: 2.7872 - acc: 0.0870 Epoch 142/1000 - 0s - loss: 2.7837 - acc: 0.0870 Epoch 143/1000 - 0s - loss: 2.7804 - acc: 0.0870 Epoch 144/1000 - 0s - loss: 2.7769 - acc: 0.0870 Epoch 145/1000 - 0s - loss: 2.7734 - acc: 0.0870 Epoch 146/1000 - 0s - loss: 2.7700 - acc: 0.0870 Epoch 147/1000 - 0s - loss: 2.7667 - acc: 0.0870 Epoch 148/1000 - 0s - loss: 2.7630 - acc: 0.0870 Epoch 149/1000 - 0s - loss: 2.7596 - acc: 0.0870 Epoch 150/1000 - 0s - loss: 2.7562 - acc: 0.0870 Epoch 151/1000 - 0s - loss: 2.7524 - acc: 0.0870 Epoch 152/1000 - 0s - loss: 2.7495 - acc: 0.0870 Epoch 153/1000 - 0s - loss: 2.7459 - acc: 0.0870 Epoch 154/1000 - 0s - loss: 2.7425 - acc: 0.0870 Epoch 155/1000 - 0s - loss: 2.7389 - acc: 0.0870 Epoch 156/1000 - 0s - loss: 2.7353 - acc: 0.1304 Epoch 157/1000 - 0s - loss: 2.7322 - acc: 0.0870 Epoch 158/1000 - 0s - loss: 2.7291 - acc: 0.0870 Epoch 159/1000 - 0s - loss: 2.7256 - acc: 0.0870 Epoch 160/1000 - 0s - loss: 2.7222 - acc: 0.0870 Epoch 161/1000 - 0s - loss: 2.7189 - acc: 0.1304 Epoch 162/1000 - 0s - loss: 2.7160 - acc: 0.1304 Epoch 163/1000 - 0s - loss: 2.7125 - acc: 0.1304 Epoch 164/1000 - 0s - loss: 2.7092 - acc: 0.1304 Epoch 165/1000 - 0s - loss: 2.7054 - acc: 0.1304 Epoch 166/1000 - 0s - loss: 2.7028 - acc: 0.1304 Epoch 167/1000 - 0s - loss: 2.6996 - acc: 0.1304 Epoch 168/1000 - 0s - loss: 2.6963 - acc: 0.1304 Epoch 169/1000 - 0s - loss: 2.6933 - acc: 0.1304 Epoch 170/1000 - 0s - loss: 2.6899 - acc: 0.1304 Epoch 171/1000 - 0s - loss: 2.6869 - acc: 0.1304 Epoch 172/1000 - 0s - loss: 2.6837 - acc: 0.0870 Epoch 173/1000 - 0s - loss: 2.6806 - acc: 0.0870 Epoch 174/1000 - 0s - loss: 2.6777 - acc: 0.0870 Epoch 175/1000 - 0s - loss: 2.6743 - acc: 0.0870 Epoch 176/1000 - 0s - loss: 2.6714 - acc: 0.1304 Epoch 177/1000 - 0s - loss: 2.6683 - acc: 0.1304 Epoch 178/1000 - 0s - loss: 2.6649 - acc: 0.1304 Epoch 179/1000 - 0s - loss: 2.6624 - acc: 0.1304 Epoch 180/1000 - 0s - loss: 2.6592 - acc: 0.1304 Epoch 181/1000 - 0s - loss: 2.6560 - acc: 0.1304 Epoch 182/1000 - 0s - loss: 2.6531 - acc: 0.1304 Epoch 183/1000 - 0s - loss: 2.6502 - acc: 0.1304 Epoch 184/1000 - 0s - loss: 2.6478 - acc: 0.1304 Epoch 185/1000 - 0s - loss: 2.6441 - acc: 0.1304 Epoch 186/1000 - 0s - loss: 2.6412 - acc: 0.1304 Epoch 187/1000 - 0s - loss: 2.6383 - acc: 0.1304 Epoch 188/1000 - 0s - loss: 2.6356 - acc: 0.1304 Epoch 189/1000 - 0s - loss: 2.6324 - acc: 0.1304 Epoch 190/1000 - 0s - loss: 2.6293 - acc: 0.1304 Epoch 191/1000 - 0s - loss: 2.6266 - acc: 0.1304 Epoch 192/1000 - 0s - loss: 2.6237 - acc: 0.1304 Epoch 193/1000 - 0s - loss: 2.6209 - acc: 0.1304 Epoch 194/1000 - 0s - loss: 2.6180 - acc: 0.1304 Epoch 195/1000 - 0s - loss: 2.6155 - acc: 0.1304 Epoch 196/1000 - 0s - loss: 2.6122 - acc: 0.1304 Epoch 197/1000 - 0s - loss: 2.6101 - acc: 0.1304 Epoch 198/1000 - 0s - loss: 2.6073 - acc: 0.1304 Epoch 199/1000 - 0s - loss: 2.6049 - acc: 0.1304 Epoch 200/1000 - 0s - loss: 2.6021 - acc: 0.1304 Epoch 201/1000 - 0s - loss: 2.5995 - acc: 0.1304 Epoch 202/1000 - 0s - loss: 2.5965 - acc: 0.1304 Epoch 203/1000 - 0s - loss: 2.5942 - acc: 0.1304 Epoch 204/1000 - 0s - loss: 2.5909 - acc: 0.1304 Epoch 205/1000 - 0s - loss: 2.5887 - acc: 0.1304 Epoch 206/1000 - 0s - loss: 2.5860 - acc: 0.1304 Epoch 207/1000 - 0s - loss: 2.5837 - acc: 0.1304 Epoch 208/1000 - 0s - loss: 2.5808 - acc: 0.1304 Epoch 209/1000 - 0s - loss: 2.5780 - acc: 0.1304 Epoch 210/1000 - 0s - loss: 2.5757 - acc: 0.1304 Epoch 211/1000 - 0s - loss: 2.5731 - acc: 0.1304 Epoch 212/1000 - 0s - loss: 2.5707 - acc: 0.1304 Epoch 213/1000 - 0s - loss: 2.5680 - acc: 0.1304 Epoch 214/1000 - 0s - loss: 2.5651 - acc: 0.0870 Epoch 215/1000 - 0s - loss: 2.5628 - acc: 0.0870 Epoch 216/1000 - 0s - loss: 2.5600 - acc: 0.1304 Epoch 217/1000 - 0s - loss: 2.5577 - acc: 0.1304 Epoch 218/1000 - 0s - loss: 2.5552 - acc: 0.1304 Epoch 219/1000 - 0s - loss: 2.5525 - acc: 0.1739 Epoch 220/1000 - 0s - loss: 2.5501 - acc: 0.1739 Epoch 221/1000 - 0s - loss: 2.5472 - acc: 0.1739 Epoch 222/1000 - 0s - loss: 2.5453 - acc: 0.1739 Epoch 223/1000 - 0s - loss: 2.5422 - acc: 0.1739 Epoch 224/1000 - 0s - loss: 2.5397 - acc: 0.2174 Epoch 225/1000 - 0s - loss: 2.5380 - acc: 0.2174 Epoch 226/1000 - 0s - loss: 2.5353 - acc: 0.1739 Epoch 227/1000 - 0s - loss: 2.5330 - acc: 0.1739 Epoch 228/1000 - 0s - loss: 2.5306 - acc: 0.1739 Epoch 229/1000 - 0s - loss: 2.5280 - acc: 0.1739 Epoch 230/1000 - 0s - loss: 2.5257 - acc: 0.1739 Epoch 231/1000 - 0s - loss: 2.5234 - acc: 0.1739 Epoch 232/1000 - 0s - loss: 2.5211 - acc: 0.1739 Epoch 233/1000 - 0s - loss: 2.5184 - acc: 0.1739 Epoch 234/1000 - 0s - loss: 2.5163 - acc: 0.1739 Epoch 235/1000 - 0s - loss: 2.5145 - acc: 0.1304 Epoch 236/1000 - 0s - loss: 2.5121 - acc: 0.1304 Epoch 237/1000 - 0s - loss: 2.5097 - acc: 0.1304 Epoch 238/1000 - 0s - loss: 2.5077 - acc: 0.1304 Epoch 239/1000 - 0s - loss: 2.5053 - acc: 0.1739 Epoch 240/1000 - 0s - loss: 2.5029 - acc: 0.1739 Epoch 241/1000 - 0s - loss: 2.5007 - acc: 0.1739 Epoch 242/1000 - 0s - loss: 2.4980 - acc: 0.1739 Epoch 243/1000 - 0s - loss: 2.4954 - acc: 0.1739 Epoch 244/1000 - 0s - loss: 2.4939 - acc: 0.1739 Epoch 245/1000 - 0s - loss: 2.4914 - acc: 0.1739 Epoch 246/1000 - 0s - loss: 2.4893 - acc: 0.1739 Epoch 247/1000 - 0s - loss: 2.4870 - acc: 0.1739 Epoch 248/1000 - 0s - loss: 2.4852 - acc: 0.1739 Epoch 249/1000 - 0s - loss: 2.4829 - acc: 0.1739 Epoch 250/1000 - 0s - loss: 2.4804 - acc: 0.1739 Epoch 251/1000 - 0s - loss: 2.4785 - acc: 0.1739 Epoch 252/1000 - 0s - loss: 2.4760 - acc: 0.2174 Epoch 253/1000 - 0s - loss: 2.4737 - acc: 0.2174 Epoch 254/1000 - 0s - loss: 2.4716 - acc: 0.1739 Epoch 255/1000 - 0s - loss: 2.4693 - acc: 0.1739 Epoch 256/1000 - 0s - loss: 2.4678 - acc: 0.2174 Epoch 257/1000 - 0s - loss: 2.4656 - acc: 0.2174 Epoch 258/1000 - 0s - loss: 2.4632 - acc: 0.2609 Epoch 259/1000 - 0s - loss: 2.4616 - acc: 0.2174 Epoch 260/1000 - 0s - loss: 2.4592 - acc: 0.2174 Epoch 261/1000 - 0s - loss: 2.4571 - acc: 0.2174 Epoch 262/1000 - 0s - loss: 2.4548 - acc: 0.2174 Epoch 263/1000 - 0s - loss: 2.4533 - acc: 0.2174 Epoch 264/1000 - 0s - loss: 2.4511 - acc: 0.2174 Epoch 265/1000 - 0s - loss: 2.4490 - acc: 0.2174 Epoch 266/1000 - 0s - loss: 2.4473 - acc: 0.2609 Epoch 267/1000 - 0s - loss: 2.4451 - acc: 0.2174 Epoch 268/1000 - 0s - loss: 2.4433 - acc: 0.2174 Epoch 269/1000 - 0s - loss: 2.4411 - acc: 0.2174 Epoch 270/1000 - 0s - loss: 2.4391 - acc: 0.2174 Epoch 271/1000 - 0s - loss: 2.4370 - acc: 0.2174 Epoch 272/1000 - 0s - loss: 2.4351 - acc: 0.2174 Epoch 273/1000 - 0s - loss: 2.4331 - acc: 0.2174 Epoch 274/1000 - 0s - loss: 2.4311 - acc: 0.2609 Epoch 275/1000 - 0s - loss: 2.4291 - acc: 0.2609 Epoch 276/1000 - 0s - loss: 2.4276 - acc: 0.2174 Epoch 277/1000 - 0s - loss: 2.4254 - acc: 0.2174 Epoch 278/1000 - 0s - loss: 2.4232 - acc: 0.2609 Epoch 279/1000 - 0s - loss: 2.4215 - acc: 0.2174 Epoch 280/1000 - 0s - loss: 2.4195 - acc: 0.2174 Epoch 281/1000 - 0s - loss: 2.4178 - acc: 0.2174 Epoch 282/1000 - 0s - loss: 2.4156 - acc: 0.2174 Epoch 283/1000 - 0s - loss: 2.4145 - acc: 0.2174 Epoch 284/1000 - 0s - loss: 2.4124 - acc: 0.2174 Epoch 285/1000 - 0s - loss: 2.4099 - acc: 0.2174 Epoch 286/1000 - 0s - loss: 2.4082 - acc: 0.2174 Epoch 287/1000 - 0s - loss: 2.4064 - acc: 0.2174 Epoch 288/1000 - 0s - loss: 2.4042 - acc: 0.2174 Epoch 289/1000 - 0s - loss: 2.4028 - acc: 0.2174 Epoch 290/1000 - 0s - loss: 2.4008 - acc: 0.2174 Epoch 291/1000 - 0s - loss: 2.3989 - acc: 0.2174 Epoch 292/1000 - 0s - loss: 2.3968 - acc: 0.2174 Epoch 293/1000 - 0s - loss: 2.3954 - acc: 0.2174 Epoch 294/1000 - 0s - loss: 2.3938 - acc: 0.2174 Epoch 295/1000 - 0s - loss: 2.3916 - acc: 0.2609 Epoch 296/1000 - 0s - loss: 2.3901 - acc: 0.2609 Epoch 297/1000 - 0s - loss: 2.3880 - acc: 0.2609 Epoch 298/1000 - 0s - loss: 2.3867 - acc: 0.2174 Epoch 299/1000 - 0s - loss: 2.3849 - acc: 0.2174 Epoch 300/1000 - 0s - loss: 2.3831 - acc: 0.2174 Epoch 301/1000 - 0s - loss: 2.3812 - acc: 0.2174 Epoch 302/1000 - 0s - loss: 2.3798 - acc: 0.2609 Epoch 303/1000 - 0s - loss: 2.3779 - acc: 0.2609 Epoch 304/1000 - 0s - loss: 2.3760 - acc: 0.2609 Epoch 305/1000 - 0s - loss: 2.3740 - acc: 0.2609 Epoch 306/1000 - 0s - loss: 2.3725 - acc: 0.2174 Epoch 307/1000 - 0s - loss: 2.3707 - acc: 0.2174 Epoch 308/1000 - 0s - loss: 2.3692 - acc: 0.2174 Epoch 309/1000 - 0s - loss: 2.3670 - acc: 0.2174 Epoch 310/1000 - 0s - loss: 2.3656 - acc: 0.2609 Epoch 311/1000 - 0s - loss: 2.3639 - acc: 0.2609 Epoch 312/1000 - 0s - loss: 2.3620 - acc: 0.2609 Epoch 313/1000 - 0s - loss: 2.3603 - acc: 0.2609 Epoch 314/1000 - 0s - loss: 2.3591 - acc: 0.2609 Epoch 315/1000 - 0s - loss: 2.3568 - acc: 0.2609 Epoch 316/1000 - 0s - loss: 2.3559 - acc: 0.2609 Epoch 317/1000 - 0s - loss: 2.3536 - acc: 0.2609 Epoch 318/1000 - 0s - loss: 2.3518 - acc: 0.2609 Epoch 319/1000 - 0s - loss: 2.3507 - acc: 0.2609 Epoch 320/1000 - 0s - loss: 2.3489 - acc: 0.2609 Epoch 321/1000 - 0s - loss: 2.3471 - acc: 0.2609 Epoch 322/1000 - 0s - loss: 2.3460 - acc: 0.2609 Epoch 323/1000 - 0s - loss: 2.3437 - acc: 0.2609 Epoch 324/1000 - 0s - loss: 2.3420 - acc: 0.2609 Epoch 325/1000 - 0s - loss: 2.3401 - acc: 0.3043 Epoch 326/1000 - 0s - loss: 2.3385 - acc: 0.2609 Epoch 327/1000 - 0s - loss: 2.3369 - acc: 0.3043 Epoch 328/1000 - 0s - loss: 2.3356 - acc: 0.3043 Epoch 329/1000 - 0s - loss: 2.3338 - acc: 0.3043 Epoch 330/1000 - 0s - loss: 2.3326 - acc: 0.3043 Epoch 331/1000 - 0s - loss: 2.3313 - acc: 0.2609 Epoch 332/1000 - 0s - loss: 2.3295 - acc: 0.2609 Epoch 333/1000 - 0s - loss: 2.3281 - acc: 0.3043 Epoch 334/1000 - 0s - loss: 2.3259 - acc: 0.2609 Epoch 335/1000 - 0s - loss: 2.3243 - acc: 0.2609 Epoch 336/1000 - 0s - loss: 2.3226 - acc: 0.2609 Epoch 337/1000 - 0s - loss: 2.3215 - acc: 0.2609 Epoch 338/1000 - 0s - loss: 2.3198 - acc: 0.2609 Epoch 339/1000 - 0s - loss: 2.3184 - acc: 0.2174 Epoch 340/1000 - 0s - loss: 2.3169 - acc: 0.2609 Epoch 341/1000 - 0s - loss: 2.3154 - acc: 0.3043 Epoch 342/1000 - 0s - loss: 2.3141 - acc: 0.3043 Epoch 343/1000 - 0s - loss: 2.3127 - acc: 0.2609 Epoch 344/1000 - 0s - loss: 2.3106 - acc: 0.2609 Epoch 345/1000 - 0s - loss: 2.3090 - acc: 0.2609 Epoch 346/1000 - 0s - loss: 2.3078 - acc: 0.2609 Epoch 347/1000 - 0s - loss: 2.3060 - acc: 0.2609 Epoch 348/1000 - 0s - loss: 2.3048 - acc: 0.2609 Epoch 349/1000 - 0s - loss: 2.3037 - acc: 0.2609 Epoch 350/1000 - 0s - loss: 2.3019 - acc: 0.2609 Epoch 351/1000 - 0s - loss: 2.3004 - acc: 0.2609 Epoch 352/1000 - 0s - loss: 2.2986 - acc: 0.2609 Epoch 353/1000 - 0s - loss: 2.2971 - acc: 0.3043 Epoch 354/1000 - 0s - loss: 2.2959 - acc: 0.2609 Epoch 355/1000 - 0s - loss: 2.2938 - acc: 0.2609 Epoch 356/1000 - 0s - loss: 2.2930 - acc: 0.2609 Epoch 357/1000 - 0s - loss: 2.2910 - acc: 0.2609 Epoch 358/1000 - 0s - loss: 2.2892 - acc: 0.2609 Epoch 359/1000 - 0s - loss: 2.2886 - acc: 0.2609 Epoch 360/1000 - 0s - loss: 2.2870 - acc: 0.2609 Epoch 361/1000 - 0s - loss: 2.2854 - acc: 0.2609 Epoch 362/1000 - 0s - loss: 2.2841 - acc: 0.3043 Epoch 363/1000 - 0s - loss: 2.2827 - acc: 0.3043 Epoch 364/1000 - 0s - loss: 2.2809 - acc: 0.3043 Epoch 365/1000 - 0s - loss: 2.2794 - acc: 0.3043 Epoch 366/1000 - 0s - loss: 2.2780 - acc: 0.3043 Epoch 367/1000 - 0s - loss: 2.2763 - acc: 0.3043 Epoch 368/1000 - 0s - loss: 2.2752 - acc: 0.3478 Epoch 369/1000 - 0s - loss: 2.2735 - acc: 0.3478 Epoch 370/1000 - 0s - loss: 2.2722 - acc: 0.3478 Epoch 371/1000 - 0s - loss: 2.2711 - acc: 0.3478 Epoch 372/1000 - 0s - loss: 2.2693 - acc: 0.2609 Epoch 373/1000 - 0s - loss: 2.2682 - acc: 0.2609 Epoch 374/1000 - 0s - loss: 2.2666 - acc: 0.2609 Epoch 375/1000 - 0s - loss: 2.2651 - acc: 0.3043 Epoch 376/1000 - 0s - loss: 2.2643 - acc: 0.3043 Epoch 377/1000 - 0s - loss: 2.2627 - acc: 0.2609 Epoch 378/1000 - 0s - loss: 2.2611 - acc: 0.3043 Epoch 379/1000 - 0s - loss: 2.2598 - acc: 0.3043 Epoch 380/1000 - 0s - loss: 2.2582 - acc: 0.3043 Epoch 381/1000 - 0s - loss: 2.2573 - acc: 0.2609 Epoch 382/1000 - 0s - loss: 2.2560 - acc: 0.2609 Epoch 383/1000 - 0s - loss: 2.2547 - acc: 0.3043 Epoch 384/1000 - 0s - loss: 2.2526 - acc: 0.3043 Epoch 385/1000 - 0s - loss: 2.2516 - acc: 0.3043 Epoch 386/1000 - 0s - loss: 2.2500 - acc: 0.3043 Epoch 387/1000 - 0s - loss: 2.2487 - acc: 0.3043 Epoch 388/1000 - 0s - loss: 2.2475 - acc: 0.3043 Epoch 389/1000 - 0s - loss: 2.2459 - acc: 0.2609 Epoch 390/1000 - 0s - loss: 2.2444 - acc: 0.3043 Epoch 391/1000 - 0s - loss: 2.2430 - acc: 0.3043 Epoch 392/1000 - 0s - loss: 2.2417 - acc: 0.3043 Epoch 393/1000 - 0s - loss: 2.2405 - acc: 0.3043 Epoch 394/1000 - 0s - loss: 2.2391 - acc: 0.2609 Epoch 395/1000 - 0s - loss: 2.2380 - acc: 0.2174 Epoch 396/1000 - 0s - loss: 2.2363 - acc: 0.2174 Epoch 397/1000 - 0s - loss: 2.2348 - acc: 0.2609 Epoch 398/1000 - 0s - loss: 2.2336 - acc: 0.2609 Epoch 399/1000 - 0s - loss: 2.2327 - acc: 0.2609 Epoch 400/1000 - 0s - loss: 2.2315 - acc: 0.2609 Epoch 401/1000 - 0s - loss: 2.2297 - acc: 0.2174 Epoch 402/1000 - 0s - loss: 2.2285 - acc: 0.2609 Epoch 403/1000 - 0s - loss: 2.2272 - acc: 0.2174 Epoch 404/1000 - 0s - loss: 2.2260 - acc: 0.2174 Epoch 405/1000 - 0s - loss: 2.2248 - acc: 0.3043 Epoch 406/1000 - 0s - loss: 2.2233 - acc: 0.2609 Epoch 407/1000 - 0s - loss: 2.2221 - acc: 0.2609 Epoch 408/1000 - 0s - loss: 2.2213 - acc: 0.2609 Epoch 409/1000 - 0s - loss: 2.2195 - acc: 0.2609 Epoch 410/1000 - 0s - loss: 2.2185 - acc: 0.2609 Epoch 411/1000 - 0s - loss: 2.2173 - acc: 0.2609 Epoch 412/1000 - 0s - loss: 2.2157 - acc: 0.3043 Epoch 413/1000 - 0s - loss: 2.2147 - acc: 0.2609 Epoch 414/1000 - 0s - loss: 2.2135 - acc: 0.2609 Epoch 415/1000 - 0s - loss: 2.2121 - acc: 0.2609 Epoch 416/1000 - 0s - loss: 2.2110 - acc: 0.2609 Epoch 417/1000 - 0s - loss: 2.2098 - acc: 0.2609 Epoch 418/1000 - 0s - loss: 2.2081 - acc: 0.3043 Epoch 419/1000 - 0s - loss: 2.2071 - acc: 0.3043 Epoch 420/1000 - 0s - loss: 2.2057 - acc: 0.3043 Epoch 421/1000 - 0s - loss: 2.2051 - acc: 0.2609 Epoch 422/1000 - 0s - loss: 2.2040 - acc: 0.3043 Epoch 423/1000 - 0s - loss: 2.2020 - acc: 0.2609 Epoch 424/1000 - 0s - loss: 2.2011 - acc: 0.2609 Epoch 425/1000 - 0s - loss: 2.1999 - acc: 0.2609 Epoch 426/1000 - 0s - loss: 2.1985 - acc: 0.2609 Epoch 427/1000 - 0s - loss: 2.1977 - acc: 0.2609 Epoch 428/1000 - 0s - loss: 2.1958 - acc: 0.2609 Epoch 429/1000 - 0s - loss: 2.1947 - acc: 0.2609 Epoch 430/1000 - 0s - loss: 2.1935 - acc: 0.2609 Epoch 431/1000 - 0s - loss: 2.1925 - acc: 0.2609 Epoch 432/1000 - 0s - loss: 2.1909 - acc: 0.2609 Epoch 433/1000 - 0s - loss: 2.1900 - acc: 0.2609 Epoch 434/1000 - 0s - loss: 2.1887 - acc: 0.3043 Epoch 435/1000 - 0s - loss: 2.1877 - acc: 0.3043 Epoch 436/1000 - 0s - loss: 2.1862 - acc: 0.3043 Epoch 437/1000 - 0s - loss: 2.1851 - acc: 0.3043 Epoch 438/1000 - 0s - loss: 2.1836 - acc: 0.3043 Epoch 439/1000 - 0s - loss: 2.1828 - acc: 0.3043 Epoch 440/1000 - 0s - loss: 2.1810 - acc: 0.2609 Epoch 441/1000 - 0s - loss: 2.1803 - acc: 0.2609 Epoch 442/1000 - 0s - loss: 2.1789 - acc: 0.2609 Epoch 443/1000 - 0s - loss: 2.1785 - acc: 0.2609 Epoch 444/1000 - 0s - loss: 2.1768 - acc: 0.2609 Epoch 445/1000 - 0s - loss: 2.1759 - acc: 0.3043 Epoch 446/1000 - 0s - loss: 2.1752 - acc: 0.2609 Epoch 447/1000 - 0s - loss: 2.1736 - acc: 0.2609 Epoch 448/1000 - 0s - loss: 2.1731 - acc: 0.2174 Epoch 449/1000 - 0s - loss: 2.1717 - acc: 0.2174 Epoch 450/1000 - 0s - loss: 2.1705 - acc: 0.2609 Epoch 451/1000 - 0s - loss: 2.1692 - acc: 0.2609 Epoch 452/1000 - 0s - loss: 2.1675 - acc: 0.2609 Epoch 453/1000 - 0s - loss: 2.1667 - acc: 0.2609 Epoch 454/1000 - 0s - loss: 2.1654 - acc: 0.2609 Epoch 455/1000 - 0s - loss: 2.1646 - acc: 0.2609 Epoch 456/1000 - 0s - loss: 2.1631 - acc: 0.3478 Epoch 457/1000 - 0s - loss: 2.1623 - acc: 0.3478 Epoch 458/1000 - 0s - loss: 2.1607 - acc: 0.3478 Epoch 459/1000 - 0s - loss: 2.1596 - acc: 0.3913 Epoch 460/1000 - 0s - loss: 2.1586 - acc: 0.3913 Epoch 461/1000 - 0s - loss: 2.1578 - acc: 0.3478 Epoch 462/1000 - 0s - loss: 2.1565 - acc: 0.3478 Epoch 463/1000 - 0s - loss: 2.1553 - acc: 0.3478 Epoch 464/1000 - 0s - loss: 2.1543 - acc: 0.3478 Epoch 465/1000 - 0s - loss: 2.1524 - acc: 0.4348 Epoch 466/1000 - 0s - loss: 2.1519 - acc: 0.4348 Epoch 467/1000 - 0s - loss: 2.1502 - acc: 0.3478 Epoch 468/1000 - 0s - loss: 2.1501 - acc: 0.3913 Epoch 469/1000 - 0s - loss: 2.1484 - acc: 0.3913 Epoch 470/1000 - 0s - loss: 2.1477 - acc: 0.3478 Epoch 471/1000 - 0s - loss: 2.1462 - acc: 0.3478 Epoch 472/1000 - 0s - loss: 2.1453 - acc: 0.3478 Epoch 473/1000 - 0s - loss: 2.1445 - acc: 0.4348 Epoch 474/1000 - 0s - loss: 2.1430 - acc: 0.4783 Epoch 475/1000 - 0s - loss: 2.1421 - acc: 0.4348 Epoch 476/1000 - 0s - loss: 2.1407 - acc: 0.4348 Epoch 477/1000 - 0s - loss: 2.1401 - acc: 0.4348 Epoch 478/1000 - 0s - loss: 2.1380 - acc: 0.3913 Epoch 479/1000 - 0s - loss: 2.1372 - acc: 0.3913 Epoch 480/1000 - 0s - loss: 2.1370 - acc: 0.3913 Epoch 481/1000 - 0s - loss: 2.1355 - acc: 0.3913 Epoch 482/1000 *** WARNING: skipped 1685 bytes of output *** Epoch 516/1000 - 0s - loss: 2.0988 - acc: 0.4348 Epoch 517/1000 - 0s - loss: 2.0985 - acc: 0.3913 Epoch 518/1000 - 0s - loss: 2.0971 - acc: 0.3913 Epoch 519/1000 - 0s - loss: 2.0963 - acc: 0.3913 Epoch 520/1000 - 0s - loss: 2.0953 - acc: 0.3913 Epoch 521/1000 - 0s - loss: 2.0939 - acc: 0.4348 Epoch 522/1000 - 0s - loss: 2.0928 - acc: 0.4348 Epoch 523/1000 - 0s - loss: 2.0920 - acc: 0.4348 Epoch 524/1000 - 0s - loss: 2.0911 - acc: 0.4348 Epoch 525/1000 - 0s - loss: 2.0904 - acc: 0.4348 Epoch 526/1000 - 0s - loss: 2.0888 - acc: 0.4348 Epoch 527/1000 - 0s - loss: 2.0883 - acc: 0.4348 Epoch 528/1000 - 0s - loss: 2.0867 - acc: 0.3913 Epoch 529/1000 - 0s - loss: 2.0861 - acc: 0.3913 Epoch 530/1000 - 0s - loss: 2.0850 - acc: 0.3913 Epoch 531/1000 - 0s - loss: 2.0842 - acc: 0.3913 Epoch 532/1000 - 0s - loss: 2.0826 - acc: 0.4348 Epoch 533/1000 - 0s - loss: 2.0820 - acc: 0.3913 Epoch 534/1000 - 0s - loss: 2.0820 - acc: 0.4348 Epoch 535/1000 - 0s - loss: 2.0803 - acc: 0.4348 Epoch 536/1000 - 0s - loss: 2.0798 - acc: 0.4348 Epoch 537/1000 - 0s - loss: 2.0786 - acc: 0.4348 Epoch 538/1000 - 0s - loss: 2.0782 - acc: 0.4348 Epoch 539/1000 - 0s - loss: 2.0760 - acc: 0.4348 Epoch 540/1000 - 0s - loss: 2.0762 - acc: 0.4348 Epoch 541/1000 - 0s - loss: 2.0743 - acc: 0.4348 Epoch 542/1000 - 0s - loss: 2.0733 - acc: 0.4348 Epoch 543/1000 - 0s - loss: 2.0724 - acc: 0.4348 Epoch 544/1000 - 0s - loss: 2.0718 - acc: 0.4348 Epoch 545/1000 - 0s - loss: 2.0704 - acc: 0.3913 Epoch 546/1000 - 0s - loss: 2.0704 - acc: 0.3913 Epoch 547/1000 - 0s - loss: 2.0698 - acc: 0.3913 Epoch 548/1000 - 0s - loss: 2.0680 - acc: 0.4783 Epoch 549/1000 - 0s - loss: 2.0675 - acc: 0.4348 Epoch 550/1000 - 0s - loss: 2.0662 - acc: 0.4348 Epoch 551/1000 - 0s - loss: 2.0659 - acc: 0.4348 Epoch 552/1000 - 0s - loss: 2.0641 - acc: 0.4348 Epoch 553/1000 - 0s - loss: 2.0636 - acc: 0.4348 Epoch 554/1000 - 0s - loss: 2.0625 - acc: 0.4348 Epoch 555/1000 - 0s - loss: 2.0613 - acc: 0.4348 Epoch 556/1000 - 0s - loss: 2.0601 - acc: 0.4348 Epoch 557/1000 - 0s - loss: 2.0599 - acc: 0.3913 Epoch 558/1000 - 0s - loss: 2.0581 - acc: 0.3913 Epoch 559/1000 - 0s - loss: 2.0576 - acc: 0.4348 Epoch 560/1000 - 0s - loss: 2.0574 - acc: 0.4783 Epoch 561/1000 - 0s - loss: 2.0553 - acc: 0.4783 Epoch 562/1000 - 0s - loss: 2.0550 - acc: 0.4348 Epoch 563/1000 - 0s - loss: 2.0542 - acc: 0.4348 Epoch 564/1000 - 0s - loss: 2.0529 - acc: 0.4348 Epoch 565/1000 - 0s - loss: 2.0527 - acc: 0.3913 Epoch 566/1000 - 0s - loss: 2.0513 - acc: 0.4348 Epoch 567/1000 - 0s - loss: 2.0504 - acc: 0.3913 Epoch 568/1000 - 0s - loss: 2.0494 - acc: 0.4348 Epoch 569/1000 - 0s - loss: 2.0483 - acc: 0.4783 Epoch 570/1000 - 0s - loss: 2.0475 - acc: 0.4348 Epoch 571/1000 - 0s - loss: 2.0472 - acc: 0.3478 Epoch 572/1000 - 0s - loss: 2.0463 - acc: 0.3478 Epoch 573/1000 - 0s - loss: 2.0450 - acc: 0.3478 Epoch 574/1000 - 0s - loss: 2.0442 - acc: 0.3913 Epoch 575/1000 - 0s - loss: 2.0433 - acc: 0.3913 Epoch 576/1000 - 0s - loss: 2.0427 - acc: 0.4348 Epoch 577/1000 - 0s - loss: 2.0418 - acc: 0.4348 Epoch 578/1000 - 0s - loss: 2.0407 - acc: 0.3913 Epoch 579/1000 - 0s - loss: 2.0396 - acc: 0.4783 Epoch 580/1000 - 0s - loss: 2.0393 - acc: 0.4348 Epoch 581/1000 - 0s - loss: 2.0381 - acc: 0.4348 Epoch 582/1000 - 0s - loss: 2.0374 - acc: 0.4348 Epoch 583/1000 - 0s - loss: 2.0364 - acc: 0.4348 Epoch 584/1000 - 0s - loss: 2.0354 - acc: 0.4348 Epoch 585/1000 - 0s - loss: 2.0346 - acc: 0.4348 Epoch 586/1000 - 0s - loss: 2.0340 - acc: 0.4348 Epoch 587/1000 - 0s - loss: 2.0325 - acc: 0.4348 Epoch 588/1000 - 0s - loss: 2.0318 - acc: 0.3913 Epoch 589/1000 - 0s - loss: 2.0309 - acc: 0.3913 Epoch 590/1000 - 0s - loss: 2.0308 - acc: 0.4783 Epoch 591/1000 - 0s - loss: 2.0297 - acc: 0.4783 Epoch 592/1000 - 0s - loss: 2.0284 - acc: 0.4783 Epoch 593/1000 - 0s - loss: 2.0277 - acc: 0.4783 Epoch 594/1000 - 0s - loss: 2.0268 - acc: 0.4783 Epoch 595/1000 - 0s - loss: 2.0259 - acc: 0.4783 Epoch 596/1000 - 0s - loss: 2.0249 - acc: 0.4783 Epoch 597/1000 - 0s - loss: 2.0241 - acc: 0.4783 Epoch 598/1000 - 0s - loss: 2.0228 - acc: 0.4783 Epoch 599/1000 - 0s - loss: 2.0230 - acc: 0.4783 Epoch 600/1000 - 0s - loss: 2.0211 - acc: 0.4783 Epoch 601/1000 - 0s - loss: 2.0207 - acc: 0.4783 Epoch 602/1000 - 0s - loss: 2.0196 - acc: 0.4783 Epoch 603/1000 - 0s - loss: 2.0189 - acc: 0.4348 Epoch 604/1000 - 0s - loss: 2.0177 - acc: 0.4783 Epoch 605/1000 - 0s - loss: 2.0167 - acc: 0.4348 Epoch 606/1000 - 0s - loss: 2.0164 - acc: 0.4348 Epoch 607/1000 - 0s - loss: 2.0155 - acc: 0.4348 Epoch 608/1000 - 0s - loss: 2.0147 - acc: 0.4348 Epoch 609/1000 - 0s - loss: 2.0144 - acc: 0.4348 Epoch 610/1000 - 0s - loss: 2.0138 - acc: 0.4348 Epoch 611/1000 - 0s - loss: 2.0120 - acc: 0.4783 Epoch 612/1000 - 0s - loss: 2.0117 - acc: 0.4783 Epoch 613/1000 - 0s - loss: 2.0108 - acc: 0.4783 Epoch 614/1000 - 0s - loss: 2.0098 - acc: 0.4783 Epoch 615/1000 - 0s - loss: 2.0090 - acc: 0.4783 Epoch 616/1000 - 0s - loss: 2.0085 - acc: 0.4783 Epoch 617/1000 - 0s - loss: 2.0076 - acc: 0.4783 Epoch 618/1000 - 0s - loss: 2.0069 - acc: 0.4348 Epoch 619/1000 - 0s - loss: 2.0057 - acc: 0.4348 Epoch 620/1000 - 0s - loss: 2.0042 - acc: 0.5217 Epoch 621/1000 - 0s - loss: 2.0047 - acc: 0.4783 Epoch 622/1000 - 0s - loss: 2.0039 - acc: 0.5217 Epoch 623/1000 - 0s - loss: 2.0030 - acc: 0.5217 Epoch 624/1000 - 0s - loss: 2.0012 - acc: 0.5217 Epoch 625/1000 - 0s - loss: 2.0010 - acc: 0.5217 Epoch 626/1000 - 0s - loss: 1.9998 - acc: 0.5217 Epoch 627/1000 - 0s - loss: 1.9994 - acc: 0.4783 Epoch 628/1000 - 0s - loss: 1.9984 - acc: 0.4783 Epoch 629/1000 - 0s - loss: 1.9979 - acc: 0.4783 Epoch 630/1000 - 0s - loss: 1.9972 - acc: 0.4783 Epoch 631/1000 - 0s - loss: 1.9963 - acc: 0.4348 Epoch 632/1000 - 0s - loss: 1.9954 - acc: 0.3913 Epoch 633/1000 - 0s - loss: 1.9949 - acc: 0.3913 Epoch 634/1000 - 0s - loss: 1.9931 - acc: 0.4348 Epoch 635/1000 - 0s - loss: 1.9926 - acc: 0.5217 Epoch 636/1000 - 0s - loss: 1.9917 - acc: 0.5217 Epoch 637/1000 - 0s - loss: 1.9914 - acc: 0.5217 Epoch 638/1000 - 0s - loss: 1.9902 - acc: 0.4348 Epoch 639/1000 - 0s - loss: 1.9892 - acc: 0.4783 Epoch 640/1000 - 0s - loss: 1.9885 - acc: 0.4783 Epoch 641/1000 - 0s - loss: 1.9875 - acc: 0.4783 Epoch 642/1000 - 0s - loss: 1.9871 - acc: 0.4783 Epoch 643/1000 - 0s - loss: 1.9859 - acc: 0.4783 Epoch 644/1000 - 0s - loss: 1.9857 - acc: 0.4783 Epoch 645/1000 - 0s - loss: 1.9849 - acc: 0.4783 Epoch 646/1000 - 0s - loss: 1.9841 - acc: 0.4348 Epoch 647/1000 - 0s - loss: 1.9830 - acc: 0.4783 Epoch 648/1000 - 0s - loss: 1.9825 - acc: 0.4348 Epoch 649/1000 - 0s - loss: 1.9822 - acc: 0.4348 Epoch 650/1000 - 0s - loss: 1.9808 - acc: 0.4348 Epoch 651/1000 - 0s - loss: 1.9803 - acc: 0.4783 Epoch 652/1000 - 0s - loss: 1.9788 - acc: 0.4783 Epoch 653/1000 - 0s - loss: 1.9782 - acc: 0.4783 Epoch 654/1000 - 0s - loss: 1.9777 - acc: 0.4783 Epoch 655/1000 - 0s - loss: 1.9761 - acc: 0.4783 Epoch 656/1000 - 0s - loss: 1.9760 - acc: 0.4783 Epoch 657/1000 - 0s - loss: 1.9753 - acc: 0.4783 Epoch 658/1000 - 0s - loss: 1.9745 - acc: 0.5217 Epoch 659/1000 - 0s - loss: 1.9734 - acc: 0.5217 Epoch 660/1000 - 0s - loss: 1.9730 - acc: 0.5217 Epoch 661/1000 - 0s - loss: 1.9725 - acc: 0.5217 Epoch 662/1000 - 0s - loss: 1.9713 - acc: 0.5217 Epoch 663/1000 - 0s - loss: 1.9705 - acc: 0.5217 Epoch 664/1000 - 0s - loss: 1.9697 - acc: 0.5217 Epoch 665/1000 - 0s - loss: 1.9690 - acc: 0.5217 Epoch 666/1000 - 0s - loss: 1.9683 - acc: 0.5217 Epoch 667/1000 - 0s - loss: 1.9673 - acc: 0.5652 Epoch 668/1000 - 0s - loss: 1.9667 - acc: 0.5652 Epoch 669/1000 - 0s - loss: 1.9664 - acc: 0.5652 Epoch 670/1000 - 0s - loss: 1.9650 - acc: 0.5652 Epoch 671/1000 - 0s - loss: 1.9646 - acc: 0.6087 Epoch 672/1000 - 0s - loss: 1.9639 - acc: 0.5217 Epoch 673/1000 - 0s - loss: 1.9628 - acc: 0.4783 Epoch 674/1000 - 0s - loss: 1.9619 - acc: 0.4783 Epoch 675/1000 - 0s - loss: 1.9611 - acc: 0.4783 Epoch 676/1000 - 0s - loss: 1.9609 - acc: 0.5652 Epoch 677/1000 - 0s - loss: 1.9596 - acc: 0.5217 Epoch 678/1000 - 0s - loss: 1.9590 - acc: 0.5217 Epoch 679/1000 - 0s - loss: 1.9587 - acc: 0.5217 Epoch 680/1000 - 0s - loss: 1.9577 - acc: 0.5217 Epoch 681/1000 - 0s - loss: 1.9567 - acc: 0.4783 Epoch 682/1000 - 0s - loss: 1.9558 - acc: 0.4783 Epoch 683/1000 - 0s - loss: 1.9558 - acc: 0.4783 Epoch 684/1000 - 0s - loss: 1.9550 - acc: 0.5217 Epoch 685/1000 - 0s - loss: 1.9531 - acc: 0.5217 Epoch 686/1000 - 0s - loss: 1.9529 - acc: 0.5217 Epoch 687/1000 - 0s - loss: 1.9520 - acc: 0.5652 Epoch 688/1000 - 0s - loss: 1.9513 - acc: 0.5217 Epoch 689/1000 - 0s - loss: 1.9501 - acc: 0.5217 Epoch 690/1000 - 0s - loss: 1.9505 - acc: 0.5217 Epoch 691/1000 - 0s - loss: 1.9492 - acc: 0.5652 Epoch 692/1000 - 0s - loss: 1.9481 - acc: 0.5217 Epoch 693/1000 - 0s - loss: 1.9476 - acc: 0.5217 Epoch 694/1000 - 0s - loss: 1.9476 - acc: 0.5217 Epoch 695/1000 - 0s - loss: 1.9463 - acc: 0.5217 Epoch 696/1000 - 0s - loss: 1.9457 - acc: 0.4783 Epoch 697/1000 - 0s - loss: 1.9448 - acc: 0.4348 Epoch 698/1000 - 0s - loss: 1.9445 - acc: 0.4348 Epoch 699/1000 - 0s - loss: 1.9435 - acc: 0.4348 Epoch 700/1000 - 0s - loss: 1.9426 - acc: 0.4348 Epoch 701/1000 - 0s - loss: 1.9421 - acc: 0.4348 Epoch 702/1000 - 0s - loss: 1.9417 - acc: 0.4348 Epoch 703/1000 - 0s - loss: 1.9402 - acc: 0.4348 Epoch 704/1000 - 0s - loss: 1.9405 - acc: 0.3913 Epoch 705/1000 - 0s - loss: 1.9395 - acc: 0.4348 Epoch 706/1000 - 0s - loss: 1.9390 - acc: 0.4348 Epoch 707/1000 - 0s - loss: 1.9383 - acc: 0.4348 Epoch 708/1000 - 0s - loss: 1.9369 - acc: 0.4348 Epoch 709/1000 - 0s - loss: 1.9363 - acc: 0.3913 Epoch 710/1000 - 0s - loss: 1.9356 - acc: 0.4348 Epoch 711/1000 - 0s - loss: 1.9351 - acc: 0.3478 Epoch 712/1000 - 0s - loss: 1.9340 - acc: 0.3913 Epoch 713/1000 - 0s - loss: 1.9337 - acc: 0.3913 Epoch 714/1000 - 0s - loss: 1.9326 - acc: 0.3913 Epoch 715/1000 - 0s - loss: 1.9318 - acc: 0.3913 Epoch 716/1000 - 0s - loss: 1.9306 - acc: 0.4348 Epoch 717/1000 - 0s - loss: 1.9306 - acc: 0.5217 Epoch 718/1000 - 0s - loss: 1.9296 - acc: 0.4783 Epoch 719/1000 - 0s - loss: 1.9294 - acc: 0.5217 Epoch 720/1000 - 0s - loss: 1.9283 - acc: 0.4783 Epoch 721/1000 - 0s - loss: 1.9276 - acc: 0.4783 Epoch 722/1000 - 0s - loss: 1.9265 - acc: 0.5217 Epoch 723/1000 - 0s - loss: 1.9267 - acc: 0.5217 Epoch 724/1000 - 0s - loss: 1.9256 - acc: 0.4783 Epoch 725/1000 - 0s - loss: 1.9250 - acc: 0.4783 Epoch 726/1000 - 0s - loss: 1.9243 - acc: 0.4348 Epoch 727/1000 - 0s - loss: 1.9233 - acc: 0.4348 Epoch 728/1000 - 0s - loss: 1.9222 - acc: 0.4783 Epoch 729/1000 - 0s - loss: 1.9217 - acc: 0.4348 Epoch 730/1000 - 0s - loss: 1.9207 - acc: 0.4783 Epoch 731/1000 - 0s - loss: 1.9204 - acc: 0.4348 Epoch 732/1000 - 0s - loss: 1.9195 - acc: 0.4348 Epoch 733/1000 - 0s - loss: 1.9189 - acc: 0.5217 Epoch 734/1000 - 0s - loss: 1.9183 - acc: 0.5217 Epoch 735/1000 - 0s - loss: 1.9173 - acc: 0.6087 Epoch 736/1000 - 0s - loss: 1.9172 - acc: 0.6087 Epoch 737/1000 - 0s - loss: 1.9158 - acc: 0.6087 Epoch 738/1000 - 0s - loss: 1.9151 - acc: 0.5217 Epoch 739/1000 - 0s - loss: 1.9147 - acc: 0.5217 Epoch 740/1000 - 0s - loss: 1.9137 - acc: 0.6087 Epoch 741/1000 - 0s - loss: 1.9133 - acc: 0.4783 Epoch 742/1000 - 0s - loss: 1.9128 - acc: 0.5217 Epoch 743/1000 - 0s - loss: 1.9120 - acc: 0.6087 Epoch 744/1000 - 0s - loss: 1.9115 - acc: 0.5652 Epoch 745/1000 - 0s - loss: 1.9105 - acc: 0.5652 Epoch 746/1000 - 0s - loss: 1.9092 - acc: 0.6087 Epoch 747/1000 - 0s - loss: 1.9091 - acc: 0.5217 Epoch 748/1000 - 0s - loss: 1.9088 - acc: 0.5217 Epoch 749/1000 - 0s - loss: 1.9086 - acc: 0.5217 Epoch 750/1000 - 0s - loss: 1.9077 - acc: 0.5217 Epoch 751/1000 - 0s - loss: 1.9064 - acc: 0.4783 Epoch 752/1000 - 0s - loss: 1.9065 - acc: 0.4783 Epoch 753/1000 - 0s - loss: 1.9053 - acc: 0.4783 Epoch 754/1000 - 0s - loss: 1.9045 - acc: 0.5217 Epoch 755/1000 - 0s - loss: 1.9035 - acc: 0.4783 Epoch 756/1000 - 0s - loss: 1.9031 - acc: 0.5217 Epoch 757/1000 - 0s - loss: 1.9022 - acc: 0.4783 Epoch 758/1000 - 0s - loss: 1.9018 - acc: 0.4783 Epoch 759/1000 - 0s - loss: 1.9012 - acc: 0.4348 Epoch 760/1000 - 0s - loss: 1.9004 - acc: 0.5217 Epoch 761/1000 - 0s - loss: 1.9001 - acc: 0.4783 Epoch 762/1000 - 0s - loss: 1.8990 - acc: 0.5652 Epoch 763/1000 - 0s - loss: 1.8986 - acc: 0.5652 Epoch 764/1000 - 0s - loss: 1.8981 - acc: 0.5217 Epoch 765/1000 - 0s - loss: 1.8973 - acc: 0.5217 Epoch 766/1000 - 0s - loss: 1.8967 - acc: 0.5217 Epoch 767/1000 - 0s - loss: 1.8954 - acc: 0.5652 Epoch 768/1000 - 0s - loss: 1.8950 - acc: 0.5652 Epoch 769/1000 - 0s - loss: 1.8942 - acc: 0.5217 Epoch 770/1000 - 0s - loss: 1.8940 - acc: 0.4783 Epoch 771/1000 - 0s - loss: 1.8928 - acc: 0.4348 Epoch 772/1000 - 0s - loss: 1.8922 - acc: 0.5217 Epoch 773/1000 - 0s - loss: 1.8918 - acc: 0.4783 Epoch 774/1000 - 0s - loss: 1.8914 - acc: 0.5217 Epoch 775/1000 - 0s - loss: 1.8898 - acc: 0.5217 Epoch 776/1000 - 0s - loss: 1.8897 - acc: 0.4783 Epoch 777/1000 - 0s - loss: 1.8886 - acc: 0.4783 Epoch 778/1000 - 0s - loss: 1.8881 - acc: 0.4348 Epoch 779/1000 - 0s - loss: 1.8875 - acc: 0.4783 Epoch 780/1000 - 0s - loss: 1.8882 - acc: 0.5217 Epoch 781/1000 - 0s - loss: 1.8864 - acc: 0.4783 Epoch 782/1000 - 0s - loss: 1.8855 - acc: 0.3913 Epoch 783/1000 - 0s - loss: 1.8851 - acc: 0.4783 Epoch 784/1000 - 0s - loss: 1.8840 - acc: 0.4783 Epoch 785/1000 - 0s - loss: 1.8838 - acc: 0.3478 Epoch 786/1000 - 0s - loss: 1.8833 - acc: 0.4783 Epoch 787/1000 - 0s - loss: 1.8824 - acc: 0.4348 Epoch 788/1000 - 0s - loss: 1.8813 - acc: 0.4348 Epoch 789/1000 - 0s - loss: 1.8806 - acc: 0.4783 Epoch 790/1000 - 0s - loss: 1.8798 - acc: 0.6087 Epoch 791/1000 - 0s - loss: 1.8793 - acc: 0.5652 Epoch 792/1000 - 0s - loss: 1.8792 - acc: 0.6087 Epoch 793/1000 - 0s - loss: 1.8783 - acc: 0.5652 Epoch 794/1000 - 0s - loss: 1.8775 - acc: 0.6087 Epoch 795/1000 - 0s - loss: 1.8771 - acc: 0.6087 Epoch 796/1000 - 0s - loss: 1.8757 - acc: 0.6522 Epoch 797/1000 - 0s - loss: 1.8760 - acc: 0.5652 Epoch 798/1000 - 0s - loss: 1.8759 - acc: 0.5652 Epoch 799/1000 - 0s - loss: 1.8745 - acc: 0.6087 Epoch 800/1000 - 0s - loss: 1.8740 - acc: 0.5652 Epoch 801/1000 - 0s - loss: 1.8734 - acc: 0.5652 Epoch 802/1000 - 0s - loss: 1.8724 - acc: 0.5217 Epoch 803/1000 - 0s - loss: 1.8728 - acc: 0.5652 Epoch 804/1000 - 0s - loss: 1.8717 - acc: 0.5652 Epoch 805/1000 - 0s - loss: 1.8707 - acc: 0.5652 Epoch 806/1000 - 0s - loss: 1.8701 - acc: 0.5652 Epoch 807/1000 - 0s - loss: 1.8698 - acc: 0.5217 Epoch 808/1000 - 0s - loss: 1.8686 - acc: 0.4783 Epoch 809/1000 - 0s - loss: 1.8686 - acc: 0.4783 Epoch 810/1000 - 0s - loss: 1.8673 - acc: 0.4783 Epoch 811/1000 - 0s - loss: 1.8664 - acc: 0.4783 Epoch 812/1000 - 0s - loss: 1.8663 - acc: 0.4783 Epoch 813/1000 - 0s - loss: 1.8651 - acc: 0.5217 Epoch 814/1000 - 0s - loss: 1.8647 - acc: 0.5217 Epoch 815/1000 - 0s - loss: 1.8651 - acc: 0.5217 Epoch 816/1000 - 0s - loss: 1.8633 - acc: 0.5652 Epoch 817/1000 - 0s - loss: 1.8630 - acc: 0.5652 Epoch 818/1000 - 0s - loss: 1.8626 - acc: 0.5217 Epoch 819/1000 - 0s - loss: 1.8622 - acc: 0.5217 Epoch 820/1000 - 0s - loss: 1.8613 - acc: 0.5217 Epoch 821/1000 - 0s - loss: 1.8599 - acc: 0.5217 Epoch 822/1000 - 0s - loss: 1.8599 - acc: 0.5652 Epoch 823/1000 - 0s - loss: 1.8598 - acc: 0.5652 Epoch 824/1000 - 0s - loss: 1.8583 - acc: 0.6087 Epoch 825/1000 - 0s - loss: 1.8580 - acc: 0.5217 Epoch 826/1000 - 0s - loss: 1.8569 - acc: 0.5652 Epoch 827/1000 - 0s - loss: 1.8561 - acc: 0.5652 Epoch 828/1000 - 0s - loss: 1.8562 - acc: 0.6087 Epoch 829/1000 - 0s - loss: 1.8558 - acc: 0.5217 Epoch 830/1000 - 0s - loss: 1.8550 - acc: 0.5217 Epoch 831/1000 - 0s - loss: 1.8536 - acc: 0.5652 Epoch 832/1000 - 0s - loss: 1.8540 - acc: 0.5652 Epoch 833/1000 - 0s - loss: 1.8531 - acc: 0.6087 Epoch 834/1000 - 0s - loss: 1.8529 - acc: 0.6087 Epoch 835/1000 - 0s - loss: 1.8518 - acc: 0.5652 Epoch 836/1000 - 0s - loss: 1.8513 - acc: 0.5652 Epoch 837/1000 - 0s - loss: 1.8507 - acc: 0.5652 Epoch 838/1000 - 0s - loss: 1.8501 - acc: 0.5652 Epoch 839/1000 - 0s - loss: 1.8495 - acc: 0.5217 Epoch 840/1000 - 0s - loss: 1.8490 - acc: 0.5652 Epoch 841/1000 - 0s - loss: 1.8482 - acc: 0.5652 Epoch 842/1000 - 0s - loss: 1.8474 - acc: 0.5652 Epoch 843/1000 - 0s - loss: 1.8474 - acc: 0.5217 Epoch 844/1000 - 0s - loss: 1.8463 - acc: 0.5217 Epoch 845/1000 - 0s - loss: 1.8460 - acc: 0.5217 Epoch 846/1000 - 0s - loss: 1.8452 - acc: 0.5652 Epoch 847/1000 - 0s - loss: 1.8441 - acc: 0.5652 Epoch 848/1000 - 0s - loss: 1.8429 - acc: 0.6087 Epoch 849/1000 - 0s - loss: 1.8430 - acc: 0.6087 Epoch 850/1000 - 0s - loss: 1.8424 - acc: 0.6087 Epoch 851/1000 - 0s - loss: 1.8416 - acc: 0.6957 Epoch 852/1000 - 0s - loss: 1.8412 - acc: 0.6522 Epoch 853/1000 - 0s - loss: 1.8410 - acc: 0.6522 Epoch 854/1000 - 0s - loss: 1.8402 - acc: 0.6957 Epoch 855/1000 - 0s - loss: 1.8395 - acc: 0.6087 Epoch 856/1000 - 0s - loss: 1.8393 - acc: 0.6087 Epoch 857/1000 - 0s - loss: 1.8386 - acc: 0.6522 Epoch 858/1000 - 0s - loss: 1.8380 - acc: 0.5652 Epoch 859/1000 - 0s - loss: 1.8377 - acc: 0.5217 Epoch 860/1000 - 0s - loss: 1.8357 - acc: 0.5652 Epoch 861/1000 - 0s - loss: 1.8354 - acc: 0.6522 Epoch 862/1000 - 0s - loss: 1.8349 - acc: 0.6957 Epoch 863/1000 - 0s - loss: 1.8347 - acc: 0.6522 Epoch 864/1000 - 0s - loss: 1.8337 - acc: 0.7391 Epoch 865/1000 - 0s - loss: 1.8332 - acc: 0.6957 Epoch 866/1000 - 0s - loss: 1.8325 - acc: 0.6957 Epoch 867/1000 - 0s - loss: 1.8321 - acc: 0.6957 Epoch 868/1000 - 0s - loss: 1.8321 - acc: 0.6087 Epoch 869/1000 - 0s - loss: 1.8309 - acc: 0.6087 Epoch 870/1000 - 0s - loss: 1.8302 - acc: 0.6087 Epoch 871/1000 - 0s - loss: 1.8296 - acc: 0.6522 Epoch 872/1000 - 0s - loss: 1.8290 - acc: 0.6957 Epoch 873/1000 - 0s - loss: 1.8284 - acc: 0.6522 Epoch 874/1000 - 0s - loss: 1.8276 - acc: 0.6087 Epoch 875/1000 - 0s - loss: 1.8273 - acc: 0.6522 Epoch 876/1000 - 0s - loss: 1.8266 - acc: 0.6957 Epoch 877/1000 - 0s - loss: 1.8259 - acc: 0.6522 Epoch 878/1000 - 0s - loss: 1.8256 - acc: 0.6087 Epoch 879/1000 - 0s - loss: 1.8252 - acc: 0.6087 Epoch 880/1000 - 0s - loss: 1.8243 - acc: 0.6522 Epoch 881/1000 - 0s - loss: 1.8243 - acc: 0.6522 Epoch 882/1000 - 0s - loss: 1.8230 - acc: 0.6957 Epoch 883/1000 - 0s - loss: 1.8234 - acc: 0.6957 Epoch 884/1000 - 0s - loss: 1.8223 - acc: 0.6957 Epoch 885/1000 - 0s - loss: 1.8220 - acc: 0.6522 Epoch 886/1000 - 0s - loss: 1.8208 - acc: 0.6522 Epoch 887/1000 - 0s - loss: 1.8201 - acc: 0.6087 Epoch 888/1000 - 0s - loss: 1.8204 - acc: 0.5652 Epoch 889/1000 - 0s - loss: 1.8187 - acc: 0.6087 Epoch 890/1000 - 0s - loss: 1.8190 - acc: 0.6522 Epoch 891/1000 - 0s - loss: 1.8181 - acc: 0.6087 Epoch 892/1000 - 0s - loss: 1.8178 - acc: 0.5652 Epoch 893/1000 - 0s - loss: 1.8172 - acc: 0.5652 Epoch 894/1000 - 0s - loss: 1.8164 - acc: 0.5217 Epoch 895/1000 - 0s - loss: 1.8159 - acc: 0.5652 Epoch 896/1000 - 0s - loss: 1.8149 - acc: 0.6087 Epoch 897/1000 - 0s - loss: 1.8150 - acc: 0.6087 Epoch 898/1000 - 0s - loss: 1.8136 - acc: 0.5652 Epoch 899/1000 - 0s - loss: 1.8132 - acc: 0.6087 Epoch 900/1000 - 0s - loss: 1.8131 - acc: 0.6087 Epoch 901/1000 - 0s - loss: 1.8123 - acc: 0.6087 Epoch 902/1000 - 0s - loss: 1.8118 - acc: 0.6522 Epoch 903/1000 - 0s - loss: 1.8118 - acc: 0.6087 Epoch 904/1000 - 0s - loss: 1.8107 - acc: 0.6522 Epoch 905/1000 - 0s - loss: 1.8109 - acc: 0.6522 Epoch 906/1000 - 0s - loss: 1.8097 - acc: 0.6957 Epoch 907/1000 - 0s - loss: 1.8092 - acc: 0.6957 Epoch 908/1000 - 0s - loss: 1.8087 - acc: 0.6522 Epoch 909/1000 - 0s - loss: 1.8071 - acc: 0.6522 Epoch 910/1000 - 0s - loss: 1.8075 - acc: 0.6087 Epoch 911/1000 - 0s - loss: 1.8068 - acc: 0.5652 Epoch 912/1000 - 0s - loss: 1.8063 - acc: 0.5652 Epoch 913/1000 - 0s - loss: 1.8052 - acc: 0.5652 Epoch 914/1000 - 0s - loss: 1.8053 - acc: 0.5652 Epoch 915/1000 - 0s - loss: 1.8045 - acc: 0.5652 Epoch 916/1000 - 0s - loss: 1.8036 - acc: 0.5652 Epoch 917/1000 - 0s - loss: 1.8025 - acc: 0.5652 Epoch 918/1000 - 0s - loss: 1.8024 - acc: 0.5217 Epoch 919/1000 - 0s - loss: 1.8025 - acc: 0.4348 Epoch 920/1000 - 0s - loss: 1.8020 - acc: 0.3913 Epoch 921/1000 - 0s - loss: 1.8003 - acc: 0.4348 Epoch 922/1000 - 0s - loss: 1.8005 - acc: 0.3913 Epoch 923/1000 - 0s - loss: 1.8000 - acc: 0.3913 Epoch 924/1000 - 0s - loss: 1.7998 - acc: 0.4348 Epoch 925/1000 - 0s - loss: 1.7986 - acc: 0.5217 Epoch 926/1000 - 0s - loss: 1.7978 - acc: 0.4783 Epoch 927/1000 - 0s - loss: 1.7973 - acc: 0.5217 Epoch 928/1000 - 0s - loss: 1.7968 - acc: 0.5217 Epoch 929/1000 - 0s - loss: 1.7969 - acc: 0.4783 Epoch 930/1000 - 0s - loss: 1.7951 - acc: 0.4783 Epoch 931/1000 - 0s - loss: 1.7947 - acc: 0.4783 Epoch 932/1000 - 0s - loss: 1.7942 - acc: 0.5217 Epoch 933/1000 - 0s - loss: 1.7948 - acc: 0.5652 Epoch 934/1000 - 0s - loss: 1.7934 - acc: 0.5652 Epoch 935/1000 - 0s - loss: 1.7924 - acc: 0.5652 Epoch 936/1000 - 0s - loss: 1.7929 - acc: 0.5652 Epoch 937/1000 - 0s - loss: 1.7917 - acc: 0.6087 Epoch 938/1000 - 0s - loss: 1.7914 - acc: 0.6522 Epoch 939/1000 - 0s - loss: 1.7910 - acc: 0.6522 Epoch 940/1000 - 0s - loss: 1.7908 - acc: 0.6957 Epoch 941/1000 - 0s - loss: 1.7896 - acc: 0.6957 Epoch 942/1000 - 0s - loss: 1.7902 - acc: 0.6087 Epoch 943/1000 - 0s - loss: 1.7884 - acc: 0.6522 Epoch 944/1000 - 0s - loss: 1.7884 - acc: 0.6087 Epoch 945/1000 - 0s - loss: 1.7875 - acc: 0.6522 Epoch 946/1000 - 0s - loss: 1.7867 - acc: 0.6522 Epoch 947/1000 - 0s - loss: 1.7862 - acc: 0.6522 Epoch 948/1000 - 0s - loss: 1.7868 - acc: 0.6087 Epoch 949/1000 - 0s - loss: 1.7849 - acc: 0.6087 Epoch 950/1000 - 0s - loss: 1.7846 - acc: 0.6522 Epoch 951/1000 - 0s - loss: 1.7840 - acc: 0.5652 Epoch 952/1000 - 0s - loss: 1.7836 - acc: 0.5217 Epoch 953/1000 - 0s - loss: 1.7830 - acc: 0.4783 Epoch 954/1000 - 0s - loss: 1.7826 - acc: 0.3913 Epoch 955/1000 - 0s - loss: 1.7815 - acc: 0.4783 Epoch 956/1000 - 0s - loss: 1.7819 - acc: 0.5652 Epoch 957/1000 - 0s - loss: 1.7810 - acc: 0.6522 Epoch 958/1000 - 0s - loss: 1.7806 - acc: 0.6522 Epoch 959/1000 - 0s - loss: 1.7798 - acc: 0.6087 Epoch 960/1000 - 0s - loss: 1.7791 - acc: 0.6087 Epoch 961/1000 - 0s - loss: 1.7787 - acc: 0.6087 Epoch 962/1000 - 0s - loss: 1.7789 - acc: 0.6522 Epoch 963/1000 - 0s - loss: 1.7782 - acc: 0.5652 Epoch 964/1000 - 0s - loss: 1.7786 - acc: 0.5652 Epoch 965/1000 - 0s - loss: 1.7771 - acc: 0.5652 Epoch 966/1000 - 0s - loss: 1.7768 - acc: 0.5217 Epoch 967/1000 - 0s - loss: 1.7758 - acc: 0.4783 Epoch 968/1000 - 0s - loss: 1.7753 - acc: 0.4783 Epoch 969/1000 - 0s - loss: 1.7744 - acc: 0.5217 Epoch 970/1000 - 0s - loss: 1.7744 - acc: 0.5217 Epoch 971/1000 - 0s - loss: 1.7737 - acc: 0.4348 Epoch 972/1000 - 0s - loss: 1.7730 - acc: 0.5652 Epoch 973/1000 - 0s - loss: 1.7723 - acc: 0.6522 Epoch 974/1000 - 0s - loss: 1.7725 - acc: 0.6087 Epoch 975/1000 - 0s - loss: 1.7717 - acc: 0.6957 Epoch 976/1000 - 0s - loss: 1.7709 - acc: 0.7391 Epoch 977/1000 - 0s - loss: 1.7703 - acc: 0.6957 Epoch 978/1000 - 0s - loss: 1.7701 - acc: 0.6522 Epoch 979/1000 - 0s - loss: 1.7696 - acc: 0.5652 Epoch 980/1000 - 0s - loss: 1.7685 - acc: 0.5217 Epoch 981/1000 - 0s - loss: 1.7681 - acc: 0.6087 Epoch 982/1000 - 0s - loss: 1.7676 - acc: 0.6522 Epoch 983/1000 - 0s - loss: 1.7686 - acc: 0.7391 Epoch 984/1000 - 0s - loss: 1.7673 - acc: 0.6522 Epoch 985/1000 - 0s - loss: 1.7663 - acc: 0.6522 Epoch 986/1000 - 0s - loss: 1.7659 - acc: 0.6522 Epoch 987/1000 - 0s - loss: 1.7659 - acc: 0.6522 Epoch 988/1000 - 0s - loss: 1.7652 - acc: 0.6087 Epoch 989/1000 - 0s - loss: 1.7649 - acc: 0.6087 Epoch 990/1000 - 0s - loss: 1.7644 - acc: 0.6087 Epoch 991/1000 - 0s - loss: 1.7631 - acc: 0.6087 Epoch 992/1000 - 0s - loss: 1.7624 - acc: 0.6522 Epoch 993/1000 - 0s - loss: 1.7617 - acc: 0.6522 Epoch 994/1000 - 0s - loss: 1.7621 - acc: 0.6522 Epoch 995/1000 - 0s - loss: 1.7617 - acc: 0.6087 Epoch 996/1000 - 0s - loss: 1.7604 - acc: 0.5652 Epoch 997/1000 - 0s - loss: 1.7606 - acc: 0.6087 Epoch 998/1000 - 0s - loss: 1.7603 - acc: 0.6522 Epoch 999/1000 - 0s - loss: 1.7594 - acc: 0.6522 Epoch 1000/1000 - 0s - loss: 1.7587 - acc: 0.6522 23/23 [==============================] - 0s 1ms/step Model Accuracy: 0.70 (['A', 'B', 'C'], '->', 'D') (['B', 'C', 'D'], '->', 'D') (['C', 'D', 'E'], '->', 'F') (['D', 'E', 'F'], '->', 'G') (['E', 'F', 'G'], '->', 'H') (['F', 'G', 'H'], '->', 'I') (['G', 'H', 'I'], '->', 'J') (['H', 'I', 'J'], '->', 'K') (['I', 'J', 'K'], '->', 'L') (['J', 'K', 'L'], '->', 'L') (['K', 'L', 'M'], '->', 'N') (['L', 'M', 'N'], '->', 'O') (['M', 'N', 'O'], '->', 'P') (['N', 'O', 'P'], '->', 'R') (['O', 'P', 'Q'], '->', 'R') (['P', 'Q', 'R'], '->', 'T') (['Q', 'R', 'S'], '->', 'T') (['R', 'S', 'T'], '->', 'U') (['S', 'T', 'U'], '->', 'W') (['T', 'U', 'V'], '->', 'W') (['U', 'V', 'W'], '->', 'Z') (['V', 'W', 'X'], '->', 'Z') (['W', 'X', 'Y'], '->', 'Z')
X.shape[1], y.shape[1] # get a sense of the shapes to understand the network architecture
Out[5]: (3, 26)

The network does learn, and could be trained to get a good accuracy. But what's really going on here?

Let's leave aside for a moment the simplistic training data (one fun experiment would be to create corrupted sequences and augment the data with those, forcing the network to pay attention to the whole sequence).

Because the model is fundamentally symmetric and stateless (in terms of the sequence; naturally it has weights), this model would need to learn every sequential feature relative to every single sequence position. That seems difficult, inflexible, and inefficient.

Maybe we could add layers, neurons, and extra connections to mitigate parts of the problem. We could also do things like a 1D convolution to pick up frequencies and some patterns.

But instead, it might make more sense to explicitly model the sequential nature of the data (a bit like how we explictly modeled the 2D nature of image data with CNNs).

Recurrent Neural Network Concept

Let's take the neuron's output from one time (t) and feed it into that same neuron at a later time (t+1), in combination with other relevant inputs. Then we would have a neuron with memory.

We can weight the "return" of that value and train the weight -- so the neuron learns how important the previous value is relative to the current one.

Different neurons might learn to "remember" different amounts of prior history.

This concept is called a Recurrent Neural Network, originally developed around the 1980s.

Let's recall some pointers from the crash intro to Deep learning.

Watch following videos now for 12 minutes for the fastest introduction to RNNs and LSTMs

Udacity: Deep Learning by Vincent Vanhoucke - Recurrent Neural network

Recurrent neural network

Recurrent neural network
http://colah.github.io/posts/2015-08-Understanding-LSTMs/

http://karpathy.github.io/2015/05/21/rnn-effectiveness/



LSTM - Long short term memory

LSTM


GRU - Gated recurrent unit

Gated Recurrent unit http://arxiv.org/pdf/1406.1078v3.pdf

Training a Recurrent Neural Network

We can train an RNN using backpropagation with a minor twist: since RNN neurons with different states over time can be "unrolled" (i.e., are analogous) to a sequence of neurons with the "remember" weight linking directly forward from (t) to (t+1), we can backpropagate through time as well as the physical layers of the network.

This is, in fact, called Backpropagation Through Time (BPTT)

The idea is sound but -- since it creates patterns similar to very deep networks -- it suffers from the same challenges:

  • Vanishing gradient
  • Exploding gradient
  • Saturation
  • etc.

i.e., many of the same problems with early deep feed-forward networks having lots of weights.

10 steps back in time for a single layer is a not as bad as 10 layers (since there are fewer connections and, hence, weights) but it does get expensive.


ASIDE: Hierarchical and Recursive Networks, Bidirectional RNN

Network topologies can be built to reflect the relative structure of the data we are modeling. E.g., for natural language, grammar constraints mean that both hierarchy and (limited) recursion may allow a physically smaller model to achieve more effective capacity.

A bi-directional RNN includes values from previous and subsequent time steps. This is less strange than it sounds at first: after all, in many problems, such as sentence translation (where BiRNNs are very popular) we usually have the entire source sequence at one time. In that case, a BiDiRNN is really just saying that both prior and subsequent words can influence the interpretation of each word, something we humans take for granted.

Recent versions of neural net libraries have support for bidirectional networks, although you may need to write (or locate) a little code yourself if you want to experiment with hierarchical networks.


Long Short-Term Memory (LSTM)

"Pure" RNNs were never very successful. Sepp Hochreiter and Jürgen Schmidhuber (1997) made a game-changing contribution with the publication of the Long Short-Term Memory unit. How game changing? It's effectively state of the art today.

(Credit and much thanks to Chris Olah, http://colah.github.io/about.html, Research Scientist at Google Brain, for publishing the following excellent diagrams!)

In the following diagrams, pay close attention that the output value is "split" for graphical purposes -- so the two h arrows/signals coming out are the same signal.

RNN Cell:

LSTM Cell:

An LSTM unit is a neuron with some bonus features:

  • Cell state propagated across time
  • Input, Output, Forget gates
  • Learns retention/discard of cell state
  • Admixture of new data
  • Output partly distinct from state
  • Use of addition (not multiplication) to combine input and cell state allows state to propagate unimpeded across time (addition of gradient)

ASIDE: Variations on LSTM

... include "peephole" where gate functions have direct access to cell state; convolutional; and bidirectional, where we can "cheat" by letting neurons learn from future time steps and not just previous time steps.


Slow down ... exactly what's getting added to where? For a step-by-step walk through, read Chris Olah's full post http://colah.github.io/posts/2015-08-Understanding-LSTMs/

Do LSTMs Work Reasonably Well?

Yes! These architectures are in production (2017) for deep-learning-enabled products at Baidu, Google, Microsoft, Apple, and elsewhere. They are used to solve problems in time series analysis, speech recognition and generation, connected handwriting, grammar, music, and robot control systems.

Let's Code an LSTM Variant of our Sequence Lab

(this great demo example courtesy of Jason Brownlee: http://machinelearningmastery.com/understanding-stateful-lstm-recurrent-neural-networks-python-keras/)

import numpy
from keras.models import Sequential
from keras.layers import Dense
from keras.layers import LSTM
from keras.utils import np_utils

alphabet = "ABCDEFGHIJKLMNOPQRSTUVWXYZ"
char_to_int = dict((c, i) for i, c in enumerate(alphabet))
int_to_char = dict((i, c) for i, c in enumerate(alphabet))

seq_length = 3
dataX = []
dataY = []
for i in range(0, len(alphabet) - seq_length, 1):
    seq_in = alphabet[i:i + seq_length]
    seq_out = alphabet[i + seq_length]
    dataX.append([char_to_int[char] for char in seq_in])
    dataY.append(char_to_int[seq_out])
    print (seq_in, '->', seq_out)

# reshape X to be .......[samples, time steps, features]
X = numpy.reshape(dataX, (len(dataX), seq_length, 1))
X = X / float(len(alphabet))
y = np_utils.to_categorical(dataY)

# Let’s define an LSTM network with 32 units and an output layer with a softmax activation function for making predictions. 
# a naive implementation of LSTM
model = Sequential()
model.add(LSTM(32, input_shape=(X.shape[1], X.shape[2]))) # <- LSTM layer...
model.add(Dense(y.shape[1], activation='softmax'))
model.compile(loss='categorical_crossentropy', optimizer='adam', metrics=['accuracy'])
model.fit(X, y, epochs=400, batch_size=1, verbose=2)

scores = model.evaluate(X, y)
print("Model Accuracy: %.2f%%" % (scores[1]*100))

for pattern in dataX:
    x = numpy.reshape(pattern, (1, len(pattern), 1))
    x = x / float(len(alphabet))
    prediction = model.predict(x, verbose=0)
    index = numpy.argmax(prediction)
    result = int_to_char[index]
    seq_in = [int_to_char[value] for value in pattern]
    print (seq_in, "->", result)
('ABC', '->', 'D') ('BCD', '->', 'E') ('CDE', '->', 'F') ('DEF', '->', 'G') ('EFG', '->', 'H') ('FGH', '->', 'I') ('GHI', '->', 'J') ('HIJ', '->', 'K') ('IJK', '->', 'L') ('JKL', '->', 'M') ('KLM', '->', 'N') ('LMN', '->', 'O') ('MNO', '->', 'P') ('NOP', '->', 'Q') ('OPQ', '->', 'R') ('PQR', '->', 'S') ('QRS', '->', 'T') ('RST', '->', 'U') ('STU', '->', 'V') ('TUV', '->', 'W') ('UVW', '->', 'X') ('VWX', '->', 'Y') ('WXY', '->', 'Z') Epoch 1/400 - 1s - loss: 3.2700 - acc: 0.0435 Epoch 2/400 - 0s - loss: 3.2571 - acc: 0.0435 Epoch 3/400 - 0s - loss: 3.2487 - acc: 0.0435 Epoch 4/400 - 0s - loss: 3.2421 - acc: 0.0000e+00 Epoch 5/400 - 0s - loss: 3.2351 - acc: 0.0000e+00 Epoch 6/400 - 0s - loss: 3.2264 - acc: 0.0435 Epoch 7/400 - 0s - loss: 3.2183 - acc: 0.0435 Epoch 8/400 - 0s - loss: 3.2088 - acc: 0.0435 Epoch 9/400 - 0s - loss: 3.1985 - acc: 0.0435 Epoch 10/400 - 0s - loss: 3.1886 - acc: 0.0435 Epoch 11/400 - 0s - loss: 3.1747 - acc: 0.0000e+00 Epoch 12/400 - 0s - loss: 3.1634 - acc: 0.0000e+00 Epoch 13/400 - 0s - loss: 3.1470 - acc: 0.0435 Epoch 14/400 - 0s - loss: 3.1335 - acc: 0.0000e+00 Epoch 15/400 - 0s - loss: 3.1170 - acc: 0.0435 Epoch 16/400 - 0s - loss: 3.1059 - acc: 0.0435 Epoch 17/400 - 0s - loss: 3.0910 - acc: 0.0435 Epoch 18/400 - 0s - loss: 3.0745 - acc: 0.0435 Epoch 19/400 - 0s - loss: 3.0629 - acc: 0.0435 Epoch 20/400 - 0s - loss: 3.0464 - acc: 0.0435 Epoch 21/400 - 0s - loss: 3.0348 - acc: 0.0435 Epoch 22/400 - 0s - loss: 3.0188 - acc: 0.0435 Epoch 23/400 - 0s - loss: 2.9978 - acc: 0.0870 Epoch 24/400 - 0s - loss: 2.9811 - acc: 0.0870 Epoch 25/400 - 0s - loss: 2.9580 - acc: 0.1304 Epoch 26/400 - 0s - loss: 2.9354 - acc: 0.1304 Epoch 27/400 - 0s - loss: 2.9085 - acc: 0.1304 Epoch 28/400 - 0s - loss: 2.8796 - acc: 0.0870 Epoch 29/400 - 0s - loss: 2.8474 - acc: 0.0870 Epoch 30/400 - 0s - loss: 2.8132 - acc: 0.0870 Epoch 31/400 - 0s - loss: 2.7738 - acc: 0.0870 Epoch 32/400 - 0s - loss: 2.7349 - acc: 0.0870 Epoch 33/400 - 0s - loss: 2.6934 - acc: 0.0870 Epoch 34/400 - 0s - loss: 2.6521 - acc: 0.1304 Epoch 35/400 - 0s - loss: 2.6122 - acc: 0.1304 Epoch 36/400 - 0s - loss: 2.5801 - acc: 0.1304 Epoch 37/400 - 0s - loss: 2.5397 - acc: 0.1304 Epoch 38/400 - 0s - loss: 2.5086 - acc: 0.1304 Epoch 39/400 - 0s - loss: 2.4795 - acc: 0.1304 Epoch 40/400 - 0s - loss: 2.4550 - acc: 0.1304 Epoch 41/400 - 0s - loss: 2.4292 - acc: 0.1304 Epoch 42/400 - 0s - loss: 2.4077 - acc: 0.1304 Epoch 43/400 - 0s - loss: 2.3840 - acc: 0.1304 Epoch 44/400 - 0s - loss: 2.3469 - acc: 0.0870 Epoch 45/400 - 0s - loss: 2.3263 - acc: 0.1304 Epoch 46/400 - 0s - loss: 2.3055 - acc: 0.1739 Epoch 47/400 - 0s - loss: 2.2763 - acc: 0.2174 Epoch 48/400 - 0s - loss: 2.2538 - acc: 0.1739 Epoch 49/400 - 0s - loss: 2.2317 - acc: 0.2174 Epoch 50/400 - 0s - loss: 2.2094 - acc: 0.2174 Epoch 51/400 - 0s - loss: 2.1879 - acc: 0.3043 Epoch 52/400 - 0s - loss: 2.1696 - acc: 0.1739 Epoch 53/400 - 0s - loss: 2.1457 - acc: 0.2174 Epoch 54/400 - 0s - loss: 2.1258 - acc: 0.3043 Epoch 55/400 - 0s - loss: 2.1104 - acc: 0.2174 Epoch 56/400 - 0s - loss: 2.0886 - acc: 0.2609 Epoch 57/400 - 0s - loss: 2.0668 - acc: 0.3043 Epoch 58/400 - 0s - loss: 2.0455 - acc: 0.1739 Epoch 59/400 - 0s - loss: 2.0293 - acc: 0.2609 Epoch 60/400 - 0s - loss: 2.0047 - acc: 0.3043 Epoch 61/400 - 0s - loss: 1.9879 - acc: 0.2609 Epoch 62/400 - 0s - loss: 1.9686 - acc: 0.3478 Epoch 63/400 - 0s - loss: 1.9593 - acc: 0.3043 Epoch 64/400 - 0s - loss: 1.9303 - acc: 0.3043 Epoch 65/400 - 0s - loss: 1.9121 - acc: 0.3043 Epoch 66/400 - 0s - loss: 1.9020 - acc: 0.3913 Epoch 67/400 - 0s - loss: 1.8824 - acc: 0.4348 Epoch 68/400 - 0s - loss: 1.8558 - acc: 0.3913 Epoch 69/400 - 0s - loss: 1.8411 - acc: 0.3478 Epoch 70/400 - 0s - loss: 1.8279 - acc: 0.3043 Epoch 71/400 - 0s - loss: 1.8132 - acc: 0.3913 Epoch 72/400 - 0s - loss: 1.8013 - acc: 0.2174 Epoch 73/400 - 0s - loss: 1.7770 - acc: 0.4783 Epoch 74/400 - 0s - loss: 1.7667 - acc: 0.5217 Epoch 75/400 - 0s - loss: 1.7527 - acc: 0.3913 Epoch 76/400 - 0s - loss: 1.7348 - acc: 0.4348 Epoch 77/400 - 0s - loss: 1.7280 - acc: 0.4348 Epoch 78/400 - 0s - loss: 1.7210 - acc: 0.4348 Epoch 79/400 - 0s - loss: 1.7112 - acc: 0.3478 Epoch 80/400 - 0s - loss: 1.6950 - acc: 0.5217 Epoch 81/400 - 0s - loss: 1.6850 - acc: 0.4348 Epoch 82/400 - 0s - loss: 1.6693 - acc: 0.6522 Epoch 83/400 - 0s - loss: 1.6653 - acc: 0.5217 Epoch 84/400 - 0s - loss: 1.6577 - acc: 0.4783 Epoch 85/400 - 0s - loss: 1.6542 - acc: 0.4348 Epoch 86/400 - 0s - loss: 1.6342 - acc: 0.6087 Epoch 87/400 - 0s - loss: 1.6243 - acc: 0.6087 Epoch 88/400 - 0s - loss: 1.6082 - acc: 0.5652 Epoch 89/400 - 0s - loss: 1.6039 - acc: 0.5652 Epoch 90/400 - 0s - loss: 1.5904 - acc: 0.6087 Epoch 91/400 - 0s - loss: 1.5900 - acc: 0.6087 Epoch 92/400 - 0s - loss: 1.5783 - acc: 0.6522 Epoch 93/400 - 0s - loss: 1.5709 - acc: 0.5652 Epoch 94/400 - 0s - loss: 1.5657 - acc: 0.6087 Epoch 95/400 - 0s - loss: 1.5536 - acc: 0.6957 Epoch 96/400 - 0s - loss: 1.5404 - acc: 0.6087 Epoch 97/400 - 0s - loss: 1.5343 - acc: 0.6522 Epoch 98/400 - 0s - loss: 1.5350 - acc: 0.6957 Epoch 99/400 - 0s - loss: 1.5248 - acc: 0.6522 Epoch 100/400 - 0s - loss: 1.5107 - acc: 0.6522 Epoch 101/400 - 0s - loss: 1.5164 - acc: 0.6522 Epoch 102/400 - 0s - loss: 1.5022 - acc: 0.7391 Epoch 103/400 - 0s - loss: 1.4993 - acc: 0.6522 Epoch 104/400 - 0s - loss: 1.4899 - acc: 0.6957 Epoch 105/400 - 0s - loss: 1.4783 - acc: 0.6087 Epoch 106/400 - 0s - loss: 1.4775 - acc: 0.6957 Epoch 107/400 - 0s - loss: 1.4671 - acc: 0.6957 Epoch 108/400 - 0s - loss: 1.4592 - acc: 0.6522 Epoch 109/400 - 0s - loss: 1.4471 - acc: 0.7391 Epoch 110/400 - 0s - loss: 1.4504 - acc: 0.6522 Epoch 111/400 - 0s - loss: 1.4382 - acc: 0.6957 Epoch 112/400 - 0s - loss: 1.4293 - acc: 0.6957 Epoch 113/400 - 0s - loss: 1.4247 - acc: 0.6522 Epoch 114/400 - 0s - loss: 1.4213 - acc: 0.7391 Epoch 115/400 - 0s - loss: 1.4119 - acc: 0.7826 Epoch 116/400 - 0s - loss: 1.4104 - acc: 0.7391 Epoch 117/400 - 0s - loss: 1.4021 - acc: 0.7826 Epoch 118/400 - 0s - loss: 1.3972 - acc: 0.6957 Epoch 119/400 - 0s - loss: 1.3958 - acc: 0.7826 Epoch 120/400 - 0s - loss: 1.3845 - acc: 0.6957 Epoch 121/400 - 0s - loss: 1.3750 - acc: 0.7826 Epoch 122/400 - 0s - loss: 1.3739 - acc: 0.7391 Epoch 123/400 - 0s - loss: 1.3685 - acc: 0.6957 Epoch 124/400 - 0s - loss: 1.3669 - acc: 0.8261 Epoch 125/400 - 0s - loss: 1.3516 - acc: 0.7826 Epoch 126/400 - 0s - loss: 1.3492 - acc: 0.7826 Epoch 127/400 - 0s - loss: 1.3475 - acc: 0.7391 Epoch 128/400 - 0s - loss: 1.3382 - acc: 0.7826 Epoch 129/400 - 0s - loss: 1.3325 - acc: 0.7391 Epoch 130/400 - 0s - loss: 1.3248 - acc: 0.7826 Epoch 131/400 - 0s - loss: 1.3221 - acc: 0.7391 Epoch 132/400 - 0s - loss: 1.3160 - acc: 0.7391 Epoch 133/400 - 0s - loss: 1.3084 - acc: 0.8261 Epoch 134/400 - 0s - loss: 1.3038 - acc: 0.7826 Epoch 135/400 - 0s - loss: 1.2940 - acc: 0.8261 Epoch 136/400 - 0s - loss: 1.2957 - acc: 0.8261 Epoch 137/400 - 0s - loss: 1.2829 - acc: 0.8696 Epoch 138/400 - 0s - loss: 1.2848 - acc: 0.8696 Epoch 139/400 - 0s - loss: 1.2740 - acc: 0.7826 Epoch 140/400 - 0s - loss: 1.2734 - acc: 0.8696 Epoch 141/400 - 0s - loss: 1.2665 - acc: 0.8261 Epoch 142/400 - 0s - loss: 1.2666 - acc: 0.7391 Epoch 143/400 - 0s - loss: 1.2513 - acc: 0.8696 Epoch 144/400 - 0s - loss: 1.2467 - acc: 0.8261 Epoch 145/400 - 0s - loss: 1.2423 - acc: 0.7826 Epoch 146/400 - 0s - loss: 1.2397 - acc: 0.8261 Epoch 147/400 - 0s - loss: 1.2355 - acc: 0.7391 Epoch 148/400 - 0s - loss: 1.2315 - acc: 0.7826 Epoch 149/400 - 0s - loss: 1.2264 - acc: 0.8261 Epoch 150/400 - 0s - loss: 1.2203 - acc: 0.8261 Epoch 151/400 - 0s - loss: 1.2151 - acc: 0.7826 Epoch 152/400 - 0s - loss: 1.2093 - acc: 0.7826 Epoch 153/400 - 0s - loss: 1.2016 - acc: 0.7826 Epoch 154/400 - 0s - loss: 1.1963 - acc: 0.7826 Epoch 155/400 - 0s - loss: 1.1937 - acc: 0.7826 Epoch 156/400 - 0s - loss: 1.1903 - acc: 0.7826 Epoch 157/400 - 0s - loss: 1.1800 - acc: 0.8261 Epoch 158/400 - 0s - loss: 1.1790 - acc: 0.8261 Epoch 159/400 - 0s - loss: 1.1705 - acc: 0.8696 Epoch 160/400 - 0s - loss: 1.1732 - acc: 0.8261 Epoch 161/400 - 0s - loss: 1.1691 - acc: 0.7826 Epoch 162/400 - 0s - loss: 1.1570 - acc: 0.8696 Epoch 163/400 - 0s - loss: 1.1520 - acc: 0.8261 Epoch 164/400 - 0s - loss: 1.1521 - acc: 0.7826 Epoch 165/400 - 0s - loss: 1.1386 - acc: 0.8261 Epoch 166/400 - 0s - loss: 1.1367 - acc: 0.8261 Epoch 167/400 - 0s - loss: 1.1301 - acc: 0.8696 Epoch 168/400 - 0s - loss: 1.1222 - acc: 0.8696 Epoch 169/400 - 0s - loss: 1.1183 - acc: 0.8696 Epoch 170/400 - 0s - loss: 1.1136 - acc: 0.8696 Epoch 171/400 - 0s - loss: 1.1142 - acc: 0.8696 Epoch 172/400 - 0s - loss: 1.1108 - acc: 0.8696 Epoch 173/400 - 0s - loss: 1.1035 - acc: 0.8261 Epoch 174/400 - 0s - loss: 1.0969 - acc: 0.8261 Epoch 175/400 - 0s - loss: 1.0952 - acc: 0.8696 Epoch 176/400 - 0s - loss: 1.0889 - acc: 0.8261 Epoch 177/400 - 0s - loss: 1.0837 - acc: 0.8696 Epoch 178/400 - 0s - loss: 1.0764 - acc: 0.9130 Epoch 179/400 - 0s - loss: 1.0694 - acc: 0.7826 Epoch 180/400 - 0s - loss: 1.0641 - acc: 0.8261 Epoch 181/400 - 0s - loss: 1.0591 - acc: 0.8696 Epoch 182/400 - 0s - loss: 1.0514 - acc: 0.8261 Epoch 183/400 - 0s - loss: 1.0475 - acc: 0.8696 Epoch 184/400 - 0s - loss: 1.0517 - acc: 0.8696 Epoch 185/400 - 0s - loss: 1.0406 - acc: 0.8261 Epoch 186/400 - 0s - loss: 1.0358 - acc: 0.8696 Epoch 187/400 - 0s - loss: 1.0309 - acc: 0.9130 Epoch 188/400 - 0s - loss: 1.0216 - acc: 0.9130 Epoch 189/400 - 0s - loss: 1.0152 - acc: 0.8696 Epoch 190/400 - 0s - loss: 1.0104 - acc: 0.8696 Epoch 191/400 - 0s - loss: 1.0106 - acc: 0.9130 Epoch 192/400 - 0s - loss: 1.0142 - acc: 0.9130 Epoch 193/400 - 0s - loss: 0.9995 - acc: 0.8696 Epoch 194/400 - 0s - loss: 0.9959 - acc: 0.9130 Epoch 195/400 - 0s - loss: 0.9976 - acc: 0.8696 Epoch 196/400 - 0s - loss: 0.9885 - acc: 0.8696 Epoch 197/400 - 0s - loss: 0.9796 - acc: 0.9130 Epoch 198/400 - 0s - loss: 0.9734 - acc: 0.9130 Epoch 199/400 - 0s - loss: 0.9726 - acc: 0.9130 Epoch 200/400 - 0s - loss: 0.9674 - acc: 0.9130 Epoch 201/400 - 0s - loss: 0.9665 - acc: 0.9565 Epoch 202/400 - 0s - loss: 0.9579 - acc: 0.9565 Epoch 203/400 - 0s - loss: 0.9562 - acc: 0.8696 Epoch 204/400 - 0s - loss: 0.9499 - acc: 0.8696 Epoch 205/400 - 0s - loss: 0.9439 - acc: 0.9130 Epoch 206/400 - 0s - loss: 0.9406 - acc: 0.9565 Epoch 207/400 - 0s - loss: 0.9371 - acc: 0.8696 Epoch 208/400 - 0s - loss: 0.9254 - acc: 0.9130 Epoch 209/400 - 0s - loss: 0.9280 - acc: 0.8696 Epoch 210/400 - 0s - loss: 0.9228 - acc: 0.9565 Epoch 211/400 - 0s - loss: 0.9183 - acc: 0.9130 Epoch 212/400 - 0s - loss: 0.9142 - acc: 0.9565 Epoch 213/400 - 0s - loss: 0.9087 - acc: 0.9130 Epoch 214/400 - 0s - loss: 0.9067 - acc: 0.9565 Epoch 215/400 - 0s - loss: 0.8983 - acc: 0.8696 Epoch 216/400 - 0s - loss: 0.8991 - acc: 0.9130 Epoch 217/400 - 0s - loss: 0.8967 - acc: 0.8696 Epoch 218/400 - 0s - loss: 0.8863 - acc: 0.9130 Epoch 219/400 - 0s - loss: 0.8871 - acc: 0.9130 Epoch 220/400 - 0s - loss: 0.8801 - acc: 0.9130 Epoch 221/400 - 0s - loss: 0.8789 - acc: 0.9565 Epoch 222/400 - 0s - loss: 0.8688 - acc: 0.9130 Epoch 223/400 - 0s - loss: 0.8718 - acc: 0.8696 Epoch 224/400 - 0s - loss: 0.8636 - acc: 0.8696 Epoch 225/400 - 0s - loss: 0.8575 - acc: 0.9565 Epoch 226/400 - 0s - loss: 0.8586 - acc: 0.8261 Epoch 227/400 - 0s - loss: 0.8493 - acc: 0.9565 Epoch 228/400 - 0s - loss: 0.8572 - acc: 0.8261 Epoch 229/400 - 0s - loss: 0.8467 - acc: 0.9130 Epoch 230/400 - 0s - loss: 0.8415 - acc: 0.8696 Epoch 231/400 - 0s - loss: 0.8364 - acc: 0.9130 Epoch 232/400 - 0s - loss: 0.8325 - acc: 0.9565 Epoch 233/400 - 0s - loss: 0.8268 - acc: 0.9130 Epoch 234/400 - 0s - loss: 0.8224 - acc: 0.9565 Epoch 235/400 - 0s - loss: 0.8160 - acc: 0.9565 Epoch 236/400 - 0s - loss: 0.8142 - acc: 0.9565 Epoch 237/400 - 0s - loss: 0.8138 - acc: 0.9130 Epoch 238/400 - 0s - loss: 0.8108 - acc: 0.9130 Epoch 239/400 - 0s - loss: 0.8038 - acc: 0.9565 Epoch 240/400 - 0s - loss: 0.7985 - acc: 0.9565 Epoch 241/400 - 0s - loss: 0.7971 - acc: 0.9565 Epoch 242/400 - 0s - loss: 0.7888 - acc: 0.9130 Epoch 243/400 - 0s - loss: 0.7871 - acc: 0.9130 Epoch 244/400 - 0s - loss: 0.7815 - acc: 0.9130 Epoch 245/400 - 0s - loss: 0.7786 - acc: 0.8696 Epoch 246/400 - 0s - loss: 0.7797 - acc: 0.9565 Epoch 247/400 - 0s - loss: 0.7680 - acc: 0.9565 Epoch 248/400 - 0s - loss: 0.7774 - acc: 0.9130 Epoch 249/400 - 0s - loss: 0.7747 - acc: 0.9565 Epoch 250/400 - 0s - loss: 0.7618 - acc: 0.9565 Epoch 251/400 - 0s - loss: 0.7521 - acc: 0.9565 Epoch 252/400 - 0s - loss: 0.7583 - acc: 0.9565 Epoch 253/400 - 0s - loss: 0.7483 - acc: 0.9565 Epoch 254/400 - 0s - loss: 0.7431 - acc: 0.9565 Epoch 255/400 - 0s - loss: 0.7441 - acc: 0.9565 Epoch 256/400 - 0s - loss: 0.7441 - acc: 0.9565 Epoch 257/400 - 0s - loss: 0.7327 - acc: 0.9130 Epoch 258/400 - 0s - loss: 0.7317 - acc: 0.9565 Epoch 259/400 - 0s - loss: 0.7294 - acc: 1.0000 Epoch 260/400 - 0s - loss: 0.7250 - acc: 0.9565 Epoch 261/400 - 0s - loss: 0.7238 - acc: 0.9565 Epoch 262/400 - 0s - loss: 0.7167 - acc: 0.9565 Epoch 263/400 - 0s - loss: 0.7123 - acc: 0.9565 Epoch 264/400 - 0s - loss: 0.7117 - acc: 0.9565 Epoch 265/400 - 0s - loss: 0.7076 - acc: 0.9565 Epoch 266/400 - 0s - loss: 0.7069 - acc: 0.9565 Epoch 267/400 - 0s - loss: 0.6952 - acc: 0.9565 Epoch 268/400 - 0s - loss: 0.6963 - acc: 1.0000 Epoch 269/400 - 0s - loss: 0.7027 - acc: 0.9565 Epoch 270/400 - 0s - loss: 0.6940 - acc: 0.9565 Epoch 271/400 - 0s - loss: 0.6958 - acc: 0.9565 Epoch 272/400 - 0s - loss: 0.6858 - acc: 0.9565 Epoch 273/400 - 0s - loss: 0.6802 - acc: 0.9565 Epoch 274/400 - 0s - loss: 0.6743 - acc: 0.9130 Epoch 275/400 - 0s - loss: 0.6768 - acc: 0.9565 Epoch 276/400 - 0s - loss: 0.6646 - acc: 0.9565 Epoch 277/400 - 0s - loss: 0.6719 - acc: 0.9565 Epoch 278/400 - 0s - loss: 0.6626 - acc: 1.0000 Epoch 279/400 - 0s - loss: 0.6562 - acc: 1.0000 Epoch 280/400 - 0s - loss: 0.6583 - acc: 0.9565 Epoch 281/400 - 0s - loss: 0.6497 - acc: 0.9565 Epoch 282/400 - 0s - loss: 0.6453 - acc: 0.9565 Epoch 283/400 - 0s - loss: 0.6444 - acc: 0.9565 Epoch 284/400 - 0s - loss: 0.6446 - acc: 0.9565 Epoch 285/400 - 0s - loss: 0.6357 - acc: 0.9130 Epoch 286/400 - 0s - loss: 0.6336 - acc: 0.9565 Epoch 287/400 - 0s - loss: 0.6308 - acc: 1.0000 Epoch 288/400 - 0s - loss: 0.6266 - acc: 1.0000 Epoch 289/400 - 0s - loss: 0.6326 - acc: 0.9565 Epoch 290/400 - 0s - loss: 0.6296 - acc: 1.0000 Epoch 291/400 - 0s - loss: 0.6194 - acc: 0.9565 Epoch 292/400 - 0s - loss: 0.6223 - acc: 1.0000 Epoch 293/400 - 0s - loss: 0.6140 - acc: 0.9565 Epoch 294/400 - 0s - loss: 0.6106 - acc: 0.9565 Epoch 295/400 - 0s - loss: 0.6033 - acc: 0.9565 Epoch 296/400 - 0s - loss: 0.6008 - acc: 0.9565 Epoch 297/400 - 0s - loss: 0.6024 - acc: 0.9565 Epoch 298/400 - 0s - loss: 0.5991 - acc: 0.9565 Epoch 299/400 - 0s - loss: 0.5921 - acc: 0.9565 Epoch 300/400 - 0s - loss: 0.5929 - acc: 0.9565 Epoch 301/400 - 0s - loss: 0.5957 - acc: 1.0000 Epoch 302/400 - 0s - loss: 0.5845 - acc: 0.9565 Epoch 303/400 - 0s - loss: 0.5856 - acc: 0.9565 Epoch 304/400 - 0s - loss: 0.5790 - acc: 1.0000 Epoch 305/400 - 0s - loss: 0.5757 - acc: 0.9565 Epoch 306/400 - 0s - loss: 0.5758 - acc: 1.0000 Epoch 307/400 - 0s - loss: 0.5734 - acc: 1.0000 Epoch 308/400 - 0s - loss: 0.5695 - acc: 0.9565 Epoch 309/400 - 0s - loss: 0.5619 - acc: 0.9565 Epoch 310/400 - 0s - loss: 0.5639 - acc: 0.9565 Epoch 311/400 - 0s - loss: 0.5621 - acc: 1.0000 Epoch 312/400 - 0s - loss: 0.5492 - acc: 1.0000 Epoch 313/400 - 0s - loss: 0.5541 - acc: 0.9565 Epoch 314/400 - 0s - loss: 0.5514 - acc: 0.9565 Epoch 315/400 - 0s - loss: 0.5455 - acc: 0.9565 Epoch 316/400 - 0s - loss: 0.5474 - acc: 0.9565 Epoch 317/400 - 0s - loss: 0.5455 - acc: 0.9565 Epoch 318/400 - 0s - loss: 0.5395 - acc: 0.9565 Epoch 319/400 - 0s - loss: 0.5363 - acc: 1.0000 Epoch 320/400 - 0s - loss: 0.5324 - acc: 1.0000 Epoch 321/400 - 0s - loss: 0.5350 - acc: 1.0000 Epoch 322/400 - 0s - loss: 0.5297 - acc: 1.0000 Epoch 323/400 - 0s - loss: 0.5286 - acc: 0.9565 Epoch 324/400 - 0s - loss: 0.5233 - acc: 0.9565 Epoch 325/400 - 0s - loss: 0.5220 - acc: 1.0000 Epoch 326/400 - 0s - loss: 0.5171 - acc: 1.0000 Epoch 327/400 - 0s - loss: 0.5108 - acc: 0.9565 Epoch 328/400 - 0s - loss: 0.5162 - acc: 0.9565 Epoch 329/400 - 0s - loss: 0.5093 - acc: 0.9565 Epoch 330/400 - 0s - loss: 0.5037 - acc: 0.9565 Epoch 331/400 - 0s - loss: 0.5028 - acc: 1.0000 Epoch 332/400 - 0s - loss: 0.4975 - acc: 1.0000 Epoch 333/400 - 0s - loss: 0.5036 - acc: 1.0000 Epoch 334/400 - 0s - loss: 0.4997 - acc: 0.9565 Epoch 335/400 - 0s - loss: 0.4905 - acc: 1.0000 Epoch 336/400 - 0s - loss: 0.4904 - acc: 0.9565 Epoch 337/400 - 0s - loss: 0.4901 - acc: 0.9565 Epoch 338/400 - 0s - loss: 0.4896 - acc: 0.9565 Epoch 339/400 - 0s - loss: 0.4858 - acc: 1.0000 Epoch 340/400 - 0s - loss: 0.4835 - acc: 0.9565 Epoch 341/400 - 0s - loss: 0.4769 - acc: 0.9565 Epoch 342/400 - 0s - loss: 0.4696 - acc: 1.0000 Epoch 343/400 - 0s - loss: 0.4733 - acc: 0.9565 Epoch 344/400 - 0s - loss: 0.4685 - acc: 1.0000 Epoch 345/400 - 0s - loss: 0.4689 - acc: 1.0000 Epoch 346/400 - 0s - loss: 0.4637 - acc: 1.0000 Epoch 347/400 - 0s - loss: 0.4647 - acc: 0.9565 Epoch 348/400 - 0s - loss: 0.4635 - acc: 1.0000 Epoch 349/400 - 0s - loss: 0.4734 - acc: 0.9565 Epoch 350/400 - 0s - loss: 0.4602 - acc: 1.0000 Epoch 351/400 - 0s - loss: 0.4538 - acc: 0.9565 Epoch 352/400 - 0s - loss: 0.4492 - acc: 1.0000 Epoch 353/400 - 0s - loss: 0.4602 - acc: 1.0000 Epoch 354/400 - 0s - loss: 0.4488 - acc: 0.9565 Epoch 355/400 - 0s - loss: 0.4531 - acc: 0.9565 Epoch 356/400 - 0s - loss: 0.4456 - acc: 0.9565 Epoch 357/400 - 0s - loss: 0.4390 - acc: 0.9565 Epoch 358/400 - 0s - loss: 0.4345 - acc: 1.0000 Epoch 359/400 - 0s - loss: 0.4323 - acc: 1.0000 Epoch 360/400 - 0s - loss: 0.4319 - acc: 1.0000 Epoch 361/400 - 0s - loss: 0.4265 - acc: 1.0000 Epoch 362/400 - 0s - loss: 0.4321 - acc: 1.0000 Epoch 363/400 - 0s - loss: 0.4188 - acc: 1.0000 Epoch 364/400 - 0s - loss: 0.4216 - acc: 1.0000 Epoch 365/400 - 0s - loss: 0.4186 - acc: 1.0000 Epoch 366/400 - 0s - loss: 0.4185 - acc: 0.9565 Epoch 367/400 - 0s - loss: 0.4188 - acc: 1.0000 Epoch 368/400 - 0s - loss: 0.4101 - acc: 1.0000 Epoch 369/400 - 0s - loss: 0.4110 - acc: 0.9565 Epoch 370/400 - 0s - loss: 0.4090 - acc: 1.0000 Epoch 371/400 - 0s - loss: 0.4106 - acc: 1.0000 Epoch 372/400 - 0s - loss: 0.4070 - acc: 0.9565 Epoch 373/400 - 0s - loss: 0.4003 - acc: 1.0000 Epoch 374/400 - 0s - loss: 0.4008 - acc: 1.0000 Epoch 375/400 - 0s - loss: 0.4029 - acc: 0.9565 Epoch 376/400 - 0s - loss: 0.3933 - acc: 0.9565 Epoch 377/400 - 0s - loss: 0.3899 - acc: 1.0000 Epoch 378/400 - 0s - loss: 0.3917 - acc: 1.0000 Epoch 379/400 - 0s - loss: 0.3895 - acc: 1.0000 Epoch 380/400 - 0s - loss: 0.3878 - acc: 0.9565 Epoch 381/400 - 0s - loss: 0.3828 - acc: 1.0000 Epoch 382/400 - 0s - loss: 0.3869 - acc: 0.9565 Epoch 383/400 - 0s - loss: 0.3790 - acc: 0.9565 Epoch 384/400 - 0s - loss: 0.3786 - acc: 1.0000 Epoch 385/400 - 0s - loss: 0.3750 - acc: 1.0000 Epoch 386/400 - 0s - loss: 0.3719 - acc: 1.0000 Epoch 387/400 - 0s - loss: 0.3727 - acc: 1.0000 Epoch 388/400 - 0s - loss: 0.3758 - acc: 1.0000 Epoch 389/400 - 0s - loss: 0.3808 - acc: 1.0000 Epoch 390/400 - 0s - loss: 0.3746 - acc: 1.0000 Epoch 391/400 - 0s - loss: 0.3701 - acc: 1.0000 Epoch 392/400 - 0s - loss: 0.3620 - acc: 0.9565 Epoch 393/400 - 0s - loss: 0.3633 - acc: 1.0000 Epoch 394/400 - 0s - loss: 0.3570 - acc: 1.0000 Epoch 395/400 - 0s - loss: 0.3582 - acc: 0.9565 Epoch 396/400 - 0s - loss: 0.3537 - acc: 1.0000 Epoch 397/400 - 0s - loss: 0.3527 - acc: 1.0000 Epoch 398/400 - 0s - loss: 0.3507 - acc: 1.0000 Epoch 399/400 - 0s - loss: 0.3470 - acc: 1.0000 Epoch 400/400 - 0s - loss: 0.3487 - acc: 0.9565 23/23 [==============================] - 0s 5ms/step Model Accuracy: 100.00% (['A', 'B', 'C'], '->', 'D') (['B', 'C', 'D'], '->', 'E') (['C', 'D', 'E'], '->', 'F') (['D', 'E', 'F'], '->', 'G') (['E', 'F', 'G'], '->', 'H') (['F', 'G', 'H'], '->', 'I') (['G', 'H', 'I'], '->', 'J') (['H', 'I', 'J'], '->', 'K') (['I', 'J', 'K'], '->', 'L') (['J', 'K', 'L'], '->', 'M') (['K', 'L', 'M'], '->', 'N') (['L', 'M', 'N'], '->', 'O') (['M', 'N', 'O'], '->', 'P') (['N', 'O', 'P'], '->', 'Q') (['O', 'P', 'Q'], '->', 'R') (['P', 'Q', 'R'], '->', 'S') (['Q', 'R', 'S'], '->', 'T') (['R', 'S', 'T'], '->', 'U') (['S', 'T', 'U'], '->', 'V') (['T', 'U', 'V'], '->', 'W') (['U', 'V', 'W'], '->', 'X') (['V', 'W', 'X'], '->', 'Y') (['W', 'X', 'Y'], '->', 'Z')
(X.shape[1], X.shape[2]) # the input shape to LSTM layer with 32 neurons is given by dimensions of time-steps and features
Out[7]: (3, 1)
X.shape[0], y.shape[1] # number of examples and number of categorical outputs
Out[8]: (23, 26)

Memory and context

If this network is learning the way we would like, it should be robust to noise and also understand the relative context (in this case, where a prior letter occurs in the sequence).

I.e., we should be able to give it corrupted sequences, and it should produce reasonably correct predictions.

Make the following change to the code to test this out:

You Try!

  • We'll use "W" for our erroneous/corrupted data element
  • Add code at the end to predict on the following sequences:
    • 'WBC', 'WKL', 'WTU', 'DWF', 'MWO', 'VWW', 'GHW', 'JKW', 'PQW'
  • Notice any pattern? Hard to tell from a small sample, but if you play with it (trying sequences from different places in the alphabet, or different "corruption" letters, you'll notice patterns that give a hint at what the network is learning

The solution is in 060_DLByABr_05a-LSTM-Solution if you are lazy right now or get stuck.

Pretty cool... BUT

This alphabet example does seem a bit like "tennis without the net" since the original goal was to develop networks that could extract patterns from complex, ambiguous content like natural language or music, and we've been playing with a sequence (Roman alphabet) that is 100% deterministic and tiny in size.

First, go ahead and start 061_DLByABr_05b-LSTM-Language since it will take several minutes to produce its first output.

This latter script is taken 100% exactly as-is from the Keras library examples folder (https://github.com/fchollet/keras/blob/master/examples/lstm_text_generation.py) and uses precisely the logic we just learned, in order to learn and synthesize English language text from a single-author corpuse. The amazing thing is that the text is learned and generated one letter at a time, just like we did with the alphabet.

Compared to our earlier examples...

  • there is a minor difference in the way the inputs are encoded, using 1-hot vectors
  • and there is a significant difference in the way the outputs (predictions) are generated: instead of taking just the most likely output class (character) via argmax as we did before, this time we are treating the output as a distribution and sampling from the distribution.

Let's take a look at the code ... but even so, this will probably be something to come back to after fika or a long break, as the training takes about 5 minutes per epoch (late 2013 MBP CPU) and we need around 20 epochs (80 minutes!) to get good output.

import sys
sys.exit(0) #just to keep from accidentally running this code (that is already in 061_DLByABr_05b-LSTM-Language) HERE

'''Example script to generate text from Nietzsche's writings.

At least 20 epochs are required before the generated text
starts sounding coherent.

It is recommended to run this script on GPU, as recurrent
networks are quite computationally intensive.

If you try this script on new data, make sure your corpus
has at least ~100k characters. ~1M is better.
'''

from keras.models import Sequential
from keras.layers import Dense, Activation
from keras.layers import LSTM
from keras.optimizers import RMSprop
from keras.utils.data_utils import get_file
import numpy as np
import random
import sys

path = "../data/nietzsche.txt"
text = open(path).read().lower()
print('corpus length:', len(text))

chars = sorted(list(set(text)))
print('total chars:', len(chars))
char_indices = dict((c, i) for i, c in enumerate(chars))
indices_char = dict((i, c) for i, c in enumerate(chars))

# cut the text in semi-redundant sequences of maxlen characters
maxlen = 40
step = 3
sentences = []
next_chars = []
for i in range(0, len(text) - maxlen, step):
    sentences.append(text[i: i + maxlen])
    next_chars.append(text[i + maxlen])
print('nb sequences:', len(sentences))

print('Vectorization...')
X = np.zeros((len(sentences), maxlen, len(chars)), dtype=np.bool)
y = np.zeros((len(sentences), len(chars)), dtype=np.bool)
for i, sentence in enumerate(sentences):
    for t, char in enumerate(sentence):
        X[i, t, char_indices[char]] = 1
    y[i, char_indices[next_chars[i]]] = 1


# build the model: a single LSTM
print('Build model...')
model = Sequential()
model.add(LSTM(128, input_shape=(maxlen, len(chars))))
model.add(Dense(len(chars)))
model.add(Activation('softmax'))

optimizer = RMSprop(lr=0.01)
model.compile(loss='categorical_crossentropy', optimizer=optimizer)


def sample(preds, temperature=1.0):
    # helper function to sample an index from a probability array
    preds = np.asarray(preds).astype('float64')
    preds = np.log(preds) / temperature
    exp_preds = np.exp(preds)
    preds = exp_preds / np.sum(exp_preds)
    probas = np.random.multinomial(1, preds, 1)
    return np.argmax(probas)

# train the model, output generated text after each iteration
for iteration in range(1, 60):
    print()
    print('-' * 50)
    print('Iteration', iteration)
    model.fit(X, y, batch_size=128, epochs=1)

    start_index = random.randint(0, len(text) - maxlen - 1)

    for diversity in [0.2, 0.5, 1.0, 1.2]:
        print()
        print('----- diversity:', diversity)

        generated = ''
        sentence = text[start_index: start_index + maxlen]
        generated += sentence
        print('----- Generating with seed: "' + sentence + '"')
        sys.stdout.write(generated)

        for i in range(400):
            x = np.zeros((1, maxlen, len(chars)))
            for t, char in enumerate(sentence):
                x[0, t, char_indices[char]] = 1.

            preds = model.predict(x, verbose=0)[0]
            next_index = sample(preds, diversity)
            next_char = indices_char[next_index]

            generated += next_char
            sentence = sentence[1:] + next_char

            sys.stdout.write(next_char)
            sys.stdout.flush()
        print()
To exit: use 'exit', 'quit', or Ctrl-D.
SystemExit: 0

Gated Recurrent Unit (GRU)

In 2014, a new, promising design for RNN units called Gated Recurrent Unit was published (https://arxiv.org/abs/1412.3555)

GRUs have performed similarly to LSTMs, but are slightly simpler in design:

  • GRU has just two gates: "update" and "reset" (instead of the input, output, and forget in LSTM)
  • update controls how to modify (weight and keep) cell state
  • reset controls how new input is mixed (weighted) with/against memorized state
  • there is no output gate, so the cell state is propagated out -- i.e., there is no "hidden" state that is separate from the generated output state

Which one should you use for which applications? The jury is still out -- this is an area for experimentation!

Using GRUs in Keras

... is as simple as using the built-in GRU class (https://keras.io/layers/recurrent/)

If you are working with RNNs, spend some time with docs to go deeper, as we have just barely scratched the surface here, and there are many "knobs" to turn that will help things go right (or wrong).

'''Example script to generate text from Nietzsche's writings.

At least 20 epochs are required before the generated text
starts sounding coherent.

It is recommended to run this script on GPU, as recurrent
networks are quite computationally intensive.

If you try this script on new data, make sure your corpus
has at least ~100k characters. ~1M is better.
'''

from __future__ import print_function
from keras.models import Sequential
from keras.layers import Dense, Activation
from keras.layers import LSTM
from keras.optimizers import RMSprop
from keras.utils.data_utils import get_file
import numpy as np
import random
import sys

path = get_file('nietzsche.txt', origin='https://s3.amazonaws.com/text-datasets/nietzsche.txt')
text = open(path).read().lower()
print('corpus length:', len(text))

chars = sorted(list(set(text)))
print('total chars:', len(chars))
char_indices = dict((c, i) for i, c in enumerate(chars))
indices_char = dict((i, c) for i, c in enumerate(chars))

# cut the text in semi-redundant sequences of maxlen characters
maxlen = 40
step = 3
sentences = []
next_chars = []
for i in range(0, len(text) - maxlen, step):
    sentences.append(text[i: i + maxlen])
    next_chars.append(text[i + maxlen])
print('nb sequences:', len(sentences))

print('Vectorization...')
X = np.zeros((len(sentences), maxlen, len(chars)), dtype=np.bool)
y = np.zeros((len(sentences), len(chars)), dtype=np.bool)
for i, sentence in enumerate(sentences):
    for t, char in enumerate(sentence):
        X[i, t, char_indices[char]] = 1
    y[i, char_indices[next_chars[i]]] = 1
Downloading data from https://s3.amazonaws.com/text-datasets/nietzsche.txt 16384/600901 [..............................] - ETA: 0s 24576/600901 [>.............................] - ETA: 1s 57344/600901 [=>............................] - ETA: 1s 122880/600901 [=====>........................] - ETA: 0s 303104/600901 [==============>...............] - ETA: 0s 606208/600901 [==============================] - 0s 0us/step 614400/600901 [==============================] - 0s 0us/step corpus length: 600901 total chars: 59 nb sequences: 200287 Vectorization...
%sh
ls
cifar-10-batches-py cifar-10-python.tar.gz conf derby.log eventlogs ganglia library-install-logs logs movie.gif
len(sentences), maxlen, len(chars)
Out[13]: (200287, 40, 59)
X
Out[14]: array([[[False, False, False, ..., False, False, False], [False, False, False, ..., False, False, False], [False, False, False, ..., False, False, False], ..., [False, False, False, ..., False, False, False], [False, False, False, ..., False, False, False], [False, False, False, ..., False, False, False]], [[False, False, False, ..., False, False, False], [False, False, False, ..., False, False, False], [False, False, False, ..., False, False, False], ..., [False, False, False, ..., False, False, False], [False, False, False, ..., False, False, False], [False, False, False, ..., False, False, False]], [[False, False, False, ..., False, False, False], [ True, False, False, ..., False, False, False], [ True, False, False, ..., False, False, False], ..., [False, False, False, ..., False, False, False], [False, False, False, ..., False, False, False], [False, False, False, ..., False, False, False]], ..., [[False, False, False, ..., False, False, False], [False, False, False, ..., False, False, False], [False, False, False, ..., False, False, False], ..., [False, True, False, ..., False, False, False], [False, False, False, ..., False, False, False], [False, False, False, ..., False, False, False]], [[False, False, False, ..., False, False, False], [False, False, False, ..., False, False, False], [False, False, False, ..., False, False, False], ..., [False, False, False, ..., False, False, False], [False, False, False, ..., False, False, False], [False, False, False, ..., False, False, False]], [[False, False, False, ..., False, False, False], [False, False, False, ..., False, False, False], [False, False, False, ..., False, False, False], ..., [False, False, False, ..., False, False, False], [False, False, False, ..., False, False, False], [False, False, False, ..., False, False, False]]])

What Does Our Nietzsche Generator Produce?

Here are snapshots from middle and late in a training run.

Iteration 19

Iteration 19
Epoch 1/1
200287/200287 [==============================] - 262s - loss: 1.3908     

----- diversity: 0.2
----- Generating with seed: " apart from the value of such assertions"
 apart from the value of such assertions of the present of the supersially and the soul. the spirituality of the same of the soul. the protect and in the states to the supersially and the soul, in the supersially the supersially and the concerning and in the most conscience of the soul. the soul. the concerning and the substances, and the philosophers in the sing"--that is the most supersiall and the philosophers of the supersially of t

----- diversity: 0.5
----- Generating with seed: " apart from the value of such assertions"
 apart from the value of such assertions are more there is the scientific modern to the head in the concerning in the same old will of the excited of science. many all the possible concerning such laugher according to when the philosophers sense of men of univerself, the most lacked same depresse in the point, which is desires of a "good (who has senses on that one experiencess which use the concerning and in the respect of the same ori

----- diversity: 1.0
----- Generating with seed: " apart from the value of such assertions"
 apart from the value of such assertions expressions--are interest person from indeed to ordinapoon as or one of
the uphamy, state is rivel stimromannes are lot man of soul"--modile what he woulds hope in a riligiation, is conscience, and you amy, surposit to advanced torturily
and whorlon and perressing for accurcted with a lot us in view, of its own vanity of their natest"--learns, and dis predeceared from and leade, for oted those wi

----- diversity: 1.2
----- Generating with seed: " apart from the value of such assertions"
 apart from the value of such assertions of
rutould chinates
rested exceteds to more saarkgs testure carevan, accordy owing before fatherly rifiny,
thrurgins of novelts "frous inventive earth as dire!ition he
shate out of itst sacrifice, in this
mectalical
inworle, you
adome enqueres to its ighter. he often. once even with ded threaten"! an eebirelesifist.

lran innoting
with we canone acquire at them crarulents who had prote will out t

Iteration 32

Iteration 32
Epoch 1/1
200287/200287 [==============================] - 255s - loss: 1.3830     

----- diversity: 0.2
----- Generating with seed: " body, as a part of this external
world,"
 body, as a part of this external
world, and in the great present of the sort of the strangern that is and in the sologies and the experiences and the present of the present and science of the probably a subject of the subject of the morality and morality of the soul the experiences the morality of the experiences of the conscience in the soul and more the experiences the strangere and present the rest the strangere and individual of th

----- diversity: 0.5
----- Generating with seed: " body, as a part of this external
world,"
 body, as a part of this external
world, and in the morality of which we knows upon the english and insigning things be exception of
consequences of the man and explained its more in the senses for the same ordinary and the sortarians and subjects and simily in a some longing the destiny ordinary. man easily that has been the some subject and say, and and and and does not to power as all the reasonable and distinction of this one betray

----- diversity: 1.0
----- Generating with seed: " body, as a part of this external
world,"
 body, as a part of this external
world, surrespossifilice view and life fundamental worthing more sirer. holestly
and whan to be
dream. in whom hand that one downgk edplenius will almost eyes brocky that we wills stupid dor
oborbbill to be dimorable
great excet of ifysabless. the good take the historical yet right by guntend, and which fuens the irrelias in literals in finally to the same flild, conditioned when where prom. it has behi

----- diversity: 1.2
----- Generating with seed: " body, as a part of this external
world,"
 body, as a part of this external
world, easily achosed time mantur makeches on this
vanity, obcame-scompleises. but inquire-calr ever powerfully smorais: too-wantse; when thoue
conducting
unconstularly without least gainstyfyerfulled to wo
has upos
among uaxqunct what is mell "loves and
lamacity what mattery of upon the a. and which oasis seour schol
to power: the passion sparabrated will. in his europers raris! what seems to these her

Take alook at the anomalous behavior that started late in the training on one run ... What might have happened?

Iteration 38

Iteration 38
Epoch 1/1
200287/200287 [==============================] - 256s - loss: 7.6662     

----- diversity: 0.2
----- Generating with seed: "erable? for there is no
longer any ought"
erable? for there is no
longer any oughteesen a a  a= at ae i is es4 iei aatee he a a ac  oyte  in ioie  aan a atoe aie ion a atias a ooe o e tin exanat moe ao is aon e a ntiere t i in ate an on a  e as the a ion aisn ost  aed i  i ioiesn les?ane i ee to i o ate   o igice thi io an a xen an ae an teane one ee e alouieis asno oie on i a a ae s as n io a an e a ofe e  oe ehe it aiol  s a aeio st ior ooe an io e  ot io  o i  aa9em aan ev a

----- diversity: 0.5
----- Generating with seed: "erable? for there is no
longer any ought"
erable? for there is no
longer any oughteese a on eionea] aooooi ate uo e9l hoe atae s in eaae an  on io]e nd ast aais  ta e  od iia ng ac ee er ber  in ==st a se is ao  o e as aeian iesee tee otiane o oeean a ieatqe o  asnone anc 
 oo a t
tee sefiois to an at in ol asnse an o e e oo  ie oae asne at a ait iati oese se a e p ie peen iei ien   o oot inees engied evone t oen oou atipeem a sthen ion assise ti a a s itos io ae an  eees as oi

----- diversity: 1.0
----- Generating with seed: "erable? for there is no
longer any ought"
erable? for there is no
longer any oughteena te e ore te beosespeehsha ieno atit e ewge ou ino oo oee coatian aon ie ac aalle e a o  die eionae oa att uec a acae ao a  an eess as
 o  i a io  a   oe a  e is as oo in ene xof o  oooreeg ta m eon al iii n p daesaoe n ite o ane tio oe anoo t ane
s i e tioo ise s a asi e ana ooe ote soueeon io on atieaneyc ei it he se it is ao e an ime  ane on eronaa ee itouman io e ato an ale  a mae taoa ien

----- diversity: 1.2
----- Generating with seed: "erable? for there is no
longer any ought"
erable? for there is no
longer any oughti o aa e2senoees yi i e datssateal toeieie e a o zanato aal arn aseatli oeene aoni le eoeod t aes a isoee tap  e o . is  oi astee an ea titoe e a exeeee thui itoan ain eas a e bu inen ao ofa ie e e7n anae ait ie a ve  er inen  ite
as oe of  heangi eestioe orasb e fie o o o  a  eean o ot odeerean io io oae ooe ne " e  istee esoonae e terasfioees asa ehainoet at e ea ai esoon   ano a p eesas e aitie

(raaz)

'Mind the Hype' around AI

Pay attention to biases in various media.

When your managers get "psyched" about how AI will solve all the problems and your sales teams are dreaming hard - keep it cool and manage their expectations as a practical data scientist who is humbled by the hard reality of additions, multiplications and conditionals under the hood.

'Mind the Ethics'

Don't forget to ask how your data science pipelines could adversely affect peoples: