0% found this document useful (0 votes)
21 views

Exp 7 Text Sequence Generator LSTM

The document discusses tokenizing and encoding a text sample to create training data for an LSTM sequence prediction model. It 1) tokenizes the text into words, 2) encodes the words as integers, 3) splits the text into overlapping sequences of varying lengths to create the input data, and 4) constructs the training and testing data from the sequences for model training.

Uploaded by

Thota sivasri
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
21 views

Exp 7 Text Sequence Generator LSTM

The document discusses tokenizing and encoding a text sample to create training data for an LSTM sequence prediction model. It 1) tokenizes the text into words, 2) encodes the words as integers, 3) splits the text into overlapping sequences of varying lengths to create the input data, and 4) constructs the training and testing data from the sequences for model training.

Uploaded by

Thota sivasri
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 12

exp-7-text-sequence-generator-lstm

September 25, 2023

1 Sequence to Sequense Prediction of Text


[1]: import tensorflow as tf
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import LSTM, Dense, Embedding

[2]: t_data = 'I am a beautiful person who has been given all the best things in␣
↪life but was not given power to work on them.'

[3]: t_data

[3]: 'I am a beautiful person who has been given all the best things in life but was
not given power to work on them.'

2 Tokenization and Encoding


[4]: tokenizer = tf.keras.preprocessing.text.Tokenizer()
tokenizer.fit_on_texts([t_data])
total_words = len(tokenizer.word_index) + 1

[5]: word_ids = tokenizer.word_index


print(word_ids)

{'given': 1, 'i': 2, 'am': 3, 'a': 4, 'beautiful': 5, 'person': 6, 'who': 7,


'has': 8, 'been': 9, 'all': 10, 'the': 11, 'best': 12, 'things': 13, 'in': 14,
'life': 15, 'but': 16, 'was': 17, 'not': 18, 'power': 19, 'to': 20, 'work': 21,
'on': 22, 'them': 23}

3 Convert the text into sequences for training the model


[6]: input_seq = []
for line in t_data.split("."):
token_list = tokenizer.texts_to_sequences([line])[0]
print(token_list)
print(len(token_list))
for i in range(1, len(token_list)):

1
num_of_words_per_seq = token_list[:i+1]
input_seq.append(num_of_words_per_seq)
print(input_seq)

[2, 3, 4, 5, 6, 7, 8, 9, 1, 10, 11, 12, 13, 14, 15, 16, 17, 18, 1, 19, 20, 21,
22, 23]
24
[]
0
[[2, 3], [2, 3, 4], [2, 3, 4, 5], [2, 3, 4, 5, 6], [2, 3, 4, 5, 6, 7], [2, 3, 4,
5, 6, 7, 8], [2, 3, 4, 5, 6, 7, 8, 9], [2, 3, 4, 5, 6, 7, 8, 9, 1], [2, 3, 4, 5,
6, 7, 8, 9, 1, 10], [2, 3, 4, 5, 6, 7, 8, 9, 1, 10, 11], [2, 3, 4, 5, 6, 7, 8,
9, 1, 10, 11, 12], [2, 3, 4, 5, 6, 7, 8, 9, 1, 10, 11, 12, 13], [2, 3, 4, 5, 6,
7, 8, 9, 1, 10, 11, 12, 13, 14], [2, 3, 4, 5, 6, 7, 8, 9, 1, 10, 11, 12, 13, 14,
15], [2, 3, 4, 5, 6, 7, 8, 9, 1, 10, 11, 12, 13, 14, 15, 16], [2, 3, 4, 5, 6, 7,
8, 9, 1, 10, 11, 12, 13, 14, 15, 16, 17], [2, 3, 4, 5, 6, 7, 8, 9, 1, 10, 11,
12, 13, 14, 15, 16, 17, 18], [2, 3, 4, 5, 6, 7, 8, 9, 1, 10, 11, 12, 13, 14, 15,
16, 17, 18, 1], [2, 3, 4, 5, 6, 7, 8, 9, 1, 10, 11, 12, 13, 14, 15, 16, 17, 18,
1, 19], [2, 3, 4, 5, 6, 7, 8, 9, 1, 10, 11, 12, 13, 14, 15, 16, 17, 18, 1, 19,
20], [2, 3, 4, 5, 6, 7, 8, 9, 1, 10, 11, 12, 13, 14, 15, 16, 17, 18, 1, 19, 20,
21], [2, 3, 4, 5, 6, 7, 8, 9, 1, 10, 11, 12, 13, 14, 15, 16, 17, 18, 1, 19, 20,
21, 22], [2, 3, 4, 5, 6, 7, 8, 9, 1, 10, 11, 12, 13, 14, 15, 16, 17, 18, 1, 19,
20, 21, 22, 23]]

[7]: print(len(input_seq))

23

[8]: maximum_length_of_seq = max(len(seq) for seq in input_seq)


maximum_length_of_seq

[8]: 24

[9]: input_seq_padded = tf.keras.preprocessing.sequence.pad_sequences(input_seq,␣


↪maxlen=maximum_length_of_seq, padding='pre')

print(input_seq_padded)

[[ 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 3]
[ 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 3 4]
[ 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 3 4 5]
[ 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 3 4 5 6]
[ 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 3 4 5 6 7]
[ 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 3 4 5 6 7 8]
[ 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 3 4 5 6 7 8 9]
[ 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 3 4 5 6 7 8 9 1]
[ 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 3 4 5 6 7 8 9 1 10]
[ 0 0 0 0 0 0 0 0 0 0 0 0 0 2 3 4 5 6 7 8 9 1 10 11]
[ 0 0 0 0 0 0 0 0 0 0 0 0 2 3 4 5 6 7 8 9 1 10 11 12]

2
[ 0 0 0 0 0 0 0 0 0 0 0 2 3 4 5 6 7 8 9 1 10 11 12 13]
[ 0 0 0 0 0 0 0 0 0 0 2 3 4 5 6 7 8 9 1 10 11 12 13 14]
[ 0 0 0 0 0 0 0 0 0 2 3 4 5 6 7 8 9 1 10 11 12 13 14 15]
[ 0 0 0 0 0 0 0 0 2 3 4 5 6 7 8 9 1 10 11 12 13 14 15 16]
[ 0 0 0 0 0 0 0 2 3 4 5 6 7 8 9 1 10 11 12 13 14 15 16 17]
[ 0 0 0 0 0 0 2 3 4 5 6 7 8 9 1 10 11 12 13 14 15 16 17 18]
[ 0 0 0 0 0 2 3 4 5 6 7 8 9 1 10 11 12 13 14 15 16 17 18 1]
[ 0 0 0 0 2 3 4 5 6 7 8 9 1 10 11 12 13 14 15 16 17 18 1 19]
[ 0 0 0 2 3 4 5 6 7 8 9 1 10 11 12 13 14 15 16 17 18 1 19 20]
[ 0 0 2 3 4 5 6 7 8 9 1 10 11 12 13 14 15 16 17 18 1 19 20 21]
[ 0 2 3 4 5 6 7 8 9 1 10 11 12 13 14 15 16 17 18 1 19 20 21 22]
[ 2 3 4 5 6 7 8 9 1 10 11 12 13 14 15 16 17 18 1 19 20 21 22 23]]

4 Construct the training and testing DATA


[10]: X, y = input_seq_padded[:, :-1], input_seq_padded[:, -1]

[11]: y = tf.keras.utils.to_categorical(y, num_classes=total_words)

[12]: print(X)

[[ 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2]
[ 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 3]
[ 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 3 4]
[ 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 3 4 5]
[ 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 3 4 5 6]
[ 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 3 4 5 6 7]
[ 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 3 4 5 6 7 8]
[ 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 3 4 5 6 7 8 9]
[ 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 3 4 5 6 7 8 9 1]
[ 0 0 0 0 0 0 0 0 0 0 0 0 0 2 3 4 5 6 7 8 9 1 10]
[ 0 0 0 0 0 0 0 0 0 0 0 0 2 3 4 5 6 7 8 9 1 10 11]
[ 0 0 0 0 0 0 0 0 0 0 0 2 3 4 5 6 7 8 9 1 10 11 12]
[ 0 0 0 0 0 0 0 0 0 0 2 3 4 5 6 7 8 9 1 10 11 12 13]
[ 0 0 0 0 0 0 0 0 0 2 3 4 5 6 7 8 9 1 10 11 12 13 14]
[ 0 0 0 0 0 0 0 0 2 3 4 5 6 7 8 9 1 10 11 12 13 14 15]
[ 0 0 0 0 0 0 0 2 3 4 5 6 7 8 9 1 10 11 12 13 14 15 16]
[ 0 0 0 0 0 0 2 3 4 5 6 7 8 9 1 10 11 12 13 14 15 16 17]
[ 0 0 0 0 0 2 3 4 5 6 7 8 9 1 10 11 12 13 14 15 16 17 18]
[ 0 0 0 0 2 3 4 5 6 7 8 9 1 10 11 12 13 14 15 16 17 18 1]
[ 0 0 0 2 3 4 5 6 7 8 9 1 10 11 12 13 14 15 16 17 18 1 19]
[ 0 0 2 3 4 5 6 7 8 9 1 10 11 12 13 14 15 16 17 18 1 19 20]
[ 0 2 3 4 5 6 7 8 9 1 10 11 12 13 14 15 16 17 18 1 19 20 21]
[ 2 3 4 5 6 7 8 9 1 10 11 12 13 14 15 16 17 18 1 19 20 21 22]]

[13]: print(y)

3
[[0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]
[0. 0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]
[0. 0. 0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]
[0. 0. 0. 0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]
[0. 0. 0. 0. 0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]
[0. 0. 0. 0. 0. 0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]
[0. 0. 0. 0. 0. 0. 0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]
[0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]
[0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]
[0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]
[0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]
[0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]
[0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0.]
[0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0.]
[0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 0.]
[0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 1. 0. 0. 0. 0. 0. 0.]
[0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 1. 0. 0. 0. 0. 0.]
[0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]
[0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 1. 0. 0. 0. 0.]
[0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 1. 0. 0. 0.]
[0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 1. 0. 0.]
[0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 1. 0.]
[0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 1.]]

5 LSTM Model Building using Tensorflow Library


[14]: model = Sequential()
model.add(Embedding(total_words, 50,input_length=maximum_length_of_seq-1))
model.add(LSTM(100))
model.add(Dense(total_words, activation='softmax'))

[15]: model.summary()

Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
embedding (Embedding) (None, 23, 50) 1200

lstm (LSTM) (None, 100) 60400

dense (Dense) (None, 24) 2424

=================================================================
Total params: 64,024
Trainable params: 64,024
Non-trainable params: 0

4
_________________________________________________________________

6 Compile the model


[16]: model.compile(loss='categorical_crossentropy', optimizer='adam',␣
↪metrics=['accuracy'])

7 Training the model


[37]: model.fit(X, y, epochs=99, verbose=1)

Epoch 1/99
1/1 [==============================] - 0s 16ms/step - loss: 0.4417 - accuracy:
0.9565
Epoch 2/99
1/1 [==============================] - 0s 17ms/step - loss: 0.5865 - accuracy:
0.8261
Epoch 3/99
1/1 [==============================] - 0s 16ms/step - loss: 0.6158 - accuracy:
0.7391
Epoch 4/99
1/1 [==============================] - 0s 15ms/step - loss: 0.6683 - accuracy:
0.7391
Epoch 5/99
1/1 [==============================] - 0s 12ms/step - loss: 0.4220 - accuracy:
1.0000
Epoch 6/99
1/1 [==============================] - 0s 12ms/step - loss: 0.6846 - accuracy:
0.5217
Epoch 7/99
1/1 [==============================] - 0s 4ms/step - loss: 0.3843 - accuracy:
1.0000
Epoch 8/99
1/1 [==============================] - 0s 6ms/step - loss: 0.5402 - accuracy:
0.7826
Epoch 9/99
1/1 [==============================] - 0s 6ms/step - loss: 0.4710 - accuracy:
0.9130
Epoch 10/99
1/1 [==============================] - 0s 16ms/step - loss: 0.3784 - accuracy:
1.0000
Epoch 11/99
1/1 [==============================] - 0s 18ms/step - loss: 0.4984 - accuracy:
0.8696
Epoch 12/99
1/1 [==============================] - 0s 15ms/step - loss: 0.4065 - accuracy:

5
0.9565
Epoch 13/99
1/1 [==============================] - 0s 17ms/step - loss: 0.3802 - accuracy:
1.0000
Epoch 14/99
1/1 [==============================] - 0s 16ms/step - loss: 0.4489 - accuracy:
0.9565
Epoch 15/99
1/1 [==============================] - 0s 16ms/step - loss: 0.3820 - accuracy:
1.0000
Epoch 16/99
1/1 [==============================] - 0s 15ms/step - loss: 0.3602 - accuracy:
1.0000
Epoch 17/99
1/1 [==============================] - 0s 14ms/step - loss: 0.3982 - accuracy:
0.9565
Epoch 18/99
1/1 [==============================] - 0s 14ms/step - loss: 0.3717 - accuracy:
1.0000
Epoch 19/99
1/1 [==============================] - 0s 13ms/step - loss: 0.3408 - accuracy:
1.0000
Epoch 20/99
1/1 [==============================] - 0s 1ms/step - loss: 0.3602 - accuracy:
1.0000
Epoch 21/99
1/1 [==============================] - 0s 4ms/step - loss: 0.3641 - accuracy:
1.0000
Epoch 22/99
1/1 [==============================] - 0s 18ms/step - loss: 0.3372 - accuracy:
1.0000
Epoch 23/99
1/1 [==============================] - 0s 18ms/step - loss: 0.3319 - accuracy:
1.0000
Epoch 24/99
1/1 [==============================] - 0s 16ms/step - loss: 0.3473 - accuracy:
1.0000
Epoch 25/99
1/1 [==============================] - 0s 15ms/step - loss: 0.3393 - accuracy:
1.0000
Epoch 26/99
1/1 [==============================] - 0s 17ms/step - loss: 0.3222 - accuracy:
1.0000
Epoch 27/99
1/1 [==============================] - 0s 16ms/step - loss: 0.3257 - accuracy:
1.0000
Epoch 28/99
1/1 [==============================] - 0s 15ms/step - loss: 0.3330 - accuracy:

6
1.0000
Epoch 29/99
1/1 [==============================] - 0s 13ms/step - loss: 0.3230 - accuracy:
1.0000
Epoch 30/99
1/1 [==============================] - 0s 12ms/step - loss: 0.3125 - accuracy:
1.0000
Epoch 31/99
1/1 [==============================] - 0s 3ms/step - loss: 0.3156 - accuracy:
1.0000
Epoch 32/99
1/1 [==============================] - 0s 7ms/step - loss: 0.3179 - accuracy:
1.0000
Epoch 33/99
1/1 [==============================] - 0s 5ms/step - loss: 0.3089 - accuracy:
1.0000
Epoch 34/99
1/1 [==============================] - 0s 20ms/step - loss: 0.3028 - accuracy:
1.0000
Epoch 35/99
1/1 [==============================] - 0s 16ms/step - loss: 0.3048 - accuracy:
1.0000
Epoch 36/99
1/1 [==============================] - 0s 17ms/step - loss: 0.3044 - accuracy:
1.0000
Epoch 37/99
1/1 [==============================] - 0s 14ms/step - loss: 0.2979 - accuracy:
1.0000
Epoch 38/99
1/1 [==============================] - 0s 19ms/step - loss: 0.2938 - accuracy:
1.0000
Epoch 39/99
1/1 [==============================] - 0s 16ms/step - loss: 0.2947 - accuracy:
1.0000
Epoch 40/99
1/1 [==============================] - 0s 17ms/step - loss: 0.2935 - accuracy:
1.0000
Epoch 41/99
1/1 [==============================] - 0s 17ms/step - loss: 0.2885 - accuracy:
1.0000
Epoch 42/99
1/1 [==============================] - 0s 15ms/step - loss: 0.2859 - accuracy:
1.0000
Epoch 43/99
1/1 [==============================] - 0s 16ms/step - loss: 0.2860 - accuracy:
1.0000
Epoch 44/99
1/1 [==============================] - 0s 18ms/step - loss: 0.2842 - accuracy:

7
1.0000
Epoch 45/99
1/1 [==============================] - 0s 6ms/step - loss: 0.2804 - accuracy:
1.0000
Epoch 46/99
1/1 [==============================] - 0s 7ms/step - loss: 0.2782 - accuracy:
1.0000
Epoch 47/99
1/1 [==============================] - 0s 5ms/step - loss: 0.2777 - accuracy:
1.0000
Epoch 48/99
1/1 [==============================] - 0s 6ms/step - loss: 0.2756 - accuracy:
1.0000
Epoch 49/99
1/1 [==============================] - 0s 2ms/step - loss: 0.2723 - accuracy:
1.0000
Epoch 50/99
1/1 [==============================] - 0s 8ms/step - loss: 0.2705 - accuracy:
1.0000
Epoch 51/99
1/1 [==============================] - 0s 6ms/step - loss: 0.2695 - accuracy:
1.0000
Epoch 52/99
1/1 [==============================] - 0s 8ms/step - loss: 0.2672 - accuracy:
1.0000
Epoch 53/99
1/1 [==============================] - 0s 8ms/step - loss: 0.2644 - accuracy:
1.0000
Epoch 54/99
1/1 [==============================] - 0s 7ms/step - loss: 0.2628 - accuracy:
1.0000
Epoch 55/99
1/1 [==============================] - 0s 9ms/step - loss: 0.2616 - accuracy:
1.0000
Epoch 56/99
1/1 [==============================] - 0s 7ms/step - loss: 0.2595 - accuracy:
1.0000
Epoch 57/99
1/1 [==============================] - 0s 17ms/step - loss: 0.2572 - accuracy:
1.0000
Epoch 58/99
1/1 [==============================] - 0s 13ms/step - loss: 0.2557 - accuracy:
1.0000
Epoch 59/99
1/1 [==============================] - 0s 15ms/step - loss: 0.2542 - accuracy:
1.0000
Epoch 60/99
1/1 [==============================] - 0s 12ms/step - loss: 0.2522 - accuracy:

8
1.0000
Epoch 61/99
1/1 [==============================] - 0s 16ms/step - loss: 0.2502 - accuracy:
1.0000
Epoch 62/99
1/1 [==============================] - 0s 13ms/step - loss: 0.2487 - accuracy:
1.0000
Epoch 63/99
1/1 [==============================] - 0s 1ms/step - loss: 0.2471 - accuracy:
1.0000
Epoch 64/99
1/1 [==============================] - 0s 5ms/step - loss: 0.2451 - accuracy:
1.0000
Epoch 65/99
1/1 [==============================] - 0s 5ms/step - loss: 0.2433 - accuracy:
1.0000
Epoch 66/99
1/1 [==============================] - 0s 6ms/step - loss: 0.2418 - accuracy:
1.0000
Epoch 67/99
1/1 [==============================] - 0s 17ms/step - loss: 0.2401 - accuracy:
1.0000
Epoch 68/99
1/1 [==============================] - 0s 13ms/step - loss: 0.2382 - accuracy:
1.0000
Epoch 69/99
1/1 [==============================] - 0s 2ms/step - loss: 0.2366 - accuracy:
1.0000
Epoch 70/99
1/1 [==============================] - 0s 5ms/step - loss: 0.2351 - accuracy:
1.0000
Epoch 71/99
1/1 [==============================] - 0s 17ms/step - loss: 0.2333 - accuracy:
1.0000
Epoch 72/99
1/1 [==============================] - 0s 17ms/step - loss: 0.2316 - accuracy:
1.0000
Epoch 73/99
1/1 [==============================] - 0s 17ms/step - loss: 0.2301 - accuracy:
1.0000
Epoch 74/99
1/1 [==============================] - 0s 17ms/step - loss: 0.2284 - accuracy:
1.0000
Epoch 75/99
1/1 [==============================] - 0s 15ms/step - loss: 0.2267 - accuracy:
1.0000
Epoch 76/99
1/1 [==============================] - 0s 14ms/step - loss: 0.2251 - accuracy:

9
1.0000
Epoch 77/99
1/1 [==============================] - 0s 12ms/step - loss: 0.2236 - accuracy:
1.0000
Epoch 78/99
1/1 [==============================] - 0s 12ms/step - loss: 0.2219 - accuracy:
1.0000
Epoch 79/99
1/1 [==============================] - 0s 6ms/step - loss: 0.2203 - accuracy:
1.0000
Epoch 80/99
1/1 [==============================] - 0s 6ms/step - loss: 0.2188 - accuracy:
1.0000
Epoch 81/99
1/1 [==============================] - 0s 17ms/step - loss: 0.2172 - accuracy:
1.0000
Epoch 82/99
1/1 [==============================] - 0s 17ms/step - loss: 0.2156 - accuracy:
1.0000
Epoch 83/99
1/1 [==============================] - 0s 17ms/step - loss: 0.2140 - accuracy:
1.0000
Epoch 84/99
1/1 [==============================] - 0s 12ms/step - loss: 0.2125 - accuracy:
1.0000
Epoch 85/99
1/1 [==============================] - 0s 13ms/step - loss: 0.2109 - accuracy:
1.0000
Epoch 86/99
1/1 [==============================] - 0s 15ms/step - loss: 0.2094 - accuracy:
1.0000
Epoch 87/99
1/1 [==============================] - 0s 0s/step - loss: 0.2079 - accuracy:
1.0000
Epoch 88/99
1/1 [==============================] - 0s 5ms/step - loss: 0.2063 - accuracy:
1.0000
Epoch 89/99
1/1 [==============================] - 0s 16ms/step - loss: 0.2048 - accuracy:
1.0000
Epoch 90/99
1/1 [==============================] - 0s 17ms/step - loss: 0.2033 - accuracy:
1.0000
Epoch 91/99
1/1 [==============================] - 0s 17ms/step - loss: 0.2018 - accuracy:
1.0000
Epoch 92/99
1/1 [==============================] - 0s 16ms/step - loss: 0.2002 - accuracy:

10
1.0000
Epoch 93/99
1/1 [==============================] - 0s 16ms/step - loss: 0.1988 - accuracy:
1.0000
Epoch 94/99
1/1 [==============================] - 0s 14ms/step - loss: 0.1973 - accuracy:
1.0000
Epoch 95/99
1/1 [==============================] - 0s 866us/step - loss: 0.1958 - accuracy:
1.0000
Epoch 96/99
1/1 [==============================] - 0s 4ms/step - loss: 0.1943 - accuracy:
1.0000
Epoch 97/99
1/1 [==============================] - 0s 18ms/step - loss: 0.1928 - accuracy:
1.0000
Epoch 98/99
1/1 [==============================] - 0s 16ms/step - loss: 0.1914 - accuracy:
1.0000
Epoch 99/99
1/1 [==============================] - 0s 17ms/step - loss: 0.1899 - accuracy:
1.0000

[37]: <keras.callbacks.History at 0x15625b2dcd0>

8 Now Lets check how the model generates Text


[38]: your_seed_text = "You:I"
next_words = 23

[39]: for _ in range(next_words):


token_list = tokenizer.texts_to_sequences([your_seed_text])[0]
token_list = tf.keras.preprocessing.sequence.pad_sequences([token_list],␣
↪maxlen=maximum_length_of_seq-1, padding='pre')

predicted = model.predict(token_list, verbose=0)


predicted_word_index = tf.argmax(predicted, axis=-1)
output_word = ""
for word, index in tokenizer.word_index.items():
if index == predicted_word_index:
output_word = word
break
your_seed_text += " " + output_word
print(f"Model: {your_seed_text.replace('You:', '')}")

Model: I am a beautiful person who has been given all the best things in life
but was not given power to work on them

11
[40]: print(your_seed_text)

You:I am a beautiful person who has been given all the best things in life but
was not given power to work on them

[20]:

12

You might also like