DAY 48-100 DAYS MLCODE: RNN Example

My Tech World

DAY 48-100 DAYS MLCODE: RNN Example

December 28, 2018 100-Days-Of-ML-Code blog 0

In the previous blog , we discussed about RNN and simple example using TensorFlow. In this blog, we’ll try to predict the time series using RNN.

First let’s generate a time series data.

Generate a Time series data for equation y=sin(t)+ϵ

def sin_wave(t):
return (np.sin(t) + 0.2 )

Create a routine to generate the batch data using the time series equation

def generate_batch(batch_size, n_steps):
t0 = np.random.rand(batch_size, 1) * (t_max – t_min – n_steps * resolution)
Ts = t0 + np.arange(0., n_steps + 1) * resolution
ys = sin_wave(Ts)
return ys[:, :-1].reshape(-1, n_steps, 1), ys[:, 1:].reshape(-1, n_steps, 1)

Now plot the time series data to review the data

t = np.linspace(t_min, t_max, int((t_max – t_min) / resolution))

n_steps = 25
time_instance = np.linspace(12.2, 12.2 + resolution * (n_steps + 1), n_steps + 1)

plt.figure(figsize=(11,4))
plt.subplot(121)
plt.title(“A time series (generated)”, fontsize=14)
plt.plot(t, sin_wave(t), label=r”sin(t)+.2″)
plt.plot(time_instance[:-1], sin_wave(time_instance[:-1]), “b-“, linewidth=3, label=”A training instance”)
plt.legend(loc=”lower left”, fontsize=14)
plt.axis([0, 30, -17, 13])
plt.xlabel(“Time”)
plt.ylabel(“Value”)

plt.subplot(122)
plt.title(“A training instance”, fontsize=14)
plt.plot(time_instance[:-1], sin_wave(time_instance[:-1]), “bo”, markersize=10, label=”instance”)
plt.plot(time_instance[1:], sin_wave(time_instance[1:]), “w*”, markersize=10, label=”target”)
plt.legend(loc=”upper left”)
plt.xlabel(“Time”)

Time Series data
Time Series data

Declare the variables and place holder for the model

n_steps = 25
n_inputs = 1
n_neurons = 100
n_outputs = 1
tf.reset_default_graph()

X = tf.placeholder(tf.float32, [None, n_steps, n_inputs])
y = tf.placeholder(tf.float32, [None, n_steps, n_outputs])

Let’s create the cell for the RNN

cell = tf.contrib.rnn.OutputProjectionWrapper(
tf.contrib.rnn.BasicRNNCell(num_units=n_neurons, activation=tf.nn.relu),
output_size=n_outputs)

Unroll the data using Dynamic_RNN method

outputs, states = tf.nn.dynamic_rnn(cell, X, dtype=tf.float32)

We’ll use MSE as the loss calculation and Adam Optimizer to optimize the loss.

learning_rate = 0.001

loss = tf.reduce_mean(tf.square(outputs – y)) # MSE
optimizer = tf.train.AdamOptimizer(learning_rate=learning_rate)
training_op = optimizer.minimize(loss)

init = tf.global_variables_initializer()
saver = tf.train.Saver()

Now let’s train the model using the time series data

n_iterations = 1500
batch_size = 50

with tf.Session() as sess:
init.run()
for iteration in range(n_iterations):
X_batch, y_batch = generate_batch(batch_size, n_steps)
sess.run(training_op, feed_dict={X: X_batch, y: y_batch})
if iteration % 100 == 0:
mse = loss.eval(feed_dict={X: X_batch, y: y_batch})
print(iteration, “\tMSE:”, mse)

saver.save(sess, “./my_time_series_model”)

Output:
0 MSE: 0.44488195
100 MSE: 0.009171729
200 MSE: 0.0031511707
300 MSE: 0.0008719797
400 MSE: 0.00057317223
500 MSE: 0.0002798651
600 MSE: 0.00043289844
700 MSE: 0.0004396626
800 MSE: 0.00043403797
900 MSE: 0.00041935992
1000 MSE: 0.00035199794
1100 MSE: 0.0004281518
1200 MSE: 0.00047081066
1300 MSE: 0.00031640596
1400 MSE: 0.0003123126

Now let’s predict the values using our trained model

with tf.Session() as sess:
saver.restore(sess, “./my_time_series_model”)

X_new = sin_wave(np.array(time_instance[:-1].reshape(-1, n_steps, n_inputs)))
y_pred = sess.run(outputs, feed_dict={X: X_new})

Plot the predicted data to verify the result.

plt.title(“Testing the model”, fontsize=14)
plt.plot(time_instance[:-1], sin_wave(time_instance[:-1]), “bo”, markersize=10, label=”instance”)
plt.plot(time_instance[1:], sin_wave(time_instance[1:]), “w*”, markersize=10, label=”target”)
plt.plot(time_instance[1:], y_pred[0,:,0], “r.”, markersize=10, label=”prediction”)
plt.legend(loc=”upper left”)
plt.xlabel(“Time”)

Prediction by RNN
Prediction by RNN

So we can see that we have predicted the value correctly. This brings end of our todays blog. You can find the entire code here.