0% found this document useful (0 votes)
50 views8 pages

Recurrent Neural Network (RNN)

Recurrent Neural Networks (RNNs) are specialized neural networks designed for processing sequential data, such as time series and natural language. They utilize hidden states to retain memory of previous inputs, allowing them to understand and predict sequences effectively. Common activation functions used in RNNs include Sigmoid, Tanh, and ReLU, each with its own advantages and challenges.

Uploaded by

zoti15-6145
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
50 views8 pages

Recurrent Neural Network (RNN)

Recurrent Neural Networks (RNNs) are specialized neural networks designed for processing sequential data, such as time series and natural language. They utilize hidden states to retain memory of previous inputs, allowing them to understand and predict sequences effectively. Common activation functions used in RNNs include Sigmoid, Tanh, and ReLU, each with its own advantages and challenges.

Uploaded by

zoti15-6145
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd

Recurrent

Network
Neural Presented By Presented to
Most. Jannatul Firdousi Zoti Dr. Md Alamgir Kabir

(RNN)
ID: 222-15-6145 Assistant Professor
Department of Computer Science & Department of Computer Science &
Engineering Engineering
Daffodil International University Daffodil International University
What is Recurrent Neural Network
Recurrent Neural Networks (RNNs) are a type of artificial neural network designed to process sequences of

(RNN)?
data. They work especially well for jobs requiring sequences, such as time series data, voice, natural
language, and other activities.
• Input Layer: Receives the initial element of the sequence.
• Hidden Layer: Processes the current input and the previous hidden state.
• Activation Function: Adds non-linearity to help the network learn complex patterns.
• Output Layer: Produces predictions, such as the next word in a sequence.
• Recurrent Connection: Links the hidden layers across time.
Recurrent Neural Network Architecture

RNNs are a type of neural network with hidden states and allow past outputs to be used as
inputs. They usually go like this:
• Each RNN cell takes two inputs: the current input x and the previous hidden state h.
• These are combined using weights, added together, passed through a tanh activation
function, and produce the new hidden state.
• The hidden state is passed to the next cell and also used to generate the output.
• By repeating this across time steps (x0,x1​) the RNN keeps memory of previous inputs, helping
it understand sequences.
How Recurrent Neural Network works?
• The input layer ‘x’ takes in the input to the neural network and processes it and passes it onto the
middle layer.
• The middle layer ‘h’ can consist of multiple hidden layers, each with its own activation functions
and weights and biases. If you have a neural network where the various parameters of different
hidden layers are not affected by the previous layer, ie: the neural network does not have
memory, then you can use a recurrent neural network.
Common Activation Function of
The Sigmoid Function is to interpret the output
Recurrent Neural Network
as probabilities or to control gates that decide
how much information to retain or forget.
However, the sigmoid function is prone to the
vanishing gradient problem (explained after this),
which makes it less ideal for deeper networks.
The Tanh (Hyperbolic Tangent) Function,
which is often used because it outputs values
centered around zero, which helps with better
gradient flow and easier learning of long-term
dependencies.
The ReLU (Rectified Linear Unit) might cause
issues with exploding gradients due to its
unbounded nature. However, variants such as
Leaky ReLU and Parametric ReLU have been used
to mitigate some of these issues.
Types of Recurrent Neural Networks
There are four types of Recurrent Neural Networks:
• One to One
• One to Many
• Many to One
• Many to Many
Application of Recurrent neural
network
Thank
You

You might also like