Lecture Notes 3
Lecture Notes 3
The output needs to be either turn, drive fast or slowly, apply brakes,
etc. As you might think, the relationship between input variables and
output variables is pretty complex in nature. Hence, the traditional
machine learning algorithm finds it hard to map this kind of relationship.
Deep learning outperforms machine learning algorithms in such
situations as it is able to learn those nonlinear features as well.
• Dendrites
• Cell body
• Terminals
23
Chapter 1 Introduction to Machine Learning
24
Chapter 1 Introduction to Machine Learning
Let’s say we have two binary inputs (X1, X2) and the weights of their
respective connections (W1, W2). The weights can be considered similar
to the coefficients of input variables in traditional machine learning.
These weights indicate how important the particular input feature is in the
model. The summation function calculates the total sum of the input. The
activation function then uses this total summated value and gives a certain
output, as shown in Figure 1-12. Activation is sort of a decision-making
function. Based on the type of activation function used, it gives an output
accordingly. There are different types of activation functions that can be
used in a neural network layer.
25
Chapter 1 Introduction to Machine Learning
A
ctivation Functions
Activation functions play a critical role in neural networks as the output
varies based on the type of activation function used. There are typically
four main activation functions that are widely used. We will briefly cover
these in this section.
1
f ( x) =
1 + e- x
H
yperbolic Tangent
The other activation function is known as the hyperbolic tangent activation
function, or tanh. This function ensures the value remains between -1 to 1
irrespective of the output, as shown in Figure 1-14. The formula of the tanh
activation function is as follows:
26
Chapter 1 Introduction to Machine Learning
e2 x - 1
f ( x) =
e2 x + 1
27