0% found this document useful (0 votes)
2 views9 pages

DP2

The document discusses the fundamentals of artificial neural networks, drawing parallels between biological neurons and artificial constructs. It covers the evolution from simple artificial neurons to multi-layer perceptrons and modern deep neural networks, highlighting concepts like the Linear Threshold Unit and the use of softmax for output. The training processes and biological inspirations behind these models are also explored.

Uploaded by

Rana Ben Fraj
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views9 pages

DP2

The document discusses the fundamentals of artificial neural networks, drawing parallels between biological neurons and artificial constructs. It covers the evolution from simple artificial neurons to multi-layer perceptrons and modern deep neural networks, highlighting concepts like the Linear Threshold Unit and the use of softmax for output. The training processes and biological inspirations behind these models are also explored.

Uploaded by

Rana Ben Fraj
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 9

Machine Learning

Part 2 (Deep Learning)


2- Introducing Artificial Neural Networks
Montassar Ben Messaoud, PhD
The biological inspiration

▪ Neurons in your cerebral cortex are connected via axons

▪ A neuron « fires » to the neurons it’s connected to, when enough of its input signals are
activated.

▪ Very simple at the individual neuron level-but layers of neurons connected in this way can
yield learning behavior.

▪ Billions of neurons, each with thousands of connections, yields a mind.

IT430: Machine Learning Montassar Ben Messaoud, PhD 2


Cortical columns

▪ Neurons in your cortex seem to be arranged into many stacks, or


« columns » that process information in parallel

▪ « mini-columns » of around 100 neurons are organized into larger


« hyper-columns ». There are 100 million mini-column in your
cortex.

▪ This is coincidentally similar to how GPU’s work… 3


The first artificial neurons (1943)

An artificial neuron « fires » if more than N input


connections are active.
C Depending on the number of connections each input
neuron, and whether a connection activates or suppresses
a neuron, you can construct AND, OR and NOT logical
constructs this way.

A B

This example would implement C=A OR B if the thrshold is 2 inuts being active 4
The Linear Threshold Unit (LTU)

Sum up the products of the


▪ 1957 inputs and their weights
▪ Adds weights to the inputs

▪ Output is given by a step Output 1 if sum is greater or
equal to 0
function

A B

Input 1 Input 2
IT430:
IT430:Machine
MachineLearning
Learning Montassar
MontassarBen
BenMessaoud,
Messaoud,PhD
PhD 55
The Perceptron

▪ A layer of LTU’s
෍ ෍ ෍
▪ A perceptron can learn by
reinforcing weights that lead to
correct behavior during training
Bias

▪ This too has a biological basis (« cells Weight 1 Weight 2 Neuron


(1.0)

that fire together , wire together »)


Input 1 Input 2
IT430: Machine Learning Montassar Ben Messaoud, PhD 6
Multi-Layer Perceptrons
෍ ෍ ෍

▪ Addition of « hidden layers »

▪ This is a Deep Neural Network


෍ ෍ ෍ ෍

▪ Training them is trickier-but we’ll


talk about that.
Bias
Weight 1 Weight 2 Neuron
(1.0)

IT430: Machine Learning Montassar Ben Messaoud, PhD 7


Input 1 Input 2
A Modern Deep Neural Network Softmax

෍ ෍
▪ Replace step activation function with ෍

something better

▪ Apply softmax to the output Bias


Neuron
(1.0)
෍ ෍ ෍ ෍

▪ Training using gradient descent

Bias
Weight 1 Weight 2 Neuron
(1.0)

IT430: Machine Learning Montassar Ben Messaoud, PhD Input 1 Input 2 8


Thank you for your attention

[email protected]

You might also like