SlideShare a Scribd company logo
Difference Between Logistic
Regression-Shallow Neural
Network and Deep Neural
Network
BY Chode Amarnath
Learning Objective :
→ Why we use supervised learning instead of unsupervised for Neural networks?
→ Why we use Logistic regression for Neural network?
Difference between logistic regression  shallow neural network and deep neural network
Supervised learning
→ In supervised learning, we are given a data set and already know what our
correct output should look like, having the idea that there is a relationship between the
input and the output.
→ Supervised learning problems are categorized into
1) Regression.
2) Classification problems.
Classification
→ In a classification problem, we are instead trying to predict results in a
discrete output. In other words, we are trying to map input variables into discrete
categories.
→ The main goal of classification is to predict the target class (Yes/
No).
→ The classification problem is just like the regression problem, except that
the values we now want to predict take on only a small number of discrete values.
For now, we will focus on the binary classification problem in which y can take on
only two values, 0 and 1.
Types of classification:
Binary classification.
When there are only two classes to predict, usually 1 or 0 values.
Multi-Class Classification
When there are more than two class labels to predict we call multi-classification
task.
Difference between logistic regression  shallow neural network and deep neural network
Difference between logistic regression  shallow neural network and deep neural network
Difference between logistic regression  shallow neural network and deep neural network
Regression
→ In a regression problem, we are trying to predict results within a continuous
output, meaning that we are trying to map input variables to some continuous function.
→ In regression task , the target value is a continuously varying variable. Such
as country’s GDP (or) the price of a house.
Difference between logistic regression  shallow neural network and deep neural network
Difference between logistic regression  shallow neural network and deep neural network
Difference between logistic regression  shallow neural network and deep neural network
Difference between logistic regression  shallow neural network and deep neural network
Cost Function
→ We can measure the accuracy of our hypothesis function by using a cost
function.
→ It also called as Square Error Function.
→ Our objective is to get the best possible line. The best possible line will be
such so that the average squared vertical distances of the scattered points from the line
will be the least.
→ Ideally, the line should pass through all the points of our training data set. In
such a case, the value of J(theta_0, theta_1)will be 0.
Difference between logistic regression  shallow neural network and deep neural network
Difference between logistic regression  shallow neural network and deep neural network
Difference between logistic regression  shallow neural network and deep neural network
We want an efficient algorithm or a piece of software for automatically finding theta 0
and theta1.
Difference between logistic regression  shallow neural network and deep neural network
Difference between logistic regression  shallow neural network and deep neural network
Difference between logistic regression  shallow neural network and deep neural network
Difference between logistic regression  shallow neural network and deep neural network
Difference between logistic regression  shallow neural network and deep neural network
Difference between logistic regression  shallow neural network and deep neural network
Difference between logistic regression  shallow neural network and deep neural network
Difference between logistic regression  shallow neural network and deep neural network
Difference between logistic regression  shallow neural network and deep neural network
Difference between logistic regression  shallow neural network and deep neural network
Difference between logistic regression  shallow neural network and deep neural network
Difference between logistic regression  shallow neural network and deep neural network
Difference between logistic regression  shallow neural network and deep neural network
Logistic regression model
→ we would like our classifier to output values between 0 and 1, so we would
come up with hypothesis that satisfies this property, that is predictions are between 0
and 1.
→ As Z goes minus infinity g(z) approaches Zero.
→ As g(z) approaches infinity, g(z) approaches one.
→ hθ​(x) will give us the probability that our output is 1.
For example, hθ(x)=0.7 gives us a probability of 70% that our output is 1. Our
probability that our prediction is 0 is just the complement of our probability that it is 1
(e.g. if probability that it is 1 is 70%, then the probability that it is 0 is 30%).
Difference between logistic regression  shallow neural network and deep neural network
Difference between logistic regression  shallow neural network and deep neural network
Difference between logistic regression  shallow neural network and deep neural network
Decision Boundary
→
Difference between logistic regression  shallow neural network and deep neural network
Difference between logistic regression  shallow neural network and deep neural network
Difference between logistic regression  shallow neural network and deep neural network
Difference between logistic regression  shallow neural network and deep neural network
Difference between logistic regression  shallow neural network and deep neural network
Difference between logistic regression  shallow neural network and deep neural network
Difference between logistic regression  shallow neural network and deep neural network
Difference between logistic regression  shallow neural network and deep neural network
Difference between logistic regression  shallow neural network and deep neural network
Logistic Regression
→ Given an input feature vector X maybe corresponding to an image that we
want to recognize as either a cat picture or not a cat picture.
→ we want an algorithm that can output a prediction, which is your
estimate(probability of chance) of y.
→ X is a n dimensional vector and given that the parameter of logistic
regression.
→ So, given an input X and the parameters W and b, how do we generate the
output y(hat).
→ We’ll usually keep the parameter W and B separate,
Difference between logistic regression  shallow neural network and deep neural network
Difference between logistic regression  shallow neural network and deep neural network
Logistic Regression Cost Function
→ What loss function or error function we can use to measure how well your
algorithm is doing.
→ The cost function measures, how well your parameters W and b are doing on
the training set.
→ To train the parameters W and B of the logistic regression, we need to define a
cost function.
→ The Loss function was defined with respect to a single training example
→ Cost function measures, how we’ll you’re doing an entire training set.
Difference between logistic regression  shallow neural network and deep neural network
Difference between logistic regression  shallow neural network and deep neural network
Gradient Descent
→ How can we use the gradient descent algorithm to train or learn, the
parameters W and B on your training set.
→ Find W and B that makes the cost function as small as possible.
→ Height of the surface represent the Value J(w,b) at a certain point.
→ we to find the value of w and b that correspond to the minimum of the cost
function J.
→ Initialize W and B to some values, denoted by this little red dot, Initialize to zero
or random initialization also works.
Difference between logistic regression  shallow neural network and deep neural network
Difference between logistic regression  shallow neural network and deep neural network
Difference between logistic regression  shallow neural network and deep neural network
Shallow Neural Network
→ In Logistic regression, we had Z followed by a calculation.
→ In Neural network, we just do it multiple times Z followed by a calculation.
→ finally compute the loss at end.
→ in logistic regression, we have backward calculation in order to compute
devitives.
→ in Neural network end up doing a backward calculation
Difference between logistic regression  shallow neural network and deep neural network
→ The Circle logistic regression really represents two steps of computation
→ First we compute Z,
→ Second we compute the activation as a sigmoid function of Z.
→ A neural network does this a lot of times
Difference between logistic regression  shallow neural network and deep neural network
Difference between logistic regression  shallow neural network and deep neural network
Difference between logistic regression  shallow neural network and deep neural network
Difference between logistic regression  shallow neural network and deep neural network
Difference between logistic regression  shallow neural network and deep neural network
Activation Function
- When you build your neural network, one of the choice you get to make is what
activation function to use in the hidden layers and as well as what is the output
units of your neural network.
- If you let the function g(z) = tanh(z) this almost work better than the sigmoid
function, Because the values between +1 and -1, the mean of the activations that come
out of the hidden layer are close to having 0 mean
- the mean of the data is close to 1 rather than 0.5
- This actually makes learning for next layer a little bit easier
One of the down side of both sides of both the sigmoid and tanh function is
- If Z is very large are small then the gradient or the derivative or the slope of the
function is very small
Difference between logistic regression  shallow neural network and deep neural network
Difference between logistic regression  shallow neural network and deep neural network
Difference between logistic regression  shallow neural network and deep neural network
Difference between logistic regression  shallow neural network and deep neural network
Difference between logistic regression  shallow neural network and deep neural network
Difference between logistic regression  shallow neural network and deep neural network
Difference between logistic regression  shallow neural network and deep neural network
Difference between logistic regression  shallow neural network and deep neural network

More Related Content

What's hot (20)

PPTX
Decision Trees
Student
 
PDF
Stock price prediction using Neural Net
Rajat Sharma
 
PPT
K mean-clustering algorithm
parry prabhu
 
PDF
Decision trees in Machine Learning
Mohammad Junaid Khan
 
PPS
Java rmi
kamal kotecha
 
PPTX
Inductive analytical approaches to learning
swapnac12
 
PDF
Dempster Shafer Theory AI CSE 8th Sem
DigiGurukul
 
PPTX
Exception handling in Pipelining in COA
RishavChandel1
 
PDF
Wireless Wide Area Network-Cellular Telephone and Satellite Networks
Jyothishmathi Institute of Technology and Science Karimnagar
 
PPTX
Ensemble Method (Bagging Boosting)
Abdullah al Mamun
 
PPTX
Routing Protocols in WSN
Darpan Dekivadiya
 
PPTX
Ensemble methods
zekeLabs Technologies
 
PPTX
ID3 ALGORITHM
HARDIK SINGH
 
PPTX
Major issues in data mining
Slideshare
 
PDF
02 Machine Learning - Introduction probability
Andres Mendez-Vazquez
 
PPTX
Data cubes
Mohammed
 
PDF
Logistic regression
Learnbay Datascience
 
PDF
Routing protocols in ad hoc network
NIIS Institute of Business Management, Bhubaneswar
 
PDF
Classification Based Machine Learning Algorithms
Md. Main Uddin Rony
 
PPT
Network Layer,Computer Networks
guesta81d4b
 
Decision Trees
Student
 
Stock price prediction using Neural Net
Rajat Sharma
 
K mean-clustering algorithm
parry prabhu
 
Decision trees in Machine Learning
Mohammad Junaid Khan
 
Java rmi
kamal kotecha
 
Inductive analytical approaches to learning
swapnac12
 
Dempster Shafer Theory AI CSE 8th Sem
DigiGurukul
 
Exception handling in Pipelining in COA
RishavChandel1
 
Wireless Wide Area Network-Cellular Telephone and Satellite Networks
Jyothishmathi Institute of Technology and Science Karimnagar
 
Ensemble Method (Bagging Boosting)
Abdullah al Mamun
 
Routing Protocols in WSN
Darpan Dekivadiya
 
Ensemble methods
zekeLabs Technologies
 
ID3 ALGORITHM
HARDIK SINGH
 
Major issues in data mining
Slideshare
 
02 Machine Learning - Introduction probability
Andres Mendez-Vazquez
 
Data cubes
Mohammed
 
Logistic regression
Learnbay Datascience
 
Routing protocols in ad hoc network
NIIS Institute of Business Management, Bhubaneswar
 
Classification Based Machine Learning Algorithms
Md. Main Uddin Rony
 
Network Layer,Computer Networks
guesta81d4b
 

Similar to Difference between logistic regression shallow neural network and deep neural network (20)

PPTX
Lecture02_Updated_Shallow Neural Networks.pptx
UzairAli65885
 
PPTX
PRML Chapter 4
Sunwoo Kim
 
PDF
Machine Learning with Python- Machine Learning Algorithms- Logistic Regressio...
KalighatOkira
 
PPTX
5_LR_Apr_7_2021.pptx in nature language processing
attaurahman
 
PPTX
Deep learning: Mathematical Perspective
YounusS2
 
PPTX
Illustrative Introductory Neural Networks
YasutoTamura1
 
PDF
Explore ml day 2
preetikumara
 
PPTX
Machine Learning using Support Vector Machine
Mohsin Ul Haq
 
PPTX
07 logistic regression and stochastic gradient descent
Subhas Kumar Ghosh
 
PDF
Linear logisticregression
kongara
 
PDF
6 logistic regression classification algo
TanmayVijay1
 
PDF
Machine learning pt.1: Artificial Neural Networks ® All Rights Reserved
Jonathan Mitchell
 
PDF
Analysis and Design of Algorithms notes
Prof. Dr. K. Adisesha
 
PPTX
ML Workshop at SACON 2018
Subrat Panda, PhD
 
PDF
working with python
bhavesh lande
 
PPTX
Introduction to Neural Netwoks
Abdallah Bashir
 
PPTX
Lecture 3.1_ Logistic Regression powerpoint
zahidwadiwale
 
PDF
Machine learning
Shreyas G S
 
PDF
Sienna 13 limitations
chidabdu
 
PDF
Introduction to Variational Auto Encoder
vaidehimadaan041
 
Lecture02_Updated_Shallow Neural Networks.pptx
UzairAli65885
 
PRML Chapter 4
Sunwoo Kim
 
Machine Learning with Python- Machine Learning Algorithms- Logistic Regressio...
KalighatOkira
 
5_LR_Apr_7_2021.pptx in nature language processing
attaurahman
 
Deep learning: Mathematical Perspective
YounusS2
 
Illustrative Introductory Neural Networks
YasutoTamura1
 
Explore ml day 2
preetikumara
 
Machine Learning using Support Vector Machine
Mohsin Ul Haq
 
07 logistic regression and stochastic gradient descent
Subhas Kumar Ghosh
 
Linear logisticregression
kongara
 
6 logistic regression classification algo
TanmayVijay1
 
Machine learning pt.1: Artificial Neural Networks ® All Rights Reserved
Jonathan Mitchell
 
Analysis and Design of Algorithms notes
Prof. Dr. K. Adisesha
 
ML Workshop at SACON 2018
Subrat Panda, PhD
 
working with python
bhavesh lande
 
Introduction to Neural Netwoks
Abdallah Bashir
 
Lecture 3.1_ Logistic Regression powerpoint
zahidwadiwale
 
Machine learning
Shreyas G S
 
Sienna 13 limitations
chidabdu
 
Introduction to Variational Auto Encoder
vaidehimadaan041
 
Ad

More from Chode Amarnath (6)

PPTX
Important Classification and Regression Metrics.pptx
Chode Amarnath
 
PPTX
Vectorization In NLP.pptx
Chode Amarnath
 
PPTX
The 10 Algorithms Machine Learning Engineers Need to Know.pptx
Chode Amarnath
 
PPTX
Bag the model with bagging
Chode Amarnath
 
PPTX
Feature engineering mean encodings
Chode Amarnath
 
PPTX
Validation and Over fitting , Validation strategies
Chode Amarnath
 
Important Classification and Regression Metrics.pptx
Chode Amarnath
 
Vectorization In NLP.pptx
Chode Amarnath
 
The 10 Algorithms Machine Learning Engineers Need to Know.pptx
Chode Amarnath
 
Bag the model with bagging
Chode Amarnath
 
Feature engineering mean encodings
Chode Amarnath
 
Validation and Over fitting , Validation strategies
Chode Amarnath
 
Ad

Recently uploaded (20)

PPTX
Fluvial_Civilizations_Presentation (1).pptx
alisslovemendoza7
 
PPTX
short term internship project on Data visualization
JMJCollegeComputerde
 
PDF
Top Civil Engineer Canada Services111111
nengineeringfirms
 
PDF
apidays Munich 2025 - The Physics of Requirement Sciences Through Application...
apidays
 
PPT
Real Life Application of Set theory, Relations and Functions
manavparmar205
 
PDF
Classifcation using Machine Learning and deep learning
bhaveshagrawal35
 
PDF
apidays Munich 2025 - Making Sense of AI-Ready APIs in a Buzzword World, Andr...
apidays
 
PDF
apidays Munich 2025 - Developer Portals, API Catalogs, and Marketplaces, Miri...
apidays
 
PPTX
Data-Driven Machine Learning for Rail Infrastructure Health Monitoring
Sione Palu
 
PDF
McKinsey - Global Energy Perspective 2023_11.pdf
niyudha
 
PDF
Blue Futuristic Cyber Security Presentation.pdf
tanvikhunt1003
 
PPTX
UVA-Ortho-PPT-Final-1.pptx Data analytics relevant to the top
chinnusindhu1
 
PPT
introdution to python with a very little difficulty
HUZAIFABINABDULLAH
 
PPTX
Nursing Shift Supervisor 24/7 in a week .pptx
amjadtanveer
 
PPTX
The whitetiger novel review for collegeassignment.pptx
DhruvPatel754154
 
PDF
apidays Munich 2025 - Integrate Your APIs into the New AI Marketplace, Senthi...
apidays
 
PDF
Key_Statistical_Techniques_in_Analytics_by_CA_Suvidha_Chaplot.pdf
CA Suvidha Chaplot
 
PDF
apidays Munich 2025 - The Double Life of the API Product Manager, Emmanuel Pa...
apidays
 
PPTX
Presentation (1) (1).pptx k8hhfftuiiigff
karthikjagath2005
 
PPTX
7 Easy Ways to Improve Clarity in Your BI Reports
sophiegracewriter
 
Fluvial_Civilizations_Presentation (1).pptx
alisslovemendoza7
 
short term internship project on Data visualization
JMJCollegeComputerde
 
Top Civil Engineer Canada Services111111
nengineeringfirms
 
apidays Munich 2025 - The Physics of Requirement Sciences Through Application...
apidays
 
Real Life Application of Set theory, Relations and Functions
manavparmar205
 
Classifcation using Machine Learning and deep learning
bhaveshagrawal35
 
apidays Munich 2025 - Making Sense of AI-Ready APIs in a Buzzword World, Andr...
apidays
 
apidays Munich 2025 - Developer Portals, API Catalogs, and Marketplaces, Miri...
apidays
 
Data-Driven Machine Learning for Rail Infrastructure Health Monitoring
Sione Palu
 
McKinsey - Global Energy Perspective 2023_11.pdf
niyudha
 
Blue Futuristic Cyber Security Presentation.pdf
tanvikhunt1003
 
UVA-Ortho-PPT-Final-1.pptx Data analytics relevant to the top
chinnusindhu1
 
introdution to python with a very little difficulty
HUZAIFABINABDULLAH
 
Nursing Shift Supervisor 24/7 in a week .pptx
amjadtanveer
 
The whitetiger novel review for collegeassignment.pptx
DhruvPatel754154
 
apidays Munich 2025 - Integrate Your APIs into the New AI Marketplace, Senthi...
apidays
 
Key_Statistical_Techniques_in_Analytics_by_CA_Suvidha_Chaplot.pdf
CA Suvidha Chaplot
 
apidays Munich 2025 - The Double Life of the API Product Manager, Emmanuel Pa...
apidays
 
Presentation (1) (1).pptx k8hhfftuiiigff
karthikjagath2005
 
7 Easy Ways to Improve Clarity in Your BI Reports
sophiegracewriter
 

Difference between logistic regression shallow neural network and deep neural network

  • 1. Difference Between Logistic Regression-Shallow Neural Network and Deep Neural Network BY Chode Amarnath
  • 2. Learning Objective : → Why we use supervised learning instead of unsupervised for Neural networks? → Why we use Logistic regression for Neural network?
  • 4. Supervised learning → In supervised learning, we are given a data set and already know what our correct output should look like, having the idea that there is a relationship between the input and the output. → Supervised learning problems are categorized into 1) Regression. 2) Classification problems.
  • 5. Classification → In a classification problem, we are instead trying to predict results in a discrete output. In other words, we are trying to map input variables into discrete categories. → The main goal of classification is to predict the target class (Yes/ No). → The classification problem is just like the regression problem, except that the values we now want to predict take on only a small number of discrete values. For now, we will focus on the binary classification problem in which y can take on only two values, 0 and 1.
  • 6. Types of classification: Binary classification. When there are only two classes to predict, usually 1 or 0 values. Multi-Class Classification When there are more than two class labels to predict we call multi-classification task.
  • 10. Regression → In a regression problem, we are trying to predict results within a continuous output, meaning that we are trying to map input variables to some continuous function. → In regression task , the target value is a continuously varying variable. Such as country’s GDP (or) the price of a house.
  • 15. Cost Function → We can measure the accuracy of our hypothesis function by using a cost function. → It also called as Square Error Function. → Our objective is to get the best possible line. The best possible line will be such so that the average squared vertical distances of the scattered points from the line will be the least. → Ideally, the line should pass through all the points of our training data set. In such a case, the value of J(theta_0, theta_1)will be 0.
  • 19. We want an efficient algorithm or a piece of software for automatically finding theta 0 and theta1.
  • 33. Logistic regression model → we would like our classifier to output values between 0 and 1, so we would come up with hypothesis that satisfies this property, that is predictions are between 0 and 1. → As Z goes minus infinity g(z) approaches Zero. → As g(z) approaches infinity, g(z) approaches one. → hθ​(x) will give us the probability that our output is 1. For example, hθ(x)=0.7 gives us a probability of 70% that our output is 1. Our probability that our prediction is 0 is just the complement of our probability that it is 1 (e.g. if probability that it is 1 is 70%, then the probability that it is 0 is 30%).
  • 47. Logistic Regression → Given an input feature vector X maybe corresponding to an image that we want to recognize as either a cat picture or not a cat picture. → we want an algorithm that can output a prediction, which is your estimate(probability of chance) of y. → X is a n dimensional vector and given that the parameter of logistic regression. → So, given an input X and the parameters W and b, how do we generate the output y(hat). → We’ll usually keep the parameter W and B separate,
  • 50. Logistic Regression Cost Function → What loss function or error function we can use to measure how well your algorithm is doing. → The cost function measures, how well your parameters W and b are doing on the training set. → To train the parameters W and B of the logistic regression, we need to define a cost function. → The Loss function was defined with respect to a single training example → Cost function measures, how we’ll you’re doing an entire training set.
  • 53. Gradient Descent → How can we use the gradient descent algorithm to train or learn, the parameters W and B on your training set. → Find W and B that makes the cost function as small as possible. → Height of the surface represent the Value J(w,b) at a certain point. → we to find the value of w and b that correspond to the minimum of the cost function J. → Initialize W and B to some values, denoted by this little red dot, Initialize to zero or random initialization also works.
  • 57. Shallow Neural Network → In Logistic regression, we had Z followed by a calculation. → In Neural network, we just do it multiple times Z followed by a calculation. → finally compute the loss at end. → in logistic regression, we have backward calculation in order to compute devitives. → in Neural network end up doing a backward calculation
  • 59. → The Circle logistic regression really represents two steps of computation → First we compute Z, → Second we compute the activation as a sigmoid function of Z. → A neural network does this a lot of times
  • 65. Activation Function - When you build your neural network, one of the choice you get to make is what activation function to use in the hidden layers and as well as what is the output units of your neural network. - If you let the function g(z) = tanh(z) this almost work better than the sigmoid function, Because the values between +1 and -1, the mean of the activations that come out of the hidden layer are close to having 0 mean - the mean of the data is close to 1 rather than 0.5 - This actually makes learning for next layer a little bit easier
  • 66. One of the down side of both sides of both the sigmoid and tanh function is - If Z is very large are small then the gradient or the derivative or the slope of the function is very small