Lec 9 Supervised Learning Final
Lec 9 Supervised Learning Final
Applications
Supervised and Deep Learning
Week 9
Dr Shoab Khan
Outline Week 8
• GMM
• Generative AI
• Clustering Evaluation
• Supervised Learning
• Naïve Bayes Classifier
This week is
Dedicated to
Dr Andrew NG
Outline Week 9
1. Dr Andrew Ng: AI and Future of Work
2. Examples of Naïve Bayes Classifer
3. Supervised Learning: Class of Algorithms
4. k-Nearest Negihbours (k-NN) Classifier
5. Supervised Learning: Training, Validation and Testing
6. Image Classification and Neural Networks
7. Deep Learning: Convolutional Neural Networks (CNN)
1. Dr Andrew Ng: AI and Future of Work
2. Examples of Naïve Bayes Classifer
3. Supverside Learning: Class of Algorithms
4. K-nearest Negihbours (k-NN) Classifier
5. Supervised Learning: Training, Validation and Testing
6. Image Classification and Neural Networks
7. Deep Learning: Convolutional Neural Networks (CNN)
1
AI and Future of Work
Dr Andrew NG
Dr Andrew NG
Andrew Ng
• Computer scientist
• Entrepreneur, widely recognized as
one of the world's leading experts
in artificial intelligence.
• Founder of deeplearning.ai
• Co-founder of Coursera
• Former vice president and chief
scientist at Baidu
• Google Brain
• Landing AI
• AI Fund
1
Generative AI
Reinforcement
Learning
Unsupervised
learning
Supervised learning
(Labeling things)
Value from AI technologies: Today→ 𝟑 𝒀𝒆𝒂𝒓𝒔
Generative AI
.
Reinforcement
Learning
Unsupervised
learning
Supervised learning
(Labeling things)
Supervised and Unsupervised Learning
Model
Model
Model Output Actual Output
Prompting is Revolutionizing AI application development
•Supervised learning
Get Label Data Train AI Model Deploy Model
•Prompt-based AI
Input Prompt to
Deploy Model
ChatGPT
value
2
Examples of Naïve Bayes Classifier
Supervised Learning
Naïve Bayes Classifier
•Supervised Learning technique
•The Naïve Bayes Classifier belongs to the family
of probability classifier
•It uses Bayesian theorem
•It is called ‘Naïve’ because it requires rigid
independence assumption between features of
the input sample
Bayes Decision Rule
Labelled Training samples (for every X sample, output y is known)
test sample
For a test sample X, output y is to be
computed
Training samples
test sample
Bayes Decision Rule
Labels
Freq and Likelihood Calculations
1
𝑃 𝑆𝑈𝑉|𝑦𝑒𝑠 =
4
3
𝑃 𝑆𝑈𝑉|𝑛𝑜 =
4
Freq and Likelihood Calculations
2
𝑃 𝐷𝑜𝑚𝑒𝑠𝑡𝑖𝑐|𝑦𝑒𝑠 =
5
3
𝑃 𝐷𝑜𝑚𝑒𝑠𝑡𝑖𝑐|𝑛𝑜 =
5
Freq and Likelihood Calculations
3
𝑃 𝐼𝑚𝑝𝑜𝑟𝑡𝑒𝑑|𝑦𝑒𝑠 =
5
2
𝑃 𝐷𝑜𝑚𝑒𝑠𝑡𝑖𝑐|𝑛𝑜 =
5
Applying Naïve Bayes Classifier
Since 18 > 6, with same denominator of 200, Which means given the features RED SUV and
Domestic, our example gets classified as ’NO’ the car is not stolen must be parked somewhere else
1. Dr Andrew Ng: AI and Future of Work
2. Examples of Naïve Bayes Classifer
3. Supverside Learning: Class of Algorithms
4. K-nearest Negihbours (k-NN) Classifier
5. Supervised Learning: Training, Validation and Testing
6. Image Classification and Neural Networks
7. Deep Learning: Convolutional Neural Networks (CNN)
2
Gaussian Naïve Bayes Classifier
Naïve Bayes Classifier
Gaussian Naïve Bayes Classifier
Gaussian Naïve Bayes Classifier
ChatGPT Prompt
1. Generate python code that implements a
Naive Bayes Classifier on a benchmark
data
2. Can the data be plotted with decision
boundaries in 2-D
import numpy as np
import matplotlib.pyplot as plt
from sklearn.datasets import load_iris
from sklearn.model_selection import train_test_split
from sklearn.naive_bayes import GaussianNB
total_instances = len(data)
class_priors = {cls: count / total_instances for cls, count in class_counts.items()}
2
Example 3: Gaussian Naïve Bayes Classifier
Naïve Bayes Classifier
Example 3: Gaussian Naïve Bayes
Classifier
•Distinguish children from adults based on
•Height
•Weight
1. Dr Andrew Ng: AI and Future of Work
2. Examples of Naïve Bayes Classifer
3. Supverside Learning: Class of Algorithms
4. K-nearest Negihbours (k-NN) Classifier
5. Supervised Learning: Training, Validation and Testing
6. Image Classification and Neural Networks
7. Deep Learning: Convolutional Neural Networks (CNN)
3
Class of Algorithms
Superseded Learning
Machine Learning
Unsupervised Learning
Supervised Learning
Regression and Classification
1. Dr Andrew Ng: AI and Future of Work
2. Examples of Naïve Bayes Classifer
3. Supverside Learning: Class of Algorithms
4. K-nearest Negihbours (k-NN) Classifier
5. Supervised Learning: Training, Validation and Testing
6. Image Classification and Neural Networks
7. Deep Learning: Convolutional Neural Networks (CNN)
4
k-Nearest Neighbors (k-NN) Classification
Superseded Learning
Robot Pick and Place Application
Area of the Holes
3
5
1
2
Total Area
4
Area of the Holes
3
5
x
1
2
Total Area
4
Area of the Holes
3
5
1
1
2
Total Area
4
Area of the Holes
3
5
x
1
2
Total Area
4
Area of the Holes
3
5
3
1
2
Total Area
4
Manufacturing Artifacts: Training Data
5 5
Area of the Holes
5 2
2
1 2
1 1
3 4 4
3
3 4 4
Total Area
Manufacturing Artifacts
5 5
Area of the Holes
5 2
2
1 2
1 1
3 4 4
3
3 4 4
Total Area
Manufacturing Artifacts
5 5
Area of the Holes
5 2
2
1 2
x
1 1
3 4 4
3
3 4 4
Total Area
K-NN Classifier
https://www.ibm.com/topics/knn
•k-NN, is a non-parametric, supervised
learning classifier
•Uses proximity to make classifications or
predictions
KNN Algorithm
•Points in the same class are usually “neighbors”
•Assign class based on majority of neighbors
•Need distance measurements
•Need to choose k- number of neighbors
• Note: k must be odd for simple majority
Example of kNN
5
Training, Validation and Testing
Superseded Learning
Cross Validation
K-Fold Cross Validation
• splitting the data into 3 parts,
namely, Train, Validation and Test sets
• Does not work well for small dataset
• Data is split in k-folds (sets)
• Every fold gets chance to appears in the training set (k-1)
times
• K is generally form 5-10
• If k’ is large (say k = n (the number of observations)), then
this approach is called Leave One Out CV (LOOCV)
Final Model Selection
Average Performance: The model's performance
across all k folds is assessed
•Compute average performance metrics
•Accuracy
•Precision
•recall, etc.
•The model with best average performance is
chosen
Final Model Selection
Generalization Ability
•A model that demonstrates good
performance on the validation sets across all
k folds
•doesn't show significant variance in
performance between different folds is
preferred
•This indicates better generalization.
Final Model Selection
1.Average Performance
2.Generalization Ability
3.Model Complexity
4.Domain Expertise and Insights
5.Computational Resources and Constraints
Stratified k-Fold
Ensure relative class frequencies in each fold reflect relative class frequencies on the
whole dataset
ChatGPT
1. Generate python code that demonstrate
k-NN classification technique
2. can we also plot the points to
demonstrate the algorithm
3. Can we generate python code that finds
the optimal value of k using cross
validation?
1. Dr Andrew Ng: AI and Future of Work
2. Examples of Naïve Bayes Classifer
3. Supverside Learning: Class of Algorithms
4. K-nearest Negihbours (k-NN) Classifier
5. Supervised Learning: Training, Validation and Testing
6. Image Classification and Neural Networks
7. Deep Learning: Convolutional Neural Networks (CNN)
6
Image Classification and Neural Networks: Historical Perspective
Superseded Learning
PASCAL Visual Object Challenge
https://www.research.ed.ac.uk/en/publications/the-pascal-visual-object-classes-voc-challenge
Narrow AI
Convolution Neural Networks (CNN) have
become an important tool for object recognition
Gradient-based learning applied to document recognition
https://ieeexplore.ieee.org/document/726791
1. Dr Andrew Ng: AI and Future of Work
2. Examples of Naïve Bayes Classifer
3. Supverside Learning: Class of Algorithms
4. K-nearest Negihbours (k-NN) Classifier
5. Supervised Learning: Training, Validation and Testing
6. Image Classification and Neural Networks
7. Deep Learning: Convolutional Neural Networks (CNN)
6
Image Classification and Neural Networks: Challenges
Superseded Learning
Image Classification
Challenges: Viewpoint Variation
Challenges: Illumination
Challenges: Occlusion
Challenges: Deformation
Challenges: Background Clutter
Challenges: Interclass Variation
https://www.kaggle.com/c/cifar-10/
id,label
1,cat
2,cat
3,cat
4,cat
ChatGPT Prompt
7
Image Classification and Neural Networks: Parametric Approach
Superseded Learning
ChatGPT
•can we generate python code that uses one
of the best developed model for classifying
CIFAR-10 images