100% found this document useful (1 vote)
72 views

Lec 9 Supervised Learning Final

This document discusses topics related to supervised and deep learning covered in week 9 of a course. It includes: 1. An outline of topics including an introduction to Dr. Andrew Ng and his work in AI and the future of work. 2. Examples of using a Naive Bayes classifier for classification. 3. An overview of supervised learning algorithms and deep learning techniques like convolutional neural networks that will be discussed.

Uploaded by

Malik Shaan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
100% found this document useful (1 vote)
72 views

Lec 9 Supervised Learning Final

This document discusses topics related to supervised and deep learning covered in week 9 of a course. It includes: 1. An outline of topics including an introduction to Dr. Andrew Ng and his work in AI and the future of work. 2. Examples of using a Naive Bayes classifier for classification. 3. An overview of supervised learning algorithms and deep learning techniques like convolutional neural networks that will be discussed.

Uploaded by

Malik Shaan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 182

AI Technology and

Applications
Supervised and Deep Learning
Week 9

Dr Shoab Khan
Outline Week 8
• GMM
• Generative AI
• Clustering Evaluation
• Supervised Learning
• Naïve Bayes Classifier
This week is
Dedicated to
Dr Andrew NG
Outline Week 9
1. Dr Andrew Ng: AI and Future of Work
2. Examples of Naïve Bayes Classifer
3. Supervised Learning: Class of Algorithms
4. k-Nearest Negihbours (k-NN) Classifier
5. Supervised Learning: Training, Validation and Testing
6. Image Classification and Neural Networks
7. Deep Learning: Convolutional Neural Networks (CNN)
1. Dr Andrew Ng: AI and Future of Work
2. Examples of Naïve Bayes Classifer
3. Supverside Learning: Class of Algorithms
4. K-nearest Negihbours (k-NN) Classifier
5. Supervised Learning: Training, Validation and Testing
6. Image Classification and Neural Networks
7. Deep Learning: Convolutional Neural Networks (CNN)

1
AI and Future of Work
Dr Andrew NG
Dr Andrew NG
Andrew Ng
• Computer scientist
• Entrepreneur, widely recognized as
one of the world's leading experts
in artificial intelligence.
• Founder of deeplearning.ai
• Co-founder of Coursera
• Former vice president and chief
scientist at Baidu
• Google Brain
• Landing AI
• AI Fund
1

Andrew Ng is Founder & CEO of Landing AI


2

Dr. Andrew Ng leads AI Fund as the Managing


General Partner
3

Andrew Ng, chief scientist, Chinese search


giant Baidu
4

•He was also


the founding
lead of the
Google Brain
team
Value from AI technologies: Current

Generative AI

Reinforcement
Learning
Unsupervised
learning
Supervised learning
(Labeling things)
Value from AI technologies: Today→ 𝟑 𝒀𝒆𝒂𝒓𝒔

Generative AI

.
Reinforcement
Learning
Unsupervised
learning

Supervised learning
(Labeling things)
Supervised and Unsupervised Learning

Supervised Learning Unsupervised Learning

Model
Model
Model Output Actual Output
Prompting is Revolutionizing AI application development

•Supervised learning
Get Label Data Train AI Model Deploy Model

1 month 4-5 month 4-5 month

•Prompt-based AI
Input Prompt to
Deploy Model
ChatGPT

Minutes / hours 4-5 month


Future of Work
Target
Customization (long tail) use low/no code tools
Target the tail

value

All potential AI projects, sorted in decreasing order of value


1. Dr Andrew Ng: AI and Future of Work
2. Examples of Naïve Bayes Classifer
3. Supverside Learning: Class of Algorithms
4. K-nearest Negihbours (k-NN) Classifier
5. Supervised Learning: Training, Validation and Testing
6. Image Classification and Neural Networks
7. Deep Learning: Convolutional Neural Networks (CNN)

2
Examples of Naïve Bayes Classifier
Supervised Learning
Naïve Bayes Classifier
•Supervised Learning technique
•The Naïve Bayes Classifier belongs to the family
of probability classifier
•It uses Bayesian theorem
•It is called ‘Naïve’ because it requires rigid
independence assumption between features of
the input sample
Bayes Decision Rule
Labelled Training samples (for every X sample, output y is known)

test sample
For a test sample X, output y is to be
computed

Training samples

test sample
Bayes Decision Rule

Test sample Labelled Training samples (for every X sample,


For a test sample X, output output y is known)
y is to be computed
𝑃 𝑋 𝑦 𝑃(𝑦)
𝑃 𝑦𝑋 =
𝑃(𝑋)
Assumptions made by Naïve Bayes
•Contribution of each feature to the outcome
is:
•Independent
•equal
Naïve Bayes Classifier Types
1. Multinomial Naïve Bayes Classifier
2. Gaussian Naïve Bayes Classifier
3. Bernoulli Naïve Bayes Classifier
Multinomial Naive Base Classifier
•The posterior probability P(y|X) is calculated by first,
creating a Frequency Table for each attribute/feature
against the label
•Then, molding the frequency tables to Likelihood
Tables
•Finally, use the Naïve Bayesian equation to calculate
the posterior probability for each class
2
Example 1: Bernoulli /Multinomial Naïve Bayes Classifier
Naïve Bayes Classifier
Frequency Table Against the Label
Features
5 1
: Naive Base Classifier𝑃 𝑦𝑒𝑠 =
10 2
=
5 1
𝑃 𝑛𝑜 = =
10 2

Labels
Freq and Likelihood Calculations

: Naive Base Classifier 𝑃 𝑦𝑒𝑠 =


5
=
1
10 2
5 1
𝑃 𝑛𝑜 = =
10 2
3
𝑃 𝑟𝑒𝑑|𝑦𝑒𝑠 =
5
2
𝑃 𝑟𝑒𝑑|𝑛𝑜 =
5
Freq and Likelihood Calculations
5 1
: Naive Base Classifier 𝑃 𝑦𝑒𝑠 = 10 = 2
5 1
𝑃 𝑛𝑜 = =
10 2
3
𝑃 𝑟𝑒𝑑|𝑦𝑒𝑠 =
5
2
𝑃 𝑟𝑒𝑑|𝑛𝑜 =
5
2
𝑃 𝑦𝑒𝑙𝑙𝑜𝑤|𝑦𝑒𝑠 =
5
3
𝑃 𝑦𝑒𝑙𝑙𝑜𝑤|𝑛𝑜 =
5
Freq and Likelihood Calculations Classifier
4
𝑃 𝑆𝑝𝑜𝑟𝑡𝑠|𝑦𝑒𝑠 =
6
2
𝑃 𝑆𝑝𝑜𝑟𝑡𝑠|𝑛𝑜 =
6
Freq and Likelihood Calculations

: Naive Base Classifier

1
𝑃 𝑆𝑈𝑉|𝑦𝑒𝑠 =
4
3
𝑃 𝑆𝑈𝑉|𝑛𝑜 =
4
Freq and Likelihood Calculations

: Naive Base Classifier

2
𝑃 𝐷𝑜𝑚𝑒𝑠𝑡𝑖𝑐|𝑦𝑒𝑠 =
5
3
𝑃 𝐷𝑜𝑚𝑒𝑠𝑡𝑖𝑐|𝑛𝑜 =
5
Freq and Likelihood Calculations

: Naive Base Classifier

3
𝑃 𝐼𝑚𝑝𝑜𝑟𝑡𝑒𝑑|𝑦𝑒𝑠 =
5
2
𝑃 𝐷𝑜𝑚𝑒𝑠𝑡𝑖𝑐|𝑛𝑜 =
5
Applying Naïve Bayes Classifier

Numerator of 𝑃 yes red,SUV,Domestic = 𝑃 red yes ×𝑃(SUV|yes)×P(Doemstic|yes)×𝑃(yes)


! # % # &
Numerator of 𝑃 yes red,SUV,Domestic = × × ×
" $ " %
= %''

Numerator of 𝑃 no red,SUV,Domestic = 𝑃 red no ×𝑃(𝑆𝑈𝑉|𝑛𝑜)×𝑃(Doemstic|no)×𝑃(no)


% ! ! # #(
Numerator of 𝑃 no red,SUV,Domestic = × × × =
" $ " % %''

Since 18 > 6, with same denominator of 200, Which means given the features RED SUV and
Domestic, our example gets classified as ’NO’ the car is not stolen must be parked somewhere else
1. Dr Andrew Ng: AI and Future of Work
2. Examples of Naïve Bayes Classifer
3. Supverside Learning: Class of Algorithms
4. K-nearest Negihbours (k-NN) Classifier
5. Supervised Learning: Training, Validation and Testing
6. Image Classification and Neural Networks
7. Deep Learning: Convolutional Neural Networks (CNN)

2
Gaussian Naïve Bayes Classifier
Naïve Bayes Classifier
Gaussian Naïve Bayes Classifier
Gaussian Naïve Bayes Classifier
ChatGPT Prompt
1. Generate python code that implements a
Naive Bayes Classifier on a benchmark
data
2. Can the data be plotted with decision
boundaries in 2-D
import numpy as np
import matplotlib.pyplot as plt
from sklearn.datasets import load_iris
from sklearn.model_selection import train_test_split
from sklearn.naive_bayes import GaussianNB

# Load the Iris dataset


iris = load_iris()
X, y = iris.data[:, :2], iris.target # Considering only the first two features for
visualization purposes

# Split the data into training and testing sets


X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.3,
random_state=42)

# Initialize the Naive Bayes classifier (Gaussian Naive Bayes)


clf = GaussianNB()
# Train the classifier
clf.fit(X_train, y_train)

# Plot decision boundaries


x_min, x_max = X[:, 0].min() - 1, X[:, 0].max() + 1
y_min, y_max = X[:, 1].min() - 1, X[:, 1].max() + 1
xx, yy = np.meshgrid(np.arange(x_min, x_max, 0.02), np.arange(y_min, y_max, 0.02))
Z = clf.predict(np.c_[xx.ravel(), yy.ravel()])

# Put the result into a color plot


Z = Z.reshape(xx.shape)
plt.contourf(xx, yy, Z, cmap=plt.cm.viridis, alpha=0.8)

# Plot training points


plt.scatter(X_train[:, 0], X_train[:, 1], c=y_train, cmap=plt.cm.Set1, edgecolor='k',
label='Training')
# Plot testing points
plt.scatter(X_test[:, 0], X_test[:, 1], c=y_test, cmap=plt.cm.Set1, edgecolor='k',
marker='^', label='Testing')
plt.xlabel('Sepal length')
plt.ylabel('Sepal width')
plt.title('Naive Bayes Classifier Decision Boundaries')
plt.legend()
plt.show()
Code Explanation by ChatGPT
• This code demonstrates how to:

• Load the Iris dataset using scikit-learn.


• Split the dataset into training and testing sets.
• Initialize the Gaussian Naive Bayes classifier.
• Train the classifier using the training data.
• Make predictions on the test set.
• Calculate the accuracy of the model.
• Generate a classification report showing precision, recall, F1-score, and support for
each class.
• You can use other datasets or modify parameters as needed for different
benchmark datasets.
ChatGPT Prompt
1. Generate python code that implements a
Naive Bayes Classifier on a benchmark
data
2. Can the data be plotted with decision
boundaries in 2-D
3. Can the data be reduced using PCA for
plotting?
import numpy as np
import matplotlib.pyplot as plt
from sklearn.datasets import load_iris
from sklearn.model_selection import train_test_split
from sklearn.naive_bayes import GaussianNB
from sklearn.decomposition import PCA

# Load the Iris dataset


iris = load_iris()
X, y = iris.data, iris.target

# Split the data into training and testing sets


X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.3,
random_state=42)

# Apply PCA for dimensionality reduction


pca = PCA(n_components=2)
X_train_pca = pca.fit_transform(X_train)
X_test_pca = pca.transform(X_test)

# Initialize the Naive Bayes classifier (Gaussian Naive Bayes)


clf = GaussianNB()
# Train the classifier
clf.fit(X_train_pca, y_train)

# Plot decision boundaries


x_min, x_max = X_train_pca[:, 0].min() - 1, X_train_pca[:, 0].max() + 1
y_min, y_max = X_train_pca[:, 1].min() - 1, X_train_pca[:, 1].max() + 1
xx, yy = np.meshgrid(np.arange(x_min, x_max, 0.02), np.arange(y_min, y_max, 0.02))
Z = clf.predict(np.c_[xx.ravel(), yy.ravel()])

# Put the result into a color plot


Z = Z.reshape(xx.shape)
plt.contourf(xx, yy, Z, cmap=plt.cm.viridis, alpha=0.8)

# Plot training points


plt.scatter(X_train_pca[:, 0], X_train_pca[:, 1], c=y_train, cmap=plt.cm.Set1,
edgecolor='k', label='Training')
# Plot testing points
plt.scatter(X_test_pca[:, 0], X_test_pca[:, 1], c=y_test, cmap=plt.cm.Set1,
edgecolor='k', marker='^', label='Testing’)
plt.xlabel('Principal Component 1')
plt.ylabel('Principal Component 2')
plt.title('Naive Bayes Classifier Decision Boundaries with PCA')
plt.legend()
plt.show()
ChatGPT Prompt
1. Generate python code that implements a
Naive Bayes Classifier on a benchmark data
2. Can the data be plotted with decision
boundaries in 2-D
3. Can the data be reduced using PCA for
plotting?
4. Can a code be generated that implements
simple Naive base classifier on self-generated
tables?
from collections import defaultdict

# Define sample training data (feature values and labels)


data = [
{'Outlook': 'Sunny', 'Temperature': 'Hot', 'Humidity': 'High', 'Windy': 'False', 'Play': 'No'},
{'Outlook': 'Sunny', 'Temperature': 'Hot', 'Humidity': 'High', 'Windy': 'True', 'Play': 'No'},
{'Outlook': 'Overcast', 'Temperature': 'Hot', 'Humidity': 'High', 'Windy': 'False', 'Play': 'Yes'},
{'Outlook': 'Rain', 'Temperature': 'Mild', 'Humidity': 'High', 'Windy': 'False', 'Play': 'Yes'},
{'Outlook': 'Rain', 'Temperature': 'Cool', 'Humidity': 'Normal', 'Windy': 'False', 'Play': 'Yes'},
{'Outlook': 'Rain', 'Temperature': 'Cool', 'Humidity': 'Normal', 'Windy': 'True', 'Play': 'No'},
{'Outlook': 'Overcast', 'Temperature': 'Cool', 'Humidity': 'Normal', 'Windy': 'True', 'Play': 'Yes'
{'Outlook': 'Sunny', 'Temperature': 'Mild', 'Humidity': 'High', 'Windy': 'False', 'Play': 'No'},
{'Outlook': 'Sunny', 'Temperature': 'Cool', 'Humidity': 'Normal', 'Windy': 'False', 'Play': 'Yes'},
{'Outlook': 'Rain', 'Temperature': 'Mild', 'Humidity': 'Normal', 'Windy': 'False', 'Play': 'Yes'},
{'Outlook': 'Sunny', 'Temperature': 'Mild', 'Humidity': 'Normal', 'Windy': 'True', 'Play': 'Yes'},
{'Outlook': 'Overcast', 'Temperature': 'Mild', 'Humidity': 'High', 'Windy': 'True', 'Play': 'Yes'},
{'Outlook': 'Overcast', 'Temperature': 'Hot', 'Humidity': 'Normal', 'Windy': 'False', 'Play': 'Yes'
{'Outlook': 'Rain', 'Temperature': 'Mild', 'Humidity': 'High', 'Windy': 'True', 'Play': 'No'}
]
# Calculate class priors
class_counts = defaultdict(int)
for instance in data:
class_counts[instance['Play']] += 1

total_instances = len(data)
class_priors = {cls: count / total_instances for cls, count in class_counts.items()}

# Calculate conditional probabilities for each feature


conditional_probs = defaultdict(lambda: defaultdict(lambda: defaultdict(int)))
for instance in data:
for feature, value in instance.items():
if feature != 'Play':
conditional_probs[feature][value][instance['Play']] += 1

# Normalize conditional probabilities


for feature in conditional_probs:
for value in conditional_probs[feature]:
total_count = sum(conditional_probs[feature][value].values())
for cls in conditional_probs[feature][value]:
conditional_probs[feature][value][cls] /= total_count
# Function to predict class based on input feature values
def predict(sample):
probabilities = {}
for cls in class_priors:
probability = class_priors[cls]
for feature, value in sample.items():
if feature in conditional_probs and value in conditional_probs[feature]:
probability *= conditional_probs[feature][value][cls]
probabilities[cls] = probability
return max(probabilities, key=probabilities.get)

# Test the classifier with a new instance


new_instance = {'Outlook': 'Sunny', 'Temperature': 'Cool', 'Humidity': 'High', 'Windy':
'False'}
prediction = predict(new_instance)
print(f"Predicted class for {new_instance}: {prediction}")
1. Dr Andrew Ng: AI and Future of Work
2. Examples of Naïve Bayes Classifer
3. Supverside Learning: Class of Algorithms
4. K-nearest Negihbours (k-NN) Classifier
5. Supervised Learning: Training, Validation and Testing
6. Image Classification and Neural Networks
7. Deep Learning: Convolutional Neural Networks (CNN)

2
Example 3: Gaussian Naïve Bayes Classifier
Naïve Bayes Classifier
Example 3: Gaussian Naïve Bayes
Classifier
•Distinguish children from adults based on
•Height
•Weight
1. Dr Andrew Ng: AI and Future of Work
2. Examples of Naïve Bayes Classifer
3. Supverside Learning: Class of Algorithms
4. K-nearest Negihbours (k-NN) Classifier
5. Supervised Learning: Training, Validation and Testing
6. Image Classification and Neural Networks
7. Deep Learning: Convolutional Neural Networks (CNN)

3
Class of Algorithms
Superseded Learning
Machine Learning

Unsupervised Learning

Supervised Learning
Regression and Classification
1. Dr Andrew Ng: AI and Future of Work
2. Examples of Naïve Bayes Classifer
3. Supverside Learning: Class of Algorithms
4. K-nearest Negihbours (k-NN) Classifier
5. Supervised Learning: Training, Validation and Testing
6. Image Classification and Neural Networks
7. Deep Learning: Convolutional Neural Networks (CNN)

4
k-Nearest Neighbors (k-NN) Classification
Superseded Learning
Robot Pick and Place Application
Area of the Holes

3
5

1
2

Total Area
4
Area of the Holes

3
5

x
1
2

Total Area
4
Area of the Holes

3
5

1
1
2

Total Area
4
Area of the Holes

3
5

x
1
2

Total Area
4
Area of the Holes

3
5

3
1
2

Total Area
4
Manufacturing Artifacts: Training Data
5 5
Area of the Holes

5 2
2
1 2
1 1
3 4 4
3
3 4 4

Total Area
Manufacturing Artifacts
5 5
Area of the Holes

5 2
2
1 2
1 1
3 4 4
3
3 4 4

Total Area
Manufacturing Artifacts
5 5
Area of the Holes

5 2
2
1 2
x
1 1
3 4 4
3
3 4 4

Total Area
K-NN Classifier
https://www.ibm.com/topics/knn
•k-NN, is a non-parametric, supervised
learning classifier
•Uses proximity to make classifications or
predictions
KNN Algorithm
•Points in the same class are usually “neighbors”
•Assign class based on majority of neighbors
•Need distance measurements
•Need to choose k- number of neighbors
• Note: k must be odd for simple majority
Example of kNN

•what labels for A and B?


Assigning a Label for A
Assigning a Label for B
How to Choose k?
Example
distance metrics
Hamming Distance
K-NN: A Lazy Learning Model
•“lazy learning” models,
• it only stores a training dataset versus
undergoing a training stage
•computation occurs when a classification or
prediction is being made.
•Heavily relies on memory to store all its
training data
K-NN
•As a dataset grows, KNN becomes increasingly
inefficient, compromising overall model
performance
•It is commonly used for simple systems
•recommendation systems
•pattern recognition
•data mining
•financial market predictions
•intrusion detection etc
Inventors
•Evelyn Fix and Joseph Hodges 1951 paper
•Thomas Cover expands on their concept in
his research “Nearest Neighbor Pattern
Classification”
Selecting Value of k
•The choice of k largely depends on the input data
• data with more outliers or noise will likely perform
better with higher values of k
•K should be an odd number
•cross-validation can help in choosing the optimal k for
a given dataset
ChatGPT
1. Generate python code that demonstrate k-
NN classification technique
from sklearn.datasets import load_iris
from sklearn.model_selection import train_test_split
from sklearn.neighbors import KNeighborsClassifier
from sklearn import metrics

# Load the Iris dataset


iris = load_iris()
X = iris.data # Features
y = iris.target # Target variable

# Split the dataset into training and testing sets


X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.3,
random_state=42)

# Initialize the k-NN classifier


k = 3 # Define the number of neighbors
knn = KNeighborsClassifier(n_neighbors=k)

# Train the classifier using the training data


knn.fit(X_train, y_train)

# Predict the labels for the test set


y_pred = knn.predict(X_test)
# Calculate accuracy
accuracy = metrics.accuracy_score(y_test, y_pred)
print("Accuracy:", accuracy)

# Predict a new sample


new_sample = [[5.1, 3.5, 1.4, 0.2]] # Example new sample (you can change this)
predicted_class = knn.predict(new_sample)
print("Predicted class for the new sample:", iris.target_names[predicted_class[0]])
ChatGPT
1. Generate python code that demonstrate
k-NN classification technique
2. Can we also plot the points to
demonstrate the algorithm?
K-NN Classifier with Decision Boundaries
1. Dr Andrew Ng: AI and Future of Work
2. Examples of Naïve Bayes Classifer
3. Supverside Learning: Class of Algorithms
4. K-nearest Negihbours (k-NN) Classifier
5. Supervised Learning: Training, Validation and Testing
6. Image Classification and Neural Networks
7. Deep Learning: Convolutional Neural Networks (CNN)

5
Training, Validation and Testing
Superseded Learning
Cross Validation
K-Fold Cross Validation
• splitting the data into 3 parts,
namely, Train, Validation and Test sets
• Does not work well for small dataset
• Data is split in k-folds (sets)
• Every fold gets chance to appears in the training set (k-1)
times
• K is generally form 5-10
• If k’ is large (say k = n (the number of observations)), then
this approach is called Leave One Out CV (LOOCV)
Final Model Selection
Average Performance: The model's performance
across all k folds is assessed
•Compute average performance metrics
•Accuracy
•Precision
•recall, etc.
•The model with best average performance is
chosen
Final Model Selection
Generalization Ability
•A model that demonstrates good
performance on the validation sets across all
k folds
•doesn't show significant variance in
performance between different folds is
preferred
•This indicates better generalization.
Final Model Selection
1.Average Performance
2.Generalization Ability
3.Model Complexity
4.Domain Expertise and Insights
5.Computational Resources and Constraints
Stratified k-Fold

Ensure relative class frequencies in each fold reflect relative class frequencies on the
whole dataset
ChatGPT
1. Generate python code that demonstrate
k-NN classification technique
2. can we also plot the points to
demonstrate the algorithm
3. Can we generate python code that finds
the optimal value of k using cross
validation?
1. Dr Andrew Ng: AI and Future of Work
2. Examples of Naïve Bayes Classifer
3. Supverside Learning: Class of Algorithms
4. K-nearest Negihbours (k-NN) Classifier
5. Supervised Learning: Training, Validation and Testing
6. Image Classification and Neural Networks
7. Deep Learning: Convolutional Neural Networks (CNN)

6
Image Classification and Neural Networks: Historical Perspective
Superseded Learning
PASCAL Visual Object Challenge

https://www.research.ed.ac.uk/en/publications/the-pascal-visual-object-classes-voc-challenge
Narrow AI
Convolution Neural Networks (CNN) have
become an important tool for object recognition
Gradient-based learning applied to document recognition
https://ieeexplore.ieee.org/document/726791
1. Dr Andrew Ng: AI and Future of Work
2. Examples of Naïve Bayes Classifer
3. Supverside Learning: Class of Algorithms
4. K-nearest Negihbours (k-NN) Classifier
5. Supervised Learning: Training, Validation and Testing
6. Image Classification and Neural Networks
7. Deep Learning: Convolutional Neural Networks (CNN)

6
Image Classification and Neural Networks: Challenges
Superseded Learning
Image Classification
Challenges: Viewpoint Variation
Challenges: Illumination
Challenges: Occlusion
Challenges: Deformation
Challenges: Background Clutter
Challenges: Interclass Variation
https://www.kaggle.com/c/cifar-10/

id,label
1,cat
2,cat
3,cat
4,cat
ChatGPT Prompt

•generate python code that randomly selects


10 images from cifar10 data set and display
them
images
ChatGPT Prompt

•generate python code that randomly selects


10 images from cifar10 data set and display
them
•can we also print their labels
images
images
1. Dr Andrew Ng: AI and Future of Work
2. Examples of Naïve Bayes Classifer
3. Supverside Learning: Class of Algorithms
4. K-nearest Negihbours (k-NN) Classifier
5. Supervised Learning: Training, Validation and Testing
6. Image Classification and Neural Networks
7. Deep Learning: Convolutional Neural Networks (CNN)

7
Image Classification and Neural Networks: Parametric Approach
Superseded Learning
ChatGPT
•can we generate python code that uses one
of the best developed model for classifying
CIFAR-10 images

(CIFAR: Canadian Institute For Advanced Research)


ResNet50
Parametric Approach
Parametric Approach: Linear Classifier
Parametric Approach
Linear Classifier
10x36x36x3 36x36x3=
10x3072 3072x1
Example
Linear Classifier
Fully Connected Layer
Hard Cases for Linear Classifier
The first drawing of a brain cells by Santiago RamÓn y Cajal in 1989
Schematic version of neuron
Perceptron Forward Propagation
Perception Forward Propagation

You might also like