SlideShare a Scribd company logo
CONCEPT LEARNING
SWAPNA.C
Asst. Prof.
IT DEPT.
Sri Devi Women’s Engineering College
CONCEPT LEARNING
• CONCEPT LEARNING:
• Inferring a Boolean -valued function from training
examples of its input and output.
• A CONCEPT LEARNING TASK:
• Each hypothesis be a vector of six constraints,
specifying the values of the six attributes Sky, AirTemp,
Humidity, Wind, Water, and Forecast. For each
attribute, the hypothesis will either indicate by a "?”
that any value is acceptable for this attribute, specify a
single required value (e.g., Warm) for the attribute, or
indicate by a "0" that no value is acceptable.
By Swapna.C
• The most general hypothesis-that every day is a positive
example-is represented by
• (?, ?, ?, ?, ?, ?)
• and the most specific possible hypothesis-that no day is a
positive example-is represented by negative example
• (0,0,0,0,0,0)
By Swapna.C
• If some instance x satisfies all the constraints
of hypothesis h, then h classifies x as a
positive example (h(x) = 1).
• To illustrate, the hypothesis that Aldo enjoys
his favourite sport only on cold days with high
humidity (independent of the values of the
other attributes) is represented by the
expression
• (?, Cold, High, ?, ?, ?)
By Swapna.C
• NOTATION:
• The set of items over which the concept is defined is called the set
of instances, which we denote by X.
• The concept or function to be learned is called the target concept,
which we denote by C.
• In general, C can be any boolean-valued function defined over the
instances X; that is, C : X -> {O , 1}.In the current example, the
target concept corresponds to the value of the attribute EnjoySport
(i.e., C(x) = 1 if EnjoySport = Yes, and C(x) = 0 if EnjoySport =
No).
• Given:
Instances X: Possible days, each described by the attributes
Sky (with possible values Sunny, Cloudy, and Rainy),
AirTemp (with values Warm and Cold),
Humidity (with values Normal and High),
Wind (with values Strong and Weak),
Water (with values Warm and Cool), and
Forecast (with values Same and Change).
By Swapna.C
• Hypotheses H: Each hypothesis is described by a
conjunction of constraints on the attributes.
• Sky, AirTemp, Humidity, Wind, Water, and
Forecast. The constraints may be "?" (any value
is acceptable), “0”(no value is acceptable), or a
specific value.
• Target concept c: EnjoySport : X  { 0 , l }
• Training examples D: Positive and negative
examples of the target function.
• Determine:
• A hypothesis h in H such that h(x) = c(x) for all x
in X.
By Swapna.C
• A hypothesis h in H such that h ( x ) = c(x) for all x in X.
• When learning the target concept, the learner is presented a
set of training examples, each consisting of an instance x
from X, along with its target concept value c ( x ) .
Instances for which c ( x ) = 1 are called positive examples,
or members of the target concept. Instances for which
c( x )= 0 are called negative examples, or nonmembers of
the target concept. We will often write the ordered pair
( x ,c ( x ) ) to describe the training example consisting of
the instance x and its target concept value c ( x)
• We use the symbol H to denote the set of all possible
hypotheses that the learner may consider regarding the
identity of the target concept. In general, each hypothesis h
in H represents a boolean-valued function defined over X;
that is, h : X  {O, 1}.
• DETERMINE:
• The goal of the learner is to find a hypothesis h such that
h ( x ) = c ( x ) for all x in X.
By Swapna.C
The Inductive Learning
Hypothesis
• The inductive learning hypothesis. Any
hypothesis found to approximate the target
function well over a sufficiently large set of
training examples will also approximate the target
function well over other unobserved examples.
• our assumption is that the best hypothesis
regarding unseen instances is the hypothesis that
best fits the observed training data. This is the
fundamental assumption of inductive learning.
By Swapna.C
CONCEPT LEARNING AS SEARCH
• The goal of this search is to find the hypothesis that best fits
the training examples.
• Consider, for example, the instances X and hypotheses H in
the EnjoySport learning task. Given that the attribute Sky
has three possible values, and that AirTemp, Humidity,
Wind, Water, and Forecast each have two possible values,
the instance space X contains exactly.
• 3.2.2.2.2.2 = 96 distinct instances. A similar calculation
shows that there are 5.4.4. 4.4.4= 5120 syntactically distinct
hypotheses within H. Notice, however, that every
hypothesis containing one or more “0” symbols represents
the empty set of instances; that is, it classifies every
instance as negative. Therefore, the number of semantically
distinct hypotheses is only 1+ ( 4 . 3 . 3 . 3 . 3 . 3 )= 973.
By Swapna.C
General-to-Specific Ordering of
Hypotheses
• Definition: Let hj and hk be Boolean-valued functions defined
over X. Then hj is more general-than-or-equal-to hk (written hj >g
hk) if and only if
•
• To illustrate the general-to-specific ordering, consider the two
hypotheses
• h1 = (Sunny, ?, ?, Strong, ?, ?) h2 = (Sunny, ?, ?, ?, ?, ?)
• Now consider the sets of instances that are classified positive by hl
and by h2. Because h2 imposes fewer constraints on the instance, it
classifies more instances as positive. In fact, any instance classified
positive by hl will also be classified positive by h2. Therefore, we
say that h2 is more general than hl.
• First, for any instance x in X and hypothesis h in H, we say that x
satisfies h if and only if h(x) = 1. We now define the more -
general~ than _ or . - equal~ correlation in terms of the sets of
instances that satisfy the two hypotheses: Given hypotheses hj and
hk, hj is more-general-than m equal do hk if and only if any
instance that satisfies hk also satisfies hi.
By Swapna.C
By Swapna.C
FIND-S ALGORITHM :FINDING A
MAXIMALLY SPECIFIC HYPOTHESIS
• General Hypothesis
G= (‘?’,’?’,’?’......’?’,’?’) no.of attributes
• Specific Hypothesis
S=(‘0’,’0’,’0’......’0’,’0’) no.of attributes
• Most Specific Hypothesis (h)
• Considers only +ve Examples or instances only
By Swapna.C
FIND-S ALGORITHM :FINDING A MAXIMALLY
SPECIFIC HYPOTHESIS
1.Initialize h to the most specific hypothesis in H
2.For each positive training instance x
For each attribute constraint ai, in h
If the constraint ai, is satisfied by x
Then do nothing
Else replace ai, in h by the next more
general constraint that is satisfied by x
3.Output hypothesis h
By Swapna.C
By Swapna.C
• h0<-(0, 0, 0, 0, 0, 0,)
The first step of FIND-S is to initialize h to the most
specific hypothesis in H
• h1<-(Sunny, Warm, Normal, Strong, Warm, Same)
• This h is still very specific; it asserts that all instances
are negative except for the single positive training
example we have observed.
• Next, the second training example (also positive in this
case) forces the algorithm to further generalize h, this
time substituting a “?” in place of any attribute value in
h that is not satisfied by the new example. The refined
hypothesis in this case is
• h2 <-(Sunny ,Warm , ?, Strong, Warm, Same)
By Swapna.C
• The current hypothesis h is the most specific hypothesis
in H consistent with the observed positive examples.
Because the target concept c is also assumed to be in H
and to be consistent with the positive training
examples, c must be more general_than-or-equal doh .
• Upon encountering the third training example-in this
case a negative example-the algorithm makes no
change to h. In fact, the FIND-S algorithm simply
ignores every negative example.
h3 <-(Sunny ,Warm , ?, Strong, Warm, Same)
• To complete our trace of FIND-S, the fourth (positive)
example leads to a further generalization of h
h4 <-(Sunny, Warm, ?, Strong, ?, ?)
By Swapna.C
• The key property of the FIND-S algorithm is
that for hypothesis spaces described by
conjunctions of attribute constraints (such as H
for the Enjoy Sport task), FIND-S is
guaranteed to output the most specific
hypothesis within H that is consistent with the
positive training examples.
• Its final hypothesis will also be consistent with
the negative examples provided the correct
target concept is contained in H, and provided
the training examples are correct.
By Swapna.C
Limitations of FIND-S
• Has the learner converged to the correct target concept?
Although FIND-S will find a hypothesis consistent with the
training data, it has no way to determine whether it has found
the only hypothesis in H consistent with the data (i.e., the
correct target concept), or whether there are many other
consistent hypotheses as well.
• Why prefer the most specific hypothesis?
• Are the training examples consistent? In most practical
learning problems there is some chance that the training
examples will contain at least some errors or noise. Such
inconsistent sets of training examples can severely mislead
FIND-S, given the fact that it ignores negative example.
• What if there are several maximally specific consistent
hypotheses?
By Swapna.C
VERSION SPACES AND THE CANDIDATE-
ELIMINATION ALGORITHM
• The CANDIDATE-ELIMINATION algorithm, that
addresses several of the limitations of FIND-S.
• The key idea in the CANDIDATE-ELIMINATION
algorithm is to output a description of the set of
all hypotheses consistent with the training
examples.
• The CANDIDATE-ELIMINATION algorithm provides
a useful conceptual framework for introducing
several fundamental issues in machine learning.
By Swapna.C
• The CANDIDATE-ELIMINATION algorithm has
been applied to problems such as learning
regularities in chemical mass spectroscopy
(Mitchell 1979) and learning control rules for
heuristic search (Mitchell et al. 1983).
• The practical applications of the CANDIDATE-
ELIMINATION and FIND-S algorithms are
limited by the fact that they both perform
poorly when given noisy training data.
By Swapna.C
• Representation:
• Definition: A hypothesis h is consistent with a set
of training examples D if and only if h(x) = c(x) for
each example (x, c ( x ) )in D.
• the key difference between this definition of
consistent and our earlier definition of satisfies.
An example x is said to satisfy hypothesis h when
h(x) = 1, regardless of whether x is a positive or
negative example of the target concept. However,
whether such an example is consistent with h
depends on the target concept, and in particular,
whether h ( x ) = c ( x ) .
By Swapna.C
• The CANDIDATE-ELIMINATION algorithm represents the
set of all hypotheses consistent with the observed training
examples. This subset of all hypotheses is called the version
space with respect to the hypothesis space H and the training
examples D, because it contains all plausible versions of the
target concept.
By Swapna.C
By Swapna.C
By Swapna.C
By Swapna.C
By Swapna.C
By Swapna.C
By Swapna.C
By Swapna.C
By Swapna.C
+VE
EXAMPLE
-VE
EXAMPLE
By Swapna.C
By Swapna.C
By Swapna.C
By Swapna.C
REMARKS ON VERSION SPACES AND
CANDIDATE-ELIMINATION
• Will the CANDIDATE-ELIMINATION Algorithm Converge
to the Correct Hypothesis?
• The version space learned by the CANDIDATE-
ELIMINATION Algorithm will converge toward the
hypothesis that correctly describes the target concept,
provided.
• (1) there are no errors in the training examples, and (2) there is
some hypothesis in H that correctly describes the target
concept
• What Training Example Should the Learner Request Next?
• How Can Partially Learned Concepts Be Used?
By Swapna.C
Ad

Recommended

introduction to NOSQL Database
introduction to NOSQL Database
nehabsairam
 
Concept learning
Concept learning
Musa Hawamdah
 
Bias and variance trade off
Bias and variance trade off
VARUN KUMAR
 
Data preprocessing
Data preprocessing
ankur bhalla
 
Theory of estimation
Theory of estimation
Tech_MX
 
NOSQL- Presentation on NoSQL
NOSQL- Presentation on NoSQL
Ramakant Soni
 
Genetic Algorithms
Genetic Algorithms
Alaa Khamis, PhD, SMIEEE
 
Combining inductive and analytical learning
Combining inductive and analytical learning
swapnac12
 
Inductive bias
Inductive bias
swapnac12
 
Introduction to Machine Learning
Introduction to Machine Learning
Panimalar Engineering College
 
Machine learning Lecture 2
Machine learning Lecture 2
Srinivasan R
 
Recurrent neural networks rnn
Recurrent neural networks rnn
Kuppusamy P
 
FIND-S ALGORITHM.pptx
FIND-S ALGORITHM.pptx
Student Conscious Club
 
ML_Unit_1_Part_C
ML_Unit_1_Part_C
Srimatre K
 
Unification and Lifting
Unification and Lifting
Megha Sharma
 
Decision tree induction \ Decision Tree Algorithm with Example| Data science
Decision tree induction \ Decision Tree Algorithm with Example| Data science
MaryamRehman6
 
Learning set of rules
Learning set of rules
swapnac12
 
Linear regression
Linear regression
MartinHogg9
 
PAC Learning
PAC Learning
Sanghyuk Chun
 
Decision Trees
Decision Trees
Student
 
An overview of Hidden Markov Models (HMM)
An overview of Hidden Markov Models (HMM)
ananth
 
Machine learning Lecture 1
Machine learning Lecture 1
Srinivasan R
 
Candidate elimination example
Candidate elimination example
Menoufiya University
 
Regularization in deep learning
Regularization in deep learning
Kien Le
 
Instance based learning
Instance based learning
swapnac12
 
Inductive analytical approaches to learning
Inductive analytical approaches to learning
swapnac12
 
Logics for non monotonic reasoning-ai
Logics for non monotonic reasoning-ai
ShaishavShah8
 
Solving recurrences
Solving recurrences
Megha V
 
ML_Unit_1_Part_B
ML_Unit_1_Part_B
Srimatre K
 
Version space - Concept Learning - Machine learning
Version space - Concept Learning - Machine learning
DharanshNeema
 

More Related Content

What's hot (20)

Inductive bias
Inductive bias
swapnac12
 
Introduction to Machine Learning
Introduction to Machine Learning
Panimalar Engineering College
 
Machine learning Lecture 2
Machine learning Lecture 2
Srinivasan R
 
Recurrent neural networks rnn
Recurrent neural networks rnn
Kuppusamy P
 
FIND-S ALGORITHM.pptx
FIND-S ALGORITHM.pptx
Student Conscious Club
 
ML_Unit_1_Part_C
ML_Unit_1_Part_C
Srimatre K
 
Unification and Lifting
Unification and Lifting
Megha Sharma
 
Decision tree induction \ Decision Tree Algorithm with Example| Data science
Decision tree induction \ Decision Tree Algorithm with Example| Data science
MaryamRehman6
 
Learning set of rules
Learning set of rules
swapnac12
 
Linear regression
Linear regression
MartinHogg9
 
PAC Learning
PAC Learning
Sanghyuk Chun
 
Decision Trees
Decision Trees
Student
 
An overview of Hidden Markov Models (HMM)
An overview of Hidden Markov Models (HMM)
ananth
 
Machine learning Lecture 1
Machine learning Lecture 1
Srinivasan R
 
Candidate elimination example
Candidate elimination example
Menoufiya University
 
Regularization in deep learning
Regularization in deep learning
Kien Le
 
Instance based learning
Instance based learning
swapnac12
 
Inductive analytical approaches to learning
Inductive analytical approaches to learning
swapnac12
 
Logics for non monotonic reasoning-ai
Logics for non monotonic reasoning-ai
ShaishavShah8
 
Solving recurrences
Solving recurrences
Megha V
 
Inductive bias
Inductive bias
swapnac12
 
Machine learning Lecture 2
Machine learning Lecture 2
Srinivasan R
 
Recurrent neural networks rnn
Recurrent neural networks rnn
Kuppusamy P
 
ML_Unit_1_Part_C
ML_Unit_1_Part_C
Srimatre K
 
Unification and Lifting
Unification and Lifting
Megha Sharma
 
Decision tree induction \ Decision Tree Algorithm with Example| Data science
Decision tree induction \ Decision Tree Algorithm with Example| Data science
MaryamRehman6
 
Learning set of rules
Learning set of rules
swapnac12
 
Linear regression
Linear regression
MartinHogg9
 
Decision Trees
Decision Trees
Student
 
An overview of Hidden Markov Models (HMM)
An overview of Hidden Markov Models (HMM)
ananth
 
Machine learning Lecture 1
Machine learning Lecture 1
Srinivasan R
 
Regularization in deep learning
Regularization in deep learning
Kien Le
 
Instance based learning
Instance based learning
swapnac12
 
Inductive analytical approaches to learning
Inductive analytical approaches to learning
swapnac12
 
Logics for non monotonic reasoning-ai
Logics for non monotonic reasoning-ai
ShaishavShah8
 
Solving recurrences
Solving recurrences
Megha V
 

Similar to Concept learning and candidate elimination algorithm (20)

ML_Unit_1_Part_B
ML_Unit_1_Part_B
Srimatre K
 
Version space - Concept Learning - Machine learning
Version space - Concept Learning - Machine learning
DharanshNeema
 
Machine Learning CONCEPT LEARNING AS SEARCH.pptx
Machine Learning CONCEPT LEARNING AS SEARCH.pptx
Battuchiranjeevi2
 
Machine Learning Chapter 11 2
Machine Learning Chapter 11 2
butest
 
Concept Learning - Find S Algorithm,Candidate Elimination Algorithm
Concept Learning - Find S Algorithm,Candidate Elimination Algorithm
Global Academy of Technology
 
Concept Learning in hypothesis in machine Learning by tom m mitchel
Concept Learning in hypothesis in machine Learning by tom m mitchel
haribabuj5
 
Concept Learning with types and benefits
Concept Learning with types and benefits
mousmiin
 
Concept Learning presentation with benefits and uses
Concept Learning presentation with benefits and uses
mousmiin
 
Lecture: Introduction to concept-learning.ppt
Lecture: Introduction to concept-learning.ppt
NiteshJha97
 
concept-learning of artificial intelligence
concept-learning of artificial intelligence
ssuser01fa1b
 
2_conceptlearning in machine learning.ppt
2_conceptlearning in machine learning.ppt
geethar79
 
Poggi analytics - concepts - 1a
Poggi analytics - concepts - 1a
Gaston Liberman
 
concept-learning.ppt
concept-learning.ppt
patel252389
 
Lecture04_Concept Learning_ FindS Algorithm.pptx
Lecture04_Concept Learning_ FindS Algorithm.pptx
DrMTayyabChaudhry1
 
Lec 17
Lec 17
Nilt1234
 
UNIT 1-INTRODUCTION-MACHINE LEARNING TECHNIQUES-AD
UNIT 1-INTRODUCTION-MACHINE LEARNING TECHNIQUES-AD
sarmiladevin
 
ML_Lecture_2 well posed algorithm find s.ppt
ML_Lecture_2 well posed algorithm find s.ppt
geethar79
 
classification of learning methods...ppt
classification of learning methods...ppt
GeethaPanneer
 
AI_Lecture_34.ppt
AI_Lecture_34.ppt
InamUllahKhan961803
 
Concept Learning in Artificial Intelligence
Concept Learning in Artificial Intelligence
RahulKumar812056
 
ML_Unit_1_Part_B
ML_Unit_1_Part_B
Srimatre K
 
Version space - Concept Learning - Machine learning
Version space - Concept Learning - Machine learning
DharanshNeema
 
Machine Learning CONCEPT LEARNING AS SEARCH.pptx
Machine Learning CONCEPT LEARNING AS SEARCH.pptx
Battuchiranjeevi2
 
Machine Learning Chapter 11 2
Machine Learning Chapter 11 2
butest
 
Concept Learning - Find S Algorithm,Candidate Elimination Algorithm
Concept Learning - Find S Algorithm,Candidate Elimination Algorithm
Global Academy of Technology
 
Concept Learning in hypothesis in machine Learning by tom m mitchel
Concept Learning in hypothesis in machine Learning by tom m mitchel
haribabuj5
 
Concept Learning with types and benefits
Concept Learning with types and benefits
mousmiin
 
Concept Learning presentation with benefits and uses
Concept Learning presentation with benefits and uses
mousmiin
 
Lecture: Introduction to concept-learning.ppt
Lecture: Introduction to concept-learning.ppt
NiteshJha97
 
concept-learning of artificial intelligence
concept-learning of artificial intelligence
ssuser01fa1b
 
2_conceptlearning in machine learning.ppt
2_conceptlearning in machine learning.ppt
geethar79
 
Poggi analytics - concepts - 1a
Poggi analytics - concepts - 1a
Gaston Liberman
 
concept-learning.ppt
concept-learning.ppt
patel252389
 
Lecture04_Concept Learning_ FindS Algorithm.pptx
Lecture04_Concept Learning_ FindS Algorithm.pptx
DrMTayyabChaudhry1
 
UNIT 1-INTRODUCTION-MACHINE LEARNING TECHNIQUES-AD
UNIT 1-INTRODUCTION-MACHINE LEARNING TECHNIQUES-AD
sarmiladevin
 
ML_Lecture_2 well posed algorithm find s.ppt
ML_Lecture_2 well posed algorithm find s.ppt
geethar79
 
classification of learning methods...ppt
classification of learning methods...ppt
GeethaPanneer
 
Concept Learning in Artificial Intelligence
Concept Learning in Artificial Intelligence
RahulKumar812056
 
Ad

More from swapnac12 (16)

Awt, Swing, Layout managers
Awt, Swing, Layout managers
swapnac12
 
Applet
Applet
swapnac12
 
Event handling
Event handling
swapnac12
 
Asymptotic notations(Big O, Omega, Theta )
Asymptotic notations(Big O, Omega, Theta )
swapnac12
 
Performance analysis(Time & Space Complexity)
Performance analysis(Time & Space Complexity)
swapnac12
 
Introduction ,characteristics, properties,pseudo code conventions
Introduction ,characteristics, properties,pseudo code conventions
swapnac12
 
Using prior knowledge to initialize the hypothesis,kbann
Using prior knowledge to initialize the hypothesis,kbann
swapnac12
 
Analytical learning
Analytical learning
swapnac12
 
Learning rule of first order rules
Learning rule of first order rules
swapnac12
 
Genetic algorithms
Genetic algorithms
swapnac12
 
Computational learning theory
Computational learning theory
swapnac12
 
Multilayer & Back propagation algorithm
Multilayer & Back propagation algorithm
swapnac12
 
Artificial Neural Networks 1
Artificial Neural Networks 1
swapnac12
 
Evaluating hypothesis
Evaluating hypothesis
swapnac12
 
Advanced topics in artificial neural networks
Advanced topics in artificial neural networks
swapnac12
 
Introdution and designing a learning system
Introdution and designing a learning system
swapnac12
 
Awt, Swing, Layout managers
Awt, Swing, Layout managers
swapnac12
 
Event handling
Event handling
swapnac12
 
Asymptotic notations(Big O, Omega, Theta )
Asymptotic notations(Big O, Omega, Theta )
swapnac12
 
Performance analysis(Time & Space Complexity)
Performance analysis(Time & Space Complexity)
swapnac12
 
Introduction ,characteristics, properties,pseudo code conventions
Introduction ,characteristics, properties,pseudo code conventions
swapnac12
 
Using prior knowledge to initialize the hypothesis,kbann
Using prior knowledge to initialize the hypothesis,kbann
swapnac12
 
Analytical learning
Analytical learning
swapnac12
 
Learning rule of first order rules
Learning rule of first order rules
swapnac12
 
Genetic algorithms
Genetic algorithms
swapnac12
 
Computational learning theory
Computational learning theory
swapnac12
 
Multilayer & Back propagation algorithm
Multilayer & Back propagation algorithm
swapnac12
 
Artificial Neural Networks 1
Artificial Neural Networks 1
swapnac12
 
Evaluating hypothesis
Evaluating hypothesis
swapnac12
 
Advanced topics in artificial neural networks
Advanced topics in artificial neural networks
swapnac12
 
Introdution and designing a learning system
Introdution and designing a learning system
swapnac12
 
Ad

Recently uploaded (20)

ECONOMICS, DISASTER MANAGEMENT, ROAD SAFETY - STUDY MATERIAL [10TH]
ECONOMICS, DISASTER MANAGEMENT, ROAD SAFETY - STUDY MATERIAL [10TH]
SHERAZ AHMAD LONE
 
YSPH VMOC Special Report - Measles Outbreak Southwest US 6-14-2025.pptx
YSPH VMOC Special Report - Measles Outbreak Southwest US 6-14-2025.pptx
Yale School of Public Health - The Virtual Medical Operations Center (VMOC)
 
Hurricane Helene Application Documents Checklists
Hurricane Helene Application Documents Checklists
Mebane Rash
 
F-BLOCK ELEMENTS POWER POINT PRESENTATIONS
F-BLOCK ELEMENTS POWER POINT PRESENTATIONS
mprpgcwa2024
 
VCE Literature Section A Exam Response Guide
VCE Literature Section A Exam Response Guide
jpinnuck
 
A Visual Introduction to the Prophet Jeremiah
A Visual Introduction to the Prophet Jeremiah
Steve Thomason
 
How to use search fetch method in Odoo 18
How to use search fetch method in Odoo 18
Celine George
 
LDMMIA Yoga S10 Free Workshop Grad Level
LDMMIA Yoga S10 Free Workshop Grad Level
LDM & Mia eStudios
 
University of Ghana Cracks Down on Misconduct: Over 100 Students Sanctioned
University of Ghana Cracks Down on Misconduct: Over 100 Students Sanctioned
Kweku Zurek
 
K12 Tableau User Group virtual event June 18, 2025
K12 Tableau User Group virtual event June 18, 2025
dogden2
 
IIT KGP Quiz Week 2024 Sports Quiz (Prelims + Finals)
IIT KGP Quiz Week 2024 Sports Quiz (Prelims + Finals)
IIT Kharagpur Quiz Club
 
Vitamin and Nutritional Deficiencies.pptx
Vitamin and Nutritional Deficiencies.pptx
Vishal Chanalia
 
Values Education 10 Quarter 1 Module .pptx
Values Education 10 Quarter 1 Module .pptx
JBPafin
 
Great Governors' Send-Off Quiz 2025 Prelims IIT KGP
Great Governors' Send-Off Quiz 2025 Prelims IIT KGP
IIT Kharagpur Quiz Club
 
Code Profiling in Odoo 18 - Odoo 18 Slides
Code Profiling in Odoo 18 - Odoo 18 Slides
Celine George
 
How to Manage Different Customer Addresses in Odoo 18 Accounting
How to Manage Different Customer Addresses in Odoo 18 Accounting
Celine George
 
Birnagar High School Platinum Jubilee Quiz.pptx
Birnagar High School Platinum Jubilee Quiz.pptx
Sourav Kr Podder
 
OBSESSIVE COMPULSIVE DISORDER.pptx IN 5TH SEMESTER B.SC NURSING, 2ND YEAR GNM...
OBSESSIVE COMPULSIVE DISORDER.pptx IN 5TH SEMESTER B.SC NURSING, 2ND YEAR GNM...
parmarjuli1412
 
Photo chemistry Power Point Presentation
Photo chemistry Power Point Presentation
mprpgcwa2024
 
SCHIZOPHRENIA OTHER PSYCHOTIC DISORDER LIKE Persistent delusion/Capgras syndr...
SCHIZOPHRENIA OTHER PSYCHOTIC DISORDER LIKE Persistent delusion/Capgras syndr...
parmarjuli1412
 
ECONOMICS, DISASTER MANAGEMENT, ROAD SAFETY - STUDY MATERIAL [10TH]
ECONOMICS, DISASTER MANAGEMENT, ROAD SAFETY - STUDY MATERIAL [10TH]
SHERAZ AHMAD LONE
 
Hurricane Helene Application Documents Checklists
Hurricane Helene Application Documents Checklists
Mebane Rash
 
F-BLOCK ELEMENTS POWER POINT PRESENTATIONS
F-BLOCK ELEMENTS POWER POINT PRESENTATIONS
mprpgcwa2024
 
VCE Literature Section A Exam Response Guide
VCE Literature Section A Exam Response Guide
jpinnuck
 
A Visual Introduction to the Prophet Jeremiah
A Visual Introduction to the Prophet Jeremiah
Steve Thomason
 
How to use search fetch method in Odoo 18
How to use search fetch method in Odoo 18
Celine George
 
LDMMIA Yoga S10 Free Workshop Grad Level
LDMMIA Yoga S10 Free Workshop Grad Level
LDM & Mia eStudios
 
University of Ghana Cracks Down on Misconduct: Over 100 Students Sanctioned
University of Ghana Cracks Down on Misconduct: Over 100 Students Sanctioned
Kweku Zurek
 
K12 Tableau User Group virtual event June 18, 2025
K12 Tableau User Group virtual event June 18, 2025
dogden2
 
IIT KGP Quiz Week 2024 Sports Quiz (Prelims + Finals)
IIT KGP Quiz Week 2024 Sports Quiz (Prelims + Finals)
IIT Kharagpur Quiz Club
 
Vitamin and Nutritional Deficiencies.pptx
Vitamin and Nutritional Deficiencies.pptx
Vishal Chanalia
 
Values Education 10 Quarter 1 Module .pptx
Values Education 10 Quarter 1 Module .pptx
JBPafin
 
Great Governors' Send-Off Quiz 2025 Prelims IIT KGP
Great Governors' Send-Off Quiz 2025 Prelims IIT KGP
IIT Kharagpur Quiz Club
 
Code Profiling in Odoo 18 - Odoo 18 Slides
Code Profiling in Odoo 18 - Odoo 18 Slides
Celine George
 
How to Manage Different Customer Addresses in Odoo 18 Accounting
How to Manage Different Customer Addresses in Odoo 18 Accounting
Celine George
 
Birnagar High School Platinum Jubilee Quiz.pptx
Birnagar High School Platinum Jubilee Quiz.pptx
Sourav Kr Podder
 
OBSESSIVE COMPULSIVE DISORDER.pptx IN 5TH SEMESTER B.SC NURSING, 2ND YEAR GNM...
OBSESSIVE COMPULSIVE DISORDER.pptx IN 5TH SEMESTER B.SC NURSING, 2ND YEAR GNM...
parmarjuli1412
 
Photo chemistry Power Point Presentation
Photo chemistry Power Point Presentation
mprpgcwa2024
 
SCHIZOPHRENIA OTHER PSYCHOTIC DISORDER LIKE Persistent delusion/Capgras syndr...
SCHIZOPHRENIA OTHER PSYCHOTIC DISORDER LIKE Persistent delusion/Capgras syndr...
parmarjuli1412
 

Concept learning and candidate elimination algorithm

  • 1. CONCEPT LEARNING SWAPNA.C Asst. Prof. IT DEPT. Sri Devi Women’s Engineering College
  • 2. CONCEPT LEARNING • CONCEPT LEARNING: • Inferring a Boolean -valued function from training examples of its input and output. • A CONCEPT LEARNING TASK: • Each hypothesis be a vector of six constraints, specifying the values of the six attributes Sky, AirTemp, Humidity, Wind, Water, and Forecast. For each attribute, the hypothesis will either indicate by a "?” that any value is acceptable for this attribute, specify a single required value (e.g., Warm) for the attribute, or indicate by a "0" that no value is acceptable. By Swapna.C
  • 3. • The most general hypothesis-that every day is a positive example-is represented by • (?, ?, ?, ?, ?, ?) • and the most specific possible hypothesis-that no day is a positive example-is represented by negative example • (0,0,0,0,0,0) By Swapna.C
  • 4. • If some instance x satisfies all the constraints of hypothesis h, then h classifies x as a positive example (h(x) = 1). • To illustrate, the hypothesis that Aldo enjoys his favourite sport only on cold days with high humidity (independent of the values of the other attributes) is represented by the expression • (?, Cold, High, ?, ?, ?) By Swapna.C
  • 5. • NOTATION: • The set of items over which the concept is defined is called the set of instances, which we denote by X. • The concept or function to be learned is called the target concept, which we denote by C. • In general, C can be any boolean-valued function defined over the instances X; that is, C : X -> {O , 1}.In the current example, the target concept corresponds to the value of the attribute EnjoySport (i.e., C(x) = 1 if EnjoySport = Yes, and C(x) = 0 if EnjoySport = No). • Given: Instances X: Possible days, each described by the attributes Sky (with possible values Sunny, Cloudy, and Rainy), AirTemp (with values Warm and Cold), Humidity (with values Normal and High), Wind (with values Strong and Weak), Water (with values Warm and Cool), and Forecast (with values Same and Change). By Swapna.C
  • 6. • Hypotheses H: Each hypothesis is described by a conjunction of constraints on the attributes. • Sky, AirTemp, Humidity, Wind, Water, and Forecast. The constraints may be "?" (any value is acceptable), “0”(no value is acceptable), or a specific value. • Target concept c: EnjoySport : X  { 0 , l } • Training examples D: Positive and negative examples of the target function. • Determine: • A hypothesis h in H such that h(x) = c(x) for all x in X. By Swapna.C
  • 7. • A hypothesis h in H such that h ( x ) = c(x) for all x in X. • When learning the target concept, the learner is presented a set of training examples, each consisting of an instance x from X, along with its target concept value c ( x ) . Instances for which c ( x ) = 1 are called positive examples, or members of the target concept. Instances for which c( x )= 0 are called negative examples, or nonmembers of the target concept. We will often write the ordered pair ( x ,c ( x ) ) to describe the training example consisting of the instance x and its target concept value c ( x) • We use the symbol H to denote the set of all possible hypotheses that the learner may consider regarding the identity of the target concept. In general, each hypothesis h in H represents a boolean-valued function defined over X; that is, h : X  {O, 1}. • DETERMINE: • The goal of the learner is to find a hypothesis h such that h ( x ) = c ( x ) for all x in X. By Swapna.C
  • 8. The Inductive Learning Hypothesis • The inductive learning hypothesis. Any hypothesis found to approximate the target function well over a sufficiently large set of training examples will also approximate the target function well over other unobserved examples. • our assumption is that the best hypothesis regarding unseen instances is the hypothesis that best fits the observed training data. This is the fundamental assumption of inductive learning. By Swapna.C
  • 9. CONCEPT LEARNING AS SEARCH • The goal of this search is to find the hypothesis that best fits the training examples. • Consider, for example, the instances X and hypotheses H in the EnjoySport learning task. Given that the attribute Sky has three possible values, and that AirTemp, Humidity, Wind, Water, and Forecast each have two possible values, the instance space X contains exactly. • 3.2.2.2.2.2 = 96 distinct instances. A similar calculation shows that there are 5.4.4. 4.4.4= 5120 syntactically distinct hypotheses within H. Notice, however, that every hypothesis containing one or more “0” symbols represents the empty set of instances; that is, it classifies every instance as negative. Therefore, the number of semantically distinct hypotheses is only 1+ ( 4 . 3 . 3 . 3 . 3 . 3 )= 973. By Swapna.C
  • 10. General-to-Specific Ordering of Hypotheses • Definition: Let hj and hk be Boolean-valued functions defined over X. Then hj is more general-than-or-equal-to hk (written hj >g hk) if and only if • • To illustrate the general-to-specific ordering, consider the two hypotheses • h1 = (Sunny, ?, ?, Strong, ?, ?) h2 = (Sunny, ?, ?, ?, ?, ?) • Now consider the sets of instances that are classified positive by hl and by h2. Because h2 imposes fewer constraints on the instance, it classifies more instances as positive. In fact, any instance classified positive by hl will also be classified positive by h2. Therefore, we say that h2 is more general than hl. • First, for any instance x in X and hypothesis h in H, we say that x satisfies h if and only if h(x) = 1. We now define the more - general~ than _ or . - equal~ correlation in terms of the sets of instances that satisfy the two hypotheses: Given hypotheses hj and hk, hj is more-general-than m equal do hk if and only if any instance that satisfies hk also satisfies hi. By Swapna.C
  • 12. FIND-S ALGORITHM :FINDING A MAXIMALLY SPECIFIC HYPOTHESIS • General Hypothesis G= (‘?’,’?’,’?’......’?’,’?’) no.of attributes • Specific Hypothesis S=(‘0’,’0’,’0’......’0’,’0’) no.of attributes • Most Specific Hypothesis (h) • Considers only +ve Examples or instances only By Swapna.C
  • 13. FIND-S ALGORITHM :FINDING A MAXIMALLY SPECIFIC HYPOTHESIS 1.Initialize h to the most specific hypothesis in H 2.For each positive training instance x For each attribute constraint ai, in h If the constraint ai, is satisfied by x Then do nothing Else replace ai, in h by the next more general constraint that is satisfied by x 3.Output hypothesis h By Swapna.C
  • 15. • h0<-(0, 0, 0, 0, 0, 0,) The first step of FIND-S is to initialize h to the most specific hypothesis in H • h1<-(Sunny, Warm, Normal, Strong, Warm, Same) • This h is still very specific; it asserts that all instances are negative except for the single positive training example we have observed. • Next, the second training example (also positive in this case) forces the algorithm to further generalize h, this time substituting a “?” in place of any attribute value in h that is not satisfied by the new example. The refined hypothesis in this case is • h2 <-(Sunny ,Warm , ?, Strong, Warm, Same) By Swapna.C
  • 16. • The current hypothesis h is the most specific hypothesis in H consistent with the observed positive examples. Because the target concept c is also assumed to be in H and to be consistent with the positive training examples, c must be more general_than-or-equal doh . • Upon encountering the third training example-in this case a negative example-the algorithm makes no change to h. In fact, the FIND-S algorithm simply ignores every negative example. h3 <-(Sunny ,Warm , ?, Strong, Warm, Same) • To complete our trace of FIND-S, the fourth (positive) example leads to a further generalization of h h4 <-(Sunny, Warm, ?, Strong, ?, ?) By Swapna.C
  • 17. • The key property of the FIND-S algorithm is that for hypothesis spaces described by conjunctions of attribute constraints (such as H for the Enjoy Sport task), FIND-S is guaranteed to output the most specific hypothesis within H that is consistent with the positive training examples. • Its final hypothesis will also be consistent with the negative examples provided the correct target concept is contained in H, and provided the training examples are correct. By Swapna.C
  • 18. Limitations of FIND-S • Has the learner converged to the correct target concept? Although FIND-S will find a hypothesis consistent with the training data, it has no way to determine whether it has found the only hypothesis in H consistent with the data (i.e., the correct target concept), or whether there are many other consistent hypotheses as well. • Why prefer the most specific hypothesis? • Are the training examples consistent? In most practical learning problems there is some chance that the training examples will contain at least some errors or noise. Such inconsistent sets of training examples can severely mislead FIND-S, given the fact that it ignores negative example. • What if there are several maximally specific consistent hypotheses? By Swapna.C
  • 19. VERSION SPACES AND THE CANDIDATE- ELIMINATION ALGORITHM • The CANDIDATE-ELIMINATION algorithm, that addresses several of the limitations of FIND-S. • The key idea in the CANDIDATE-ELIMINATION algorithm is to output a description of the set of all hypotheses consistent with the training examples. • The CANDIDATE-ELIMINATION algorithm provides a useful conceptual framework for introducing several fundamental issues in machine learning. By Swapna.C
  • 20. • The CANDIDATE-ELIMINATION algorithm has been applied to problems such as learning regularities in chemical mass spectroscopy (Mitchell 1979) and learning control rules for heuristic search (Mitchell et al. 1983). • The practical applications of the CANDIDATE- ELIMINATION and FIND-S algorithms are limited by the fact that they both perform poorly when given noisy training data. By Swapna.C
  • 21. • Representation: • Definition: A hypothesis h is consistent with a set of training examples D if and only if h(x) = c(x) for each example (x, c ( x ) )in D. • the key difference between this definition of consistent and our earlier definition of satisfies. An example x is said to satisfy hypothesis h when h(x) = 1, regardless of whether x is a positive or negative example of the target concept. However, whether such an example is consistent with h depends on the target concept, and in particular, whether h ( x ) = c ( x ) . By Swapna.C
  • 22. • The CANDIDATE-ELIMINATION algorithm represents the set of all hypotheses consistent with the observed training examples. This subset of all hypotheses is called the version space with respect to the hypothesis space H and the training examples D, because it contains all plausible versions of the target concept. By Swapna.C
  • 35. REMARKS ON VERSION SPACES AND CANDIDATE-ELIMINATION • Will the CANDIDATE-ELIMINATION Algorithm Converge to the Correct Hypothesis? • The version space learned by the CANDIDATE- ELIMINATION Algorithm will converge toward the hypothesis that correctly describes the target concept, provided. • (1) there are no errors in the training examples, and (2) there is some hypothesis in H that correctly describes the target concept • What Training Example Should the Learner Request Next? • How Can Partially Learned Concepts Be Used? By Swapna.C