The document discusses instance-based learning, focusing on assumptions, inductive bias, and similarity functions. It emphasizes the importance of encoding and distance measures in classification tasks, particularly in the context of nearest neighbor algorithms (1-NN and k-NN). The role of the parameter 'k' in determining the effectiveness of the model is also highlighted.
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0 ratings0% found this document useful (0 votes)
2 views9 pages
CE802_Lec_IBL_add_slides
The document discusses instance-based learning, focusing on assumptions, inductive bias, and similarity functions. It emphasizes the importance of encoding and distance measures in classification tasks, particularly in the context of nearest neighbor algorithms (1-NN and k-NN). The role of the parameter 'k' in determining the effectiveness of the model is also highlighted.
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 9
Instance-based learning Assumptions and inductive bias
Smooth relative to the sampling of feature space
x2
x1
CE802 (CSEE) Instance-based learning 5 / 24
Instance-based learning Assumptions and inductive bias
Smooth relative to the sampling of feature space
x2
x1
CE802 (CSEE) Instance-based learning 5 / 24
Instance-based learning Assumptions and inductive bias
Smooth relative to the sampling of feature space
x2
x1
CE802 (CSEE) Instance-based learning 5 / 24
Instance-based learning Similarity functions
Choosing the right encoding/distance
Imagine we are designing autopilot able to fly a broken plane.
We use a simple encoding: 1 binary attribute: “left engine broken” 1 binary attribute: “right engine broken” and a simple distance: number of attributes that are different.