0% found this document useful (0 votes)
24 views2 pages

Consider Dataset of COVID-19 Infection and Implement ID3 Algorithm

The document describes an assignment to implement the ID3 algorithm on a COVID-19 dataset containing 14 records of symptoms and infection status. Students are tasked with calculating information gain of features, splitting the dataset based on maximum information gain, making decision tree nodes, and repeating until all features are used or all leaf nodes are created.

Uploaded by

Muhamamd Mehmaam
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
24 views2 pages

Consider Dataset of COVID-19 Infection and Implement ID3 Algorithm

The document describes an assignment to implement the ID3 algorithm on a COVID-19 dataset containing 14 records of symptoms and infection status. Students are tasked with calculating information gain of features, splitting the dataset based on maximum information gain, making decision tree nodes, and repeating until all features are used or all leaf nodes are created.

Uploaded by

Muhamamd Mehmaam
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

Assignment: 2

CS-466 Machine Learning

Date: 4 –Sep -2020

Due Date: 06-09-2020

Marks: 05

Consider Dataset of COVID-19 infection and implement ID3


Algorithm.
+----+-------+-------+------------------+----------+
| ID | Fever | Cough | Breathing issues | Infected |
+----+-------+-------+------------------+----------+
| 1 | NO | NO | NO | NO |
+----+-------+-------+------------------+----------+
| 2 | YES | YES | YES | YES |
+----+-------+-------+------------------+----------+
| 3 | YES | YES | NO | NO |
+----+-------+-------+------------------+----------+
| 4 | YES | NO | YES | YES |
+----+-------+-------+------------------+----------+
| 5 | YES | YES | YES | YES |
+----+-------+-------+------------------+----------+
| 6 | NO | YES | NO | NO |
+----+-------+-------+------------------+----------+
| 7 | YES | NO | YES | YES |
+----+-------+-------+------------------+----------+
| 8 | YES | NO | YES | YES |
+----+-------+-------+------------------+----------+
| 9 | NO | YES | YES | YES |
+----+-------+-------+------------------+----------+
| 10 | YES | YES | NO | YES |
+----+-------+-------+------------------+----------+
| 11 | NO | YES | NO | NO |
+----+-------+-------+------------------+----------+
| 12 | NO | YES | YES | NO |
+----+-------+-------+------------------+----------+
| 13 | NO | YES | YES | NO |
+----+-------+-------+------------------+----------+
| 14 | YES | YES | NO | NO |
+----+-------+-------+------------------+----------

1. Calculate the Information Gain of each feature.

2. Considering that all rows don’t belong to the same class, split the dataset S into subsets using
the feature for which the Information Gain is maximum.

3. Make a decision tree node using the feature with the maximum Information gain.

4. If all rows belong to the same class, make the current node as a leaf node with the class as its
label.

5. Repeat for the remaining features until we run out of all features, or the decision tree has all
leaf nodes.

You might also like