Problem 1: What is the information entropy of the following multisets? a) {1,2}, b) {1,2,3}, c) {1,2,3,4}, d) {1,1,2,2}, e) {1,1,2,3}
Problem 2: What is the information entropy of the probability space induced by the biased coin that shows head with a probability of 10%, and tail with a probability of 90%?
Problem 3: Let's take another example of playing chess fromChapter 2,Naive Bayes:
a) What is the information gain for each of the non-classifying attributes in the table?
b) What is the decision tree constructed from the given table?
c) How would you classify a data sample (Warm,Strong,Spring,?)
according to the constructed decision tree?
Temperature | Wind | Season | Play |
Cold | Strong | Winter | No |
Warm | Strong | Autumn | No |
Warm | None | Summer | Yes |
Hot | None | Spring | No |
Hot | Breeze | Autumn | Yes |
Warm | Breeze | Spring | Yes |
Cold | Breeze | Winter | No |
Cold | None | Spring | Yes |
Hot | Strong | Summer | Yes |
Warm | None | Autumn | Yes |
Warm | Strong | Spring | ? |
Problem 4: Mary and temperature preferences: Let's take the example from Chapter 1, Classification Using K Nearest Neighbors, regarding...