- What is the information entropy of the following multisets?
a) {1,2}, b) {1,2,3}, c) {1,2,3,4}, d) {1,1,2,2}, e) {1,1,2,3} - What is the information entropy of the probability space induced by the biased coin that shows heads with the probability 10% and tails with the probability 90%?
- Let us take another example of playing chess from Chapter 2, Naive Bayes:
- a) What is the information gain for each of the non-classifying attributes in the table?
- b) What is the decision tree constructed from the given table?
- c) How would you classify a data sample (warm,strong,spring,?) according to the constructed decision tree?
Temperature |
Wind |
Season |
Play |
Cold |
Strong |
Winter |
No |
Warm |
Strong |
Autumn |
No |
Warm |
None |
Summer |
Yes |
Hot |
None |
Spring |
No |
Hot |
Breeze |
Autumn |
Yes |
Warm |
Breeze |
Spring |
Yes |
Cold |
Breeze |
Winter... |