Warning: Trying to access array offset on value of type bool in /home/topgsnkq/timelyhomework.com/wp-content/themes/enfold/framework/php/function-set-avia-frontend.php on line 570

Intro to Data Mining

Chapter 3, exercises in 3.115. Consider the following data set for a binary class problem.A B Class LabelT F +T T +T T +T F −T T +F F −F F −F F −T T −T F −a. Calculate the information gain when splitting on A and B. Whichattribute would the decision tree induction algorithm choose?b. Calculate the gain in the Gini index when splitting on A and B.Which attribute would the decision tree induction algorithmchoose?c. Figure 3.11 shows that entropy and the Gini index are bothmonotonically increasing on the range [0, 0.5] and they are bothmonotonically decreasing on the range [0.5, 1]. Is it possible thatinformation gain and the gain in the Gini index favor differentattributes? Explain.7. Consider the following set of training examples.X Y Z No. of Class C1 Examples No. of Class C2 Examples0 0 0 5 400 0 1 0 150 1 0 10 50 1 1 45 01 0 0 10 51 0 1 25 01 1 0 5 201 1 1 0 15a. Compute a two-level decision tree using the greedy approachdescribed in this chapter. Use the classification error rate as thecriterion for splitting. What is the overall error rate of the inducedtree?b. Repeat part (a) using X as the first splitting attribute and thenchoose the best remaining attribute for splitting at each of the twosuccessor nodes. What is the error rate of the induced tree?c. Compare the results of parts (a) and (b). Comment on the suitabilityof the greedy heuristic used for splitting attribute selection.8. The following table summarizes a data set with three attributes A, B,C and two class labels +, −. Build a two-level decision tree.A B CNumber of Instances+ −T T T 5 0F T T 0 20T F T 20 0F F T 0 5T T F 0 0F T F 25 0T F F 0 0F F F 0 25a. According to the classification error rate, which attribute would bechosen as the first splitting attribute? For each attribute, show thecontingency table and the gains in classification error rate.b. Repeat for the two children of the root node.c. How many instances are misclassified by the resulting decisiontree?d. Repeat parts (a), (b), and (c) using C as the splitting attribute.e. Use the results in parts (c) and (d) to conclude about the greedynature of the decision tree induction algorithm.

 
"Looking for a Similar Assignment? Order now and Get 10% Discount! Use Code "GET10" in your order"

If this is not the paper you were searching for, you can order your 100% plagiarism free, professional written paper now!

Order Now Just Browsing

All of our assignments are originally produced, unique, and free of plagiarism.

Free Revisions Plagiarism Free 24x7 Support