Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.
This is a continuation of the exercise Basic Tree 1
Answers to the exercises are available here.
If you obtained a different (correct) answer than those listed on the solutions page, please feel free to post your answer as a comment on that page.
Exercise 1
load the tree library. If it is not installed then use the install.packages() command to install it.
Exercise 2
Convert all the feaures(columns) into factors, including the class column
Exercise 3
Use the sample methods that you learnt from the sample_exercise to split the data into two sets with a SplitRatio of 0.7. Hint: Use caTools library and sample.split() function. Store the results into Train and Test.
Exercise 4
Use the tree()
command to build the model. Use class as the target variable and everything else as the predictor variable. Also, use the Train variable as the data source. Store the model in a variable called model1
Exercise 5
Use the plot()
command to plot the model and use the text() command to add the text.
Exercise 6
Use the predict()
command to predict the classes using the Test dataset. We want to predict the classes. Store this in the variable pred_class
Exercise 7
Use the table()
command to print the confusion matrix. Hint: You are comparing the class from the Test set and the predicted vector. This tells you wether the model is answering anything right or wrong
Exercise 8
use the summary()
to print the summary of the model and note the misclassification error rate.
Exercise 9
Now find the misclassification error rate of the model on the Test data. Use the formula. mean(Test$class != pred_class)
Exercise 10
Compare the two misclassification error rates and determine which is worse and why. How can we improve the model?
R-bloggers.com offers daily e-mail updates about R news and tutorials about learning R and many other topics. Click here if you're looking to post or find an R/data-science job.
Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.