In this chapter, we learned how to select a feature that best represents a set of data. We gained an understanding the basic concept of dimensionality reduction. We saw how to perform a feature extraction procedure for dimensionality reduction when transformation of variables is possible. We also explored stepwise regression and PCA.
We learned how to use the
stepwiselm() function to create a linear model and automatically add/remove variables from the model. We also saw how to create a small model starting from a constant model, and how to create a large model starting from a model containing many terms. We reviewed the methods to remove missing values from a dataset.
Subsequently, we covered the techniques for extracting features. In particular, we analyzed PCA. PCA is a quantitatively rigorous method for achieving this simplification. The method generates a new set of variables, called principal components. Each principal component is a linear combination of the original variables...