Understanding BO GP
Bayesian optimization Gaussian process (BOGP) is one of the variants of the BO hyperparameter tuning method. It is well-known for its good capability in describing the objective function. This variant is very popular due to the unique analytically tractable nature of the surrogate model and its ability to produce relatively accurate approximation, even with only a few observed points.
However, BOGP has limitations. It only works on continuous hyperparameters, not on the discrete or categorical types of hyperparameters. It is not recommended to use BOGP when you need a lot of iterations to get the optimal set of hyperparameters, especially when you have a large number of samples. This is BOGP has a runtime, where is the number of samples. If you have more than 10 hyperparameters to be optimized, the common belief is that BOGP is not the right hyperparameter tuning method for you.
Having GP as the surrogate model means that we utilize GP as the prior for...