Implementing Particle Swarm Optimization
PSO is also one of the variants of the Heuristic Search hyperparameter tuning group (see Chapter 5) that can be implemented by the DEAP package. We’ll still use the same example as in the previous section to see how we can implement PSO using the DEAP package.
The following code shows how to implement PSO with the DEAP package. You can find the more detailed code in the GitHub repository mentioned in the Technical requirements section:
- Define the PSO parameters and type classes through the
creator.create()
module:N = 50 #swarm size w = 0.5 #inertia weight coefficient c1 = 0.3 #cognitive coefficient c2 = 0.5 #social coefficient num_trials = 15 #number of trials
Fix the seed for reproducibility:
import random random.seed(1)
Define the type of our fitness function. Here, we are working with a maximization problem and a single objective function, which is why we set weights=(1.0,)
:
from deap import creator, base...