Let's look at how changing the number of neurons in an RBM layer affects the test set's accuracy:
The following is the output of a DBN with 512 neurons in an RBM layer. The reconstruction loss has come down and the test set's accuracy has come down as well:
Reconstruction loss: 0.128517: 100%|██████████| 5/5 [01:32<00:00, 19.25s/it] Start deep belief net finetuning... Tensorboard logs dir for this run is /home/ubuntu/.yadlt/logs/run55 Accuracy: 0.0758: 100%|██████████| 1/1 [00:06<00:00, 6.40s/it] Test set accuracy: 0.0689999982715
Notice how the accuracy and test set accuracy both have come down. This means increasing the number of neurons doesn't necessarily improve the accuracy.