Drawing a histogram
A histogram is a statistical graphic representation of variable distribution. This allows us to understand the density estimation and probability distribution of data. The histogram is created by dividing the entire range of variable values into a fixed number of intervals and then counting how many values fall into each interval.
If we apply this histogram concept to an image, it seems to be complex to understand, but it is really very simple. In a gray image, our variable values can take any possible gray value ranging from 0
to 255
, and the density is the number of pixels in the image that have this value. This means that we have to count the number of image pixels that have the value 0
, count the number of pixels of value 1
, and so on.
The callback function that shows the histogram of the input image is called showHistoCallback
. This function calculates the histogram of each channel image and shows the result of each histogram channel in a new image.
Now, let's check...