-
Book Overview & Buying
-
Table Of Contents
Hands-On Artificial Intelligence for IoT - Second Edition
By :
When the objective function is smooth with a continuous second derivative, then we know from the knowledge of calculus that at a local minimum the following are true.
The gradient of the objective function, f(x), at minima, x*, is zero, that is
. This condition indicates that there is no slope at the minimum point, implying a “flat” spot or a turning point on the curve of the function.
The second derivative (Hessian
) is positively definite. This condition ensures that the point is indeed a minimum, as a positive definite Hessian indicates that the function curves upwards at x*.
In such conditions, for some problems, it is possible to find the solution analytically by determining the zeros of the gradient and verifying the positive definiteness of the Hessian matrix at the zeros. So, in these cases, we can explore the search space iteratively for the minima of the objective function. There are various search methods; let’...