In this chapter, we learned about regression analysis. We can think of variables as being dependent on each other in a functional way. For example, the y variable is a function of x denoted by y=f(x). The f(x) function has constant parameters. If y depends on x linearly, then f(x)=a*x+b, where a and b are constant parameters in the f(x) function.
We saw that regression is a method to estimate these constant parameters in such a way that the estimated f(x) follows y as closely as possible. This is formally measured by the squared error between f(x) and y for x data samples.
We also covered the gradient descent method, which minimizes this error by updating the constant parameters in the direction of the steepest descent (that is, the partial derivative of the error), ensuring that the parameters converge to the values resulting in minimal errors in the quickest possible way.
Finally, we learned about the scipy.linalg
Python library which supports the estimation of linear regression using...