Numerically, it is relatively simple to state the approximation problem for the least squares norm. This is the topic of this section.

In the context of linear least squares approximation, it is always possible to reduce the problem to solving a system of linear equations, as the following example shows:

Consider the sine function *f(x) = sin(x)* in the interval from 0 to 1. We choose as approximants the polynomials of second degree: *{a*_{0}* + a*_{1}*x + a*_{2}*x*^{2}*}*. To compute the values *[a*_{0}*, a*_{1}*, a*_{2}*]* that minimize this problem, we first form a 3 × 3 matrix containing the pairwise dot products (the integral of the product of two functions) of the basic functions *{1, x, x*^{2}*}* in the given interval. Because of the nature of this problem, we obtain a Hilbert matrix of order 3:

[ < 1, 1 > < 1, x > < 1, x^2 > ] [ 1 1/2 1/3 ][ < x, 1 > < x, x > < x, x^2 > ] = [ 1/2 1/3 1/4 ][ < x^2, 1...