← Back to context

Comment by tim-kt

6 months ago

No, the integral cannot be "expressed" as a sum over weights and function evaluations (with a "="), it can be approximated with this idea. If you fix any n+1 nodes, interpolate your function, and integrate your polynomial, you will get this sum where the weights are integrals over (Lagrange) basis polynomials. By construction, you can compute the integral of polynomials up to degree n exactly. Now, if you choose the nodes in a particular way (namely, as the zeros of some polynomials), you can increase this to up to 2n+1. What I'm getting at is that the Gaussian integration is not estimating the integrals of polynomials of degree 2n+1, but it's evaluating them exactly.

You are right and almost all methods of numerical integration (in any case all those that are useful and I am aware of) are equivalent to approximating the target function with another simpler function, which is then integrated using an exact formula.

So the estimation error is introduced at the step where a function is approximated with another function, which is usually chosen as either a polynomial or a polynomial spline (composed of straight line segments for the simplest trapezoidal integration), not at the actual integration.

Fortunately, for well-behaved functions, when they are approximated by a suitable simpler function, the errors that this approximation introduces in the values of function integrals are smaller than the errors in the interpolated values of the function, which are in turn smaller than the errors in the values estimated at some point for the derivative of the function (using the same approximating simpler function).

  • > [...] almost all methods of numerical integration (in any case all those that are useful and I am aware of) are equivalent to approximating the target function with another simpler function, which is then integrated using an exact formula.

    The key property of quadrature formulas (i.e. numerical integration formulas) is the degree of exactness, which just says up to which degree we can integrate polynomials exactly. The (convergence of the) error of the quadrature depends on this exactness degree.

    If you approximate the integral using a sum of n+1 weights and function evaluations, then any quadrature that has exactness degree n or better is in fact an interpolatory quadrature, that is, it is equivalent to interpolating your function on the n+1 nodes and integrating the polynomial. You can check this by (exactly) integrating the Lagrange basis polynomials, through which you can express the interpolation polynomial.

    • Something interesting here is that although Clenshaw-Curtis rules with N nodes only exactly integrate degree N-1 polynomials while Gauss rules integrate degree 2N-1, they are frequently competitive with Gauss rules when integrating general analytic functions. So while the space of polynomials being integrated is crucial, there are nevertheless many possible interpolatory quadrature rules and how they behave is not completely straightforward.

You're totally correct, my writeup in that section definitely was not clear. I've updated the blog, hopefully it's better. I've also given you a shoutout in the end of the post in my edit log, if that's cool with you

  • Sure, thanks. I also saw this sentence, which was not there in the last version (I think):

    > That is to say, with n nodes, gaussian integration will approximate your function's integral with a higher order polynomial than a basic technique would - resulting in more accuracy.

    This is not really the case, Gaussian integration is still just interpolation on n nodes, but the way of choosing the nodes increases the integration's exactness degree to 2n-1. It's actually more interesting that Gaussian integration does not require any more work in terms of interpolation, but we just choose our nodes better. (Actually, computing the nodes is sometimes more work, but we can do that once and use them forever.)