Comment by sfpotter
1 day ago
No, I am not confused. :-) I am just trying to help you understand some basics of numerical analysis.
> What you do not understand is that this is the same thing.
It is not the same thing.
You can express an analytic function f(x) in a convergent (on [-1, 1]) Chebyshev series: f(x) = \sum_{n=0}^\infty a_n T_n(x). You can then truncate it keeping N+1 terms, giving a degree N polynomial. Call it f_N.
Alternatively, you can interpolate f at at N+1 Chebyshev nodes and use a DCT to compute the corresponding Chebyshev series coefficients. Call the resulting polynomial p_N.
In general, f_N and p_N are not the same polynomial.
Furthermore, computing the coefficients of f_N is much more expensive than computing the coefficients of p_N. For f_N, you need to evaluate N+1 integral which may be quite expensive indeed if you want to get digits. For p_N, you simply evaluate f at N+1 nodes, compute a DCT in O(N log N) time, and the result is the coefficients of p_N up to rounding error.
In practice, people do not compute the coefficients of f_N, they compute the coefficients of p_N. Nevertheless, f_N and p_N are essentially as good as each other when it comes to approximation.
At this point I really hate to ask. Do you know what "orthogonal subspace" means and what a projection is?
Ah, shucks. I can see you're getting upset and defensive. Sorry... Yes, it should be clear from everything I've written that I'm quite clear on the definition of these.
If you would like to read what I'm saying but from a more authoritative reference that you feel you can trust, you can just take a look at Trefethen's "Approximation Theory and Approximation Practice". I'm just quoting contents of Chapter 4 at you.
Again, like I said in my first response to you, what you're saying isn't wrong, it just misses the mark a bit. If you want to compute the L2 projection of a function onto the orthogonal subspace of degree N Chebyshev polynomials, you would need to evaluate a rather expensive integral to compute the coefficients. It's expensive because it requires the use of adaptive integration... many function evaluations per coefficient! Bad!
On the other hand, you could just do polynomial interpolation using either of the degree N Chebyshev nodes (Type-I or Type-II). This requires only N+1 functions evaluations. Only one function evaluation per coefficient. Good!
And, again, since the the polynomial so constructed is not the same polynomial as the one obtained via L2 projection mentioned in paragraph 3 above, this interpolation procedure cannot be regarded as a projection! I guess you could call it an "approximate projection". It agrees quite closely with the L2 projection, and has essentially the same approximation power. This is why Chebyshev polynomials are so useful in practice for approximation, and why e.g. Legendre polynomials are much less useful (they do not have a convenient fast transform).
Anyway, I hope this helps! It's a beautiful subject and a lot of fun to work on.