Comment by imtringued
9 hours ago
Look, I'm not trying to decimate you here but your list of keywords is wrong and I know it because I explored that list last month for a completely different application.
The Jacobian is the first order derivative for a function that accepts a vector as an input and produces a vector as an output, hence it must be a matrix.
Newton-Raphson is an algorithm for finding the roots(=zeroes) of a function. Since the derivative of the minimum of a function is zero, it can be used for solving convex optimization problems.
Levenberg-Marquardt is another way to solve optimization problems.
The Powell dog leg method is new to me, but it is just an extension of Gauss-Newton which you could think of a special casing of Newton-Raphson where the objective function is quadratic (useful for objectives with vector norms aka distances between positions).
Most of the algorithms require solving a linear system for finding the zero of the derivative. The Schur complement is a way to factor the linear system into a bunch of smaller linear systems and sparse QR/Cholesky are an implementation detail of solving linear systems.
Now that we got the buzzwords out of the way I will tell you the problem with your buzzwords. Constraint solving algorithms are SAT or SMT based and generally not optimization based.
Consider the humble circle constraint: a^2 + b^2 = c^2. If you have two circles with differing centers and radii, they may intersect and if they do, they will intersect at two points and this is readily apparent in the equations since c = sqrt(a^2 + b^2) has two solutions. This means you will need some sort of branching inside your algorithm and the optimization algorithms you listed are terrible at this.
Optimization can work well for interactive CAD usage, because geometric sketches tend to be close to the intended solution, and it's fast. It also has the nice property of stability (i.e., with optimization, small changes in parameters tend not to cause large perturbations in the solution). Doing more gets into what you mentioned, and is called chirality or root identification in the literature. That's much more intense computationally, but could be useful especially if cheaper approaches like optimization failed.