Comment by mindB

7 years ago

I thought the solution set had to be convex or concave for branch and bound to work as an optimization algorithm, and this one clearly isn't?

EDIT: Nope. It's just the objective function that needs to be convex, not the constraints.

Technically, you need neither. Branch and bound works by solving a series of relaxations with fixed integer variables in order to better search the discrete space. Now, imagine we're solving a minimization problem. Finding a feasible solution to the problem may be difficult, but if we were to find one, discrete variables and all, we'd have an upper bound on how good the solution could be. Simply, it's feasible, but not optimal, so the objective value is higher. Now, since integer variables are hard to work with, we can relax them into continuous. For example, instead of a binary variable that's {0,1}, we could relax it into a continuous variable bounded between 0 and 1, [0,1]. If we do this relaxation and find the globally optimal solution, then we have a lower bound as to how good the solution is. In branch and bound, if we can show that the lower bound for one branch is not as good as the upper bound of another branch, we don't have to explore that branch.

Now, convexity comes into play because it allows us to correctly determine these lower bounds because we can guarantee a globally optimal solution to the relaxed problem. If we lack convexity, it's hard to find such solutions. Does that mean branch and bound requires convexity? No. You can still perform branch and bound using locally optimal solutions and even though you don't have a guaranteed global bound, there's relatively good information about where we should search next. This can lead to good, but not provably globally optimal solutions.

  • But the article was making a strong claim that this was the provably optimal solution. In this case it is, because the objective function (in the way I'm assuming they formulated the problem) is convex.