← Back to context

Comment by loudmax

4 days ago

The fact that you have the possibility of running Flux locally might be enough of an argument to sway the balance for some cases. For example, if you've already set up a workflow and Google jacks up the price, or changes the API, you have no choice but to go along. If BFL does the same, you at least have the option of running locally.

Those cases imply commercial workflows that are prohibited with the open-weights model without purchasing a license.

I am curious to see how the Apache 2.0 distilled variant performs but it's still unlikely that the economics will favor it unless you have a specific niche use case: the engineering effort needed to scale up image inference for these large models isn't zero cost.

Their testing was for the Pro model, which you cannot host locally, and is already not price competitive with Google's offering for the capabilities.

You can run Alibaba's Qwen(Edit) locally too, and the company isn't as weird with its license, weights, or training set.

I personally prefer Qwen's performance here. I'm waiting to see other folks' takes.

The Qwen folks are also a lot more transparent, spend time community building, and iterate on releases much more rapidly. In the open rather than behind closed doors.

I don't like how secretive BFL is.