← Back to context

Comment by acc_297

10 months ago

If so then what a colossal waste of our planetary carbon budget. I don't know what the backend on this app is (maybe it was done properly i really hope so) but surely there is a solution which maintains or surpasses the accuracy of an LLM and uses <1% of the compute resources.

Like a small vision model combined with the size/measurements data from the AR sensors modern phones come with and an open source caloric values database should achieve the 90% accuracy they are claiming.

Ronald Wright writes about "progress traps" in A Short History of Progress. It's been awhile since I read that but I think about it more and more these days with AI products on the rise.

No, this is a completely unsolvable problem with just a camera.

You cannot differentiate a high calorie meal from a low calorie meal on sight alone.

The waste is selling a lie, enabled by AI bullshit artists and the public's seeming inability to understand that the US has no legal (or market most of the time) requirements to be truthful, upfront, or honest in marketing.

Like people just take this shit at face value and I don't understand how you can live in the US for more than a few years and not recognize that marketing is just lies, like not even smart or clever lies.