← Back to context

Comment by lurkingllama

1 year ago

Anyone who has participated in calorie-tracking for an extended period of time can tell you that the visual appearance of food can be highly misleading to the number of calories in it.

I'm all for making it easier for people to lose weight but this app may honestly have the reverse effect. If the app estimates calories too low (and therefore the individual eats more), many people will get frustrated with the lack of progress and give up. If the app estimates too high, the individual will lose weight, but diet fatigue and other negative side effects of being at a >500 calorie deficit may make the diet too difficult to maintain.

I actually went over this recently with someone who wanted to build something similar. The conclusion was this is a very difficult problem to solve, probably intractable to some extent. You can't see the complete composition of the food with a standard camera. E.g. I make a salad which is maybe 300 calories. Then I sprinkle some croutons and bacon on top, which will mostly be in the middle. Then I put dressing on it, which is hard to estimate. That dressing hides the bacon and croutons and, since it contains a lot of oil, could seriously skew the measurement one way or the other. Now I mix it all around and the AI can't tell how much dressing was used at all.

I pick this example because I've seen specifically this cause problems for people trying to lose weight. They think their eating a salad, not realizing they've thrown an extra 500 calories on top.

Another case: I sit down to breakfast, having made myself eggs and toast. One of if not the largest contributor to my calorie intake will be the amount of butter on my toast. If I use four pats that will probably exceed my calorie intake from eggs. If I use one, not as much. I sincerely doubt it's realistic to tell the difference with any sort of precision.

  • I'm in the calorie identification app business, and one benchmark I like to use is the old milkshake salad. It LOOKS like a salad with some mystery white sauce, maybe a ranch dressing or some such, but it's actually a milkshake. Hah! Gotcha. Humans: 3, AI:1.

    • How can that benchmark be useful unless real people are actually eating some lettuce covered in milkshake?

      Wait, ARE people eating that? Have I been out of the game too long?

A lot of the testimonial photos for this show the AI literally deciphering the text Nutrition Information label.

Any other interpretive situation based solely on a camera has so many inherent flaws as to render this almost useless.

  • I think even if you have volume factored in like photogrammetry/lidar, the inside of the food could be something else too like a chunk of meat inside potatoes

  • The nutritional information label that lists the calories in the food? What is the point of that?

    • Steelmanning this: You take your phone out and snap a picture and it gets added to a running total without typing into a tiny keyboard or doing any math. You consult your total next time you're thinking of eating.

      Critiquing: You still need to figure out serving sizes - it's going to need to ask how many servings. Nutrition labels also aren't available for any number of things.

Anecdote of one... about 8 years ago I put a lot of effort into losing weight... when I was eatng a 1500 calorie/day diet, I wasn't losing weight... per a nutritionist friend who looked at what I was eating suggested closer to 2800 (6'1" tall, 370# at the time) and when I did, I started losing weight. Note this was combined with crossfit 3-4 days a week both on the lower and higher calorie intake.

If you have a really dysregulated metabolism, your body can definitely work against you when consuming too little.