Comment by Zealotux
7 days ago
Congrats on the launch! Looks slick and reactive. Now I wish there was an app to analyse my meals from pictures and estimate the calories/macros, I guess it exists somewhere but I doubt the accuracy, honestly the only thing I really want from so-called AIs.
There are a few made for people with diabetes that offer macro estimates based on a photo, e.g. https://www.snaq.ai, https://gluroo.com. I'm sure others exist that are for a more general or fitness-minded audience.
If you'll find an app like this, try to do an experiment: estimate calories by taking a picture, then add 30g of olive oil (270 kcal) to the same dish and see if the app will detect it.
Heh, and the same goes for using a traditional nutrient tracker for a meal you ordered at a restaurant.
Nutrient/calorie tracking really only works if you measure the raw inputs or use a packaged product that gives you the info, and I imagine those are also the two cases that the AI can estimate visually.
The OpenNutrition app does that :)
Logging foods by image is a great way to get started being accountable with eating, and I'll use it if I'm out and don't want to manually figure out all the different components of something, but it's impossible for even the most well-trained human eye to understand food composition visually. A lot of AI-focused diet apps have gone in this direction as their primary method of input because it removes the need for a database, but the marketing these apps run that this is in anyway accurate as a primary search mechanism is, to me, really borders on abject dishonesty and sets users up for long-term failure. Just because an ingredient is invisible when prepared doesn't mean it's not there.
any idea how the webapp is written ? eg, what libraries or framework is used
Pretty standard Next app on Vercel. Some pre-fetching. Food pages are cached post-generation. Custom go-based search server behind a Cloudflare cache. Any other questions?