Comment by Terr_

16 days ago

> Has anyone else noticed that the AI industry can’t take “no” for an answer? AI is being force-fed into every corner of tech. It’s unfathomable to them that some of us aren’t interested. The entire AI industry is built upon a common principle of non-consent.

I can't help but see the spam as more circumstantial evidence of a bubble, where top-down "pump those numbers" priorities overrides regular process.

The really strange thing is that so much of it doesn't work. Like I get that the SOTA models perform some tasks quite well and have some real value. But the AI being implemented in every corner creates a lot of really bad results. The Shopify code assistant will completely wreck your site and basically gets nothing correct. It will write 100 lines to change a color of a single DIV. The Amazon product Q&A will give you wrong information more frequently than not.

In what mind frame is it logical or necessary to put these extremely poorly functioning products in to the wild?

  • It's a desperate attempt at staying relevant, even if most of those companies don't realize it yet. Because of its general-purpose nature, AI subsumes products. Most software products that try to "implement AI in every corner" would, from the user's POV, be more useful if they became tools for ChatGPT/Claude/Gemini.

    People's goals are rarely limited to just one software product, and products are basically defined as a bag of tools glued with UI, that work together but don't interoperate much with anything else. That boundary drawn around a bunch of software utilities, is given a name and a fancy logo, and sold or used to charge people rent. That's software products. But LLMs want to flip that around - they're good at gluing things, so embedding one within a product is just a waste of model capabilities, and actually makes the product boundary more apparent and annoying.

    Or in short: consider Copilot in Microsoft Word, vs. "Generate Word Document" plugin/tool for a general LLM interface (whether Gemini webapp or Claude Code or something like TypingMind). The former is just an LLM locked in a box, barely able to output some text without refusing or claiming it can't do it. The latter is a general-purpose tool that can search the web for you, scrap some sites and run data analysis on results (writing its own code for this), talk results over with you, cross-reference with other sources, and then generate you a pretty Word document with formatting and images.

    This is, btw., a real example. I used a Word document generator with TypingMind and GPT-4 via API, and it was more usable over a year ago than Copilot is even now. Partly because Copilot is just broken, but mostly because the LLM can do lots of things other than writing text in Word.

    Point being, AI is eroding the notion of software product as something you sell/rent, which threatens just about the entire software industry :).

    • I have been enjoying reading this thread, but with some irony: sure the email spams pushing their Lumo LLM private chatbot were a mistake, and I bet they stop doing that fast.

      The irony is that Lumo is a separate product, not really tied to the rest of their products except for a common login. Lumo works fine for the simple quality of life search and question answering stuff.

      Off topic, but have you tried avoiding the big corporate LLM providers and run local models? The small models just keep getting better and I find it fun and satisfying to do as much as I can locally.

  • "It's difficult to get a man to understand something when his salary depends upon his not understanding it."

    In this case, the thing that's difficult to understand is "AI in everything is shit and nobody wants it."

  • Saw an AI generated product feature list on walmart's site that listed a stainless steel rack as microwaveable. If someone can sue mcdonalds for hot coffee, I imagine someone burning their house down while microwaving steel probably could sue too. Intelligence of the plaintiff not withstanding.

    • > while microwaving steel

      There actually are microwave-safe steel objects, it depends on their shapes and conductive paths.

      After all, the whole inner-box is already a metal surface being blasted by the microwaves that come in through a small hole...

Agree. The number of services i use where the apps continually add new marketing preferences which are defaulted to ‘enabled’ despite the fact that all other preferences are disabled is disgusting and clearly used by some companies to ignore people’s actual preferences.

LinkedIn is one of the worst offenders.

  • Whenever I login to LinkedIn I get "emails aren't getting through to your main email address".

    1. That's by design, because you spammed the shit out of it. 2. Given that all I do is send them to /dev/null, HOW DO YOU KNOW?

    • They're checking to see whether any of the links they put in the emails are being fetched from their servers. It's stupid, but it works for most people.

      I had a similar situation with SMS messages that were being sent to me with links informing me of status updates. These texts were useful, and I would go over to my real computer to check the web site. Then after a few days the text messages said "It looks like these messages aren't getting through to you, so we'll stop sending them." Which is also stupid, but it works for most people that load the web site on their phone from the SMS link. God help you if you have a dumb-phone.

      2 replies →

  • Have you noticed certain financial providers sending blatant marketing emails with no unsubscribe option and a comment along the lines of "these emails are not marketing"

    • The trick is create a filter to weed out such junk. And if a company sends me marketing fluff without unsubscribe option, then it goes in the junk/spam folder, and I may eventually discontinue my account with that service provider altogether.

      Because I periodically check my sp/junk folder to see if legitimate emails got dumped there, so I eventually know who's a spammer and who's not.