← Back to context

Comment by echelon

8 hours ago

No. This is a CEO expressing righteous indignation about a company that provides (seemingly) little value and has almost no competition.

Slack won't open up their data moat to AI, which is shameful. And Slack costs way too much. If there were any competitors, the price would drop significantly. It's not like chat is a hard problem. And Slack's app is an absolute bear.

>> "almost no competition"

>> "costs way too much"

>> "It's not like chat is a hard problem"

Surely these statements can't all be true. Since Slack is expensive and has little competition, I think chat is a harder problem than you think.

  • Its not hard. Its capital intensive with a low profit margin. So it doesn't attract a lot of competition because you can make more money in other ways that have moats. There are at least a dozen other chat apps, some of which are decades old.

    To have a successful chat business, you need the network effect of lots of users (big marketing spend), you need lots of capital for operations (big spend on disks and compute) and after all that you get only a few dollars per user. Its just not a great business on the balance sheet. Notice that quality software doesn't even get a mention in this niche.

    • > Its just not a great business on the balance sheet.

      I think that's probably what makes it hard.

    • You can offload the cost of operations to the end user if you’re B2B. Sell the software as licenses the old school way and offload the cost by allowing users to run their own instances either on prem or on cloud.

You’re saying it’s an easy problem with an expensive solution and yet there’s no competition? Seems there must be more to it because that makes little sense to me.

> Slack won't open up their data moat to AI, which is shameful.

Ah yes. It's shameful that Slack won't open data moat to AI. You know, those millions of chats (including private data) by people who didn't give consent to this

  • > You know, those millions of chats (including private data) by people who didn't give consent to this

    I'm pretty sure the company you work for owns your work chat, and that what you say on company slack constitutes business information.

    There are a lot of things people don't consent to. Being born. Breathing in the air molecules that come from other people's bodies. Looking at ugly things. Hearing annoying sounds. It'll be okay.

    • > It'll be okay.

      Could there ever exist anything that wouldn't be okay? What's the difference between something that will be okay and something that won't? I'm guessing the things that will be okay are the things that might pose an obstacle for AI "progress".

    • In general the companies are the ones showing reluctance, much more than their employees. There's still a morass of security, privacy, and legal unanswered questions about LLM use in general. Not to mention the huge unknown of total lifecycle costs

    • > I'm pretty sure the company you work for owns your work chat, and that what you say on company slack constitutes business information.

      That’s not a valid argument. The company itself would still need to consent.

    • It's amazing how every reply failed to realize you're (and post was) talking about (a) enterprise Slack usage & (b) AI use by the company itself.

      1 reply →

    • > I'm pretty sure the company you work for owns your work chat, and that what you say on company slack constitutes business information.

      It does. And a lot of this information is highly sensitive. Imagine my company's surprise if Slack would not be shameful and would just open up its data moat to AI.

      > There are a lot of things people don't consent to. Being born.

      Demagoguery and non sequiturs are not arguments.

      But I guess that's what passes for "arguments" for AI maximalists.