Comment by daxfohl
18 hours ago
I wonder how the internet would have been different if claws had existed beforehand.
I keep thinking something simpler like Gopher (an early 90's web protocol) might have been sufficient / optimal, with little need to evolve into HTML or REST since the agents might be better able to navigate step-by-step menus and questionnaires, rather than RPCs meant to support GUIs and apps, especially for LLMs with smaller contexts that couldn't reliably parse a whole API doc. I wonder if things will start heading more in that direction as user-side agents become the more common way to interact with things.
This is the future we need to make happen.
I would love to subscribe to / pay for service that are just APIs. Then have my agent organize them how I want.
Imagine youtube, gmail, hacker news, chase bank, whatsapp, the electric company all being just apis.
You can interact how you want. The agent can display the content the way you choose.
Incumbent companies will fight tooth and nail to avoid this future. Because it's a future without monopoly power. Users could more easily switch between services.
Tech would be less profitable but more valuable.
It's the future we can choose right now by making products that compete with this mindset.
Biggest question I have is maybe... just maybe... LLM's would have had sufficient intelligence to handle micropayments. Maybe we might not have gone down the mass advertising "you are the product" path?
Like, somehow I could tell my agent that I have a $20 a month budget for entertainment and a $50 a month budget for news, and it would just figure out how to negotiate with the nytimes and netflix and spotify (or what would have been their equivalent), which is fine. But would also be able to negotiate with an individual band who wants to directly sell their music, or a indie game that does not want to pay the Steam tax.
I don't know, just a "histories that might have been" thought.
Maybe we needed to go through this dark age to appreciate that sort of future.
This sort of thing is more attractive now that people know the alternative.
Back then, people didn't want to pay for anything on the internet. Or at least I didn't.
Now we can kill the beasts as we outprice and outcompete.
Feels like the 90s.
too easy to skip/strip the ads that way...
Premium accounts?
I don't exactly mean APIs. (We largely have that with REST). I mean a Gopher-like protocol that's more menu based, and question-response based, than API-based.
Interesting
Why wouldn't there be monopoly power? Popular API providers would still have a lot of power.
If I can get videos from YouTube or Rumble or FloxyFlib or your mom’s personal server in her closet… I can search them all at once, the front end interface is my LLM or some personalized interface that excels in it’s transparency, that would definitely hurt Google’s brand.
2 replies →
What is in it _for them_?
Where and how do they make money?
Yesterday IMG tag history came up, prompting a memory lane wander. Reminding me that in 1992-ish, pre `www.foo` convention, I'd create DNS pairs, foo-www and foo-http. One for humans, and one to sling sexps.
I remember seeing the CGI (serve url from a script) proposal posted, and thinking it was so bad (eg url 256-ish character limit) that no one would use it, so I didn't need to worry about it. Oops. "Oh, here's a spec. Don't see another one. We'll implement the spec." says everyone. And "no one is serving long urls, so our browser needn't support them". So no big query urls during that flexible early period where practices were gelling. Regret.
sexps?
> sexps?
Not the person you're responding to, but I think they mean sexps as in S-expressions [1]. These are used in all kinds of programming, and they have been used inside protocols for markup, as in the email protocol IMAP.
[1] https://en.wikipedia.org/wiki/S-expression
Presumably https://en.wikipedia.org/wiki/S-expression
This sounds very plausible. Arguably MCPs are already a step in that direction: give the LLMs a way to use services that is text-based and easy for them. Agents that look at your screen and click on menus are a cool but clumsy and very expensive intermediate step.
When I use telegram to talk to the OpenClaw instance in my spare Mac I am already choosing a new interface, over whatever was built by the designers of the apps it is using. Why keep the human-facing version as is? Why not make an agent-first interface (which will not involve having to "see" windows), and make a validation interface for the human minder?
Any website could in theory provide api access. But websites do not want this in general: remember google search api? Agents will run into similar restrictions for some cases as apis. It is not a technical problem imo, but an incentives one.
The rules have changed though. They blocked api access because it helped competitors more than end users. With claws, end users are going to be the ones demanding it.
I think it means front-end will be a dead end in a year or two.
My point is that the underlying incentives are exactly the same. I dont think the rules have changed at all. If you are expedia you could always give an api to search forhotels, but why commoditize yourself? Same with agents.
Ryanair recently had a court case with some meta travel website because they were selling their flights. Ryanair wants to sell you the insurance and extras, and they can only do so controlling the experience.
My prediction is, like apis, there will be some years of extra access for agents, followed by locking moats for their own experience.
”End users” currently being people spending hundreds/thousands of dollars to set up custom brittle workflows, a whole total of a few thousands globally. Let’s not make this into something it’s not, personally I lost all trust in karpathy with his hyping of Clawdbot as som sci-fi future when all it was were people prompting LLMs to go write Reddit posts.
Can you explain how Google Search API fits into your point? I don't know enough about it
If I want to use google search in an automated way google does not want it. They prefer to show me ads. This applies to apis or agents. If google does not want that they will add friction by removing api access or making it difficult to use agents (fingerprinting, 2fa, captchas, etc)
> if claws had existed beforehand.
That's literally not possible would be my take. But of course just intuition.
The dataset used to train LLM:s was scraped from an internet. The data was there mainly due to the user expansion due to www, and the telco infra laid during and after dot-com boom that enabled said users to access web in the first place.
The data labeling which underpins the actual training, done by masses of labour, on websites, could not have been scaled as massively and cheaply without www scaled globally with affordable telecoms infra.