Comment by arjunchint
19 hours ago
Majority of sites don't even expose accessibility functionalities, and for WebMCP you have to expose and maintain internal APIs per page. This opens the site up to abuse/scraping/etc.
Thats why I dont see this standard going to takeoff.
Google put it out there to see uptake. Its really fun to talk about but will be forgotten by end of year is my hot take.
Rather what I think will be the future is that each website will have its own web agent to conversationally get tasks done on the site without you having to figure out how the site works. This is the thesis for Rover (rover.rtrvr.ai), our embeddable web agent with which any site can add a web agent that can type/click/fill by just adding a script tag.
> for WebMCP you have to expose and maintain internal APIs per page
Perhaps. I think an API for the session is probably the root concern. Page specific is nice to have.
You say it like it's a bad thing. But ideally this also brings clarity & purpose to your own API design too! Ideally there is conjunct purpose! And perhaps shared mechanism!
> This opens the site up to abuse/scraping/etc.
In general it bothers me that this is regarded as a problem at all. In principle, sites that try to clickjack & prevent people from downloading images or whatever have been with us for decades. Trying to keep users from seeing what data they want is, generally, not something I favor.
I'd like to see some positive reward cycles begin, where sites let users do more, enable them to get what they want more quickly, in ways that work better for them.
The web is so unique in that users often can reject being corralled and cajoled. That they have some choice. A lot of businesses being the old app-centric "we determine the user experience" ego to the web when they work, but, imo, there's such a symbiosis to be won by both parties by actually enhancing user agency, rather than this war against your most engaged users.
This also could be a great way to avoid scraping and abuse, by offering a better system of access so people don't feel like they need to scrape your site to get what they want.
> Rather what I think will be the future is that each website will have its own web agent to conversationally get tasks done on the site without you having to figure out how the site works
For someone who just was talking about abuse, this seems like a surprising idea. Your site running its own agent is going to take a lot of resources!! Insuring those resources go to what is mutually beneficial to you both seems... difficult.
It also, imo, misses the idea of what MCP is. MCP is a tool calling system, and usually, it's not just one tool involved! If an agent is using webmcp to send contacts from one MCP system into a party planning webmcp, that whole flow is interesting and compelling because the agent can orchestrate across multiple systems.
Trying to build your own agent is, broadly, imo, a terrible idea, that will never allow the user to wield the connected agency they would want to be bringing. What's so exciting an interesting about the agent age is that the walls and borders of software are crumbling down, and software is intertwingularizing, is soft & malleable again. You need to meet users & agents where they are at, if you want to participate in this new age of software.
> You say it like it's a bad thing. But ideally this also brings clarity & purpose to your own API design too! Ideally there is conjunct purpose! And perhaps shared mechanism!
I update my website multiple times a day. I want to have as much decoupling as possible. Everytime I update internal API, I dont want to think of having to also update this WebMCP config.
Basically I have to put in work setting up WebMCP, so that Google can have a better agent that disintermediates my site.
> Trying to keep users from seeing what data they want is, generally, not something I favor.
This is literally the whole cat and mouse game of scraping and web automation, sites clearly want to protect their moat and differentiators. LinkedIn/X/Google literally sue people for scraping, I don't think they themselves are going to package all this data as a WebMCP endpoint for easy scraping.
Regardless of your preferences/ideals, the ecosystem is not going to change overnight due to hype about agents.
> Your site running its own agent is going to take a lot of resources
A lot of sites already expose chatbots, its trivial to rate limit and captcha on abuse detection
But we have OpenAPI at home
OpenAPI is a replacement for web browsing. Mostly for businesses. WebMCP nicely supplements your web browsing.
4 replies →
Sadly I do see this slop taking off purely because something something AI, investors, shareholders, hype. I mean even the Chrome devtools now push AI in my face at least once a week, so the slop has saturated all the layers.
They don't give a fuck about accessibility unless it results in fines. Otherwise it's totally invisible to them. AI on the other hand is everywhere at the moment.
This isn’t even MCP, it’s just tools. If it were real MCP of definitely have fun using the “sampling” feature of MCP with people who visit my site…
IYKYK