CDP is a good exmaple of why the ecosystem has converged around Chrome and not Firefox. CDP has:
- Full documentation
- A stable API
- Tooling like this
Firefox has none of that: implementing the firefox devtools protocol means reverse engineering it, and then sometimes it still breaks when Firefox updates!
I haven't used the devtools protocol of Firefox but CDP is one of the worst protocols I had to work with. Everything is "experimental", inconsistencies between different domains, multiple ways to do some stuff and revealing internal stuff.
A recent attempt to bring up an element in Firefox's inspector by right-clicking the element and selecting the appropriate menu item resulted in the fans on my machine spinning up and the process needing to be killed because it hung while I tried to get a relatively simple view of the DOM tree on the screen to interact with and it responded by chewing through RAM.
When I tried it again I observed an increase of 130MB RAM to bring up the initial window/view, along with noticeable lag to put its contents on screen and make the controls interactive. When I collapsed all the nodes so that the only nodes in the tree toggled open were the HTML body element and its ancestors, it ended up consuming 400MB more—to collapse tree nodes and show fewer things on the screen.
That's half a gigabyte to bring up a less usable tool than the original DOM Inspector that Joe Hewitt checked in to the mozilla.org CVS server back in 2001.
The fact that Firefox's devtools team has ignored the readily available information and guidance from Firefox's own repo about how to do large JS codebases because they instead favor doing a wholesale import all of the bad practices from the NodeJS/NPM world is a serious problem unto itself.
very nice. there are things that won't show up in the network panel and you had to resort to proxy for debugging, especially when you are using puppeteer/etc for testing.
So many people building AI browsers definitely had this as an internal tool already lol, nice to see Chrome leaning in here; CDP is a huge pain to write and debug
in fact this is similar to a thing I wanted to do recently that I was calling JIT automation (using FF's browser console) but when I wanted to write an article about it they weren't too interested because FF specific.
CDP is a good exmaple of why the ecosystem has converged around Chrome and not Firefox. CDP has:
- Full documentation
- A stable API
- Tooling like this
Firefox has none of that: implementing the firefox devtools protocol means reverse engineering it, and then sometimes it still breaks when Firefox updates!
Chrome's/Google's attack on adblockers is why 'the ecosystem' now shifting away from chrome to firefox
I haven't used the devtools protocol of Firefox but CDP is one of the worst protocols I had to work with. Everything is "experimental", inconsistencies between different domains, multiple ways to do some stuff and revealing internal stuff.
A recent attempt to bring up an element in Firefox's inspector by right-clicking the element and selecting the appropriate menu item resulted in the fans on my machine spinning up and the process needing to be killed because it hung while I tried to get a relatively simple view of the DOM tree on the screen to interact with and it responded by chewing through RAM.
When I tried it again I observed an increase of 130MB RAM to bring up the initial window/view, along with noticeable lag to put its contents on screen and make the controls interactive. When I collapsed all the nodes so that the only nodes in the tree toggled open were the HTML body element and its ancestors, it ended up consuming 400MB more—to collapse tree nodes and show fewer things on the screen.
That's half a gigabyte to bring up a less usable tool than the original DOM Inspector that Joe Hewitt checked in to the mozilla.org CVS server back in 2001.
The fact that Firefox's devtools team has ignored the readily available information and guidance from Firefox's own repo about how to do large JS codebases because they instead favor doing a wholesale import all of the bad practices from the NodeJS/NPM world is a serious problem unto itself.
FYI it's behind a feature flag (aka experiment). Just in case you rarely use experiments in DevTools:
DevTools -> Settings (cog, top right) -> Experiments -> Search for "Protocol Monitor" -> Check the checkbox
Can anyone tell me some use cases for CDP commands?
In which situation is preferable to use CDP commands over Puppeteer?
very nice. there are things that won't show up in the network panel and you had to resort to proxy for debugging, especially when you are using puppeteer/etc for testing.
The absolute safest way to grab all traffic is capturing a netlog (https://www.chromium.org/developers/design-documents/network...) unfortunately there's been a lot of quirks with network captures through devtools and even with some of the recent bugfixes (https://issues.chromium.org/issues/40254754) they can still be lossy.
Is this supported through the chrome dev tools mcp server?
So many people building AI browsers definitely had this as an internal tool already lol, nice to see Chrome leaning in here; CDP is a huge pain to write and debug
I think they released this in 2023 somewhere, it's been hidden behind an experiment in the devtools settings.
Very useful. Does anything similar exists for Firefox?
I would look at the Browser Console if I were trying to do something similar
https://firefox-source-docs.mozilla.org/devtools-user/browse...
in fact this is similar to a thing I wanted to do recently that I was calling JIT automation (using FF's browser console) but when I wanted to write an article about it they weren't too interested because FF specific.
Servo have a 3rd party script to dump the traffic: https://book.servo.org/hacking/developing-devtools.html#capt... But there's nothing built-in.
Awesome, debugging is gonna be way easier from here on out.
[dead]
[dead]