eBay is hyper aggressive about fingerprinting, they will catch things like it trivially. Browsers leak all sorts of information like what sockets are open on localhost, making yourself look like an actual person is very challenging to someone motivated to detect you.
LLMs don't need browser automation though. Multimodal models with vision input can operate a real computer with "real" user inputs over USB, where the computer itself returns a real, plausible browser fingerprint because it is a real browser being operated by something that behaves humanly.
But will they behave like same user in past? I would guess there is lot of difference between how bot accesses page and real user has historically accessed them. Like opening multiple tabs at one time, possibly how long going through next set takes. How they navigate and so on.
There might be lot of modelling that could be done simply based on words used in searches and behaviour of opening pages. All trivially tracked to user's logged in session.
Sure, the cost of that goes way up though, especially if it has to emulate real world inputs like a mouse, type in a way that’s plausible, and browse a website in a way that’s not always the direct happy path.
No, but when instituting bullshit policies or trying to regulate natural/normal behavior for selfish gain, it helps you if you can enforce the policy, otherwise people will just ignore it.
Probably less about direct enforcement, more about after the fact. Ebay doesn't want to deal with charge backs for hallucinate purchases
Yeah, they're hedging against "AI purchases". eBay has already been dealing with automated/bots for years.
> Ebay doesn't want to deal with charge backs for hallucinate purchases
A charge back doesn’t mean buyer always wins. Imagine if credit card companies also pass a rule - “LLM or AI purchases are non-refundable”.
On a different note - once I tried to cancel an eBay order within a minute, both eBay and seller declined. It’s so fked up with them.
If Amazon has not defeated Perplexity yet, eBay is not going to stop anyone.
This. These kinds of "rules" are basically useless because they are not enforceable. It's exactly like having speed limits but no cops.
eBay is hyper aggressive about fingerprinting, they will catch things like it trivially. Browsers leak all sorts of information like what sockets are open on localhost, making yourself look like an actual person is very challenging to someone motivated to detect you.
LLMs don't need browser automation though. Multimodal models with vision input can operate a real computer with "real" user inputs over USB, where the computer itself returns a real, plausible browser fingerprint because it is a real browser being operated by something that behaves humanly.
But will they behave like same user in past? I would guess there is lot of difference between how bot accesses page and real user has historically accessed them. Like opening multiple tabs at one time, possibly how long going through next set takes. How they navigate and so on.
There might be lot of modelling that could be done simply based on words used in searches and behaviour of opening pages. All trivially tracked to user's logged in session.
1 reply →
Sure, the cost of that goes way up though, especially if it has to emulate real world inputs like a mouse, type in a way that’s plausible, and browse a website in a way that’s not always the direct happy path.
> Impossible to enforce
Maybe, but a policy's or law's validity or importance are not contingent on them being enforceable.
No, but when instituting bullshit policies or trying to regulate natural/normal behavior for selfish gain, it helps you if you can enforce the policy, otherwise people will just ignore it.