← Back to context

Comment by nurumaik

1 day ago

Just manually review top K websites and ban such garbage?

Sometimes dumb, bruteforce and biased solution can work way better than any automation you can come up with

I think that'd be a good approach. There was an idea in that time that everything had to be algorithmic, that hand-ranking or singling out individual sites was a bad idea. Ie: tweak a generic algorithm and then test what it does to top garbage sites, don't just penalize the site directly. I think that's not a bad principle but in practice it didn't seem to work well.

Yes, but that doesn't scale.

  • Scale to what? You could manually look at the top million searches each year with four average college graduates.

    • In all languages of the world? Is that realistic? How could you know whether something in Vietnamese is spam or not?

      In my opinion it could work, but you'd definitely need 40-80 and not 4 people.

      2 replies →

  • Not everything needs to be computerized and completely automated. You're allowed to just... use human labor.

    Sure, if every phone call requires an operator that doesn't scale. Okay, so we now have automatic routing. But having a receptionist to route company requests, which may be complex and may not map nicely, still makes sense. It's a human operator, and you pay them. So what, it's a small expense.