← Back to context

Comment by utopiah

9 hours ago

> make the cost of the signal match what it claims to represent.

Well that's the WHOLE problem of trust. There is so much work on blockchain in proof of work, proof of stake, etc in order to protect ourselves from attacks, e.g. https://en.wikipedia.org/wiki/Sybil_attack

If you do find a way it would apply to a lot more than "just" GitHub star for VCs.

> Well that's the WHOLE problem of trust.

Great point. It all comes down to trust.

Some are masters at setting attention traps. They manipulate all these cheap metrics that normal people naturally pay attention to, confusing potential deal parties, serving self-interest while increasing risk and causing harm to the other side.

> It would apply to a lot more than "just" GitHub star for VCs

Yes. It would apply to a lot more than "just" GitHub stars for VCs — even more so if the 'interactions' are naturally deal-based.

Imagine a metric for proof of work named DUKI-ALM. If you give away $100 to the world, you gain 100 DUKI-ALM — absolutely equal to the cost.

Think of it as tips contributed jointly by the taker and maker of a deal, paid out to the world. The DUKI-ALM signal is the sum of all tips. They tipped $10? The metric value is 10.

Products that have the ability to make more deals and generate more surplus will naturally contribute more — and gain more signal of trust. Sybil attacks are prevented by design, since what's the point of attacking if you still need to tip the world 100 USDT to gain 100?

I'd love to hear if you see a hole in this — the cost of the signal matches exactly what it claims to represent.