← Back to context

Comment by aleph_minus_one

8 days ago

Couldn't you simply increase the temperature of the model to somewhat mitigate this effect?

I kind of think of that as just increasing the standard deviation. Its been a while since I experimented with this, but I remember trying a temp of 1 and the output was gibberish, like base64 gibberish. So something like 0.5 doesn't necessarily seem to solve this problem, it just flattens the distribution and makes the output less coherent, with rarer tokens, but still the same underlying distribution.

When applied to insightful writing, that is much more likely to dull the point rather than preserve or sharpen it.