← Back to context

Comment by IncreasePosts

8 hours ago

Wouldn't a llm that just tokenized by character be good at it?

I asked this in another thread and it would only be better with unlimited compute and memory.

Because without those, then the llm has to encode way more parameters and way smaller context windows.

In a theoretical world, it would be better, but might not be much better.