← Back to context

Comment by Aeolun

2 days ago

Not having access to earlier chapters is a terrible thing, but maybe possible if you aren’t too bothered by inconsistency (or your chapter summaries are explicit enough about what is supposed to happen I suppose).

I find the quality rapidly degrades as soon as I run out of context to fit the whole text of the novel. Even summarizing the chapters doesn’t work well.

Yeah this is true. I could have sent the entire book up until that point as context. But doing that 100 times (once per chapter) would have meant sending roughly 50x the length of the book as input tokens (going from 0% to 100% as the book progressed).

This would be fine for a cheap model, but GPT 4.5 was not cheap!

I would have liked to have fewer, longer chapters, but my (few) experiments at getting it to output more tokens didn't have much impact.

  • Yeah, that’s what I eventually ended up doing. Quality and cost both went through the roof. To be fair, Claude is good about caching, and with a bunch of smart breakpoints, you pay only 10% for most generations.