Comment by Jimmc414

1 year ago

This is nice. I created something similar, https://github.com/jimmc414/1filellm

It converts papers, repositories, PRs, YT transcripts and web docs into one text file in the clipboard for llm ingestion

This stuff sounds cool, but doesn't it quickly run into token/context limits on the models?

  • Not anymore in the subscription LLM offerings. Claude seems to allow 70k tokens or more in their paid UI, ChatGPT seems to be about half of that while custom GPTs allow well over 100k.