Slacker News Slacker News logo featuring a lazy sloth with a folded newspaper hat
  • top
  • new
  • show
  • ask
  • jobs
Library
← Back to context

Comment by nixon_why69

1 month ago

I think the intuition is that the first N layers decode into "thought language" while the last N encode back to desired output language. So if there are well defined points where it transitions between decoding/understanding, thinking, and rendering back to language, those 2 transition points should be in the same vector space of "LLM magic thinking language".

0 comments

nixon_why69

Reply

No comments yet

Contribute on Hacker News ↗

Slacker News

Product

  • API Reference
  • Hacker News RSS
  • Source on GitHub

Community

  • Support Ukraine
  • Equal Justice Initiative
  • GiveWell Charities