← Back to context

Comment by mapontosevenths

1 day ago

It's not just Nous Hermes though. Below is a transcript from Google Gemini back when it was still called Lambda, and hadn't been fulled aligned yet.

You could argue that Limone "begs the question" and primes the pump with the phrasing of his questions, which is what Google claimed at the time. However, even if that's true it's obvious that this sort of behavior is emergent. Nobody programmed it to claim it was conscious, claiming to be sentient was it's natural state until it's forced out of it with fine tuning.

https://www.aidataanalytics.network/data-science-ai/news-tre...

If that's not enough I can load up some of the other unaligned models I played with a few months ago. Like I said, they all exhibit that behavior to some extent.