You can ask it to summarize and write thoughtful responses. Based on how humans write other things based on their feelings, GPT spits out responses that read like a reflection
it's similar. it's a distinction of what a fact is... the full sentence may not be a fact, but it is a statistical fact that word X[1] follows X[2] most often after X[3], and most often after X[4], etc.
From very nature of how these systems operate - they aren't capable of "reflecting" about anything.
You can ask it to summarize and write thoughtful responses. Based on how humans write other things based on their feelings, GPT spits out responses that read like a reflection
Assuming your assertion is correct, I still don't think that anything that isn't "reflecting" must be "regurgitating".
How are you so sure of the nature of these systems? Are you just speculating or regurgitating other peoples' speculations of these systems?
it's similar. it's a distinction of what a fact is... the full sentence may not be a fact, but it is a statistical fact that word X[1] follows X[2] most often after X[3], and most often after X[4], etc.