← Back to context Comment by pezgrande 3 months ago One year ago was taboo to say you were using LLM to help you code, today is the other way around... 8 comments pezgrande Reply Aurornis 3 months ago > today is the other way around...It is definitely not taboo to say you’re writing your own code. bdangubic 3 months ago could get you fired in more and more places though… :) Aurornis 3 months ago If you only see the world through crazy headlines, this probably seems true. 1 reply → lawlessone 3 months ago where? lazide 3 months ago This isn’t to code. It’s tk summarize - something LLMs are usually good at, since they’re essentially lossy text/knowledge compression at their root. int3trap 3 months ago Yeah... no it's not. AndrewKemendo 3 months ago This is testable:Can you link to another one?
Aurornis 3 months ago > today is the other way around...It is definitely not taboo to say you’re writing your own code. bdangubic 3 months ago could get you fired in more and more places though… :) Aurornis 3 months ago If you only see the world through crazy headlines, this probably seems true. 1 reply → lawlessone 3 months ago where?
bdangubic 3 months ago could get you fired in more and more places though… :) Aurornis 3 months ago If you only see the world through crazy headlines, this probably seems true. 1 reply → lawlessone 3 months ago where?
Aurornis 3 months ago If you only see the world through crazy headlines, this probably seems true. 1 reply →
lazide 3 months ago This isn’t to code. It’s tk summarize - something LLMs are usually good at, since they’re essentially lossy text/knowledge compression at their root.
int3trap 3 months ago Yeah... no it's not. AndrewKemendo 3 months ago This is testable:Can you link to another one?
> today is the other way around...
It is definitely not taboo to say you’re writing your own code.
could get you fired in more and more places though… :)
If you only see the world through crazy headlines, this probably seems true.
1 reply →
where?
This isn’t to code. It’s tk summarize - something LLMs are usually good at, since they’re essentially lossy text/knowledge compression at their root.
Yeah... no it's not.
This is testable:
Can you link to another one?