← Back to context

Comment by pyman

8 months ago

Initially, it had the same effect on people until they got used to it. In the near future, whether the text is yours or not won't matter. What will matter is the message or idea you're communicating. Just like today, it doesn't matter if the code is yours, only the product you're shipping and problem it's solving.

Code is either fit for a given purpose or not. Communicating with a LLM instead of directly with the desired recipient may be considered fit for purpose for the receiving party, but it’s not for the LLM user to say what the goals of the writer is, nor is it for the LLM user to say what the goals of the writer ought to be. LLMs for communication are inherently unfit for purpose for anything beyond basic yes/no and basic autocomplete. Otherwise I’m not even interacting with a human in the loop except before they hit send, which doesn’t inspire confidence.

I think just looking at information transfer misses the picture. What's going to happen is that my Siri is going to talk to your Cortana, and that our digital secretaries will either think we're fit to meet or we're not. Like real secretaries do.

You largely won't know such conversations are happening.

Similar-looking effects are not the "same" effect.

"Change always triggers backlash" does not imply "all backlash is unwarranted."

> What will matter is the message or idea you're communicating. Just like today, it doesn't matter if the code is yours, only the product you're shipping and problem it's solving.

But like the article explains about why it's rude: the less thought you put into it, the less chance the message is well communicated. The less thought you put into the code you ship, the less chance it will solve the problem reliably and consistently.

You aren't replying to "don't use LLM tools" you're replying to "don't just trust and forward their slop blindly."

Doesn't matter today? What are you even talking about? It completely matters if the code you write is yours. The only people saying otherwise have fallen prey to the cult of slop.

  • Why does it matter where the code came from if it is correct?

    • I really hope you're not a software engineer and saying this. But just as a lighting round of issues.

      1. code can be correct but non-performant, be it in time or space. A lot of my domain is fixing "correct" code so it's actually of value.

      2. code can be correct, but unmaintainable. If you ever need to update that code, you are adding immense tech debt with code you do not understand.

      3. code can be correct, but not fit standards. Non-standard code can be anywhere from harder to read, to subtly buggy with some gnarly effects farther down the line.

      4. code can be correct, but insecure. I really hope cryptographers and netsec aren't using AI for anymore than generating keys.

      5. code can be correct, but not correct in the larger scheme of the legacy code.

      6. code can be correct, but legally vulnerable. A rare, but expensive edge case that may come up as courts catch up to LLM's.

      7. and lastly (but certainly not limited to), code can be correct. But people can be incorrect, change their whims and requirements, or otherwise add layers to navigate through making the product. This leads more back to #2, but it's important to remember that as engineers we are working with imperfect actors and non-optimal conditions. Our job isn't just to "make correct code", it's to navigate the business and keep everyone aligned on the mission from a technical perspective.

      3 replies →

    • Why does it matter where the paint came from if it looks pretty?

      Why does it matter where the legal claims came from if a judge accepts them?

      Why does it matter where the sound waves came from if it sounds catchy?

      Why does it matter?

      Why does anything matter?

      Sorry, I normally love debating epistemology but not here on Hacker News. :)

      3 replies →