Comment by lesostep
12 hours ago
Had a friend in a similar situation. She got a clearly LLM-generated ticket that didn't make any sense, and was directed to question anything about that ticket.
Apparently, asking "why it doesn't make any sense" wasn't !polite~
If I remember correctly, she came up with ~200 questions for a 2-paged ticket. I helped write some of them, because for parts of the word salad you had to come up with the meaning first and then question the meaning.
You know what happened after she presented it? Ticket got rewritten as a job requirement, and now they seeking some poor sod to make it make sense lol
One had to be very unqualified to even get through the interview for that job without asking questions about the job, I feel. Truly, an AI-generated job for anyone who is new to the field
The first question should have been "Was this ticket AI-generated?".
Oh, it was! But the guy that generated it insisted that he triple-checked the prose after, and it should be treated as typed by hand
I'm pretty sure it would be okay to stop at 5-10 questions, because it was clear he couldn't answer any. But my friend is from a hateful branch, and so she went for humiliation angle of asking for as much clarification as the ticket itself allowed
I have a very similar situation. Except it isn't even a ticket, just an export of a very long "conversation" with ChatGPT with a vague indication that this is what needs to be implemented. When questioned about it, the person insists they completely understood it before but just forgot after a few days. Sometimes the prompts are removed. Lots of contradictory material in it, some doesn't make sense even in context. Very difficult to figure out what is wanted.
1 reply →
[dead]