Comment by LeafItAlone
12 hours ago
That’s a great example and I understand it was intentionally simple but highlighted how LLMs need care with use. Not that this example is very related to NLP:
My prompt: `<<I want a flight from portland to cuba after easter>>`
The response: ``` { "origin": ["PDX"], "destination": ["HAV"], "date": "2025-04-01", "departure_time": null, "preferences": null } ```
Of course I meant Portland Maine (PWM), there is more than one airport option in Cuba than HAV, and it got the date wrong, since Easter is April 20 this year.
If the business stakeholders came out with that scenario, I would modify the prompt like this. You would know the users address if they had an account.
https://chatgpt.com/share/678c1708-639c-8010-a6be-9ce1055703...
OK, but that only fixed one of the three issues.
While the first one is easy. I mean you could give it a list of holidays and dates. But the rest you would just ask the user to confirm the information and say “is this correct”? If they say “No” ask them which isn’t correct and let them correct it.
I would definitely assume someone wanted to leave from an airport close by if they didn’t say anything.
You don’t want the prompt to grow too much. But you do have analytics that you can use to improve your prompt.
In the case of Connect, you define your logic using a GUi flowchart builder called a contact flow.
BTW: with my new prompt, it did assume the correct airport “<<I want to go to Cuba after Easter>>”
2 replies →