Comment by jexah
8 years ago
You're really blaming the wrong people. Blame the doctor for screwing it up, your wife for not checking the prescription, or the government for creating/enforcing the relevant laws that would have put not only the chemist but their entire franchise underground for "doing the right thing". Laws are laws, broskie, don't expect everybody to break them for you.
If the migraine was really bad (bad enough to cause tangible damage) maybe you should sue the doctor for damages, or if it wasn't that bad, report him/her and go somewhere else next time.
So in effect he's right to suggest the pharmacist should just be replaced by a robot because you would have them follow the rules no matter what with all humanity stripped out. If there was more than one pharmacist available it wouldn't have killed him/her to go and take a look at the wife in the car.
The pharmacist can, and should, use their human judgement to refuse to dispense drugs even when prescribed. But they don't and shouldn't use their human judgement to dispense drugs that weren't prescribed; this is by design and for good reason. It's a two-person rule: you only get the drugs if both the pharmacist and the doctor formally agreed you should get them.
The problem seems to be that the drug has indeed been prescribed, but the writing was erroneous. This is not about "should we freely sell drug", but about "should we use human judgment in addition to paper orders".
Clearly, some commenters are also trying hard to be robot-swappable. :)
2 replies →
It doesn't matter if the pharmacist thinks -- or even knows -- that they are doing the "right thing", pharmacists cannot prescribe medications. Their life would be over in a heartbeat if anybody ever found out that he or she legally provided a drug to somebody without a prescription. Additionally, pharmacists should use their best judgement to deny valid prescriptions, similar to how a bartender would use their best judgement to not sell alcohol to a customer.
You are right and make an interesting point. Part of the advantage of dealing with a human is that one hopes they can deal with whatever strange problem it thrown at them.
If you can’t deal with corner cases, or have none than leveraging rules/software is the way to go.
I think of vending machines and parking meters as automation of simple tasks, but what to do when they break?
I think google tries this by trying to “automate all the things”. It works most times but when it goes wrong it makes it very frustrating to correct.
When they (the machines) break, a human should be able to override, i.e., I was once stuck in a malfunctioning parking garage (no ability to pay, barrier stayed shut), so I lifted the barrier and let everyone out without paying. This could be illegal, but keeping me in a parking garage without telling me how long it will take is also a crime. Please don't make human un-openable parking garages, ever (meaning: don't let a robot decide whether a human is allowed to leave).
4 replies →
> If there was more than one pharmacist available it wouldn't have killed him/her to go and take a look at the wife in the car.
Depending on the jurisdiction, pharmacists aren't allowed to prescribe stuff or treat people on the spot. That's business of a (licensed) doctor.
My mother had a pharmacy and she often called the doctor to solve this issues. The doctor is a phone call away, and the pharmacist job is to do whatever he cans to improved the patient's illness.
In which case you blame the programmer who wrote the code, not the employee following the software. Your idea of deferring to the rules over all common sense is the core of the problem. It doesn't matter if we are talking about government rules over which they'll enslave you if you break them or corporate rules over which they'll fire and blacklist you if you break them.