Artificial Intelligence,  English

Addiction Prevention Algorithm, Ethical Issues

Wired Magazine, in what’s hands down the most compelling article of this month, exposes a concerning facet of medical sexism where a woman suffering advanced endometriosis is denied pain medication due to algorithmic prescreening for addiction risk.

The Pain Was Unbearable, So Why Did the Doctors Turn Her Away?

The NarxCare algorithm screens pain patients according to their health data and prescription history, in order to determine who is entitled to pain management and who is supposed to tough it out (for their own good).

In this case, the patient had a long history of legally prescribed percocet that prompted the algorithm to flag her as being at risk to develop addiction. In consequence, while at the hospital, after 3 days of intravenously receiving pain meds, she was taken off the meds on the 4th day and thrown out of the hospital.

I am shocked, but not the least surprised. This is a perfect example of algorithmic bias perpetuating deeply-rooted medical sexism and racism. African Americans and Canadians are particular targets of such algorithms.

Let me be a broken record and remind you of the dangers of data sharing. NarxCare will scan your health data and prescription history, other algorithms will scan your Facebook and Instagram and together they will profile the hell out of you. With reduced healthcare workers, algorithms will be ever more present in hospitals. Absurd consequences are to be expected.

It is important to remember that you have a certain degree of control over what you disclose. Even in Quebec where the government tries to centralize all data (and nobody ever prescribes percocet – we are a clean province), if you need to see a doctor it will rarely be the same one. Even though your prescription history is centralized in a provincial database, you have to recite it to every new doctor you see, be it private practice or emergency room, otherwise they have no access to it. I was surprised to find out that even CNESST has no access to workers’ prescription history. So, it entirely depends on you and what you decide to give as data. If you forget to report on a prescription or a health supplement, algorithms won’t know about it and won’t mine it.

In the ancient (prepandemic) times some people used to lie about how many ounces of alcohol they drink per week. In Quebec, the Société des Alcools is a government entity that has a monopoly over all alcohol sales in the province and taxes alcohol to the maximum possible to make sure people drink less. In order to trace sales of specific products to specific customers, the government partners with a 3rd party that issues a rewards card, called Inspire (I realize how comical this paragraph is becoming) that grants you points on the basis of your alcohol purchases. If you have and use that card, you have handed over to the government your entire drinking history. So the same government that has your prescription history, also has your drinking habits at hand and all of it is fed into AI for analytics. Just saying.

Lastly, if you care about the planet, you should stick to health supplements as much as possible, since prescription drugs are all tested on animals. I love bringing up health supplements with medical doctors and look at their face …