Do You Trust AI With Your Health Care?

Ricardo Arduengo

I’ve only been in an ICU once as a patient. I developed sudden chest pain and went to the ER. The funny thing about chest pain is that it gets you right to the front of the line at the ER. I had been there on a previous visit with a deep cut on my hand and had to sit with my hand over a garbage can so I wouldn’t get blood all over the floor. As it later turned out, the chest pain was actually caused by an ulcer that was creating reflux. The ulcer was caused by my job, but that’s another story. But I did spend a night in the ICU. My favorite memory was of my wife, who worked on that very unit, telling all her co-workers, “If he starts to snore, just close the door.” That’s what’s known as medical humor. I learned one thing during my night in the ICU: unless you are unconscious, you don’t get any rest. I was checked on constantly, monitored, and even given a shot at one point. I was glad to be discharged so I could go home and get some rest. But I was still impressed. My nurses were on top of the game the entire time.

Advertisement

Also read: Did A.I. Just Do the Impossible and Reunite The Beatles?

In addition to obtaining a degree, nurses go through regular training and take continuing education courses. And after five or six years, they develop a certain savvy when it comes to spotting problems with their patients. It is something that can only come with experience. It may be something obvious like pale skin, or a slow increase or decrease in heart rate or respirations. Even a moment of confusion in a patient can tell a nurse volumes. With that in mind, how willing are you to turn your care over to artificial intelligence? Maybe an algorithm can help build a car or write a term paper, but what about detecting a developing and potentially life-threatening issue properly?

That’s the issue facing hospitals and caregivers today. Writing in the Wall Street Journal, Lisa Bannon talks about the increasing role of AI in monitoring patients:

Melissa Beebe, an oncology nurse, relies on her observation skills to make life-or-death decisions. A sleepy patient with dilated pupils could have had a hemorrhagic stroke. An elderly patient with foul-smelling breath could have an abdominal obstruction.

So when an alert said her patient in the oncology unit of UC Davis Medical Center had sepsis, she was sure it was wrong. “I’ve been working with cancer patients for 15 years so I know a septic patient when I see one,” she said. “I knew this patient wasn’t septic.”

Nurses can override the algorithm with the attending doctor’s permission. There is a protocol that is followed when the algorithm determines that a patient is septic and it does not provide a rationale for its decision. It merely created an alert when it noticed objective data that were similar to past patients with sepsis. Beebe could tell that sepsis was not the problem, but she still had to draw blood. As Bannon noted, that created a risk of infection while adding to the patient’s medical bills. So AI is good at finding an issue, but not so good when it comes to pinpointing it. The piece quotes Kenrick Cato, a professor of nursing at the University of Pennsylvania and nurse scientist at the Children’s Hospital of Philadelphia: “AI should be used as clinical decision support and not to replace the expert. Hospital administrators need to understand there are lots of things an algorithm can’t see in a clinical setting.”

Remember, AI has data and a program but not practical experience. It essentially follows a flow chart: If this, then this.

Advertisement

Another nurse whom I talked to said that the danger comes when a hospital puts a higher value on the abilities of the machine than a nurse’s experience and critical thinking skills. She added that quite a bit goes into clinical judgment. Experience, after even 5-10 years in nursing, provides more value than an algorithm that lacks a human component. The other danger is that younger nurses may learn to lean on AI as opposed to evaluating the situation themselves. She noted that as with any tool, it is great to have, but AI cannot have a higher priority than clinical judgment. She has used it before and received an alert that a patient had an increase in their rate of breaths per minute. She explained that it could be a sign of infection as the body tries to rid itself of lactic acid. But the algorithm does not take things into account such as asthma or COPD that can increase respiratory rates without sepsis being a concern.

Will AI replace skilled bedside nursing? At some point, it may replace every job, including jobs in health care. But when the time comes that human heads and hearts are no longer needed in medicine, will quality health care even be an option?

Recommended

Trending on PJ Media Videos

Join the conversation as a VIP Member

Advertisement
Advertisement