Read Time:6 Minute, 25 Second


[

WASHINGTON — Artificial intelligence (AI) in medicine certainly has advantages, but it’s not “magic,” experts warned, so it’s important for providers to understand the upside and downside of AI.

“For the bedside clinicians, I think it’s important for them to … recognize that artificial intelligence is not some type of magic that just happens in ‘the cloud,'” said Gregg Springan, MSN, BSN, vice president of clinical services and a nurse executive for Diligent Robotics, during a panel at the American Academy of Nursing 2023 Health Policy Conference.

Even “the cloud” is a data center in a physical building, he said. “I think in that spirit … the idea that predictive models [in medicine] are just working on … ‘magical things’ … that’s not the case,” he said.

Maura Grossman, JD, PhD, a research professor at the School of Computer Science and School of Public Health Science at the University of Waterloo in Ontario, Canada, said she sees three main parts to AI: algorithms, machine learning, and natural language processing (NLP).

An algorithm is like a cake recipe. It is essentially “a set of instructions that tells a computer what to do,” she said. Machine learning involves training the computer to look for patterns. For example, a computer can be given pictures of a malignant cancer and a benign tumor, and then be shown a picture it’s never seen. It can then say whether that new photo is more similar to the photo of the malignant cancer or the benign tumor.

NLP looks to “construct or understand” language. “So that it can predict the next word or answer questions,” she said.

Grossman also likened AI to old-fashioned hardware: “Nobody’s buying a drill because they want a drill,” she said. “They’re buying a drill because they want a hole in the wall.” Grossman cited Richard Susskind, LLB, DPhil, an author and “futurist” who consults with international companies on legal services, she added.

In healthcare, that “hole” is better diagnosis and treatment, and in the future it will be disease prevention, Grossman said. So “AI is a tool. Electricity is a tool. Fire is a tool. This is a general purpose tool and what’s going to make the difference is how we use it, and the controls and guardrails we put around it.”

But Will a Robot Take My Job?

Concerns have been expressed about AI replacing healthcare workers. Grossman and Springan said healthcare providers shouldn’t worry about that.

“We are at a moment where you either start to use AI in your clinical practice, or you’re going to get left behind,” Springan said. Diligent Robotics develops “robots to support and empower patient care teams,” according to the company website, but “AI is not going to replace nurses,” Springan emphasized. “There is nothing that can replace the human touch.”

Grossman agreed: “If I’m in hospice or facing the end of my life. I don’t want to hold hands with a bot,” she said.

That doesn’t mean providers can ignore the AI revolution. Grossman noted that there may be some who want to “put their head in the sand” when it comes to AI, maybe saying they’ll wait until they retire to learn about it. But “unless you’re going to retire in the next 3 to 5 years, you’re going to have a problem,” Grossman said. “You need to embrace this stuff.”

AI the Ally

Springan noted that AI can be used to address challenges in the healthcare system, such as staffing issues, patient safety, and readmissions. For example, AI is being used in risk scores for fall management in older adults, he said.

In nursing, Grossman said she believes the “most immediate impact” will be in the areas of data collection and analysis. “AI systems will be able to provide nurses with highly accurate and actionable recommendations, such as providing assistance with triage to identify patient risk or deterioration. In this sense, AI will act as ‘a second pair of eyes,’ with acute vision,” Grossman told MedPage Today, in an email following the session.

She also wrote that generative AI tools will help prepare patient notes, reports, and routine forms, thereby reducing the time it takes for nurses to manage these administrative tasks. Such tools will enable nurses to work to their full level of training and experience, Grossman said.

“AI will likely also help nurses with patient education and the promotion of healthy lifestyles,” she added.

However, Grossman was careful to separate AI from robotics, which she views more in the realm of “automation.”

“Social or companion robots have already been introduced into healthcare delivery systems,” she said. “In addition to interacting with patients, they can deliver and administer medication and provide ambulation support; some can even take vital signs and implement infectious disease protocols, for example, by handling or removing contaminated materials.”

And “ambulatory robots” can be used in combination with telehealth to help patients with chronic illnesses in remote areas where an in-person visit might be difficult.

“The robot takes the tests or otherwise assesses the patient’s health status and conveys that data to the nurse who provides advice and coaching to the patient based on the results,” she stated.

Helpful, but Not Ideal

Grossman did convey some reservations about AI, such as a lack of regulation — currently companies are not required to tell an individual if they’re dealing with a human or a bot, she said — and the danger of not being able to differentiate between “deep fakes and real pictures,” she said.

Another concern is misinformation. Grossman reported that a mother whose baby had colic asked a chatbot for advice on her crying child, and was instructed to “grind up porcelain and put it in the breast milk. Now, how is somebody who’s not sophisticated and can’t afford to go to a local emergency room … supposed to know whether that’s correct or not?”

Grossman also expressed concern about AI biases in healthcare, such as the exclusion of certain populations or using current health spending to predict how much to invest in a population’s health. People of color may spend less on healthcare because they have more difficulty accessing it, but that doesn’t mean they have less need, she said.

Then there is “human bias,” or how much trust people have in the AI results. Grossman noted that there are programs that can read imaging scans better than a radiologist, but the trust in the AI read isn’t there. On the other hand, there are people “who drive right off a cliff or into the water because GPS told them to,” Grossman said.

Preventing or managing an algorithm’s bias is not easy, she pointed out, as there is no clear agreement on the right standards to use. For instance, one person may think that every population should be treated equally, while another will think that the algorithm has to “correct for past inequities,” Gross said.

“Often, the fairer I make my algorithm, the less accurate it is,” she said.

Bottom line for healthcare providers: Evaluate and “interrogate” the AI tools put in front of you: “What was the data this was trained on and is it similar to the data I’m making predictions on?” Grossman said.

“What are the assumptions, background assumptions that went into this algorithm?” she said. “Is a finger being put on the scale … by somebody behind the scenes that is never made explicit?”

  • Shannon Firth has been reporting on health policy as MedPage Today’s Washington correspondent since 2014. She is also a member of the site’s Enterprise & Investigative Reporting team. Follow

Disclosures

The panel discussion was sponsored by the University of Pittsburgh.



#Experts #Nurses #Embrace

Happy
Happy
0 %
Sad
Sad
0 %
Excited
Excited
0 %
Sleepy
Sleepy
0 %
Angry
Angry
0 %
Surprise
Surprise
0 %

Average Rating

5 Star
0%
4 Star
0%
3 Star
0%
2 Star
0%
1 Star
0%

Leave a Reply

Your email address will not be published. Required fields are marked *

Israel-Hamas war adds new urgency to U.S. House speaker fight Previous post Israel-Hamas war adds new urgency to U.S. House speaker fight
Arsenal star Bukayo Saka withdraws from England squad due to injury Next post Arsenal star Bukayo Saka withdraws from England squad due to injury