Union leaders and technology experts say health systems should be open with nurses about how they plan to use artificial intelligence and educate them on such tools in light of staffing and other concerns.

Hundreds of nurses at Kaiser Permanente and HCA Healthcare protested last month, worried about the systems’ use of AI to measure the severity of patients’ illnesses and perform other clinical tasks. The nurses cited concerns about the technology’s potential to put patient safety at risk and cause job losses. Healthcare unions have increasingly pushed for contract language that sets up guardrails for AI use and asked to be included in hospitals’ decision making processes around such technology.

AI is widely used by health systems to generate diagnostic recommendations and personalized treatment plans, monitor patients’ vital signs, analyze X-rays and alert healthcare workers to declines in patients’ condition.

While AI can be useful, input from nurses is crucial before health systems roll out such technology to make sure it won’t be detrimental to clinical practice, said Cathy Kennedy, president of the California Nurses Association/National Nurses Organizing Committee.

At an April 22 protest at Kaiser Permanente’s San Francisco Medical Center in Oakland, California, members of the California Nurses Association raised alarms about the health system’s use of algorithms to help assess patients and set staffing levels, as well as the potential use of robotic patient sitters. Kennedy said the system’s use of AI is largely reliant on nurse charting and patient data, which may be incomplete due to staffing shortages that leave nurses unable to document all patient care in real time. Using incomplete date could in turn lead to inadequate nurse-to-patient ratios and limit clinicians’ ability to care for patients, she said.

“We need [Kaiser Permanente] to pause and really take a look at what they’re doing,” said Kennedy, a nurse at Kaiser Permanente’s Roseville Medical Center in Roseville, CA. “Is it important to spend millions of dollars on technology, gadgets and devices, or is it better to utilize some of the money for staff nurses in the hospital and clinics? The artificial intelligence that they’re utilizing — is it going to harm patients?”

Possible upcoming use of remote monitoring technologies as patient sitters could also complicate staffing. Remote monitoring technologies are often unable to detect subtle changes in patients’ condition and signs of delirium or skin breakdown that certified nursing assistants, respiratory therapists, nurses or other bedside caregivers can more easily pick up on, said Jessica Early, patient advocacy coordinator for National Union of Healthcare Workers.

“If you don’t have someone there laying eyes on the patient to provide that data, these algorithms could be making erroneous determinations about a patient’s status and spit out staffing decisions that are inappropriate and would compromise care,” Early said.

Kaiser said clinicians are still at the center of patient care decisions. The health system is working with unions to monitor how emerging technologies might impact jobs and avoid employee displacement, Kaiser Permanente said in a statement.

“We have consistently invested in and embraced technology that enables nurses to work more effectively, resulting in improved patient outcomes and nurse satisfaction,” Kaiser Permanente said. “We ensure the results from AI tools are correct and unbiased.”

The system said its Advance Alert Monitor program, which analyzes electronic health record data at 21 hospitals across Northern California to identify at-risk patients in need of clinical intervention, saves an estimated 500 patient lives per year. In December, Kaiser Permanente awarded grants up to $750,000 to five medical centers to fund projects on artificial intelligence and machine learning algorithms’ potential to improve diagnostic decision-making.

HCA Healthcare, which nurses also protested over AI use, launched a pilot program in early 2023 giving emergency room physicians at four hospitals access to speech-to-text processing technology aimed at facilitating easier documentation of data during patient visits.

HCA Healthcare did not respond to questions about how it is using AI.

Ahead of additional AI roll-outs, members of the National Union of Healthcare Workers at various health systems have been bargaining for contract language that ensures job protections for clinicians and prevents the implementation of technology initiatives as a means of cost cutting, Early said.

At Maimonides Medical Center in Brooklyn, strong contract language has limited untested and unregulated AI use, as nurses’ agreement says they must be consulted before the center fully introduces new technology to the workplace.

In one instance, nurses’ trial run of an AI-powered thermometer revealed faulty technology when the devices listed 30 intensive care unit patients as having the same temperature, said Nancy Hagans, president of the New York State Nurses Association and registered nurse at Maimonides Medical Center. Following these erroneous readings, the nurses returned to using old-fashioned thermometers and found that several patients actually had high fevers, Hagans said.

By allowing nurses to assess the technology’s impact on patient care, the facility was able to avoid terrible outcomes, she said.

More than half of nurses are not comfortable with integrating AI technology into their practice, according to a 2024 survey of 1,100 nursing professionals and students conducted by Florida Atlantic University’s Christine E. Lynn College of Nursing and technology and workforce advisory firm Cross Country Healthcare.

Although some caution is warranted around AI use, most of nurses’ concerns can be mitigated by a better understanding of how the technology works and how it will be applied, said Richard Kenny, healthcare executive advisor at SAS Institute, an AI software developer based in Cary, North Carolina.

Health systems should take the time to establish a culture of trust and transparency, ask clinicians for their feedback and involve them in AI implementation decisions, Kenny said. Organizations should also avoid using terms like “AI doctor” or “AI nurse” that cause confusion and exaggerate the use of technology in clinical care, he said.

For the most part, health systems are trying to think responsibly about AI, and the best systems are applying AI toward optimizing operational processes and relieving administrative burden, Kenny said. Kaiser Permanente has developed its own frameworks and guidelines to dictate best practices with the technology.

“The truth of the matter is, AI cannot replace the nurse,” he said. “More than anything, it gives us an opportunity to go back to practicing at top of license, which is what every nurse wants.”

To educate clinicians on the benefits and uses of AI in healthcare, Kenny said he holds training sessions with frontline staff at different facilities, starting conversations about digital literacy and discussing concerns.

The Christine E. Lynn College of Nursing in Boca Raton, Florida, makes it a point to teach students and faculty about the various applications of AI and Chat GPT through workshops and curriculum that incorporate the technology and telehealth platforms used in hospitals and clinical centers.

“As educators, we need to make sure that nursing students at least understand how AI can be integrated into healthcare and have exposure to it,” said Dr. Safiya George, dean of the college. “Usually when people aren’t comfortable with something, it’s often because [they don’t] have enough information about or experience with it.”

As part of the school’s combined degree program, students can complete their Bachelor of Science in Nursing and spend an additional year working toward a Master of Science degree with a focus on either AI or biomedical engineering.

The program helps prepare nurses to give input on the development and implementation of AI solutions and make sure technology is helpful for clinicians’ day-to-day practice, George said.

The more AI is used to improve efficiency in clinical documentation and research, the less fearful nurses will be about the technology, said John Martins, president and CEO of Cross Country Healthcare.

“Over time, I believe clinicians and nurses in particular will come to embrace the technology because they’ll see that the outcomes are the same if not better,” Martins said. “If you program the algorithms right, you may actually take away biases that actually happen with human nature.”

LEAVE A REPLY

Please enter your comment!
Please enter your name here