When Doctors Use Artificial Intelligence in Medicine are They Increasing the Risk of Misdiagnosis?September 11th, 2018
NEW ORLEANS, Louisiana. Misdiagnosis is one of the costliest forms of medical malpractice for both doctors and patients. According to Radiology Business, 80 percent of radiology malpractice claims stem from failure to diagnose. Alarmingly, the cost to patients for these failures is high. 80% of these missed diagnoses can lead to permanent injury or death in the patients affected.
When humans and technology interact to lead to diagnosis, the potential for error can be higher, and the consequences for error can sometimes be great. While radiology is an established diagnostic tool, doctors are increasingly turning to artificial intelligence to help them make medical determinations and diagnosis. When doctors increase their reliance on technology, the risk that some patients might fall through the cracks increases.
For example, reports on a study in which artificial intelligence was used to diagnose patients with Alzheimer’s. Unfortunately, the system only proved effective when the speakers were native English speakers. When non-native English speakers were tested, there was a greater risk for a false-positive. The artificial intelligence system was more likely to interpret mispronunciations and long pauses as signs of dementia among non-native speakers.
Another issue with artificial intelligence is that the systems are only as good as the data you feed into the system. If your data is flawed or distorted or biased, the system will be more likely to produce distorted results. In fact, research has found that most medical data in the U.S. involves data regarding the health of white men. This means that women and minorities, who may have different health risks, might be treated using a data set that may not entirely represent them or their health needs. In fact, QZ reports that African Americans die more than whites of cancer because the data on cancer doesn’t reflect studies involving African Americans and cancer. In fact, QZ reports that only 2% of current cancer studies involve enough minority groups to produce statistically significant results.
If doctors plan to use artificial intelligence going forward, they will need to make sure that their research reflects a diverse population. For example, when researchers look at genetics, QZ reports that 80% of the genomes studied are from people of European descent. In one artificial intelligence detection system, the system was found by researchers to be able to detect malignant melanoma at a better rate than doctors, however, the program only worked when patients had lighter skin.
We need to closely watch how artificial intelligence changes medicine and diagnosis and how these diagnostic tools impact minorities and women. There is already a gap in the kind of treatment minorities and women receive. If artificial intelligence becomes standard in diagnostic settings, but only uses data that primarily looks at white males, minorities and women can fall further behind in getting the quality treatment they need and deserve. The The Bowling Christiansen Law Firm are medical malpractice lawyers in New Orleans, Louisiana who are closely watching how new technologies impact medicine. If you believe your misdiagnosis led to a loved one’s death or to your own injury, you may have the right to sue. The The Bowling Christiansen Law Firm helps victims and families seek damages after medical malpractice.
The Bowling Christiansen Law Firm, A Professional Law Corporation
1615 Poydras Street, Suite 1050
New Orleans, Louisiana, 70112
Phone: (504) 586-5200
Toll Free: (504) 586-5200