Hadassah-Microsoft Team Diagnoses Parkinson's, Raising Ethical Dilemmas

January 7, 2020

Hadassah-Microsoft Team Diagnoses Parkinson's, Raising Ethical Dilemmas

In a recently published article in the prestigious journal Annals of Clinical and Translational Neurology, Hadassah Medical Organization doctors working with Microsoft researchers reported they were often able to diagnose Parkinson’s disease simply from the way people seeking information about the disease typed on their computer and moved their mouse.

Now that the researchers have the information, what should they do with it?

The team included researchers from Microsoft and Hadassah neurologist Dr. David Arkadir. Offering guidance on the morality of detecting diseases and illness without the person’s permission or knowledge was Prof. Ora Paltiel, head of the Hadassah Center for Research in Clinical Epidemiology. 

The researchers began their experiment by tracing the way two groups used the Bing search engine. In one group were Parkinson’s sufferers; the other group was composed of those searching for information on behalf of a spouse with the disease. The groups’ search behavior was analyzed by monitoring the actual search strings, determining mouse and keyboard response time, and recording any misspellings or other mistakes.

A machine-learned algorithm was developed that then allowed the researchers to monitor those surfing the web to see if the disease could be diagnosed purely on the basis of their online behavior. It was applied to the searches of 1.5 million Americans.

“We detected those we suspect have Parkinson’s disease, even though they didn’t openly declare it,” says Dr. Arkadir. “They may have been looking for The New York Times or anything else, but their behavior led to possible detection of the disease.”

The exercise raised awareness of the possibilities of disease detection via machine-learned algorithms. In this experiment, the results were not provided to the members of the public whose data were scrutinized.

“There’s a debate about the ethics of unsolicited diagnosis,” says Dr. Arkadir. “Physicians have a professional ethical code, but private companies don’t have such restrictions. There needs to be an ethical framework for what to do with such data to make sure it’s used in the right way.”

Dr. Arkadir asked Prof. Paltiel to add her opinion ahead of the publication of a paper on the research. 

 “Lots of things are technically possible, but should they be done?” asks Prof. Paltiel. “Something we doctors always do is spot diagnosis. When we go to a shopping center or an airport, we’re constantly diagnosing people. But we keep it to ourselves.” It’s very unusual for a doctor to spot-diagnose someone and actually inform the person that he or she has a specific illness. 

“People have a right to know, but they also have a right not to know,” says Prof. Paltiel, who adds that this is even important when the diagnosis is not watertight, which is the case in the Bing Parkinson’s research.

The paper concludes with this warning: “Our findings highlight the urgency in the need to establish ethical guidelines for technology companies and researchers involved in unsolicited web-derived diagnoses. For obvious ethical reasons, we did not attempt identifying subjects in this study. Our objective focused on demonstrating that with the accelerating development of remote, unsolicited web-based diagnosis, ethical dilemmas move outside of the sole area of responsibility of the medical profession to encompass technology companies that develop capabilities to collect and analyze user information on a massive scale. 

“The absence of an ethical framework dealing with this pertinent issue could harm both users and commercial companies, and has far-reaching implications for the current practice of medicine. Collaboration between the medical community, the public, and the leading technology companies is required to develop an ethical framework and guidelines for the use of web-based diagnostic tools, and for informing users of their results. Such collaboration could improve users’ wellbeing while maintaining their rights to privacy, their ability to receive clinically useful information, their autonomy to choose between different possible courses of action, and, most notably, their right not to know.”

No items found.