In the race to develop the first true artificial intelligence, Google will be using eye-related medical data to help develop and train one if it’s works-in- progress.
Moorfields Eye Hospital is providing one million (anonymous) eye scans for DeepMind to churn through, looking for medical issues like macular degeneration, diabetic retinopathy, or other causes of blindness.
This isn’t the first time Google’s prototype AI has used data provided by hospitals. In fact, three London hospitals, Royal Free, Barnet and Chase, were involved in a bit of controversy when it was finally discovered that records were shared without notifying the patients.
In Google’s defense, the records, which were used to research kidney disorders, were stripped of any personally identifying information, and used to develop an app that would help doctors diagnose potential kidney issues in patients.
There appear to be two independent but cooperating goals at play here. Google’s main priority is the development of an artificial intelligence, and sees complex sets of medical data as a means of stressing their attempts of developing one. On the other hand, the hospitals that are contributing the data to Google’s project is far more interested in the possible medical breakthroughs and developments that may occur during the experimentation. Their number one priority is saving lives, with the development of AI falling to a distant second place.
Even Sam Smith, a co-coordinator at patient data campaign group MedConfidential is uncertain of where this project might lead. “How it plays out over time remains to be seen.”; he said. “But you do have organizations involved that aren’t principally concerned with DeepMind - they care about blindness in the case of RNIB and long term medical research in the case of the National Institute for Health Research.”
Although the data is anonimised, the people whose records are being shared are not all pleased with the notion. Personal privacy is a quickly vanishing commodity these days, and while the law allows for these records to be disseminated once all personally identifying data has been stripped, the fact that the patients weren’t individually consulted has upset many.
It is still possible for patients to opt out of this type of data share, however it seems very few of them are even aware they are involved in the first place.