A U.S. science magazine produced by Massachusetts General Hospital has published an article on the “trash island” created by fraudulent computer-generated papers in the scientific literature (emphasis is mine):
Bad science can do lasting damage … [some studies] go beyond bad data and are wholly fraudulent from their inception. A curious subset of these are scientific papers that are generated by computer programs. Custom algorithms can contribute sections or even spin up whole papers by splicing together likely-sounding phrases or linguistically twisting already-published results. Such papers get printed, even by reputable publishers, and while many of these papers have been retracted, some are still being cited in the scientific literature years after their retraction.
The problem of academics and medical staff successfully publishing peer-reviewed gibberish is one we’ve written about previously at the Daily Sceptic. Although it’s good that the issue is being surfaced in pop-science magazines, the report itself adds little new information, being based entirely on the work of the tiny handful of volunteer bloggers and researchers who investigate fake papers. Instead it’s worth reading in full to study the responses by science insiders. It appears that the seeds are being sown for a destruction of confidence in science that could be even greater than the fallout from Covid.
To understand what might happen we need to briefly review the latest breakthroughs in the field of artificial intelligence.
Generative transformer models
20th century science fiction usually depicts artificial intelligence (AI) as cold, analytical calculating machines driven by pure logic. Twenty-first century AI is shaping up quite differently. The invention of generative transformer models has opened the era of creative, imaginative AI that can invent original material difficult to distinguish from that created by humans. A fun demonstration posted in recent days is the output of the DALL-E 2 model when asked to draw pictures of Kermit the frog.