Nature wrote a story on the weird terms popping up in computer science papers. Dubbed ‘tortured phrases’, they likely appear when the writers use automated software to hide their plagiarism. But they don’t make a lot of contextual sense:
In April 2021, a series of strange phrases in journal articles piqued the interest of a group of computer scientists. The researchers could not understand why researchers would use the terms ‘counterfeit consciousness’, ‘profound neural organization’ and ‘colossal information’ in place of the more widely recognized terms ‘artificial intelligence’, ‘deep neural network’ and ‘big data’.
Research-integrity sleuths say that Cabanac and his colleagues have uncovered a new type of fabricated research paper, and that their work, posted in a preprint on arXiv on 12 July, might expose only the tip of the iceberg when it comes to the literature affected.
To get a sense of how many papers are affected, the researchers ran a search for several tortured phrases in journal articles indexed in the citation database Dimensions. They found more than 860 publications that included at least one of the phrases, 31 of which were published in a single journal: Microprocessors and Microsystems.
This reminds me of the time in Friends when Joey used a thesaurus to write an adoption recommendation letter for Monica and Chandler. Except this is real and more is at stake. This also reminds me that things like GPT-3 aren’t as amazing as people are making out. The potential could be useful and groundbreaking but too many people are messing with it, feeding it garbage data, and using it for nefarious means or plain nonsense.