组卷题库 > 高中英语试卷库
试题详情
 阅读理解

"Hallucinate" — the Word of 2023

Cambridge Dictionary has named "hallucinate" as the word of the year for 2023 — while giving it an added new meaning relating to AI (artificial intelligence) technology.

The added Cambridge Dictionary definition (定义) reads: " When an AI hallucinates, it produces false information, which can vary (变化) from suggestions that seem perfectly believable to ones that are clearly non-sense. "

Wendalyn Nichols, Cambridge Dictionary's publishing manager, said: " The fact that AIs can ‘hallucinate' reminds us that humans still need to bring their critical (批判的) thinking skills to the use of these tools. AIs can draw out specific information we need from huge amounts and piece it together. That's amazing. But they just stop there. The more original (原创的) you ask them to be, the likelier they are to go wrong. "

Actually, at their best, AIs can only be as dependable as their training information. Humans' professional knowledge is more important than ever, to create the truthful and up-to-date information that AIs can be trained on.

AIs can hallucinate in a confident and thus more misleading manner. Their influences have been shown in real-world examples. In Google's advertisement for its chatbot Bard, the AI tool made an error about the James Webb Space Telescope. A US law company used cases made up by AIs in court after using ChatGPT for legal research.

"The widespread use of the word ‘hallucinate' to refer to mistakes by AIs offers us a quick look at how we're treating them as our equals," said Dr Henry, an AI ethicist at Cambridge University. "‘Hallucinate' is originally a verb suggesting someone experiencing a disconnect from reality," he continued. "It mirrors an unnoticeable change in perception (认知): the AI, not the user, is the one ‘ hallucinating '". It seems that as time progresses, psychological vocabulary will be further enlarged to describe the strange abilities of the new intelligences we're creating.

知识点
参考答案
采纳过本试题的试卷
    教育网站链接