Skip to content

ChatGPT (Feb 13 Version) is a Chinese Room

Maurice HT Ling edited this page Apr 4, 2023 · 1 revision

Citation: Ling, MHT. 2023. ChatGPT (Feb 13 Version) is a Chinese Room. Novel Research in Sciences 14(2): NRS.000832.

Link to [PDF].

Here is the permanent [PDF] link to my archive.

ChatGPT has gained both positive and negative publicity after reports suggesting that it is able to pass various professional and licensing examinations. This suggests that ChatGPT may pass Turing Test in the near future. However, a computer program that passing Turing Test can either mean that it is a Chinese Room or artificially conscious. Hence, the question of whether the current state of ChatGPT is more of a Chinese Room or approaching artificial consciousness remains. Here, I demonstrate that the current version of ChatGPT (Feb 13 version) is a Chinese Room. Despite potential evidence of cognitive connections, ChatGPT exhibits critical errors in causal reasoning. At the same time, I demonstrate that ChatGPT can generate all possible categorical responses to the same question and response with erroneous examples; thus, questioning its utility as a learning tool. I also show that ChatGPT is capable of artificial hallucination, which is defined as generating confidently wrong replies. It is likely that errors in causal reasoning leads to hallucinations. More critically, ChatGPT generates false references to mimic real publications. Therefore, its utility is cautioned.

Clone this wiki locally