LLM will never be AGI: The Proof

The author demonstrates that LLM’s complexity must be $O(n)$ based on a necessary condition in complexity theories. If LLM is AGI, it should be capable of solving any problem a human can solve. For instance, determining if a string is a palindrome can be solved by humans easily in at least $O(n)$ time, while LLM must solve it in $O(1)$ time due to the constant length of the output. This contradiction leads to the conclusion that LLM cannot solve problems like a human, thus disputing its AGI capabilities. This unique perspective challenges the assumptions surrounding LLM’s intelligence.

https://ycao.net/posts/llm-agi/

To top