Blog

May 23, 2023

ChatGPT is not “true AI.” A computer scientist explains why

Posted by in categories: innovation, robotics/AI

AI is one of humanity’s oldest dreams. It goes back at least to classical Greece and the myth of Hephaestus, blacksmith to the gods, who had the power to bring metal creatures to life. Variations on the theme have appeared in myth and fiction ever since then. But it was only with the invention of the computer in the late 1940s that AI began to seem plausible.

Computers are machines that follow instructions. The programs that we give them are nothing more than finely detailed instructions — recipes that the computer dutifully follows. Your web browser, your email client, and your word processor all boil down to these incredibly detailed lists of instructions. So, if “true AI” is possible — the dream of having computers that are as capable as humans — then it too will amount to such a recipe. All we must do to make AI a reality is find the right recipe. But what might such a recipe look like? And given recent excitement about ChatGPT, GPT-4, and BARD — large language models (LLMs), to give them their proper name — have we now finally found the recipe for true AI?

Comments are closed.