

Not co-incidentally, human children at the age of 4 or 5 are thought to have a vocabulary of 2500 words and a corresponding knowledge of the everyday world. Some of these are “thoughts about thinking” or meta knowledge but most about 2500 are about everyday things like people and what they do.

So far that model consists about 3000 concepts. We are developing a model of the everyday world that consists of a carefully integrated matrix of “concepts” in computers. Now they have the thought of you taking your medicine in their mind.Ĭhatbots and other natural language processing applications like machine learning don’t have any ideas to map words to or to serve as components to construct new thoughts from. If all these are there, they can process your message and comprehend what you said.

“I took my medicine.” For communication to take place the other person has to first have corresponding ideas in their mind: person, swallow and medicine and they have to have the same words referring to those ideas. Next you communicate the message to someone else in speech or writing. Then you have to pick words the correspond to each and arrange them into a grammatically valid message. You imagine or remember performing the action but to communicate the thought you have to break it down into its component ideas: the actor, yourself, the action, swallowing, and the object of the action, the medicine. Think about the process you go through to convey a thought, “taking your medicine” for instance. “One of the biggest misconceptions is that chatbots are able to have conversations with humans in the same way that a human converses with another human.”Ĭhatbots are programs that parse incoming text and map key words and phrases to pro-programmed responses or functions. This is nothing like what goes on when humans communicate. Since chatbots are often described as “understanding” language – they don’t. Sapiens are often confused with chatbots, a natural mistake
