阅读理解
A machine can now not only beat you at chess, it can also outperform you in debate. Last week, in a public debate in San Francisco, a software program called Project Debater beat its human opponents, including Noa Ovadia, Israel's former national debating champion.
Brilliant though it is, Project Debater has some weaknesses. It takes sentences from its library of documents and prebuilt arguments and strings them together. This can lead to the kinds of errors no human would make. Suchwrinkleswill no doubt be ironed out, yet they also point to a fundamental problem. As Kristian Hammond, professor of electrical engineering and computer science at Northwestern University, put it: "There's never a stage at which the system knows what it's talking about."
What Hammond is referring to is the question of meaning, and meaning is central to what distinguishes the least intelligent of humans from the most intelligent of machines. A computer works with symbols. Its program specifies a set of rules to transform one string of symbols into another. But it does not specify what those symbols mean. Indeed, to a computer, meaning is irrelevant. Humans, in thinking, talking, reading and writing, also work with symbols. But for humans, meaning is everything. When we communicate, we communicate meaning. What matters is not just the outside of a string of symbols, but the inside too, not just how they are arranged but what they mean.
Meaning emerges through a process of social interaction, not of computation, interaction that shapes the content of the symbols in our heads. The rules that assign meaning lie not just inside our heads, but also outside, in society, in social memory, social conventions and social relations. It is this that distinguishes humans from machines. And that's why, however astonishing Project Debater may seem, the tradition that began with Socrates and Confucius will not end with artificial intelligence.