Computers accept data in the form of ones and zeroes. Why can’t they understand normal language?
Bill
Ones and zeroes are the most basic language that we use to write instructions to tell computers how to do things we want them to do. The more complicated the task is, the harder it is to write instructions that get the results we want.
As it turns out, human language is one of the hardest challenges in computer science. People pronounce the same words in different ways. Words that sound the same have different meanings. The same sentence can have very a different meaning depending on how you say it. It’s very hard to use ones and zeros to convey all the subtle differences in language that are very easy for people to understand.
The good news is that we are making great progress in this area today. Soon, I think we’ll see computer programs that really understand what you say and answer in ways that sounds very human.
As it turns out, human language is one of the hardest challenges in computer science. People pronounce the same words in different ways. Words that sound the same have different meanings. The same sentence can have very a different meaning depending on how you say it. It’s very hard to use ones and zeros to convey all the subtle differences in language that are very easy for people to understand.
The good news is that we are making great progress in this area today. Soon, I think we’ll see computer programs that really understand what you say and answer in ways that sounds very human.
No comments:
Post a Comment