Artificial Intelligence
Masafumi Hagiwara and his students at Keio University are attempting to develop a robotic brain that can carry on a conversation, or in other words, a robotic brain that can understand images and words and can carry on thoughtful communication with us. |
The Hagiwara Lab in the Department of Information and Computer Science of Keio University's Faculty of Science and Technology is trying to realize a robotic brain that can carry on a conversation, or in other words, a robotic brain that can understand images and words and can carry on thoughtful communication with humans.
"Even now, significant progress is being made with robots, and tremendous advancements are being made with the control parts," says Masafumi Hagiwara, director of the research. "However, we feel like R&D with regards to the brain has been significantly delayed. When we think about what types of functions are necessary for the brain, the first thing that we as humans do is visual information processing. In other words, the brain needs to be able to process what is seen."
Hagiwara continues, "The next thing is the language information processing that we as humans implement. By using language capabilities, humans are able to perform extremely advanced intellectual processing. However, even if a robotic brain can process what it sees and use words, it is still lacking one thing, specifically, feelings and emotions. Therefore, as a third pillar, we're conducting research on what is called Kansei Engineering, or affective information processing."
"Within the next 10 years, and perhaps even sooner, I believe that robots will be steadily introduced into the home. And when that happens, the interface with humans, which are the users, will be extremely important." |
"In the conventional object recognition field, patterns from the recognized results are merely converted to symbols. However, by adding language processing to those recognized results, we can comprehensively utilize knowledge to get a better visual image," he continues. "For example, even if an object is recognized as being a robot, knowledge such as the robot has a human form, or it has arms and legs can also be used. Next will be language information processing because processing of language functions is becoming extremely important. For example, even as a robot, the next step would be for it to recognize something as being cute, not cute, mechanical, or some other type of characteristic."
Related articles |
The robotic brain targeted by the Hagiwara Lab is one that is not merely just smart. Instead, the lab is targeting a robotic brain with emotions, feelings, and spirit that will enable it to interact skillfully with humans and other environments. To achieve this, the lab is conducting a broad range of research from the fundamentals of Kansei Engineering to applications thereof in fields such as entertainment, design, and healing.
"Most of the robots thus far move exactly as they are programmed to do. However, within the next 10 years, and perhaps even sooner, I believe that robots will be steadily introduced into the home. And when that happens, the interface with humans, which are the users, will be extremely important," says Hagiwara. "For example, if you have a robot that can undergo a variety of movements rather than being a robot like this that doesn't move, and if among those various movements, there is movement that looks like fluctuation, then communication is occurring among that movement, or if the contact time with the robot becomes longer, then of course the robot will be able to understand even the user's feelings and personality, and it can then respond and act accordingly. We're trying to build a robot that is capable of that type of attentiveness."
SOURCE DIGInfo TV
By 33rd Square | Embed |
0 comments:
Post a Comment