文档详情

AI人工智能讲义.ppt

发布:2017-02-10约字共24页下载文档
文本预览下载声明
the interaction between humans and robots People already trust machines too much Bias: if machines says something, it must be correct. People need to trust machines less. Automation extends far beyond the battlefield Self-driving cars Explosive-disposal robots Roomba One of Googles self-driving cars hit a public bus while it was in autonomous mode on a Mountain View, California, road. Deceive human beings? Get humans to do something they dont want to do. (Get elderly people to take need medications) It is a long way! Robots need much better capabilities for deciphering and manipulating human desires and incentives But There are sorts of nefarious computers and robots that manipulate people. in-app purchases For instance It may due to a commercial agreement between the robot manufacturer and the carpet stain remover manufacturer. Summary People need to remember what distinguishes them from robots. We have to be guarded not to be automatons ourselves. * The Air Force Wants You to Trust Robots --Should You? Research and development in human–robot trust is the cutting edge of artificial intelligence, but faith in machines can often be misplaced. British fighter was found and destroyed by US anti-missile system in the 2003 Iraq war. Patriot-made made the same mistake that shoot down US aircraft Ten years later, the problem of artificial intelligence which is the fundamental problem has not disappeared Military increasingly rely on automation, research and development of artificial intelligence and spend a lot of money Artificial intelligence will be controversial Heather Roff says those friendly fire incidents highlight what experts call automation bias. The human operators took no action in the small window of time they had in which to prevent the weapon from firing, and didn’t fire. Al and human operator Automation exceeded choose to kill the target Wright - Patterson Air Force Base in human and machine interaction to develop new forms of surveillan
显示全部
相似文档