http://www.boingboing.net/2009/01/07/dod-wants-parent-bot.html"Idea OSD09-H03? Develop an AI that fools young children into thinking they are talking to Daddy or Mommy when Daddy or Mommy are off on their 3rd deployment to Iraq and can't come to the webcam."
The child should be able to have a simulated conversation with a parent about generic, everyday topics. For instance, a child may get a response from saying "I love you", or "I miss you", or "Good night mommy/daddy." ... The application should incorporate an AI that allows for flexibility in language comprehension to give the illusion of a natural (but simple) interaction."
What are the odds that this system would end up interacting with a kid after their parent's been killed or severely wounded? (Especially given the assumption that the parent can't come to the phone/webcam?)
(Personally, I'm betting that it turns out to be pretty hard to fool even a three-year-old into thinking that they're talking to their own parent. This is, IMO, arguably _harder_ than a Turing Test--you have to convince the kid that they are talking to a _specific_ person rather than a computer composite of previous recordings.)
gah.