* U.S. funding research into AI that can predict how people will behave
* Software recognises activities and predicts what might happen next
* Intended for use in both military and civilian contexts
By DAMIEN GAYLE
23 November 2012
An artificial intelligence system that connects to surveillance cameras to predict when people are about to commit a crime is under development, funded by the U.S. military.
The software, dubbed Mind's Eye, recognises human activities seen on CCTV and uses algorithms to predict what the targets might do next - then notify the authorities.
The technology has echoes of the Hollywood film Minority Report, where people are punished for crimes they are predicted to commit, rather than after committing a crime.
Scientists from Carnegie Mellon University in Pittsburgh, Pennsylvania, have presented a paper demonstrating how such so-called 'activity forecasting' would work.
Their study, funded by the U.S. Army Research Laboratory, focuses on the 'automatic detection of anomalous and threatening behaviour' by simulating the ways humans filter and generalise information from the senses.
The system works using a high-level artificial intelligence infrastructure the researchers call a 'cognitive engine' that can learn to link relevant signals with background knowledge and tie it together.