Can criminals really be convicted of crimes that they ‘might’ commit, but haven’t actually perpetrated? This sounds like a scenario snatched straight from science-fiction; certainly when Philip K. Dick wrote "The Minority Report" back in 1956 the concept seemed fanciful, but advances in technology and data analysis are turning this fictional idea into fact.

Robert Nash Parker, a professor of sociology from the University of California, has been developing a computer program in conjunction with the Indio Police Department which can predict how and where criminals will strike next. The dragnet system, which is reported to provide estimates of ‘pre-crime’ which are up to 93 per cent accurate, was implemented earlier this year and initial results have been extremely positive, with an eight per cent decline in theft already reported.

Unlike other computer models which monitor incidents in time or at various places, but not both, the new system analyzed crime data and other significant factors such as truancy records to identify patterns of crime using both time and space.

“This is still cutting-edge and experimental,” Parker explained. “Big data gives you statistical power to make these kinds of predictions. It makes it possible for us to anticipate crime patterns, especially hot spots of crime, which allows law enforcement agencies to engage in targeted prevention activities that could disrupt the cause of crime before the crime happens.”

This type of programme is not restricted to the US: a pilot scheme has also been implemented across UK police forces using a computer algorithm which predicts where crimes are most likely to occur. Police officers are then deployed to the selected areas where crime is likely to take place, providing a deterrent and sometimes catching perpetrators in the act, "Minority Report"- style. The results there have been even more impressive than in the US, with some areas reporting a 26 per cent decrease in offences.

Scotland Yard is working with Professor Shane Johnson, from the department for crime science at University College, London. Johnson said that the activities of burglars resembled the behaviour of wild animals searching for food, who return to areas where they have previously been successful.

Commander Simon Letchford, one of the officers implementing the new initiative, named "PredPol", told the Sunday Times: ‘Predictive policing places officers in the right place at the right time using the most up-to-date information.

All 43 forces in England and Wales are now being encouraged to adopt this approach to allocating their resources by the College of Policing, the new professional body that sets police standards.Rachel Tuffing, head of research at the College, said she expected most forces to be using such schemes within five years.

"It’s the classic Minority Report, trying to prevent crime before it happens," she commented.

This type of system appears to be the tip of the intelligence iceberg, however. Many other similar programs are already in existence and undoubtedly provided the basis for those now being utilised by law enforcement. One of these, known as FAST (Future Attribute Screening Technology) was created by the Department of Homeland Security (DHS), under their Homeland Security Advanced Research Agency and the Science & Technology Human Factors Behavior Science Divisions, and was originally called "Project Hostile Intent". It was developed to screen individuals for various psychological and physiological indicators in order to predict crime, but was intended to work in "real time" rather than after a crime had been committed. John Verrico of the DHS stated back in September 2008 that initial tests had demonstrated 78% accuracy on predicting motive and intent, and 80% on deception.

It doesn’t end there. It seems that the potential to predict and control our behavior is being exploited to its outer limits by every government department in existence. Agencies and organisations involved in the data collection initiative include the National Institutes of Health, the Department of Defense, the National Science Foundation, the Department of Energy, the U.S. Geological survey, along with DARPA, and the government has allocated $200 million through the Office of Science and Technology Policy to "access, organize and glean discoveries " from the huge amount of digital information collated.

New methods of collecting data are being explored constantly: cameras are being installed in the San Francisco subway system and other locations, including tourist attractions, government buildings and military bases, to watch for “suspicious behavior”. Surveillance cameras are nothing new, but these are special "pre-crime" cameras programmed with a list of behaviors considered to be “normal”; any deviation from these activities will be classified as irregular and law enforcement will be notified automatically. Even more chilling is the fact that the cameras can almost "think", as they are programmed to build up "memories" of suspicious behavior for future reference, and if necessary, they can track up to 150 suspects at a time.

The most concerning fact of all is that the government is not content with merely predicting or analyzing our actions but has now moved on to our private thoughts. The National Security Agency (NSA)has apparently developed an artificial intelligence system which can actually determine the thought processes of targeted individuals. Drawing on the Internet and thousands of databases including phone records, credit card transactions, cell phone locations, Internet searches, online purchasing history and social networks such as Facebook, Twitter and MySpace, the program can potentially calculate not just what people are going to do, but how they actually think.

The program is called Aquaint, (Advanced QUestion Answering for INTelligence) and was created by by the Intelligence Advanced Research Projects Activity (IARPA)], part of the new M Square Research Park in College Park, Maryland. The system is very controversial, and there are rumors that at least one member of the development team resigned over worries that it was too intrusive and could be a lethal weapon if it fell into the wrong hands.

The collation of such data must be considered to be a valuable tool for crime prevention, but in the wrong hands, could also be a threatening and intrusive mechanism accusing target individuals of mal-intent based solely on the fact that they turned up in a "sensitive" location at the right time. Where will the line be drawn in the future between actions and intent? These measures and methods of data collection have been insidiously implemented into our society, ostensibly for our benefit, but at what cost? Eventually the concept of "free will", that most fundamental notion that defines humanity and allows us the opportunity for self-actualization, may be slowly eroded without our knowledge or consent.

There is a fine line between an over-protective "nanny state" and a sinister, totalitarian society where even our thoughts could be controlled.

Unknown Country brings you the finest dining for your thoughts (whilst they are still your own) – subscribe today!
 

Dreamland Video podcast
To watch the FREE video version on YouTube, click here.

Subscribers, to watch the subscriber version of the video, first log in then click on Dreamland Subscriber-Only Video Podcast link.