By Sharon Weinberger
Can technology predict bad intentions? The Department of Homeland Security is hoping it can, though even experts working on the technology are dubious that this is a silver bullet. The New Scientist reports this week on how a “battery of lasers, cameras, eye trackers and microphones begin secretly compiling a dossier of information about your body.” The goal of all this technology is to predict your future intentions.
It sounds far-fetched, but this is the aim of Project Hostile Intent (PHI), the latest anti-terrorism idea from the US Department of Homeland Security. According to DHS spokesman Larry Orluskie, the DHS wants to develop systems that can analyse behaviour remotely to predict which of the 400 million people who enter the US every year have “current or future hostile intentions”.
PHI aims to identify facial expressions, gait, blood pressure, pulse and perspiration rates that are characteristic of hostility or the desire to deceive. Then the idea is to develop “real-time, culturally independent, non-invasive sensors” and software that can detect those behaviours, says Orluskie. The DHS’s Advanced Research Projects Agency (HSARPA) suggests that these sensors could include heart rate and breathing sensors, infrared light, laser, video, audio and eye tracking.
PHI got quietly under way on 9 July, when HSARPA issued a “request for information” in which it asked security companies and US government labs to suggest technologies that could be used to achieve the project’s aims. It hopes to test them at a handful of airports, borders and ports as early as 2010 and to deploy the system at all points of entry to the US by 2012.
A lot of people are asking: How can this technology know that you have “hostile intent” because you want to blow up a plane, or that your hostility stems from the fact that your toothpaste was confiscated because it’s in a 3.5 ounce container instead of a 3 ounce container? Thermal imaging, for example, has proved to be a less-than-reliable indicator of intention. And as the article notes, even researchers involved in deception detection have expressed doubt about Project Hostile Intent:
“I can’t imagine they will have any reasonable rates of success with such a system,” says Kerstin Dautenhahn of the University of Hertfordshire, UK, who specialises in teaching robots to understand human intentions. “I have serious doubts that it will be successful,” adds psychologist Paul Ekman of the University of California, San Francisco, an expert in detecting hidden emotions and intentions from human facial expressions.