BY: Kevin Fogarty
Searching for an edge on terrorists, investigators keep insisting on better light to look in the wrong places..
According to confidential documents obtained by the WSJ and the Electronic Privacy Information Center, the intelligence and criminal investigations divisions of the U.S. government have lost their minds.
On one hand is increasing number and outrageousness of the secret demands for data courts pass out in support of investigations based on hunches, biases and prejudgments by law-enforcement officers. All assume the rules against having the police strip search you on the street because they suspect you might have been thinking about something related to a crime either don't exist, don't apply to anything on the Internet, or don't apply to them.
On the other hand is a set of documents from the Dept. of Homeland Security that show it is trying to develop a program that can accurately predict whether one specific person is likely to commit a crime.
It's testing the app now on volunteers who are either confident that the software won't work at all, or that expect never to be in a position in their lives in which they would covet anything the 10 Commandments say they shouldn't covet, or harbor any intent, conspiracy or deep, dark desires inside the depths of their souls.
If the app worked and DHS knew about the intent to have a conspiracy to covet, the volunteer might be putting in a little more time than expected, in Guantanamo rather than the DC area.
The app, called the Future Attribute Screening Technology (FAST) sounds more like a fantasy some agent had after leaving the Tom Cruise movie "Minority Report" than it does a real investigative tool.
FAST reaches its conclusions based first on the behavior of the volunteer using video, audio and sensors to measure "psychophysiological measurements" (like the ones proven to be unreliable in polygraph tests even when everything's done correctly and the reactions are as isolated and calibrated as possible).
Don't worry, it doesn't work yet. Yet.FAST is purely a research project at this stage, being tested on a few employees who volunteered to try out a "noninvasive" means of determining their intent in the future.
The Electronic Privacy Information Center statement said it believed using the system would be "very problematic" for legal reasons.
It doesn't mention how mind-bendingly wrong the conclusions of a piece of software would be that had been developed using observations of human behavior, analyzed and codified by other humans, to try to isolate the specific behaviors present in the one part of human behavior no human has ever shown he or she is all that good at figuring out: when someone else isn't telling the truth, the whole truth, or something in addition to the truth.
Software is a lot more objective than human perceptions, because it's more stupid. The algorithms that allow analytical software to reach conclusions are written by humans codifying data that are entirely subjective, then trying to verify those subjective judgments and reduce the error rate by piling on more tests built on more series' of subjective judgments.
However learned the explanation of theory or practice sounded in some smoke-filled underground bunker at DHS when this idea was pitched, no matter how many graphs were involved or how many white coats the scientists wore while swearing they could do this, no one can do this.