Wednesday, November 07, 2007

Thought Police: How Brain Scans Could Invade Your Private Life

From Popular Mechanics, snippets from a three page article. Important because of the human rights angle, especially considering that various executives are not always ethical in the use of technology.

[...]“Our ability to guess what a person is thinking about binary decisions is not super dramatic,” he says. “But we’re doing it with really crude image resolution of samples from the brain. If we could access every neuron, and spent long enough analyzing the data, we could figure out in great detail what a person is seeing or thinking.”

[...]

If they’re right, then there may come a day when others—the government, employers, even your spouse—might turn to technology to determine whether you are a law-abiding citizen, a promising new hire or a faithful partner. But skeptics say that talk of mind-reading machines is nothing more than hype.

“They’re marketing snake oil,” says Yale University psychiatry professor Andy Morgan. “We’ve been really skeptical of the science. But even if it works, it raises interesting questions about Fourth and Fifth Amendment rights. Is [an involuntary fMRI scan] illegal search and seizure since something was taken from you without your permission? And how do you protect your right not to incriminate yourself if people have a way of asking your brain questions, and you can’t say no or refuse to answer? These are some serious questions we have to begin to ask.”

[...]

In the wake of Sept. 11, the potential for fMRI to distinguish liars from truth-tellers generated particular interest as the U.S. government sought more reliable ways to extract information from detainees in the global war on terror. The Pentagon’s Defense Academy for Credibility Assessment at Fort Jackson, S.C., formerly the Polygraph Institute, has financed over 20 projects aimed at developing improved lie detectors. DARPA, the Pentagon’s high-tech research arm, also jumped into fMRI work. “Researchers, funded by the Department of Defense,” a recent article in the Cornell Law Review noted, “have developed technologies that may render the ‘dark art’ of interrogation unnecessary.”

[...]

Even some of fMRI’s most enthusiastic supporters recognize that using the technology in this way could pose gigantic risks to civil liberties. Joel Huizenga, chief executive officer of No Lie MRI, says he anticipates a potential backlash against his firm—and welcomes it. “There should be controversy,” he says. “If I were the next Joe Stalin, I could use this technology to figure out who my friends and enemies are very simply, so I’d know who to shoot.”


[...]

No comments: