The massive investment in neuroscience undertaken by the U.S. BRAIN project and its sister initiative the Human Brain Project is increasingly taking a turn toward the examination of mental health. In fact, hundreds of European scientists working on the project are threatening a boycott because of this direction. In their view, the initial directive was to be more focused on repairing organic injuries and disorders such as Parkinson’s, Alzheimer’s and physical brain damage sustained in accidents. Post Traumatic Stress Disorder would be one area that might involve the military.
However, there is a disturbing trend developing in law enforcement and medicine to use what has been learned about the human brain in order to adopt pre-crime systems and predictive behavior technology.
But could a brain scan become standard procedure to see which troops might be inclined to commit insider attacks?
Troops overseas have been working alongside Iraqi and Afghan troops for years, but a new interest is being taken in evaluating potential extremists who are infiltrating to kill from within.
The numbers of these incidents are statistically low as reported by Defense One, which cites the inside killing of “several troops in recent years.” But a former Army counterintelligence agent sees the opportunity to apply new technology that presumably can screen people for mal-intent. The system is called HandShake:
Here’s how the HandShake system works: A U.S. soldier would take, say, an Iraqi officer and outfit the subject with a special helmet that can pick up both electromagnetic signals (EEG) and perform functional near-infrared imaging (fNIRs) which images blood flow changes in the brain. The soldier would put the subject through a battery of tests including image recognition. Most of the pictures in the tests would be benign, but a few would contain scenes that a potential insider threat would remember, possibly including faces, locations or even bomb parts. The key is to select these images very, very carefully to cut down on the potential false positives.
When you recognize a picture that’s of emotional significance to you, your brain experiences a 200 to 500 microsecond hiccup, during which the electromagnetic activity drops, measurable via EEG. The reaction, referred to as the P300 response, happens too fast for the test subject to control, so the subject can’t game the system.
The fNIR readings back up the EEG numbers. Together, they speak to not only whether or not a subject is a traitor but how likely an individual is to act on potentially criminal or treasonous impulses. The system then runs all the data through what Veritas calls a Friend or Foe Algorithm. The output: the ability to pinpoint an insider’s threat potential with 80 to 90 percent accuracy, according to the company. (Source) [emphasis added]
The company, Veritas, has issued the following video promo for their system:
It’s obviously ironic that this system is intended to be used on people who never should have encountered the U.S. military in the first place, since the U.S. military arrived based on lies. Moreover, to those flagged by such a system, they are clearly open to being tortured under the policies that have been established in the War on Terror world in which we live.
This system comes at an expense in excess of $1 million dollars to deploy and $500,000 per month thereafter, per site, according to the company’s founder. Both the monetary cost and the ethical costs should ensure that this technology never sees the light of day. However, the military-industrial complex has a provable track record of caring very little about either.
Note: The article linked below demonstrates how the biometric identification system in Afghanistan already has trickled down to the streets of America. If brain scanning technology is successful overseas, it is guaranteed to show up inside the United States. It’s already been proposed for air travel and other applications under the FAST system (Future Attribute Screening Technology). Additionally, with the increased war on whistleblowers, this would be a wonderful tool for employers to weed out those whose desire is not to undermine, but simply to expose criminality.
Real-Time Facial Recognition Offered to Police in New Program
Nicholas West’s article appears courtesy of Activist Post.