As a member of the military – or anyone, for that matter – wouldn't it be nice to simply wear a pair of headphones and see yourself become smarter, faster and stronger?
Exactly where the science lands on this topic should be left to rigorous research, validated in peer-reviewed journals. That is all the more important in the social sciences, which, though relatively new in the context of national-security applications, stand to get a significant boost in the name of boosting human performance.
While the military sees itself as a partner and customer of scientific breakthroughs, government leaders should ensure that adequate safeguards exist to separate the good science from the bad. For example, one-off experimental results, while perhaps striking and revolutionary, or sample sizes that are too small to allow for meaningful conclusions, should be recognized as such and treated with the appropriate caution.
Brain research is tricky business. What works in mice may not have the same effect on humans. In addition, effects observed in clinical applications may not necessarily transfer to healthy brains.
In the case of the TDCS headgear, a trio of Canadian researchers in 2014 observed in an article in the journal Neuron a "rising tide" of the technology mentioned in the media and academic literature. At the same time, they wrote, some of the effects are still "poorly understood," not to mention the chance of inducing "prolonged neurological changes that are as of yet unknown."
Others in the community are more blunt. Vincent Walsh, of the Institute of Cognitive Neuroscience at University College London, suggested in a 2013 article in the journal Brain Stimulation that the field of TDCS for the purpose of human-performance improvements is riddled with overpromises and academic shortcuts motivated by financial prospects.
"As a community we articulate concerns about garage mechanics making their own stimulators, companies [peddling] 'enhancements' and cures, people giving themselves brain damage, and unregulated use in areas such as gambling, ethical decision making and military training," Walsh wrote.
As the military looks to the field of social and behavioral sciences to see what would work for national-security applications, the answers may not be as clear-cut as in other fields. And because the Pentagon has a tendency of focusing on results first and figuring out the finer details later, there is a chance of spending money on projects that yield little return on investment.
The Army has figured out that technology is unable to lift the fog of war, as was believed during the heyday of the Future Combat Systems. That learning experience came at great cost to taxpayers, as the program was eventually terminated.
The service's more recent focus on the "human domain" appears to pivot away from traditional technology employment, focusing instead on the characteristics of individuals doing the fighting. At the same time, there is a similar allure to invest in social-behavioral techniques that promise boosting human functioning of one kind or another.
It will fall on Pentagon leadership, including the new DIUx offices, to be the gatekeeper for legitimate scientific investments, lest precious defense resources are spent on the wrong things.