In many ways information can be seen as the essential catalyst, or lifeblood of change and readjustment. Because of this characteristic, as problems and obstacles mount, the need to disseminate accurate and useful information in a timely manner becomes increasingly important. And, it is clear that the stakes in the next fifty years are rising. Climate change is purported to cause massive meteorological, agricultural and biological damage. Oil depletion is poised to heavily disrupt agricultural and energy production. And weapons proliferation is slated to trigger intense struggles over these dwindling resourses while continuing old ideological conflicts. In light of these proceeding facts, what is the most effective way to transmit pertinent ideas to large numbers of people? And, what, if any, are the contentious aspects of these communication methods?
Generally, when people think about transmitting pertinent information they gravitate towards what we can call the 18th century consensus – a view that upholds rational argumentation, clear analysis and complete explanation as the optimal means of conveying data and ideas. The goal of this view is to win over your opponent with empirical datum, and to approach an objective truth via a continual process of readjustment: personal beliefs are formed and changed by proven facts. The work of Bertrand Russell certainly comes to mind when thinking of this approach, and Noam Chomsky still swears by the 18th century consensus. But does it work?
Unfortunately, not in all instances – mostly because human social and personal interactions rarely permit this mode of data transmission to occur unadulterated. There are two core issues with the consensus view (both the consequences of synaptic connections). Firstly, the same piece of information is analyzed differently depending on who the messenger is. If a renown doctor were to give advice to a patient, it would more than likely be accepted as trustworthy, but if that same advice was given by, say, a street musician, the patient would be less likely to accept it as valid. Prejudices and other preconceptions are remarkably strong, and can easily push their subscribers to simply analyze the messenger, rather than submitting to the ‘tedious’ task of analyzing actual data.
Secondly, persons with strongly established worldviews will rarely analyze information that even remotely seems to contradict their ideological foundations. A study out of Emory University offers a poignant example of this phenomenon. In 2004, clinical psychologists from the university studied a sample of steadfast Democrats and Republicans during the months prior to that year’s U.S. Presidential election. The subjects were given a reasoning task in which they had to evaluate threatening information about their preferred candidate. And, during the task, the subjects were given functional neuroimaging scans (fMRI) to see what parts of their brain were active.
To admirers of the 18th century consensus, the results are unsettling: As Drew Weston, the study’s lead author recalled, “we did not see any increased activation of the parts of the brain normally engaged during reasoning.” Even more troubling was that the researchers found that areas associated with conflict resolution were activated during the task, hinting that rather than actually analyzing information, the subjects were simply ignoring it or were even attempting to suppress their urge to act out emotionally in protest.
To be sure, there are instances where data can be analyzed without hindrance. In the first case, which we can call ‘reaffirmation,’ new information is readily received when a subject’s previously established neurological associations allow them to analyze incoming data without outright rejection. When incoming data is seen as seamlessly integrating with the individuals pre-subscribed ideology, the new data is in effect granted access. Access to be analyzed in the areas of the brain where reasoning takes place, access to long-term memory, access to an individual’s overall worldview.
In the second case, which we will call ‘critical self-analysis,’ new information is readily received when a subject truly adheres to an ideology that covets critical data analysis even if the new data will lead to an adjustment of the subject’s current worldview. Unfortunately, this method of approaching reality is probably underused as it requires one to see their current worldview as a work in progress and demands an unwavering cognitive commitment (to hold the belief that one’s current worldview is dynamic in light of verifiable incoming data and that all incoming data demands fair analysis rather than Prima facie dismissal). After all, it is much simpler (not to mention ideologically safer) to revert to old habits rather than to continually analyze incoming data and create new ‘realities’ based on the interpretations of that data.
In the end, it seems that without a catastrophic event acting as the agent of ideological and political change (think Russia during The Great War, Germany after that same war, or America after September the 11th), people rarely give ideologically foreign information honest analysis or waver from their established worldviews. And here lies the problem: In the next fifty years there is a high probability that the stakes will continue to rise, demanding an equal amount of information gathering and sharing in order to create the ideas and courses of action needed to counter these growing threats.
So, my final question is this – as a people, are we capable of overcoming our bias towards ideologically foreign information, or as Edward Bernays advocated for in his 1928 treatise, are we stuck with professional propagandists embedded in the upper echelons of government and business?