Excellent Analysis. Very Sobering. Put here in full minus a couple of the links. I have linked to the article as well.
Here is why it is the way it is!
All I would add is: YOU have turned your back on God and rejected Him, So He has rejected you! Truth hurts sometimes.
Seven Reasons Why the “Bad Guys” Keep Winning
How Come They Keeping Getting Away With It?
How come the bad guys keep getting away with it … even after getting caught again and again?
Reason Number 1: Falling for the Big Fib
As Adolph Hitler wrote in Mein Kampf:
All this was inspired by the principle–which is quite true in itself–that in the big lie there is always a certain force of credibility; because the broad masses of a nation are always more easily corrupted in the deeper strata of their emotional nature than consciously or voluntarily; and thus in the primitive simplicity of their minds they more readily fall victims to the big lie than the small lie, since they themselves often tell small lies in little matters but would be ashamed to resort to large-scale falsehoods. It would never come into their heads to fabricate colossal untruths, and they would not believe that others could have the impudence to distort the truth so infamously. Even though the facts which prove this to be so may be brought clearly to their minds, they will still doubt and waver and will continue to think that there may be some other explanation. For the grossly impudent lie always leaves traces behind it, even after it has been nailed down, a fact which is known to all expert liars in this world and to all who conspire together in the art of lying.
Similarly, Hitler’s propaganda minister, Joseph Goebbels, wrote:
That is of course rather painful for those involved. One should not as a rule reveal one’s secrets, since one does not know if and when one may need them again. The essential English leadership secret does not depend on particular intelligence. Rather, it depends on a remarkably stupid thick-headedness. The English follow the principle that when one lies, one should lie big, and stick to it. They keep up their lies, even at the risk of looking ridiculous.
Science has now helped to explain why the big lie is effective.
As I’ve previously pointed out in another context:
Psychologists and sociologists show us that people will rationalize what their leaders are doing, even when it makes no sense ….
Sociologists from four major research institutions investigated why so many Americans believed that Saddam Hussein was behind 9/11, years after it became obvious that Iraq had nothing to do with 9/11.
The researchers found, as described in an article in the journal Sociological Inquiry (and re-printed by Newsweek):
- Many Americans felt an urgent need to seek justification for a war already in progress
- Rather than search rationally for information that either confirms or disconfirms a particular belief, people actually seek out information that confirms what they already believe.
- “For the most part people completely ignore contrary information.”
- “The study demonstrates voters’ ability to develop elaborate rationalizations based on faulty information”
- People get deeply attached to their beliefs, and form emotional attachments that get wrapped up in their personal identity and sense of morality, irrespective of the facts of the matter.
- “We refer to this as ‘inferred justification, because for these voters, the sheer fact that we were engaged in war led to a post-hoc search for a justification for that war.
- “People were basically making up justifications for the fact that we were at war”
- “They wanted to believe in the link [between 9/11 and Iraq] because it helped them make sense of a current reality. So voters’ ability to develop elaborate rationalizations based on faulty information, whether we think that is good or bad for democratic practice, does at least demonstrate an impressive form of creativity.
An article yesterday in Alternet discussing the Sociological Inquiry article helps us to understand that the key to people’s active participation in searching for excuses for actions by the big boys is fear:
Subjects were presented during one-on-one interviews with a newspaper clip of this Bush quote: “This administration never said that the 9/11 attacks were orchestrated between Saddam and al-Qaeda.”The Sept. 11 Commission, too, found no such link, the subjects were told.“Well, I bet they say that the commission didn’t have any proof of it,” one subject responded, “but I guess we still can have our opinions and feel that way even though they say that.”Reasoned another: “Saddam, I can’t judge if he did what he’s being accused of, but if Bush thinks he did it, then he did it.”Others declined to engage the information at all. Most curious to the researchers were the respondents who reasoned that Saddam must have been connected to Sept. 11, because why else would the Bush Administration have gone to war in Iraq?The desire to believe this was more powerful, according to the researchers, than any active campaign to plant the idea.Such a campaign did exist in the run-up to the war…He won’t credit [politicians spouting misinformation] alone for the phenomenon, though.“That kind of puts the idea out there, but what people then do with the idea … ” he said. “Our argument is that people aren’t just empty vessels. You don’t just sort of open up their brains and dump false information in and they regurgitate it. They’re actually active processing cognitive agents”…The alternate explanation raises queasy questions for the rest of society.“I think we’d all like to believe that when people come across disconfirming evidence, what they tend to do is to update their opinions,” said Andrew Perrin, an associate professor at UNC and another author of the study…“The implications for how democracy works are quite profound, there’s no question in my mind about that,” Perrin said. “What it means is that we have to think about the emotional states in which citizens find themselves that then lead them to reason and deliberate in particular ways.”Evidence suggests people are more likely to pay attention to facts within certain emotional states and social situations. Some may never change their minds. For others, policy-makers could better identify those states, for example minimizing the fear that often clouds a person’s ability to assess facts …
The Alternet article links to a must-read interview with psychology professor Sheldon Solomon, who explains:
A large body of evidence shows that momentarily [raising fear of death], typically by asking people to think about themselves dying, intensifies people’s strivings to protect and bolster aspects of their worldviews, and to bolster their self-esteem. The most common finding is that [fear of death] increases positive reactions to those who share cherished aspects of one’s cultural worldview, and negative reactions toward those who violate cherished cultural values or are merely different.
And what about torture? Even after the Senate Intelligence report said that torture didn’t do anything helpful – confirmed by America’s top interrogation experts and 1,700 years of history – the American public still believes the big lie.
And I would argue that the fact that the governments of the world have given trillions to the giant banks has invoked the same mental process – and susceptibility to propaganda -as the war in Iraq.
Specifically, many people assume that because the government has launched a war to prop up the giant banks, it must have a good reason for doing so.
Why else would trillions in taxpayer dollars be thrown at the giant banks? Why else would the government say that saving the big boys is vital?
And I would argue that the fear of another Great Depression (an economic death, if you will) is analogous to the fear of death triggered in many Americans by 9/11.
This creates a regression towards old-fashioned thinking about such things as banks and the financial system, even though the giant banks actually do very little traditional banking these days.
In other words, the big lie appears to be as effective in financial as in military warfare.
Reason Number 2: The Urge to Defend Bad Systems
Psychiatrist Peter Zafirides, M.D sent us an excellent article explaining why good people defend bad systems:
From the bust of the housing bubble and mortgage meltdown to Bernie Madoff and Jerry Sandusky, to political candidates and campaigns, it seems not a week goes by before another story of corruption and scandal breaks. And very predictably, the following questions always seem to follow:“How could they get away with this?”- or -“Why didn’t someone say or do anything about it?”In trying to answer these questions, we have to first understand a bit about both individual and group psychology. The answers may potentially surprise or frighten you, but it is through this understanding, that any real (and lasting) change can occur. Beyond these obvious questions lies another stark reality: good people tend to continue to defend bad systems.Why does this happen? What is going on here?Why do we stick up for a system or institution we live in—a government, company, or marriage—even when anyone else can see it is failing miserably? Why do we resist change even when the system is corrupt or unjust? A new article in Current Directions in Psychological Science, reveals the conditions under which we’re motivated to defend the status quo—a psychological process called “system justification.”The Power of the Status QuoIn system justification theory, people are motivated to defend the status quo. There is a need to see it as being good, just and/or legitimate. People not only want to hold a favorable view of themselves and the groups they associate with, but they also hold favorable views of an entire, overarching social system. There is a lot at stake here on an individual psychological level that may not have anything to do with the particular candidate, or government or social issue.There are consequences for trying to buck the system. What will happen if you try to introduce a different type of political or economic system? You tend to be mocked, marginalized or completely ignored. People need to believe that the systems they believe in are legitimate. But this can cause bias and very dangerous blind spots when it comes to the issue of corruption in these systems.“Now this is not the same as acquiescence,” says Aaron C. Kay, a psychologist at Duke University, who co-authored the paper with University of Waterloo graduate student Justin Friesen. “It’s pro-active. When someone comes to justify the status quo, they also come to see it as what should be.”According to the research, four particular situations significantly increased the likelihood that system justification would occur:1. When a threat to the system occurred.2. When one is dependent on the system.3. When there is no potential escape from the system.4. When one has low personal control of their lives.ThreatWhen we’re threatened we defend ourselves—and our systems. Before 9/11, for instance, President George W. Bush was sinking in the polls. But as soon as the planes hit the World Trade Center, the president’s approval ratings soared. So did support for Congress and the police. During Hurricane Katrina, America witnessed FEMA’s spectacular failure to rescue the hurricane’s victims. Yet many people blamed those victims for their fate rather than admitting the agency flunked and supporting ideas for fixing it. In times of crisis, say the authors, we want to believe the system works. This bias is real. The problem is, it may not even be consciously in our awareness.DependencyWe also defend systems we rely on. In one experiment, students made to feel dependent on their university defended a school funding policy—but disapproved of the same policy if it came from the government, which they didn’t perceive as affecting them closely. However, if they felt dependent on the government, they liked the policy originating from it, but not from the school.Inescapability & Loss of ControlWhen we feel we can’t escape a system, we adapt. That includes feeling okay about things we might otherwise consider undesirable. The authors note one study in which participants were told that men’s salaries in their country are 20% higher than women’s. Rather than implicate an unfair system, those who felt they couldn’t emigrate chalked up the wage gap to innate differences between the sexes. “You’d think that when people are stuck with a system, they’d want to change it more,” says Kay. But in fact, the more stuck they are, the more likely are they to explain away its shortcomings.Finally, a related phenomenon: The less control people feel over their own lives, the more they endorse systems and leaders that offer a sense of order.Change Is Possible!The research on system justification should not be overwhelming or demoralizing. If anything it can really help to enlighten those who are frustrated when people don’t rise up in what would seem their own best interests. The awareness of this psychological tendency in all of us is the first step in trying to minimize its impact. Awareness is critical if one hopes to meaningfully change systems.According to Dr. Kay, “If you want to understand how to get social change to happen, you need to understand the conditions that make people resist change and what makes them open to acknowledging that change might be a necessity.” This is true whether the change one desires is individual or societal.But do not despair! Whether on an individual or societal level, change absolutely happen. Awareness and knowledge is the first part of the process.Never give up the fight.Never doubt how truly powerful you are.
Reason Number 3: Assuming that the Super-Elite Are “Like Us”
The super-elites are not like us:
Vanderbilt researchers have found that the brains of psychopaths have a dopamine abnormality which creates a drive for rewards at any cost, and causes them to ignore risks.
As PhysOrg writes:
Abnormalities in how the nucleus accumbens, highlighted here, processes dopamine have been found in individuals with psychopathic traits and may be linked to violent, criminal behavior. Credit: Gregory R.Samanez-Larkin and Joshua W. BuckholtzThe brains of psychopaths appear to be wired to keep seeking a reward at any cost, new research from Vanderbilt University finds. The research uncovers the role of the brain’s reward system in psychopathy and opens a new area of study for understanding what drives these individuals.“This study underscores the importance of neurological research as it relates to behavior,” Dr. Francis S. Collins, director of the National Institutes of Health, said. “The findings may help us find new ways to intervene before a personality trait becomes antisocial behavior.”The results were published March 14, 2010, in Nature Neuroscience.“Psychopaths are often thought of as cold-blooded criminals who take what they want without thinking about consequences,” Joshua Buckholtz, a graduate student in the Department of Psychology and lead author of the new study, said. “We found that a hyper-reactive dopamine reward system may be the foundation for some of the most problematic behaviors associated with psychopathy, such as violent crime, recidivism and substance abuse.”Previous research on psychopathy has focused on what these individuals lack—fear, empathy and interpersonal skills. The new research, however, examines what they have in abundance—impulsivity, heightened attraction to rewards and risk taking. Importantly, it is these latter traits that are most closely linked with the violent and criminal aspects of psychopathy.“There has been a long tradition of research on psychopathy that has focused on the lack of sensitivity to punishment and a lack of fear, but those traits are not particularly good predictors of violence or criminal behavior,” David Zald, associate professor of psychology and of psychiatry and co-author of the study, said. “Our data is suggesting that something might be happening on the other side of things. These individuals appear to have such a strong draw to reward—to the carrot—that it overwhelms the sense of risk or concern about the stick.”To examine the relationship between dopamine and psychopathy, the researchers used positron emission tomography, or PET, imaging of the brain to measure dopamine release, in concert with a functional magnetic imaging, or fMRI, probe of the brain’s reward system.“The really striking thing is with these two very different techniques we saw a very similar pattern—both were heightened in individuals with psychopathic traits,” Zald said.Study volunteers were given a personality test to determine their level of psychopathic traits. These traits exist on a spectrum, with violent criminals falling at the extreme end of the spectrum. However, a normally functioning person can also have the traits, which include manipulativeness, egocentricity, aggression and risk taking.In the first portion of the experiment, the researchers gave the volunteers a dose of amphetamine, or speed, and then scanned their brains using PET to view dopamine release in response to the stimulant. Substance abuse has been shown in the past to be associated with alterations in dopamine responses. Psychopathy is strongly associated with substance abuse.“Our hypothesis was that psychopathic traits are also linked to dysfunction in dopamine reward circuitry,” Buckholtz said. “Consistent with what we thought, we found people with high levels of psychopathic traits had almost four times the amount of dopamine released in response to amphetamine.”In the second portion of the experiment, the research subjects were told they would receive a monetary reward for completing a simple task. Their brains were scanned with fMRI while they were performing the task. The researchers found in those individuals with elevated psychopathic traits the dopamine reward area of the brain, the nucleus accumbens, was much more active while they were anticipating the monetary reward than in the other volunteers.“It may be that because of these exaggerated dopamine responses, once they focus on the chance to get a reward, psychopaths are unable to alter their attention until they get what they’re after,” Buckholtz said. Added Zald, “It’s not just that they don’t appreciate the potential threat, but that the anticipation or motivation for reward overwhelms those concerns.”
Has anyone tested the heads of the too big to fails for this dopamine abnormality?
What are the odds that they have it? And if they have it, what are the odds that they will voluntarily start acting responsibly, especially given the broken incentive system?
Experts also tell us that many politicians also share traits with serial killers. Specifically, the Los Angeles Times noted in 2009:
Using his law enforcement experience and data drawn from the FBI’s behavioral analysis unit, Jim Kouri has collected a series of personality traits common to a couple of professions.Kouri, who’s a vice president of the National Assn. of Chiefs of Police, has assembled traits such as superficial charm, an exaggerated sense of self-worth, glibness, lying, lack of remorse and manipulation of others.These traits, Kouri points out in his analysis, are common to psychopathic serial killers.But — and here’s the part that may spark some controversy and defensive discussion — these traits are also common to American politicians. (Maybe you already suspected.)Yup. Violent homicide aside, our elected officials often show many of the exact same character traits as criminal nut-jobs, who run from police but not for office.Kouri notes that these criminals are psychologically capable of committing their dirty deeds free of any concern for social, moral or legal consequences and with absolutely no remorse.“This allows them to do what they want, whenever they want,” he wrote. “Ironically, these same traits exist in men and women who are drawn to high-profile and powerful positions in society including political officeholders.”***“While many political leaders will deny the assessment regarding their similarities with serial killers and other career criminals, it is part of a psychopathic profile that may be used in assessing the behaviors of many officials and lawmakers at all levels of government.”
As Jim Quinn notes:
When their bets came up craps, they had the gall to hold the American people hostage for trillions in bailouts. Their fellow psychopaths in Congress gladly forked over the money. Rather than mend their ways, these evil men have returned to their excessive risk taking and continue to pay themselves billions in compensation, while the American middle class is smothered to death under mountains of debt. These evil Wall Street geniuses have shown no remorse as seven million people have lost their jobs and millions more have lost their homes due to the greed and avarice displayed on an epic scale.Wall Street bankers exhibit the epitome of psychopathic behavior, showing lack of empathy and remorse, shallow emotions, egocentricity, and deceptiveness. Psychopaths are highly prone to antisocial behavior and abusive treatment of others. Though lacking empathy and emotional depth, they often manage to pass themselves off as average individuals by feigning emotions. These Wall Street bankers will never willingly accept responsibility for their actions. They continue to use their wealth and power to control the politicians in Washington DC and the misinformation propagated by the corporate media they control. They own and control the Federal Reserve and will print money until the whole system collapses in a spectacular implosion that destroys our financial system. They only care about their own wealth, influence and status. They have no shame.
Studies also show that the wealthy are less empathic than those with more modest wealth, and so:
The idea of nobless oblige or trickle-down economics, certain versions of it, is bull,” Keltner added. “Our data say you cannot rely on the wealthy to give back. The ‘thousand points of light’—this rise of compassion in the wealthy to fix all the problems of society—is improbable, psychologically.”Those in the upper-class tend to hoard resources and be less generous than they could be.
Given that many in Congress and top government posts are multi-millionaires, the study might help explain why politicians seem only to work to make themselves wealthier and to help their wealthy buddies.
We will remain disempowered if we assume that the super-elites are “like us”. Unless we learn to spot “wolves in sheep’s clothing”, we will continue to fall prey to their scams.
This is not to say that all rich or powerful people are psychopaths. There are some great men and women who are affluent or who serve in Washington, D.C. But many do have psycopathic tendencies.
Reason Number 4: The Life-Or-Death Struggle to Defend Our Beliefs
Alternet points out:
When your deepest convictions are challenged by contradictory evidence, your beliefs get stronger.***In 2006, Brendan Nyhan and Jason Reifler at The University of Michigan and Georgia State University created fake newspaper articles about polarizing political issues. The articles were written in a way which would confirm a widespread misconception about certain ideas in American politics. As soon as a person read a fake article, researchers then handed over a true article which corrected the first. For instance, one article suggested the United States found weapons of mass destruction in Iraq. The next said the U.S. never found them, which was the truth. Those opposed to the war or who had strong liberal leanings tended to disagree with the original article and accept the second. Those who supported the war and leaned more toward the conservative camp tended to agree with the first article and strongly disagree with the second. These reactions shouldn’t surprise you. What should give you pause though is how conservatives felt about the correction. After reading that there were no WMDs, they reported being even more certain than before there actually were WMDs and their original beliefs were correct.They repeated the experiment with other wedge issues like stem cell research and tax reform, and once again, they found corrections tended to increase the strength of the participants’ misconceptions if those corrections contradicted their ideologies. People on opposing sides of the political spectrum read the same articles and then the same corrections, and when new evidence was interpreted as threatening to their beliefs, they doubled down. The corrections backfired.Once something is added to your collection of beliefs, you protect it from harm. You do it instinctively and unconsciously when confronted with attitude-inconsistent information. Just as confirmation bias shields you when you actively seek information, the backfire effect defends you when the information seeks you, when it blindsides you. Coming or going, you stick to your beliefs instead of questioning them. When someone tries to correct you, tries to dilute your misconceptions, it backfires and strengthens them instead. Over time, the backfire effect helps make you less skeptical of those things which allow you to continue seeing your beliefs and attitudes as true and proper.***Psychologists call stories like these narrative scripts, stories that tell you what you want to hear, stories which confirm your beliefs and give you permission to continue feeling as you already do.***As the psychologist Thomas Gilovich said, “”When examining evidence relevant to a given belief, people are inclined to see what they expect to see, and conclude what they expect to conclude…for desired conclusions, we ask ourselves, ‘Can I believe this?,’ but for unpalatable conclusions we ask, ‘Must I believe this?’”***What should be evident from the studies on the backfire effect is you can never win an argument online. When you start to pull out facts and figures, hyperlinks and quotes, you are actually making the opponent feel as though they are even more sure of their position than before you started the debate. As they match your fervor, the same thing happens in your skull. The backfire effect pushes both of you deeper into your original beliefs.***The backfire effect is constantly shaping your beliefs and memory, keeping you consistently leaning one way or the other through a process psychologists call biased assimilation. Decades of research into a variety of cognitive biases shows you tend to see the world through thick, horn-rimmed glasses forged of belief and smudged with attitudes and ideologies.***Flash forward to 2011, and you have Fox News and MSNBC battling for cable journalism territory, both promising a viewpoint which will never challenge the beliefs of a certain portion of the audience. Biased assimilation guaranteed.***The human understanding when it has once adopted an opinion draws all things else to support and agree with it. And though there be a greater number and weight of instances to be found on the other side, yet these it either neglects and despises, or else-by some distinction sets aside and rejects, in order that by this great and pernicious predetermination the authority of its former conclusion may remain inviolate- Francis Bacon
It is very difficult for anyone to really listen to evidence which contradicts our beliefs. But unless we learn how to grit our teeth and do so, we will forever be victims to the divide-and-conquer game which ensures that we have politicians who will ignore our demands, we will be so wedded to one investment strategy that we will forever lose money on our investments, and we will generally be weak and disempowered people.
Reason Number 5: Forgetting that We Don’t Live in Tribes
Our brains are wired for tribal relationships:
Biologists and sociologists tell us that our brains evolved in small groups or tribes.
As one example of how profoundly the small-group environment affected our brains, Daily Galaxy points out:
Research shows that one of the most powerful ways to stimulate more buying is celebrity endorsement. Neurologists at Erasmus University in Rotterdam report that our ability to weigh desirability and value doesn’t function normally if an item is endorsed by a well-known face. This lights up the brain’s dorsal claudate nucleus, which is involved in trust and learning. Areas linked to longer-term memory storage also fire up. Our minds overidentify with celebrities because we evolved in small tribes. If you knew someone, then they knew you. If you didn’t attack each other, you were probably pals.Our minds still work this way, giving us the idea that the celebs we keep seeing are our acquaintances. And we want to be like them, because we’ve evolved to hate being out of the in-crowd. Brain scans show that social rejection activates brain areas that generate physical pain, probably because in prehistory tribal exclusion was tantamount to a death sentence. And scans by the National Institute of Mental Health show that when we feel socially inferior, two brain regions become more active: the insula and the ventral striatum. The insula is involved with the gut-sinking sensation you get when you feel that small. The ventral striatum is linked to motivation and reward.In small groups, we knew everyone extremely well. No one could really fool us about what type of person they were, because we had grown up interacting with them for our whole lives.If a tribe member dressed up and pretended he was from another tribe, we would see it in a heart-beat. It would be like seeing your father in a costume: you would recognize him pretty quickly, wouldn’t you.As the celebrity example shows, our brains can easily be fooled by people in our large modern society when we incorrectly ascribe to them the role of being someone we should trust.As the celebrity example shows, our brains can easily be fooled by people in our large modern society when we incorrectly ascribe to them the role of being someone we should trust.The opposite is true as well. The parts of our brain that are hard-wired to quickly recognize “outside enemies” can be fooled in our huge modern society, when it is really people we know dressed up like the “other team”.***Our brains assume that we can tell truth from fiction, because they evolved in very small groups where we knew everyone extremely well, and usually could see for ourselves what was true.On the other side of the coin, a tribal leader who talked a good game but constantly stole from and abused his group would immediately be kicked out or killed. No matter how nicely he talked, the members of the tribe would immediately see what he was doing.But in a country of hundreds of millions of people, where the political class is shielded from the rest of the country, people don’t really know what our leaders are doing with most of the time. We only see them for a couple of minutes when they are giving speeches, or appearing in photo ops, or being interviewed. It is therefore much easier for a wolf in sheep’s clothing to succeed than in a small group setting.Indeed, sociopaths would have been discovered very quickly in a small group. But in huge societies like our’s, they can rise to positions of power and influence.As with the celebrity endorsement example, our brains are running programs which were developed for an environment (a small group) we no longer live in, and so lead us astray.Like the blind spot in our rear view mirror, we have to learn to compensate and adapt for our imperfections, or we may get clobbered.Grow UpThe good news is that we can evolve.While our brains have many built-in hardwired ways of thinking and processing information, they are also amazingly “plastic“. We can learn and evolve and overcome our hardwiring – or at least compensate for our blind spots.We are not condemned to being led astray by [banksters and power-hungry sociopaths].We can choose to grow up as a species and reclaim our power to decide our own future.
Reason Number 6: Pretending We Know
People who don’t know much about a subject tend to over-estimate their understanding. Ironically, experts in any subject tend to underestimate their abilities (because the more you know, the more you realize that you don’t know.)
(This may be learning a sport or a musical instrument. When you get decent at it, it becomes fun … and learning how to improve is pleasurable. On the other hand, if you make nails-on-chalkboard noises while learning how to play electric guitar or fall a lot while you’re learning how to ski, it isn’t as fun … and it is tempting to give up and avoid it if your friends try to “drag you along”. The same dynamic might apply to learning as well.)
If we realize that we are resisting learning new information – either because we assume we already know it all, or because we want to avoid the embarrassment of being a beginner – we will remain stuck where we are, and we will never grow wiser or more powerful. If your mind is already “full”, you can’t fill it any more. Indeed, one of the secrets of really smart people is to adopt a “beginner mind”, so that they are open to learning new information.
Reason Number 7: Apathy
The CIA notes that, public apathy allows government officials to ignore their citizens. While it is easy to slip into apathy, we will as a people be ignored by our politicians unless we remain involved.
Reason Number 8: The CIA and Other Government Agencies Control Media, Movies, TV and Video Games
Famed Watergate reporter Carl Bernstein says the CIA has already bought and paid for many successful journalists.
A CIA operative allegedly told Washington Post editor Philip Graham … in a conversation about the willingness of journalists to peddle CIA propaganda and cover stories:
You could get a journalist cheaper than a good call girl, for a couple hundred dollars a month.
The Church Committee found that the CIA submitted stories to the American press:
The New York Times discusses in a matter-of-fact way the use of mainstream writers by the CIA to spread messages.
The government is paying off reporters to spread disinformation.
A 4-part BBC documentary called the “Century of the Self” shows that an American – Freud’s nephew, Edward Bernays – created the modern field of manipulation of public perceptions, and the U.S. government has extensively used his techniques.
The Independent discusses allegations of American propaganda.
One of the premier writers on journalism says the U.S. has used widespread propaganda.
Indeed, an expert on propaganda testified under oath during trial that the CIA employs THOUSANDS of reporters and OWNS its own media organizations (the expert has an impressive background).
Of course, the Web has become a huge media force, and the Pentagon and other government agencieshave their hand in that as well. Indeed, documents released by Snowden show that spies manipulate polls, website popularity and pageview counts, censor videos they don’t like and amplify messages they do.
The CIA and other government agencies also put enormous energy into pushing propaganda throughmovies, tv and video games.
We intentionally listed propaganda last, because we only fall for propaganda to the extent we fail to learn the first 7 lessons … i.e. to wake up and think for ourselves.
As Michael Rivero notes:
Most propaganda is not designed to fool the critical thinker but only to give moral cowards an excuse not to think at all.
Moral cowards … or people too lazy to learn how their own minds – and those of the bad guys – work.
No comments:
Post a Comment