|Posted by christinkeck on July 24, 2013 at 12:25 AM|
“You can’t assume without making an ASS of U and ME.” That’s an old adage, and it seems more and more these days to be true.
Assumptions drive our most egregious thought processes. We assume even when facts are present—we assume even when we can be proven that our assumptions are ridiculous, unfounded or based on skewed logic. Why do humans assume so much? What is it that keeps us from thinking critically, basing our ideas on something other than unfounded ideals or faulty reasoning?
Well, I suppose (I assume!) that this is a trait we aren’t going to ever give up, since it’s so ingrained. But that assumption is just as faulty as any other. And I can tell you exactly why I feel this way: I’m not an optimist when it comes to human reasoning. I’ve seen us grow in the ability to communicate over my 60+ years, but not in our ability to discern truth, find validity and think critically. It would be easy to blame the rise of the internet for this, but it really started a long time ago—longer than most of us have been alive. Placing blame isn’t helpful anyway. It is a basic fault of human communication and that’s been a factor in our lives since we have been communicating.
Let’s take a basic assumption about physical appearance as an example.
You see a man run a footrace. He is fit, he is healthy and he has two legs. He wins the race. We assume he wins that race because he has all the physical characteristics he needs in order to do so. He has two legs, he has trained to run, and he is healthy. And he has a desire to win, based on whatever competitive urges he has nurtured in himself, or which have been nurtured in him. It does not go against any of our assumptions when the man wins the race.
But the man is the victim of an accident and he loses a leg. Now he does not have the physical characteristics he needs to win that race—not like he had before. What is the most common assumption then? That the man’s racing days are over. Logically, it seems sound. A man with two legs is a better runner than a man with one leg—he’s a more complete human specimen. No more racing for him!
But this man has other ideas—and they aren’t based on assumptions. He gets a prosthesis, he continues to train—and he runs a race with only ONE leg. And he still wins. Now where are our assumptions? We see that there is nothing that prevents him from running prior to his accident, but apparently, there is also nothing that prevents him from running after it either!
We accept that simply by having our physical characteristics changed that it will not necessarily preclude our ability to do what we love. It’s easy, right? We are proven wrong—and we cheer him on at the finish line—and we are not unhappy to give up our assumptions.
Or are we?
Will we, after watching the man win on his prosthesis, treat others who have lost legs in the same way we treated them prior to their loss? Will we believe that just because a physical characteristic has been altered, changed, lost or otherwise compromised, that we are dealing with the same person as before? The sad fact here is that we probably will not.
We see this man as an exception—not as the norm. We still make assumptions based on physical characteristics that may or may not be valid. We will not dump those assumptions no matter how many amputees win footraces.
Now project that same sort of thinking onto the color of someone’s skin. Or their religious beliefs. Or their country of origin. Or their gender. Or their sexual orientation.
Do we change our minds because we have been proven to be erroneous in our assumptions about those characteristics? Or do we continue to hold these assumptions regardless of how often they are shown to be erroneous?
Mostly, we do the latter.
No matter how many peace-loving, kind, moral or upstanding Muslims we meet, we still find ourselves angry at the entire religion because of the attack on 9/11. No matter how many gay couples in committed, long term relationships we get to know, we still think there is something “wrong” with them getting married. No matter how many women we see become CEOs, elected officials, company founders and executives we are shown, we still think they should make less money than men in those same positions. No matter how many black Africans with college degrees, high IQs and erudite, articulate vocabularies we encounter, we still think they should have remained slaves.
Mostly, these assumptions are based on one incident, one stupid or insensitive action or one example. Yet they persist in spreading to the entire culture, race, gender or ethnicity.
In 1994 a book was published by a psychologist and a political scientist called The Bell Curve: Intelligence and Class Structure in American Life. (1994, Free Press Publishing, Richard Herrnstein and Charles Murray). This book argued a basic theory that the level of human intelligence in Americans was determined by inherited and social factors and drew a boat-load of controversy over its assumptions especially where race was concerned. The book argued that race, especially the black race, was genetically inferior in intelligence and that it also caused the racial differences in IQ scores that exist. It went on to make “suggestions” for social and cultural changes, such as encouraging the “right” women to have more babies, severely curtailing or eliminating immigration and eliminating programs like Affirmative Action.
To be completely fair the book did not name genetic differences as the ONLY reason for the difference in IQ scores. It did state that the question was not a proven or resolved issue. However, the rest of the text seemed to support the idea even with that disclaimer.
Many people would read this book and believe it’s “scientific” credentials—but the problem is that it was based on several erroneous assumptions:
1. Human Cognitive ability is a single general entity, depictable as a single number.
2. Cognitive ability has a heritability of between 40 and 80 percent and is therefore primarily genetically based.
3. IQ is essentially immutable, fixed over the course of a life span.
4. IQ tests measure how "smart" or "intelligent" people are and are capable of rank ordering people in a linear order.
5. IQ tests can measure this accurately.
6. IQ tests are not biased with regard to race, ethnic group or socioeconomic status.
As someone who was for many years a member of the social group Mensa, I can tell you that not a single one of these assumptions is either factual or correct. Human cognitive ability is not a single, general entity by any stretch of imagination or fact. And a number cannot give you any idea of the value of someone’s ability--no more so than losing a leg can guarantee you will never run again. IQ’s are not fixed and do not remain the same over a life-span. IQ tests do not measure how “smart” someone is—no more than a drawing test measures artistic ability. “Smart” is a completely subjective adjective. It cannot be quantified or measured. Not by an IQ test, not by any test. And IQ tests are highly biased, especially when it comes to race or ethnicity.
Case in point: when I applied for entry to Mensa, I took two IQ tests. One was the Cattell test—before it was redesigned to eliminate its cultural bias. The other was the California Test of Mental Maturity. This second test was designed in a way that was more “non-verbal”. It relied on pictures instead of words, on listening and memory retention rather than reasoning by reading.
In one section, the test gave you five pictures and you were to pick out the one that “didn’t belong” in the group. Each picture was rated according to its “rightness”—there were no “wrong” answers. I remember one question that really struck me as difficult:
The question showed five women: one from Holland, one from Japan, one from India, one from Eastern Europe and one in a simple dress without any ethnic association. You were to pick out the one that did not belong—and the answer was obvious—it was the one in the dress without any ethnicity. But I puzzled over this for a while—to me it was obvious. But what if you were Japanese? Would you see it differently? What if you were East Indian? Russian? Dutch? If you were Asian, wouldn’t that one seem to be the one that did belong in the group—your group? And why was the lack of “ethnic” dress something that did not belong in a group? Was there something “wrong” with ethnic dress that caused it to be the anomaly? Or something wrong with non-ethnic dress? I understood what was wanted on this question, but I found myself very put off by the asking! If you were at an International Fair or the UN, or a multi-cultural summit meeting—would you want to think of yourself as not belonging there because you didn’t have an “ethnic” costume? The whole thing smacked of assumption and racist inequality, even though I realized in my logical brain that this question was only about the costuming—not the social implications. Still, it brought up many questions in my mind.
Even when tests are designed to be less culturally biased, bias sneaks in in ways we might not consider. And our assumptions sneak in right along behind it.
The Bell Curve was pretty much roundly dismissed as “junk science”. Still, quite a few people are willing to believe what it claims, even though there is more than enough proof and evidence to the contrary. And this type of erroneous assumption is everywhere, exacerbated by the internet and our inability to pick out truth from falsehood.
Take the controversy over vaccinations. Jenny McCarthy, self-appointed spokeswoman for the Anti-Vac movement, has claimed that vaccinations cause autism. This assumption is based on her own personal experience. It’s not based on fact. It’s not based on science or evidence. Yet she received a huge audience and numerous adherents to her cause. Her celebrity and her outspoken-ness have given her a much wider audience than she might have received before the internet and television made it so easy to gain an audience. Back in the 1800’s a man named Sylvester Graham had some views about diet that were a fad for a while: he believed that bread and foods made without chemical additives was healthier for you. Sound good? Sure—and it’s mostly been found to be correct. But Graham started a movement—and actually opened centers where people were “treated” to correct their dietary ills. And it wasn’t only this one tenet they observed. Graham’s followers also believed that diet controlled our sexual urges, that masturbation would make you insane and that your diet could control your “impure” thoughts and cure alcoholism. He is probably the driving force behind the modern vegan movement, though his name is largely forgotten now, unless you buy a by-product of his ideas: Graham Crackers; though these crackers we know are nothing like the ones he made or ate.
From one example—one erroneous assumption—to an entire thought process, a movement, a cultural shift. The truth becomes lost in the morass of garbage thinking, junk science and fad-loving zealots. Graham’s movement died out eventually, but McCarthy’s has not, despite overwhelming science that refutes it.
And our assumptions have not died either. We assume cultural differences and potentiality based on race, gender, sexual orientation; we do not assume that these things are mutable, changeable, fixable or amendable. We continue to hold our beliefs as sacrosanct, despite being proven to be biased, or just plain wrong. Then, when challenged, we shift the argument to a question of “rights”—the right, mostly, to believe anything we choose, even when it is junk. We hold faster to our right to be stupid and unthinking than almost anything else. It has become a symbol of freedom of thought, but it is in reality the worst kind of mental prison, because it keeps us from growing, expanding, and opening our minds to better ways of living, better ways of understanding and it prevents us from being better human beings.
If we are ever going to eliminate bad thinking, we must begin by showing that assumptions are not fact—and belief is not proof. We must realize that ASSUME just makes an ASS of U and ME.