Overblog
Edit post Follow this blog Administration + Create my blog
Social change

Discussing more relevant ways of thinking.

Opinion Slows Down Progress

Posted on June 20 2016 by Theofilos in Scientific Method

Foreword

“If today, in social psychology, I submit a paper to a social psychology journal showing that people think rationally, logically, act in their self-interest in situations I won't only be rejected they will send the men to my house to kill me. They will be like you are not a social psychologist you are a fool.” - Robb Willer, Stanford University Professor.

This post is an attempt to show you how inefficient our opinions are due to our cognitive limitations along with less relevant information available about observed physical phenomena. I will be exploring how people formulate their opinions and solidify them through reinforcements coming from their environment. Even though at times it may seem like it, this post is not an academic paper on social psychology hence the limited amount of biases discussed. In my attempt to explain the insufficiencies of our opinion I will briefly discuss the hypotheses formed during the "Age of Enlightenment" era to show the reader that even though people started discussing controversial subjects with their limited cognitive tools at the time, these hypotheses continue to stagnate societies' progress as a whole. Of course, I do not disregard the fact that there had to be a slow process of social evolution but resisting new information to protect outdated established practices is detrimental to humanity. We will look at cognitive biases and errors to an extent relevant to this post; I will conclude by explaining how the scientific method can help us as a tool to arrive at decisions without using our subjective opinions.

Age of Enlightenment

In the 17th and 18th century, during the "Age of Enlightenment," the standard model of rationality explained that human reasoning is and should be logical. Basically that the universe works with logical and consistent principles, and we could understand the laws of nature and the human deductively. Rene Descartes's view and the views of other philosophers in his era were that human reasoning is always very mathematical. This view placed the blame on the individual for his behavior rather than the environmental conditions he was immersed in. This approach is inadequate today since according to modern social psychology we do not always think logically and rationally. It seems to me that philosophers arriving at the standard model of rationality through deduction proves the inadequacy of their methods.

Unfortunately centuries later, people still believe in the opinion that we are these conscious, cold, reasonable and logical beings. We punish people for their behavior; we execute some; we also get angry at people for being perfectly 'normal.' Normal according to their background. This behavior is reinforced, not by the latest information received, but reinforced by our very own social system. Supported by our families, famous leaders, and even friends. The punishing behavior is further reinforced by our media and the personal satisfaction we get when a person causing trouble in our life has been 'removed,' neglecting the fact that removing someone does not get rid of the value system which shaped the behavior of this person in the first place.

It seems to me that economic inferences of human behavior also dominate the justice system as well, in particular ‘Rational choice theory,' as it assumes that individuals will always make logical and prudent decisions that yield the most benefits for themselves, but of course we don't see that in people who volunteer in the program such as 'Doctors Without Borders.' Also, that they are driven naturally by personal desires and motivated by personal goals or selfishness. Within the monetary system, this is evident: we live in perceived scarcity, and this generates a certain pattern of behavior. This is the basis of most economic theories and is also considered by some a general theory of public policy. One of the assumptions made by the rational choice model is that one has full or perfect information about the alternatives, in other words, the ranking between two alternatives involves no uncertainty. This would be interesting to observe in practice but detrimental to competition. The 'survival of the fittest' attitude which businesses have to adapt to stay in business will not allow perfect information to be advertised. In the 'real' world a company that spends more money on advertising will make their product more attractive than the product of their competitors, which spend less money on advertising. So, you might see one product being marketed more often on television, radio, newspapers, magazines, and billboards, by very convincing actors and a well-written script, and a competing product being in a local newspaper marketed by an easily missed advertisement.

The same seems to happen in politics i.e. the candidate who has more sponsorship for their campaign will be seen and heard more often in the media. Therefore, by having more information on one candidate than the other, how can we expect voting to be an adequate method of selection if we don't have most of the relevant information about the campaigns of both candidates. How can we know that the candidates are being honest in what they are saying? If their sponsor is in the war industry, the candidate will attempt to convince its electorate that war is necessary. Next time you are going to vote, keep in mind if you are voting for the person you think is a better candidate or the candidate that had more generous sponsors.

“So far, we have opinions from politicians that know nothing about ecology, safety, engineering, increasing the agricultural yield. They're totally incapable.” - Jacque Fresco

Politicians are not a separate breed of people or aliens from outer space. They are our mothers, fathers, uncles, cousins or our next door neighbors. Jacque Fresco's statement above applies for most people today; they feel comfortable in voicing their uninformed opinions. Imagine sitting in an airplane waiting to go on your much-anticipated holiday. The Captain of the flight comes over the passenger address system and announces Ladies and gentleman this is your Captain speaking. Apologies for the delay of departure, we seem to be experiencing a small technical issue with one of our engines, and we are awaiting the assessment of our local baker to tell us whether we can continue to go on with our flight or if further investigation is required before we take off. Some might unfasten their seat belt and run out of the airplane; others might protest, and few might think it's a joke and laugh it off. If they find it amusing, it will be because they know that a qualified aeronautical engineer specialized in that particular engine model will examine it by running necessary tests and will only release the aircraft as fully functional if it meets manufacturer's guidelines and all legal requirements. It is devastating that most people do not think its a joke when lawyers and businesspeople are making decisions on issues such as health, environment, education, nutrition and other areas which they are extremely under-qualified to do so.

“A cow asks no questions as to how it happens to have a dry stall and a supply of hay. The kitten laps it's warm milk from a china saucer, without knowing anything about porcelain; the dog nestles in the corner of a divan with no sense of obligation to the inventors of upholstery and the manufacturers of down pillows. So we humans accept our breakfasts, our trains, and telephones and orchestras and movies, our national Constitution, or moral code and standards of manners, with the simplicity and innocence of a pet rabbit.” - The Mind in the Making

“What profession do all of these senators and congressmen have? Law, law, law, law, business man, law, law, law. Where are the scientists? Where are the engineers? Where's the rest of life represented?” - Neil deGrasse Tyson

Cognitive Errors and Biases

“It's not that people are good or bad. They're raised in an aberrated or twisted environment.” - Jacque Fresco

Attribution is the process through which people ‘infer’ the causes of human behavior. According to Fritz Heider, there are two main types of attributions: Dispositional and Situational.

“You are taught in school that everyone should have a right to their own opinion is that right? Suppose you lived across the way from me, and I see ten guys coming out of your apartment, and I have a right to my own opinion. She could be a ballet instructor a language instructor. Never give people the right to their own opinion. If their own opinion is sane, and I ask “what is going on there?” They should say “I honestly don’t know. - Jacque Fresco

The above quote describes Dispositional Attributions. This is when we say that "the headhunters of the Amazon are bad and evil" because of their cultural custom of using shrunken human heads as ornaments. In other words, when we attribute another's behavior to something we consider inborn (Internal Attribution), like they do in law and religion. Situational Attribution, on the other hand, is when we say that the environment shapes human behavior (External Attribution.) In other words, when we attribute another's behavior to the environment. It is much easier for the inefficient observer to turn to the internal or dispositional attributions about another person’s behavior.

Admittedly we cannot see the events that lead a person to act a certain way (invisibility problem,) it is easier to say what a jerk that guy is. It is hard to attain awareness to a level which may mitigate the problem of ‘invisibility’ in everyday life because you don't want to make excuses for a person who is causing you pain. The reality is that you won't be making excuses, it would be making a 'saner' situational attribution. We are not informed of this method in our current culture, and we are not taught or trained to move away from the dispositional type of thinking while learning to say I don't know why the person acted this way thus achieving a saner way of speaking. Instead, we fall victims to our cognitive errors and biases.

“If a person with red spots always beat you up, next time you see a person with reds spots you would cross the road” - Jacque Fresco

We navigate our world by generalizing, and these generalizations carry associations which help us survive. If a stone of a certain size hits my head and it hurts, any object after that flying toward my head will be perceived as a threat. If I burn my hand on a sun-baked piece of metal, next time when I feel the heat coming from an iron, I will most likely perceive it as a threat. Evolutionarily, this mechanism had its uses in creating associations from the past and the present, and using them to project into the future, which might improve the level of predictability and general situational awareness. But in the meantime we may make inferences, ignoring environmental pressures which act as constraints, present in people's minds and not apparent to us, such as societal pressures and parental conditioning. Jealousy and envy can be added to the list when speaking of ignoring these pressures and conditioning. Even just a photo may trigger such a reaction. We may look at a picture of our friend with his new car and say What a show-off! Or a sexually provocative picture of a female and say What a tramp! We get that from the present day culture, labeling people based on our conditioned emotional reactions.


We just discussed what is known as the ‘Correspondence Bias.' Something that overlaps with this bias is the Fundamental Attribution Error, which is the reinforced tendency to draw conclusions by overestimating the Dispositional factors even when a logical analysis suggests such conclusions should not be drawn. An example of that is the supporting studies known as 'Ross et al.,1977' and the 'Jones and Harris Castro essay' evaluation. In both studies, the answers to Who is the smartest? Or What did the writer think of Castro? The people involved should have said I don't know. Instead, they made attributions even though they were informed about the experiment's details. This shows that the people studying the experiment made the Fundamental Attribution Error.

"The three most difficult words for people to say are I don't know!" - Jacque Fresco

When you get angry at a flight attendant because the food on board the airplane isn't made to your taste or if you are screaming at the cashier because you are upset that your bank has increased the interest rate on your loan, you are a victim of correspondence bias. It is not the fault of the employees, yet you scream at them. In the past if you have been on the receiving end you might say It's not your fault, you are just doing your job. Most people do not have the capacity to stop themselves from making this attribution error. Getting angry at someone who informs you that your loan has not being approved or thanking the one that gives you the positive news of loan approval is another irrational behavior as a result of the correspondence bias. Caucasian people in certain ex-colonial countries might be viewed as smarter than the local indigenous population. All these are examples of the correspondence bias.

Actor-observer bias

This bias explains our tendency to give a thorough explanation as to why ‘we’ behaved a certain way in a situation, in other words, that our behavior was a product of the situation. Nevertheless, we will quickly label other people as if the individual was responsible for making the situation. The mechanism for this is known as ‘Focalism’ i.e. we observe other people's behavior but not the situation; we see ’them’ doing what we are observing as if they are the most likely cause of their behavior. When we sit back and analyze our own behavior, we might be able to observe the mechanisms that led us to behave in such way.

When we think carefully about our own behavior, we see the situation. This mechanism is strong enough that we may even attribute the behavior of objects to their ‘personalities.' This is well demonstrated in the Heider and Simmel experiment. If the reader watches the short experiment video, they might not be able to help but provide irrational reasons for the behaviors of the inanimate objects. This is our tendency to anthropomorphize. We place human values or our own values in other things or beings.

We should be teaching people to say I don't know what part of his environment generated his behavior, or I don't have enough information to draw a conclusion on the subject. Being aware of the actor-observer bias may help you look at the evidence before making a declarative statement about something or someone. When I say the evidence I mean the environmental factors that caused the combinations of genes to undergo gene expression to form certain proteins which in turn generated a certain reaction pattern that once reinforced by the environment became a feature of that person's personality.

Your environment, social standing, and situations you are exposed to are the overriding causes of your behavior. Some argue that your personality plays a significant role in the situations you find yourself in but this hypothesis does not clarify that your personality was shaped by the environment you are raised in. Those who might think that this is a ‘which came first, the chicken or the egg’ argument need to become more familiar with effects of environment on behavior.

All these are ideas that threaten the established system. In most universities, most professors usually do not condemn the outdated values in our current system even though they might be teaching similar ideas like the ones presented above. Jacque Fresco describes this process in his argument about relation to academia. If the justice system had to make Situational Attributions they would have to close down all prisons because their entire philosophy is based on making 'unsane,' convenient Dispositional Attributions. They incarcerate people for long periods of time, and this stays on their record, so it's difficult for them to find a job once they get released thus most being forced back into crime. Next time, they might end up doing something worse and get the death penalty. All this is due to outdated information based on the ‘unsane’ values deduced by philosophers in the 17th and 18th centuries.

Confirmation Bias

Confirmation Bias is the reason you hear people saying, I did some research online, and I have confirmed that vaccines are dangerous for children. This is a tendency to search for or interpret information in a way that confirms one's preconceptions, leading to statistical errors. This type of bias reveals itself in metaphysical explanations, political and cultural views. This is a phenomenon wherein decision makers such as individuals, pilots, doctors, businessman or political leaders have been shown to actively seek out and assign more weight to evidence that confirms their presuppositions but tend to ignore or under-weigh evidence that would otherwise disprove their presuppositions.

Pilots attempt to mitigate this bias by asking open questions after they have thought of a possible solution. Instead of telling their colleagues the solution, they ask them what they think. For example, Look at the weather ahead, what do you think we should do?; the colleague might say I suggest we climb to avoid most of the stratocumulus cloud. We should also deviate up to 40nm left off our track since the wind is coming from the left, so we don't get the turbulence from the cumulonimbus cloud. But just in case, let's put the seat belt sign on and warn the cabin crew. If the other colleague doesn't have anything to add, they disregard their need for participation and plainly say that's a good idea! While performing the appropriate action. In other words, they don't seek confirmation of their thoughts, but they ask open questions to possibly hear something that they didn't think of or something they weren't aware of, such as the direction of the wind.

A person who is pro-communism will seek out all the evidence as to why communism is good and will ignore most other factors. A BMW fan will look for all the reasons why BMW is an amazing car compared to Mercedes and place less weight on facts that contradict this view. If this is true, how can we trust opinions when making decisions?

Self-serving bias

Can we trust our uninformed opinions to help us make efficient decisions when we know that they are the result of so many biases, errors, and influences? The answer, of course, is that we can only do so to a limited extent. A Pitot tube on an airplane can measure static and dynamic pressures to the very last millibar; a computer further corrects these readings for instrument, position and pressure error which will give you a remarkably accurate Calibrated Airspeed Indication. It will further correct for compressibility and density error to give you 'True Airspeed' and perform an additional correction for the wind thus providing you with a 'Ground Speed.' Many people stalled and died because they used an estimate for measuring airspeed before a more accurate system was introduced in an attempt to solve that problem. Similarly today, sensors can be used to collect data and computers can process that data to help us arrive at decisions more efficiently than our opinions.

When exploring the self-serving bias, we see how we can fail miserably at arriving at decisions without utilizing the scientific method with human and environmental concern. These are some examples of how we can fail:

  1. Motivated Reasoning: the tendency to come to conclusions that make you feel good;
  2. Motivated Handling: the tendency for people to attach more significance to facts that match their views and argue that these facts are more important - extensively practiced in political election campaigns;
  3. Motivated group affiliation: the tendency for an individual or a group to support the success of another individual or a group purely based on the fact that they have succeeded;
  4. Motivated recall: the tendency for people to remember facts that may reflect well on them more than those that do not. People tend to recall facts and evidence that endorse their assumptions more than those that don’t.
  5. Self-handicapping: people tend to invoke either real or feigned self-handicaps. Self-handicapping is a cognitive strategy by which people avoid investing effort in the hopes of keeping potential failure from hurting self-esteem.
  6. Above average effect: people who think that extensive learning about the proposals of The Venus Project and detailed examination of the methodologies used to arrive at a resource-based economy is not necessary, fall victim of the tendency to believe that they have an above average understanding.
  7. The Dunning–Kruger effect: relatively unskilled persons would suffer an illusory superiority, mistakenly assessing their ability to be much higher than it is.
  8. And last but not least, the ‘Holier than thou' effect: the tendency for people to believe they are better than others based on a moralistic perspective.

These tendencies are not inborn, but they are possibly coping strategies to protect the ego and low self-sufficiencies. The environment we are immersed in along with the other errors and biases, which we might not be aware of, is what leads us to be victims of the self-serving bias. Motivated reasoning will have us looking at a picture of a person dying in war and saying things like I am so lucky that I have peace in my country. Going beyond this would be saying, How can I create a system that does away with the need for war?

"Don't ask kids what they want to be when they grow up but what problems do they want to solve." - Jaime Casap

The moment you do as Jaime Casap suggests you move away from how can I serve myself? To How can I make the world a better place for as many people as possible? Is there a gain from such attitude? Yes! A saner next generation and the potential for the development of a world without politics, poverty and war.

The Scientific Method

"Science is really an attempt to predict the next most probable" - Jacque Fresco

We keep hearing about the scientific method, and people tend to resist it, thinking it's just another belief system, not understanding that the scientific method is a process which, if applied, helps avoid most cognitive errors and biases. It does this by using a 'Double-blind procedure.' This is an experimental procedure in which neither the subjects of the experiment nor the persons administering the experiment know the critical aspects of the experiment. Double-blinded research is used in many fields such as medicine, psychology, social and natural sciences and forensic research. It can even be used in writing journals in which both the person reviewing the paper and the author are not informed of each others identity. This is known as a double-blind review. The reason for utilizing this method during a peer review is not to be swayed negatively or positively by the author's reputation, personality, and character, allowing the reviewer to look at the journal with a relatively unbiased opinion. Of course, the person reviewing the paper might have vested interest in the matter and might favor the content of the article or the opposite might be true. To account for this, multiple people should review the article. Is it a perfect system? No, but it's better than not having a self-checking mechanism, guessing, assuming and using metaphysical references or philosophy to reach a conclusion which might fail in a double-blinded procedure.

"When we talk about science we talk about a method of looking at a situation, a method of evaluation that differs from the opinionated system, "If you ask me, I'll tell you!" The scientific method has no real connection to truth; it merely has a better way at looking at things than the earlier systems, in which everything was attributed to gods or demons." - Jacque Fresco

When researchers design particular experiments, they break the participants into to two groups: the control group and the experimental group. The control group does not receive the experimental treatment whereas the experimental group receives the experimental treatment. Both groups are vital to the experiment because without the control group the experimenters won't be able to compare results, which means they won't know if the experimental treatment had any significant effect. In other words, if I give you an aspirin for a headache, but at the same time, I make you drink lots of water and give you a candy (a placebo), and told you it's aspirin, I won't know if it is the water, the placebo or the aspirin that got rid of your headache.

What is important is for the two groups of individuals not to know which group they are in. If they do, they tend to display different types of behavior than they would normally, which means the experiment will be invalid. When I say display different types of behavior, I mean that if I told you that you are the experimental group, and you are drinking an energy drink, you could subconsciously convince yourself that you are feeling more energetic. This is known as participant bias where you would behave in ways you believe the experimenter wishes you to behave.

If we want to know whether that energy drink has any effect on people or not we will give it to the experimental group but the control group would drink normal tap water. In this case, neither groups would be aware that one of the groups is receiving the energy drink. Experimenters also use this on themselves to tackle the problem of experimenter bias which is when the experimenter influences the results by administering the experiment or collecting data in a biased way, which typically happens when the experimenter has vested interest in the results of the experiment. An extension to the double-blind is the triple-blind procedure. The monitoring committee who is responsible for evaluating the response variable is not told which group is which. They will just observe group A and group B, again to avoid any biases as a result of vested interest or personal beliefs.

These processes have their weaknesses and this should always prompt us to look for a better system that produces less biased results.

Summary

We looked at the age of enlightenment to understand how human beings have been viewed in the past and how that past is still haunting our present. We analyzed a small fraction of the cognitive biases and errors we are influenced by. Even though some of these biases and errors had their evolutionary purposes, we have to become aware of them and reinforce saner ways of thinking and acting.

Conclusively, we looked at the scientific method and how experiments are coordinated so as to avoid errors and biases. One might ask, We are not all scientists how does this applies to us?

"Be the change that you wish to see in the world." - Mahatma Gandhi

It seems to me that most people want to change the world, but they have difficulty with two things which are revealed in the following questions: Why should I change? I am a 'good' person. I don't harm anyone! Or What do I change into?

The belief that you are a 'good' person and your ways do not cause harm is evidence for the 'holier than thou' effect. We should be exploring the 'magic of reality' using the scientific method, while outgrowing the need for and debunking metaphysical delusions which are causing global neural lag. We should have self-check mechanisms in place to ensure that we are not subjecting the world to actions based on our cognitive biases and errors. Understanding how human behavior is determined by environmental influences debunks the myth of 'free will,' which has been haunting us for thousands of years. Outgrowing conditioned reflexes such as resistance to change. Rethinking our values and traditions, and aligning them to the physical world while raising our children to take care of the Earth and everyone on it.

Comment on this post