Three Kinds of Concluding: Logic, Intuition, Authority

From Rasmapedia
Revision as of 07:09, 2 May 2021 by Rasmusen p1vaim (talk | contribs)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

Three Ways of Deciding

Three ways of deciding what to believe are Logic, Intuition, and Authority. All three have their place.

Using Logic, I take data and theory and reason from them. I see that there are dark clouds in the sky, and theory tells me that this portends rain, so I decide it will rain and I take my umbrella with me. We might call this "science".

Using Intuition, I look outside and I think, "It will rain" without any conscious reasoning, based on my feelings and reaction to what I see, hear, and smell. I decide it will rain and I take my umbrella with me, but if anyone asks me why, all I can say is, "I just had this gut feeling it would rain." I am basing it on the data, but I am not conscious of that. We might call this "art".

Using Authority, I look at a weather app on my phone and it says there is an 80% chance of rain today, so I take my umbrella.

I emphasize taking the umbrella because that is the best way to define "belief": as an opinion that actually makes me change my behavior, rather than, say, being held for virtue-signalling or to avoid persecution. 

Everyone uses all three of these ways of forming beliefs, but in wildly different proportions. All three have their place in forming the most accurate beliefs. As Schumpeter said (find the quote), most beliefs are formed by Authority. But these methods are not independent. One reason we rely on Authority is because the beliefs asserted by authorities are consistent with each other, satisfying Logic and Intuition. And we choose our authorities by Logic and Intuition, as well as by other authorities such as teachers and parents.

Pride and Defensiveness

The intuitive person is flying blind. He can't explain, even to himself, why he believes what he does. This is scary. He has to rely on his self-worth, because he has no way to check a given belief. Logic and Authority are different. Logic is self-correcting--- you can check over your logic, accept input from other people, and see if you are correct. Even if you are not very good at Logic, if you are patient you can work things out. Authority is simple-- you simply have to verify that the authority has actually asserted the opinion in question. But Intuition is self-reliant.

Thus, if we rate you ability on a scale from 0 to 10, and you rate yourself a 3, you are very timid if you rely on Intuition because you know you mostly get things wrong. If you are a 3 but you rely on Logic, you know you get things wrong a lot, but if it's important, you can get help from other people-- not as authorities but as logic-checkers who can persuade you-- and you can see where you went wrong. If you are a 3 but you rely on authority, all you have to do is choose the right authority, which can be a problem but is not nearly as hard as choosing a right opinion on a given issue.

What happens when two Intuitive people disagree? They cannot use Logic to convince each other. They cannot immediately use Authority to bully each other. They have to try to use a combination of those things in the form of trying to destroy the other person's self esteem. The way to win is to persuade the other person that you have better intuition than they do--- that you are a 10 and they are a 3, so their intuition must bow before yours. You do this by speaking loudly and confidently and saying the same thing over and over more times than they can. It is a matter of Will.

I don't mean to invoke Hitler just to intimidate, but he is relevant here. He believed in Intuition, mixed with Authority, and disliked Logic. He believed in the power of Will. The person with the best Intuition and best Will would make the best decisions and persuade other people to follow them. He thus would create his own authority. For Hitler, Authority did not come from its normal sources of tradition and rank. He had no respect for the past, for age, for experience, or for noble birth. He did not want to restore the Kaiser and he hated and despised professors and aristocrats. Rather, Authority came from talent, from force of Will combined with genuinely good Intuition. He was an artist, remember.


Bundling of Political Opinions

For their political views, most people rely on Authority. This explains why there are only a few bundles of opinions around instead of a continuum. Rather than choosing their opinion on foreign policy, taxes, homosexuality, disease control, drugs, and race relations independently, which would result in a large number of combinations in 6 dimensions with N choices on each dimension, people choose to be Conservative or Liberal, just two combinations.


Notes

  • Deciding by going against what another person or group thinks is a form of decision by looking to Authority, except it's Anti-Authority: that person believes X, so I can conclude that X is wrong. This is not necessarily irrational.


Bryan Caplan

"Being Normal," Bryan Caplan, blog (2021).

The Principle of Normality: A normal person says what others say, but does what others do.

Notice that this principle captures two distinct features of normality.

First, conformism. People dislike expressing views or taking actions unless other people express the same views and take the same actions.

Second, the chasm between words and actions. Normal people lack integrity. They feel little need to bring their actions in harmony with their words – or their words in harmony with their actions.

Example: A normal person will say, “We should do everything possible to fight global warming” – yet donate zero to environmental charities. How can they cope with the cognitive dissonance? Because this psychological experience is alien to them. They speak environmentalist words to echo the environmentalist words they hear other people say. They donate zero to environmental charities because to mimic what they see other people do. What is this “dissonance” of which you speak, weird one?

For normal people, Social Desirability Bias is far more than a bias; it is their way of life.

His example can be improved. The problem with it is that nobody knows whether you donate or not, and you don't know whether other people are donating. Here's a better example, though rather narrower:

Example: A normal economist will say, “Economics articles should be short, use only the math necessary, well-written, and on interesting topics rather than just following what the literature is doing” – yet they will reject such articles when they are sent them to referee, and reject them with scorn, saying they are too short, ignore special cases or robustness checks, read like they were written by a child, and are on a topic nobody else is writing on. How can they cope with the cognitive dissonance? Because this psychological experience is alien to them. They speak methodological words to echo the methodological words they hear other people say. They reject articles with that methodology to mimic what they see other people do. What is this “dissonance” of which you speak, weird one?

This of course is inspired by personal frustration. See Why Firms Reduce Business Risk Revisited: Projects with Risky Cash Flows Are Harder To Evaluate, which I've given up on publishing.

Arnold Kling

why we need a new scoring system,"]]blog, Arnold Kling (2021).

"1. We learn socially, so that most of our beliefs come from other people.

2.This makes the problem of choosing which people to trust the central problem in epistemology.

3.What Eric Weinstein calls our “sense-making apparatus” can be thought of as a set of prestige hierarchies, at the top of which are the people who are most widely trusted.

4. Our prestige hierarchies are based largely on credentials: professor at Harvard; writer for the New York Times; public health official.

5. The incentive systems and selection mechanisms in the credential-based hierarchies have become corrupted over time, allowing people to rise to the top who lack wisdom and intellectual rigor.

I think of electrons in an atom as occupying orbits relative to a nucleus. I have never observed this. I have never done any experiments that would verify this. I believe it because that is what I was taught fifty years ago by my high school chemistry teacher, Dr. Frank Quiring. I have not kept up with chemistry or physics since then.

In The Secret of Our Success, Joseph Henrich drives home the point that almost all of the knowledge that we possess comes from culture rather than from personal experience. ...

Philosophers typically view the problem of knowledge, or epistemology, as one of aligning the beliefs in your mind with the “reality out there.” But because our beliefs about reality come from other people, I think that the choice of which people to trust is the core issue in epistemology. I have made this point to academic philosophers, and they blow me off, insisting that the issue of aligning beliefs to reality is the nub of the problem. Relying on “testimony” (other people’s beliefs) is just one method for trying to solve it. I think that they would say that I choose a person to believe based on how well I think that person’s beliefs align with “reality out there.” But I would counter that I choose who to believe first, and then I choose what to believe. ...

Henrich points out that humans have two types of hierarchies. In a dominance hierarchy, the people at the top gain authority by force, and the people at the bottom reluctantly obey. In a prestige hierarchy, the people at the top gain authority by earning respect, and the people at the bottom willingly try to copy and learn from those at the top. Prestige hierarchies work through competitive mechanisms. ...

The value of competition in choosing the people to trust was driven home to me years ago by David Brin, in his essay on Disputation Arenas. In that essay, he offered a proposal for structured competition on the Internet to improve what I call social epistemology. Better ideas would win. ...

The process of getting ahead in a prestige hierarchy is analogous to the process of earning a bonus in a firm’s compensation system. If the bonus criteria align with the firm’s goals, people who do productive work will earn bonuses and the firm will be successful. If the bonus criteria are not well considered, worker who are not particularly productive will obtain bonuses, and the firm’s performance will suffer.

Bonus systems are like a game. The firm wants to get the most (useful) effort from its workers for the least compensation. Workers want to get the most compensation with the least effort.

My observation is that the longer a specific bonus system is in place, the better workers become at figuring out how to get more compensation for less effort. Incentive systems naturally degrade over time. Management has to revise the bonus systems every few years if the firm is to prosper.

Most incentive systems use a combination of formal measures (“metrics”) and informal judgment (“what your boss thinks”). Neither is perfect. Jerry Muller’s The Tyranny of Metrics describes how the formal approach often goes wrong. Informal judgment can be used as a corrective for imperfect metrics. But judgment also can introduce bias and cronyism.

Our prestige hierarchies of academia and legacy media rely heavily on credentials. Think of the process of obtaining tenure as a professor or the process of obtaining a prestigious position for a newspaper or TV network. Such credentials are awarded on the basis of judgment by incumbents. They reward conformity rather than excellence. Why this has emerged as a problem now more than in the past is a question that I am still pondering for a subsequent essay.

In any case, popular trust in our sense-making institutions has fallen dramatically over the past 70 years. The relationship between elites and the public at large in 2021 is somewhere between troubled and dysfunctional.

Many elites cannot understand why people do not trust “the science,” government officials, leading academics, or the news as reported in legacy media. But more detached observers, such as Martin Gurri in The Revolt of the Public and Yuval Levin in A Time to Build, understand that elite misconduct contributes heavily to the problem."