***************************************
2
That's my LQ. Not sure about my IQ... just like most people I'm guessing it's above average.
So what's LQ? It stands for linvoid quotient. In math they use "x" as the go to variable. For language I use "linvoid". Not sure what the "quotient" is doing... maybe just hanging out for symmetry.
In this situation I'm using "linvoid" to refer to epic mind changes. I'm not sure if there's a word for an epic mind change so for now I'm just using "linvoid" as the placeholder.
So far in my life I've had 2 linvoids. The first occurred when I was around 11. That's when I exchanged my belief in God for my belief in evolution. The second occurred in my late 20s when I exchanged my belief in libertarianism for my belief in pragmatarianism.
What makes them linvoids is that they were both very unsettling experiences. Not unlike what Neo felt after he took the red pill.
The concept of painful reality vs comfortable illusion goes at least as far back as Socrates and the allegory of the cave....
You will recall the wonderful image at the beginning of the seventh book of Plato's Republic: those enchained cavemen whose faces are turned toward the stone wall before them. Behind them lies the source of the light which they cannot see. They are concerned only with the shadowy images that this light throws upon the wall, and they seek to fathom their interrelations. Finally one of them succeeds in shattering his fetters, turns around, and sees the sun. Blinded, he gropes about and stammers of what he saw. The others say he is raving. But gradually he learns to behold the light, and then his task is to descend to the cavemen and to lead them to the light. He is the philosopher; the sun, however, is the truth of science, which alone seizes not upon illusions and shadows but upon the true being. - Max Weber, Science as a VocationTo make sure that I'm clear, what separates linvoid from any other mind changes is the significant amount of discomfort which results from exchanging a tightly held belief for a new one. Here's Socrates speaking of a released caveman/prisoner...
...when any of them is liberated and compelled suddenly to stand up and turn his neck round and walk and look towards the light, he will suffer sharp pains; the glare will distress him...If you didn't experience significant mental distress and discomfort when you changed your mind... if the transition from an old belief to a new one wasn't very unsettling... then it really wasn't a linvoid.
The objective here isn't to debate our respective beliefs... although you're certainly welcome to do so. The objective is to conduct a survey...
1. How many linvoids have you experienced?
2. What were your linvoids?
3. Does a person's LQ mean anything?
4. Is there a relationship between LQ and IQ? If so, then what is it?
If you need more context, background, explanation or analysis then please see... Optimal Government Intervention
***************************************
I posted it in the following four forums... (sorted by quantity of responses)
- NationStates (NS) - LQ vs IQ?
- TheScienceForum (TSF) - LQ vs IQ?
- AtheistForums (AF) - LQ vs IQ?
- DebatePolitics (DP) - LQ vs IQ?
In retrospect, I could have done a much better job wording/organizing the survey. While I certainly enjoyed and appreciated the passage by Weber on the Cave Allegory... I probably should have replaced it with a really good example. Here's a decent one that I developed in the process of the discussion...
Let's say that Bob always votes, rain or shine, because he strongly believes that voting is an extremely important civic duty/responsibility. One day Sally challenges Bob's strong belief (SB) by providing him with a huge pile of papers and studies on voting. If, as a result of serious digging, Bob starts to realize that the available evidence does not support his SB... then this will be very unsettling for him. If he modifies his belief to fit the available evidence, despite the considerable discomfort of doing so, then this counts as a linvoid.
Even though my survey had lots of room for improvement... the outcome provided considerable food for thought.
There weren't really enough responses to come to any good conclusion regarding the average number of linvoids that people experience. Plus, I think people who had experienced at least one linvoid would be more inclined to respond than people who hadn't even experienced one. But, if nothing else, I did confirm that I wasn't the only one who had experienced a linvoid.
There weren't really enough responses to come to any good conclusion regarding the average number of linvoids that people experience. Plus, I think people who had experienced at least one linvoid would be more inclined to respond than people who hadn't even experienced one. But, if nothing else, I did confirm that I wasn't the only one who had experienced a linvoid.
A defining feature that distinguishes a linvoid from say... an epiphany... is that it feels really crappy. This was my favorite description...
When I stopped believing in Catholicism. I was never very devout but it [Catholicism] was a huge part of my life since infancy. At first, it was a gradual discomfort but nothing major. I still pretty much believed in everything I had been taught until that point. But my cessation of belief happened abruptly during my senior year. I remember it feeling like I had been slapped, repeatedly and it made me almost physically ill. - Nanatsu no Tsuki, NS
Whether the pain is physical or mental, we all have an instinctual desire to try and avoid it. When evidence is like a razor starting to slice into our strong beliefs (SBs) it's completely natural to try and protect ourselves from the harm. It's a fight or flight situation. I think most people choose flight. They ignore the evidence or rationalize it away... which effectively removes the razor from their SBs. The mind is extremely adept at protecting itself. Saying yes to flight is to say no to linvoids.
With fight on the other hand, there's at least the chance that the offensive evidence will be inspected. When I first fought against anarcho-capitalists... of course I really didn't want to read what Murray Rothbard or David Friedman had written. But I couldn't truly win without attacking their arguments. And I really wanted to effectively protect my SB in libertarianism. So I took the risk of inspecting the razor... and in the process of doing so I ended up experiencing a linvoid.
Given that linvoids can only occur when a SB is involved, it's not too surprising that government and religion came up pretty frequently as the subject of the linvoids.
Given that linvoids can only occur when a SB is involved, it's not too surprising that government and religion came up pretty frequently as the subject of the linvoids.
Regarding the third survey question... not very many people who experienced a linvoid felt that the number of linvoids that people experience is meaningful. Can that be right though?
If somebody has had at least one real linvoid, then it stands to reason that they didn't bravely run away. They chose the painful "truth" rather than the bliss of ignorance. They conformed their SB to fit the evidence... and not the other way around. How is this not meaningful?
Regarding the fourth survey question... if you have valuable evidence that contradicts many people's SBs... it stands to reason that you'll get more bang for your buck by sharing your evidence with people who have experienced at least one linvoid. This doesn't guarantee that they won't bravely run away... it just guarantees that they don't always bravely run away. It seems likely that this has something to do with some type of intelligence. If anything, it would seem to be a different measure of intelligence. And in certain cases, a much better measure of intelligence.
Looking at this topic somewhat differently... here's my comment on a LessWrong discussion post... If you can see the box, you can open the box
***************************************
Interesting topic! I'm a huge fan of "out of the box" thinking. But I prefer to apply "out of the box" thinking to the phrase itself by referring to this type of thinking as epiphytic thinking.
The phrase "epiphytic thinking" helps promote/advertise epiphytes. Did you know that the orchid family is the largest plant family? Around 10% of all plants are orchids.... and most orchids are epiphytes.
Epiphytes can help sequester as much carbon as trees do. They also help create a gazillion different niches which has has helped increase animal speciation/biodiversity.
Epiphytes can certainly help save the world. What are boxes good for? Helping you pretend that you're a robot?
Therefore...
epiphytic thinking > "out of the box" thinking
I'll apply some epiphytic thinking to your topic.
Let's say that we have a time machine and we travel back to a hundred years before people discovered that the earth was round. Our mission, which we've chosen to accept, is to try and persuade people that the earth is actually round!
It stands to reason that no two people are going to be equally willing to hear us out. In this sense... perhaps we can say that there's a continuum that ranges from the most close-minded person all the way to the most open-minded person. To help quantify this continuum we'll use a scale from 0 to 10.
The question is...if somebody is a 10 on this scale... does this necessarily mean that they'll believe us that the world is actually round? Just because they'll be really willing to listen to our very different perspective on the shape of the world... does this mean that they'll take our word for it? Not really... because this would imply that our open-mindedness scale was the same thing as a gullibility scale.
So clearly it would help to bring some evidence with us on our mission. Stronger evidence is always better than weaker evidence but let's just say that our evidence is good.
Imagine if we share our good evidence with 100 people who are all a 10 on the open-mindedness scale. What percentage of them are going to change their beliefs accordingly? Of course we can't really know the real answer... but it doesn't seem very likely that 100% of them would exchange their belief in a flat world for a belief in a round world.
From our perspective, we would know that anybody who didn't change their belief accordingly was making a mistake. Why did they make the mistake though? Was it a lack of intelligence? Lack of rationality? Lack of critical thinking skills? Was there some sort of bias involved? Or stubbornness?
Just like no two people are equally open-minded... I don't think that any two people are equally, for lack of a better term... "evidence-minded". Is there a better term? "Rationality" seems close but it doesn't seem quite right to refer to somebody as "irrational" just because our good evidence didn't persuade them that their belief in a flat world was wrong.
What's the point here? Well... at one point everybody was really wrong about the shape of the world. So perhaps it's a pretty good idea for us to fully embrace the possibility that we're all really wrong about the shape of... say... the best government. Because if it's really difficult to appreciate the fact that you might be wrong... then it's going to be really difficult for you to accept any good evidence that proves that you are wrong. Therefore, anybody who's a 10 on the evidence-minded scale will probably really embrace fallibilism.
***************************************
If somebody has had at least one real linvoid, then it stands to reason that they didn't bravely run away. They chose the painful "truth" rather than the bliss of ignorance. They conformed their SB to fit the evidence... and not the other way around. How is this not meaningful?
Regarding the fourth survey question... if you have valuable evidence that contradicts many people's SBs... it stands to reason that you'll get more bang for your buck by sharing your evidence with people who have experienced at least one linvoid. This doesn't guarantee that they won't bravely run away... it just guarantees that they don't always bravely run away. It seems likely that this has something to do with some type of intelligence. If anything, it would seem to be a different measure of intelligence. And in certain cases, a much better measure of intelligence.
Looking at this topic somewhat differently... here's my comment on a LessWrong discussion post... If you can see the box, you can open the box
***************************************
Interesting topic! I'm a huge fan of "out of the box" thinking. But I prefer to apply "out of the box" thinking to the phrase itself by referring to this type of thinking as epiphytic thinking.
The phrase "epiphytic thinking" helps promote/advertise epiphytes. Did you know that the orchid family is the largest plant family? Around 10% of all plants are orchids.... and most orchids are epiphytes.
Epiphytes can help sequester as much carbon as trees do. They also help create a gazillion different niches which has has helped increase animal speciation/biodiversity.
Epiphytes can certainly help save the world. What are boxes good for? Helping you pretend that you're a robot?
Therefore...
epiphytic thinking > "out of the box" thinking
I'll apply some epiphytic thinking to your topic.
Let's say that we have a time machine and we travel back to a hundred years before people discovered that the earth was round. Our mission, which we've chosen to accept, is to try and persuade people that the earth is actually round!
It stands to reason that no two people are going to be equally willing to hear us out. In this sense... perhaps we can say that there's a continuum that ranges from the most close-minded person all the way to the most open-minded person. To help quantify this continuum we'll use a scale from 0 to 10.
The question is...if somebody is a 10 on this scale... does this necessarily mean that they'll believe us that the world is actually round? Just because they'll be really willing to listen to our very different perspective on the shape of the world... does this mean that they'll take our word for it? Not really... because this would imply that our open-mindedness scale was the same thing as a gullibility scale.
So clearly it would help to bring some evidence with us on our mission. Stronger evidence is always better than weaker evidence but let's just say that our evidence is good.
Imagine if we share our good evidence with 100 people who are all a 10 on the open-mindedness scale. What percentage of them are going to change their beliefs accordingly? Of course we can't really know the real answer... but it doesn't seem very likely that 100% of them would exchange their belief in a flat world for a belief in a round world.
From our perspective, we would know that anybody who didn't change their belief accordingly was making a mistake. Why did they make the mistake though? Was it a lack of intelligence? Lack of rationality? Lack of critical thinking skills? Was there some sort of bias involved? Or stubbornness?
Just like no two people are equally open-minded... I don't think that any two people are equally, for lack of a better term... "evidence-minded". Is there a better term? "Rationality" seems close but it doesn't seem quite right to refer to somebody as "irrational" just because our good evidence didn't persuade them that their belief in a flat world was wrong.
What's the point here? Well... at one point everybody was really wrong about the shape of the world. So perhaps it's a pretty good idea for us to fully embrace the possibility that we're all really wrong about the shape of... say... the best government. Because if it's really difficult to appreciate the fact that you might be wrong... then it's going to be really difficult for you to accept any good evidence that proves that you are wrong. Therefore, anybody who's a 10 on the evidence-minded scale will probably really embrace fallibilism.
***************************************
The composition of this book has been for the author a long struggle of escape, and so must the reading of it be for most readers if the author’s assault upon them is to be successful,— a struggle of escape from habitual modes of thought and expression. The ideas which are here expressed so laboriously are extremely simple and should be obvious. The difficulty lies, not in the new ideas, but in escaping from the old ones, which ramify, for those brought up as most of us have been, into every corner of our minds. - John Keynes, The General Theory of Employment, Interest and Money
Our creed is that the science of government is an experimental science, and that, like all other experimental sciences, it is generally in a state of progression. No man is so obstinate an admirer of the old times as to deny that medicine, surgery, botany, chemistry, engineering, navigation, are better understood now than in any former age. We conceive that it is the same with political science. Like those physical sciences which we have mentioned, it has always been working itself clearer and clearer, and depositing impurity after impurity. There was a time when the most powerful of human intellects were deluded by the gibberish of the astrologer and the alchemist; and just so there was a time when the most enlightened and virtuous statesman thought it the first duty of a government to persecute heretics, to found monasteries, to make war on Saracens. But time advances; facts accumulate; doubts arise. Faint glimpses of truth begin to appear, and shine more and more unto the perfect day. The highest intellects, like the tops of mountains, are the first to catch and reflect the dawn. They are bright, while the level below is still in darkness. But soon the light, which at first illuminated only the loftiest eminences, descends on the plain and penetrates to the deepest valley. First come hints, then fragments of systems, then defective systems, then complete and harmonious systems. The sound opinion, held for a time by one bold speculator, becomes the opinion of a small minority, of a strong minority, of a majority of mankind. Thus the great progress goes on, till schoolboys laugh at the jargon which imposed on Bacon, till country rectors condemn the illiberality and intolerance of Sir Thomas More. - Thomas Macaulay
No comments:
Post a Comment