Tuesday, February 24, 2015

Optimal Government Intervention

Reply to comment on Succeeding vs Failing At Other Minds

*******************************************

Imagine a cat stuck in a tree. Do you want to argue that it's a problem that the cat is stuck in the tree? Ok, I'm not going to disagree with you. We have a problem.

Where there's a point of contention is the issue of how to solve this problem.

Let's say that you want to use the bat signal so that Batman will rescue the cat from the tree. Perhaps you're assuming that Batman has nothing better to do with his time than organize his ties. If this is what you're assuming then I can understand why you perceive that we'd increase the total benefit by having Batman rescue the cat. But if, in reality, Batman was actually coming up with a plan to defeat the Joker once and for all... then we'd greatly decrease the total benefit by having Batman rescue the cat.

In economic terms, Batman is a limited resource. The opportunity cost is too high if having him rescue a cat requires that we forgo the benefit of having him figure out how to defeat the Joker. One person's small benefit doesn't outweigh an entire city's huge detriment.

So I have no problem with you wanting the government to intervene. That's not my issue. My issue is that, with the current system... people can see a public problem... and they want the government to do something about it... which is perfectly reasonable... but they can't see where the required resources are taken from and they have no idea how much benefit is lost as a result.

On the one hand, Batman rescued a cat from a tree. But on the other hand, the Joker destroyed Gotham.

Markets work because you know that any time spent replying to this comment is time that can't be spent doing other things that you also value. So you endeavor to put your time, a limited resource, to its most valuable use.

Pragmatarianism would create a market in the public sector by allowing people would choose where their taxes go. This means that if you want more government intervention in one area... then you're going to have to decide whether it's worth it to have less government intervention in other areas. This is the only way to ensure that government intervention truly maximizes society's total benefit.

For more info please see... Why I Love Your Freedom.

Let me know if you have any questions.

*******************************************

Here's the relevant illustration (value signals)...




Optimal government intervention?  We don't want the government to allocate too many resources to an endeavor.  Neither do we want the government to allocate too few resources to an endeavor.  What we want is for the government to allocate the optimal amount of resources to an endeavor.

It's kind of astounding that so many people believe that optimal government intervention is possible in the absence of a market in the public sector.

You really can't have optimal government intervention without pragmatarianism.  It's ridiculous to believe that society's limited resources can be put to their most valuable uses in the absence of nearly everybody's valuations.

The problem isn't that this belief is ridiculous.  The problem is that it's extremely harmful.  You want to believe in God?  The Toothfairy?  Santa Claus?  Unicorns?  Ok, go ahead, no problem.  Knock yourself out.  You want to believe that optimal government intervention doesn't depend on earner/inclusive valuation?  Please don't.  When you hold this belief you hurt me, yourself, everybody you know and everybody I know.

How in the world can people be dissuaded from such a harmful belief?  Is my example of Batman rescuing a cat from a tree while Joker destroys Gotham really the best example?  I sincerely doubt it.  We need a better example... a better story.  A story that's so accessible that anybody who reads it will instantly see the harm caused by their long held belief.

Is that even possible though?  Can a story be so good that people have no problem relinquishing their long held beliefs?

I think it's entirely natural for the mind to fight against anything that challenges a long held belief.  This is because it's extremely disconcerting to confront the possibility that our perception of reality is fundamentally flawed.

This ties into this recent entry of mine... Who Are You?... where I expressed relief to learn that there's absolutely no evidence to support the possibility that I'm Robittybob1's sockpuppet.  It's hard for me to imagine a story so good that I'd instantly relinquish the belief that I'm not Robittybob1's sockpuppet.

Personally, I was raised to believe in God.  It wasn't a passive belief... it was an active belief... prayer, church and the bible in very frequent doses.  I'm sure that it wasn't any single story that convinced me to change my belief... which occurred when I was around 11.  It was a lot of different stories that I found in various books/magazines about science/nature.  I still remember the distinct discomfort I felt when my mind gradually replaced one long held belief for a new one.  The transition definitely wasn't pleasant or enjoyable.  And it's not like there was anybody around to support the transition.  All my family, friends and teachers believed in God.

The second major transition in my beliefs occurred when I was a libertarian recently returned from nation building in Afghanistan.  I suppose I should mention that I sure wasn't raised to be a libertarian.  My interest in politics though was nearly nonexistent up until college.  During college, while telling another friend about some thoughts on government, he told me that I was a libertarian.  Upon further research it seemed like a pretty good fit.  So it wasn't like I had to give up one strongly held belief for another.

Pretty soon after I returned from Afghanistan, I ended up fighting against anarcho-capitalists on Wikipedia.  It was my first exposure to anarcho-capitalism and the idea of abolishing the government was anathema to my limited government beliefs.  It really rubbed me the wrong way... especially after having spent a year in a country without even a basic government.  So I endeavored to defeat the anarcho-capitalists.  But you can't truly defeat something that you don't truly understand.  After a considerable amount of reading though... I began to entertain the possibility of being wrong.  It certainly wasn't enjoyable.

What made matters especially tricky was that while I was beginning to entertain the possibility that anarcho-capitalists were correct... I was also entertaining the possibility that the free-rider problem wasn't just applicable/relevant to defense, courts and police.  Essentially my belief in limited government libertarianism was being simultaneously challenged from completely opposite directions.  This doubled the discomfort.  My willingness to entertain doubt was drowning me.  This floundering encouraged me to consider the alternatives.  One of which was a hypothetical situation that for some time I had enjoyed posing to friends... what if people could choose where their taxes go?  The more I thought about it the more I realized how well it accounted for both possibilities.  I trusted that introducing the invisible hand into the public sector would reveal the truth regarding the relevance of government.

My life consists of two major transitions in beliefs.  And by "major" I mean very unsettling.

It feels like there should be a technical term for unsettling belief exchanges.  Does anybody know if one exists?  The only thing that pops into my head is "cognitive dissonance"...
...the mental stress or discomfort experienced by an individual who holds two or more contradictory beliefs, ideas, or values at the same time, or is confronted by new information that conflicts with existing beliefs, ideas, or values.
It's close but I want a word for when a long held belief is replaced, at considerable mental cost, for a new one.  Maybe it's a linvoid?

So far in my life I've experienced two linvoids.  I wonder what the average is?  Does it mean anything if some people have gone through more linvoids than other people?

Not exactly sure why... but I'd be a bit suspicious if somebody has never gone through a linvoid.  They've really never been confronted with enough evidence to convince them that a long-held belief of theirs is wrong?  Either they haven't been considering enough evidence... or their fundamental beliefs have never been wrong.

What about in the other direction?  Would I also be suspicious if somebody has experienced say 10 linvoids?  I'd be like "woah!  guy!  what's going on?"

Are we looking at a continuum that ranges from entirely close minded on one extreme to entirely open minded on the other extreme?

Or is it more accurate to say that we are looking at a continuum that ranges from people who always choose the blue pill on one extreme to people who always choose the red pill on the other extreme?

Where does intelligence fit into all this?

Let's consider the illustration from this blog entry... Progress as a Function of Freedom...




It should seem straightforward that being led by evidence rather than belief will increase your chances of choosing the right path.

It also seems straightforward to argue that intelligent people are more likely to choose the right paths.

Does this mean that there's a positive correlation between linvoids and intelligence?  Eh?

Are more intelligent people less likely to irrationally cling to incorrect beliefs?

In other words, is sharing pragmatarianism with more intelligent people the same thing as sharing pragmatarianism with people who are led by evidence rather than beliefs?  For some reason I'm resisting the conclusion that it is the same thing.  Am I correct to resist the conclusion?

Perhaps, when deciding whether it's worth it share pragmatarianism with somebody, rather than asking their IQ I should ask them their linvoid quotient (LQ)?  Errr... not sure if quotient is the right word.  But you get the point.

How high is my IQ?  I don't even know.  I'm sure it's not terribly high.  It's probably barely above average.  But I've gone through two linvoids!  heh

Of course intelligence is hard to pin down.  But it sure seems like some type of smarts when somebody is willing to endure a very unsettling exchange of beliefs when the evidence requires it.  And the more evidence that somebody considers... the more likely it is that they'll confront evidence that requires a very unsettling exchange of beliefs.  

No comments:

Post a Comment