Free Markets, Free People
If ever there was an apt description of our general problem in this country, Dr. Milton Wolf nails it in the first paragraph of his discussion of the building disaster we call ObamaCare.
The fatal conceits of Obamacare are the absurd notions that the government can spend your money more wisely than you can and that bureaucrats are more capable than you are to make your own most intimate, personal decisions. The antithesis of government-centered Obamacare is what I simply call “Patientcare.” Patients should be at the center of our health care universe, not President Obama and not the government.
We suffer under a landslide of the same fatal conceit applied to literally hundreds of government programs in this country. These fatal conceits (or flawed premises if you prefer) have cost us literally trillions of dollars and much of our freedom. Government has essentially decided that it’s priorities for your money are more important than your priorities for what you earn. And, it had also decided that in many areas it can make better decisions for you than you can make for yourself.
But, that’s not the problem in full. In full, the problem is exacerbated (and the notion “validated”) by the number of people who, for whatever reason, have bought into the efficacy of these conceits. They believe the flawed premises to be true and willingly cede their money and freedom believing government does indeed spend their money more wisely and is more capable than them of making “good” decisions on their behalf.
The problem, of course, is that as long as those people who willingly enslave themselves to government exist in large enough numbers, they’ll succeed in putting the shackles on the rest of us as well. As long as they look at the federal government as their care giver, they force that on the rest of us as well.
One of the reasons we have the debt and deficit problems we currently suffer is the left has been very successful in selling those flawed premises via emotional appeal to low information (and frankly, ignorant) voters. They’ve avoided rational discussion with “for the children” campaigns. They’ve often claimed “market failure” where government created problems through preverse incentives and market intrusion and then push government as the solution.
Years ago we came from a people that knew that nothing was “free”. They knew that there really wasn’t anything called a “free lunch”, someone had to pay for it. The knew that you were responsible for your own welfare, self-defense and freedom. And interestingly, so did most of the politicians of the time. Oh there were certainly those among them that believed as the left does today, but they were a distinct minority. Their creed was considered extreme and, frankly, un-American.
Now it is they who are “main stream” and those who call for much less government intrusion in our lives who seem to be considered the extremists. Common sense, the ability to see through the blarney and nonsense, seems to have died. In the so-called information age, we seem to have a growth of ignorance. Part of that I lay at the feet of another government program that has been a woeful failure – public schooling. Common sense tells you that such an institution would be unlikely to teach anything negative about government and, in fact, might even become a bit of a propaganda arm for it. That it might involve itself in a bit of indoctrination. That it might fill fairly benign subjects with information preferred by government and spend less time on information that wasn’t in favor at the time or is contrary to the agenda it prefers. But that all assumes an ability to teach the core competencies, something most of our school systems seem unable to do with any great success. So we have the misinformed and the illiterate buying into the government’s flawed premise in droves.
Obviously a great deal of things over the years have led us to this point of dependency on government. And we know how it ends. It is the blue state model and the blue state model is failing all over the country and the world.
Yet was still hear it extolled by its zealots and lapped up by the ignorant who refuse to look beyond the promises. It still amazes me that we’ve managed to get in this mess and can’t seem to find the intestinal fortitude to say “enough” and begin doing the very unpleasant task of reversing it. But that’s the problem, isn’t it? It would be unpleasant. And we don’t like unpleasant. So instead, we continue to believe the fantasy.
The problem, of course, is like Toto in the Wizard of Oz, reality is going to pull back the curtain very soon and expose the fantasy for the fraud it is. And then we’ll look back at “unpleasant” as something we wish we’d done.
By then, it will be way too late.
That’s the basic message our friend Warren Meyers (of Coyote Blog and now Forbes) makes in an article. His points are not only good, but valid. And if one thinks about how inaccurate the models we’ve seen drive debate and spending are, we’d insist on better data before those decisions are made.
Meyer points out that there are few, if any CEOs in non-financial firms who would invest a penny based solely on computer models. Yet we have this propensity to place much more confidence in models that have done nothing to earn that confidence than they deserve.
Last week the Council of Economic Advisors (CEA) released its congressionally commissioned study on the effects of the 2009 stimulus. The panel concluded that the stimulus had created as many as 3.6 million jobs, an odd result given the economy as a whole actually lost something like 1.5 million jobs in the same period. To reach its conclusions, the panel ran a series of complex macroeconomic models to estimate economic growth assuming the stimulus had not been passed. Their results showed employment falling by over 5 million jobs in this hypothetical scenario, an eyebrow-raising result that is impossible to verify with actual observations.
Not only is it impossible to verify, it was issued as a defacto “truth” and the “stimulus” was declared a “success”. And don’t forget the inclusion, now, of one of the world’s best weasle words to pad the results – jobs “saved”. However the administration goes to great lengths to ignore its previous claim that if the “stimulus” was passed, unemployment wouldn’t rise above 8%. One has to guess, given the results, that the computer model was wrong about that.
Meyer goes on to point out how the modeling which can’t predict the complex world of economics, is somehow considered the “gold-standard” of predictability when it comes to the exponentially more complex climate. So much so that governments everywhere are basing trillions of dollars of taxes (cap-and-trade) on the results of such models in an supposed effort to “save the planet”.
While we have been bombarded with hockey sticks and forlorn polar bears, our focus in climate should really be on the computer models. The primary scientific case for man-made CO2 as the main driver of global temperatures is made in exactly the same way that the stimulus was determined to have created 3.6 million jobs: computer modeling. No one yet has been clever enough to structure a controlled experiment to isolate the effect of rising CO2 levels from other changing variables in the complex global climate. So, just like the CEA did in scoring the stimulus, climate scientists use computer models to run virtual experiments, running the models backward over the last century with varying assumptions for CO2 levels.
This modeling approach yields amazingly circular logic. Like macroeconomic models built by devoted Keynesians, climate models are constructed by academics who passionately believe that a single variable, CO2 concentration, is the dominant driver of the whole complex climate system. When run retrospectively, the models they create unsurprisingly give the result that past temperature increases are mainly attributable to CO2. The problem with these models is that when run forward, as in the case of the Washington Redskins election model, they do a terrible job of predicting the future. None of them, for example, predicted the flattening of global temperatures over the last decade.
Yet policy has been proposed and written based on results that are nonverifiable and questionable at best. That’s insanity. But the purported case for using the results is if we wait for real data it may be too late. But when the real data appears (such as the flattening of global temps for this past decade) the modelers and proponents of the government action want to ignore it and deny its importance.
This all goes back to two themes I’ve been hammering for quite some time – common sense and scientific skepticism. Both are necessary tools of a rational person. And Meyers nails the point:
Our common sense about government stimulus tells us that the government is highly unlikely to invest money more productively than the private entities from whom the government took the money. Unfortunately, we have allowed this common sense to be trumped by computer models. Once our imperfect understanding the economy was laundered through computer models and presented with two-decimal precision, smart people somehow lost their skepticism.
We are now facing what is potentially an even more expensive decision: to regulate CO2 based mainly on computer models that claim to be able to separate the effects of trace concentrations of CO2 from a hundred other major climate variables. If your common sense is whispering to you that this seems crazy, listen to it. Otherwise all we get is garbage in, money out.
The “garbage in” should be obvious. Unfortunately, the “money out” is money coming out of your wallet to pay for unproven science and unfounded economic models.
This story out of Ann Arbor, MI is a perfect example of bureaucratic inertia and the use of bureaucratic language to evade a common sense solution to a changed situation:
The debate in Ann Arbor, where firefighters are being laid off due to a multimillion dollar budget deficit, is over an $850,000 piece of art.
That’s how much the city has agreed to pay German artist Herbert Dreiseitl for a three-piece water sculpture that would go in front of the new police and courts building right by the City Hall.
The city has the money to do it because in 2007, it agreed to set aside for public art 1 percent of money that went into capital improvement projects that were $100,000 or larger. Most capital projects involve streets, sewers and water.
Anyone – what has changes since 2007? Perhaps the economic climate? So if a city can agree to “set aside” money for public art – a luxury for economically flush time – why can’t it now agree to change that previous agreement? Why can’t it now spend the money set aside on critical jobs jeopardized by the economic downturn?
Well here’s the city administrators answer, I guess:
City Administrator Roger Fraser wrote in an e-mail that the solid waste coordinator position was eliminated as a cost-cutting measure because the solid waste millage had decreased. Fraser wrote that the art coordinator position would be paid for by the public art fund.
Fraser noted that the public art dollars did not come from the city’s general fund, which is used to pay salaries and benefits, and that less than $6,000 of the art money came from the general fund.
The art projects also must have a "thematic connection" to the source of funding, Fraser wrote. The $850,000 art project is water-themed, because the money came from storm water funds.
So there. If that isn’t a pant load of, well you know what it is. As one resident noted, when it wants too the city has always found ways to shuffle money from one fund to another. But if it did that, it couldn’t scare the hell out of the citizenry claiming it was going to have to lay off critical public safety types and therefore justify increasing taxes, etc.
"Administrators cry poverty while lavishing money on the beautiful people," LaFaive said. "The threat to dismiss firefighters often comes while officials protect golf courses, wave pools and art. No city can cry poverty while it defends recreation and aesthetics such as art."
Have you ever noticed that? Layer upon layer of bureaucrats and non-essential workers stay on staff, but police and fire protection are the first on the block. Meanwhile almost a million bucks is slated for “water art”. And it is all defended by bureaucratic nonsense – bureaucratese. When they want to do something, the rules mean nada. When they don’t want to for whatever reason, the rules constrain them.
And Ann Arbor isn’t unique here – the same song and dance is going on at the state and local level.
This is your government at work. The politicians are only the part-time help. Bureaucrats are who really run it all. And the the result?
Well, look around you.
I think we many times become overwrought about things without ever really taking the time to put the threat into perspective. Nate Silver at FiveThirtyEight throws some numbers out there for us to consider as we assess the latest terrorist attempt. Taking the decade of October 1999 to September 2009 (stats for this month and others following September are not available yet) and even including the 9/11 attacks (TSA didn’t appear until after those) there have been six terrorist acts or attempted terrorist acts involving aircraft. Silver breaks down the numbers:
Over the past decade, according to BTS, there have been 99,320,309 commercial airline departures that either originated or landed within the United States. Dividing by six, we get one terrorist incident per 16,553,385 departures.
These departures flew a collective 69,415,786,000 miles. That means there has been one terrorist incident per 11,569,297,667 mles flown.
Wow. Not a huge threat. I take many more chances with my life in Atlanta traffic every day. But, to put it in even better contrast, how about the old stand-by: How do my chances compare to being struck by lightning?
There were a total of 674 passengers, not counting crew or the terrorists themselves, on the flights on which these incidents occurred. By contrast, there have been 7,015,630,000 passenger enplanements over the past decade. Therefore, the odds of being on given departure which is the subject of a terrorist incident have been 1 in 10,408,947 over the past decade. By contrast, the odds of being struck by lightning in a given year are about 1 in 500,000. This means that you could board 20 flights per year and still be less likely to be the subject of an attempted terrorist attack than to be struck by lightning.
So in answer to the title is an unqualified “yes”. That’s not to say we shouldn’t maintain an awarness on flights of idiots like this last one and do precisely what the passengers did to thwart his attack. But then we know not to stand on a hilltop in a lightning storm wrapped in copper wire too. We take proper precautions, but we don’t obsess over it.
Given these stats and what we have to go through to fly now, I’d say we’re past the obsessive stage and into the downright parnoid stage.
We do this alot anymore. Maybe it is the proliferation of mass communication which seems to magnify the significance of the story without providing any context like Silver has. Guys like this latest wannabe bomber are not a great threat to us.
We average 50 commercial crashes a decade and have since the 1950s. Yet for all those decades we happily climbed on board understanding that our real chances of being in an airline crash were really very small. And as you can see, given those numbers, your chances of being in a non-terrorist caused crash are significantly higher than those caused by terrorism. Yet it is the “terrorist” attack over which we obsess.
Life’s a risk. We know that and risk ours everyday. We do so because we know that in reality the risk we take is very low and not doing so would limit how we lived our lives to a very mundane and boring routine. We’d hate it. And we normally pride ourselves in understanding that we must take risks to live life to the fullest.
I can’t help but think that every attempted or failed attack like this one that drives the neurotic over-reaction that follows is considered a victory by our enemies. We need to quit enabling that.