That’s certainly one of the factors keeping GDP growth low.
Four years into the economic recovery, U.S. workers’ pay still isn’t even keeping up with inflation. The average hourly pay for a nongovernment, non-supervisory worker, adjusted for price increases, declined to $8.77 last month from $8.85 at the end of the recession in June 2009, Labor Department data show.
Stagnant wages erode the spending power of consumers. That means it is harder for them to make purchases ranging from refrigerators to restaurant meals that account for most of the nation’s economic growth.
Not only that, but unemployment remains historically high years after the “recovery”. The question, however, is why wages are remaining stagnant. The WSJ cites three factors:
Economic growth remains sluggish, advancing at a seasonally adjusted annual pace of less than 2% for three straight quarters—below the prerecession average of 3.5%. That effectively has put a lid on inflation, which has been near or below the 2% level the Federal Reserve considers healthy for the economy. With demand for labor low, prices not rising fast and 11.5 million unemployed searching for work, employers aren’t under pressure to raise wages to retain or attract workers.
Emphasis mine. The Fed is happy with the inflation rate. And the administration, despite numerous claims to be focused like a laser beam on “j-0-b-s” has done little if anything to address unemployment or economic growth. Finally, given the uncertainty that regulation and new laws (such as ObamaCare) bring to the table, employers are even less likely to hire until the regulatory and legal dust settles and they have a much better idea of how both effect their business and industry. It’s not about “pressure”. It’s about a lack of incentive.
Businesses are changing how they manage payrolls. Economists at the Federal Reserve Bank of San Francisco in a recent paper said that, in the past, companies cut wages when the economy struggled and raised them amid expansions. But in the past three recessions since 1986—and especially the 2007-2009 downturn—companies minimized wage cuts and instead let workers go to keep remaining workers happy. As a result, to compensate for the wage cuts that never were made, businesses now may be capping wage growth. “As the economy recovers, pent-up wage cuts will probably continue to slow wage growth long after the unemployment rate has returned to more normal levels,” the researchers said.
Another point to make, again considering the unemployment rate, is that those working are glad to still have a job. And with the economy still struggling it is unlikely that many feel the time right to push for higher wages. In fact, it is a “buyers market” right now when it comes to labor. And it will remain one until we get into much higher growth percentages and the demand for labor begins to outstrip the supply. We’re not even close to that at this point.
Globalization continues to pressure wages. Thanks to new technologies, Americans are increasingly competing with workers world-wide. “We are on a long-term adjustment, as China, in particular, but all developing countries, get their wages closer to ours,” said Richard Freeman, an economist at Harvard University. According to Boston Consulting Group, there will be only a roughly 10% cost difference between the U.S. and China in making products such as machinery, furniture and plastics by 2015.
Technology is also replacing workers in many industries. Automation is especially tough on low skilled workers. But again, given laws like ObamaCare, the incentive at work is to have fewer employees, not more. Businesses will automate where it makes sense and helps make a profit. It is also a means of closing that wage gap mentioned above, so it isn’t a trend that is likely to end anytime soon.
All of those factors and what I’ve mentioned in addition to them combine to make unemployment and wage growth both remain static. There simply aren’t any incentives at the moment to hire more people. Certainly not in GDP growth. Certainly not with the plethora of new regulations and laws.
In fact, as is mentioned in the article, at the moment there are only two paths to higher wages:
The only path to wage gains is through a stronger economy or an increase in demand for specialized skills.
The economy is moribund and has been for quite some time with GDP growth under 2% for the last three quarters.
That narrows the path to wage gains to a single one – developing specialized skills. It isn’t a path open to everyone, unfortunately, for a number of reasons.
So how could government help change all of that? Quite simply by getting out of the way – something it seems completely unable to comprehend or do.
And because of that, it continues to contribute negatively to the economic situation we endure.
This week, Michael, and Dale discuss Syria, the NSA, and tiptoe ever so carefully around the subject of race in America.
The direct link to the podcast can be found here.
As a reminder, if you are an iTunes user, don’t forget to subscribe to the QandO podcast, Observations, through iTunes. For those of you who don’t have iTunes, you can subscribe at Podcast Alley. And, of course, for you newsreader subscriber types, our podcast RSS Feed is here.
I’ve spent the last 20 years developing software, managing software development, and doing software systems analysis full-time. I’ve been programming since I was 16. The first time I went to college, my major was computer science. Since then, I’ve seen one major revolution in the computer industry, which essentially blew up everything that came before it.
That first revolution was the advent of the PC. I started college in 1982, when there were very, very few PCs. Our computer lab had a couple of Apple IIs, Osbornes and TRS-80s. We played with them. They were cute. But we did our business on a Digital DEC 1133 mainframe with 50 dumb terminals and an old IBM 64 that only took punch card input. Programming was done by an elite group of specialists Who logged into the mainframe and began writing in plain text, or who pushed up to a massive IBM card punch machine, and typed in their programs one line at a time, punching a card for each line of code.
The punch cards were the worst. At least, on the DEC, you you could compile your program while writing it. With the punch cards, you’d stack them up all in order—being very careful to write the card number on the back in magic marker, in case you dropped your card stack and had to re-sort the cards—and turn them into the computing center. Then you’d go back the next day to learn that you’d forgotten to type a comma in card 200, and you had to re-type the card. You’d turn your stack in for another overnight wait, only to learn you’d missed a comma in card 201.
It was worse than being stabbed.
The PC killed all that. It killed the idea of programmers being this small cadre of elite specialists. And once PCs could talk to each other via a network, they killed the idea that software development had to take a long time, with shadowy specialists toiling away on one large machine in the bowels of the building. By 1995, self-taught amateurs were programming database applications to handle their small business inventory in FoxPro. Corporations employed hordes of programmers to build complicated database applications.
In fact, for all the snide attitudes they get, the people at Microsoft pretty much single-handedly created the worldwide software development community we have today. The Visual Basic for Applications programming interface for Microsoft Office, Visual Basic, and, eventually, the .NET Framework allowed millions of people to learn programming and create applications. Practically every corporation with more than 100 people has their own programming team, building custom desktop software for the organization. Millions of people are employed as free-lance software developers. Yes, there are other programming technologies out there, but none of them—none—have had the impact on democratizing software development that Microsoft’s has had.
There are still some mainframe computers, of course. IBM makes 90% of them. It’s a small, mostly irrelevant market, except for governments, universities, and very large corporations. Every computer used to be a mainframe. Now, almost none of them are.
We’ve gotten used to this software development landscape. Even the advent of the Internet—while revolutionary in many ways—has not been particularly revolutionary in terms of software development. Until now, the changes to software development caused by the internet have been evolutionary—building on existing technologies. In fact, in some ways, the internet has been devolutionary. Thirty years ago, workers logged in to dumb terminals with no processing power, using a mainframe to do all the work. Similarly, using the internet until now has mainly meant opening up a dumb web browser like Firefox or Internet Explorer to talk to a web server where all the magic happens. The browser has really been just a display device. The web server takes a request from a browser, retrieves information, gets database data, formats it, and sends it back in simple text so the browser can display it.
It’s still in it’s infancy. All of the smooth developer productivity tools we’ve become used to aren’t there yet. It’s still a bit of a pain to program, and it takes longer than we’ve been used to. Indeed, it’s very much like programming was back in the 1990s, when the developer had to code everything.
For instance, in Microsoft’s .NET development environment today, developers have become used to just dragging a text box or check box onto a form and having the software development environment write hundreds of lines of code for them. In a Windows application, a data-driven form that connects to a SQL Server database can be created almost entirely through a drag and drop interface, where the development environment writes thousands of lines of code behind the scenes. The developer has to actually write about 15 lines of code to finish up the form so it works.
This is the second revolution in computing that will change everything we’ve become used to. Right now, a software application generally has to be installed on your computer. A software application designed for Windows won’t install on a Mac or Linux machine. Well, that’s all going away. Browser-based software is platform independent, which is to say, it doesn’t matter what your computer’s operating system is. Do you need an office suite? Google already has a browser-based one online, and there’s probably someone, somewhere in the world, working on a desktop-based version right now. Need to access your database to crunch some data? No need for Access or FileMaker Pro. We can do that in the browser, too. In fact, we’re pretty much at the point where there is no commonly-done task that can’t be done in the browser.
We can now make almost—almost—any software application you think of, and anyone, anywhere in the world, can run it, no matter what computer they’ve got. This is the second revolution in computing that I’ve seen in my lifetime, and it’s going to change everything about how applications are developed. Software that is dependent on the operating system is essentially as dead as the mainframe. Microsoft’s advantage in building a software development community? That’s dead now. Desktop applications? Dead. In 5 years there’ll be nothing that you can’t do in a browser. On a phone. I think that means the days of Windows having a natural monopoly on corporate computing is now dead, too.
There’ll still be desktop systems, of course. And, as long as your system can act as a web server—and pretty much any desktop or laptop can—you’ll still have software that you run on the desktop. After all, you may want to work locally instead of on a big server run by someone like Google or Microsoft, who’ll be doing God knows what with any data you store on their servers. But your choice of computer or operating system will not be driven by whether the software you want to use is available for it. In fact, in 10 years, when you think of desktop, you may just be thinking of a keyboard and display monitor that you plug your phone into to work more conveniently. Assuming that you just don’t have to talk into it to get things done.
If you’re working in software development, and aren’t embracing the coming wave of platform independent, browser-based programming, you’re not doing yourself any favors. It may take another 10 years or so, but the technology you’re working on right now is dying. For someone like me, who’s invested decades in Windows applications development, it’s a bit sad to see all that accumulated knowledge and experience passing away. It’s not easy to move into an entirely new development technology and go through all of it’s growing pains. But I don’t see any choice.
Thirty years ago, everything I learned in college about how to work with computers got tossed out the window. All of those hours struggling to write RPG programs in IBM punch cards, learning about mainframes…all of it was utterly useless within a few years when the PC came out. Now it’s happening again.
I remember how, back in the 90s, all the old mainframe Unix guys were grumpy about having to kiss their Unix machines good-bye. In 10 years, I’m not going to be one of those old Unix guys.
Once again, it’s time for a change. This time, I’m rigidly going after a reading-centric style. No graphics. No bells and whistles. Just large, readable text. The body text is done in a Google font called "Vollkorn" that I really like. Even some of you…ahem…more mature folks should find it much more readable.
Everything about the new template is focused on reading the blog. The sidebar has been moved over to the left. The ad banners have been moved so that there is only one in the text area, while the third has been moved to the sidebar. All the sidebar text is much lighter, so that it fades into the background of the blog post text.
Still, I’m not sure I like it. In successive iterations, I’ve gone for a simpler and simpler look. I may have gone too far with this one. This isn’t much different than a web site from 1996. It doesn’t look like progress, with flashy graphics and image sliders and what-not. It’s just…text.
Ah, well, I can always switch back to the previous one. Or the one before that.
Todays only economic statistic: In a stunning reversal, new home sales plunged to a 394,000 annual rate in July, and the previous two months were revised down sharply. This completely contradicts the upbeat reports on home prices and existing home sales we’ve gotten over the last few weeks.
That’s a legitimate question. The man makes it up as he goes. The latest evidence is his invention of a new category for hurricanes (which is right up there with his invention of the internet for veracity). Yes, friends, his claim came during an “interview” (here, see if you can hit these softballs, Al) by Ezra Klein. In it, he likened “deniers” to slave owners, racists and just about any other bit of nonsense he could muster.
A Union of Concerned Scientists (UCS) expert says Al Gore goofed during his widely circulated Washington Post interview on global warming. Gore, noting stronger storms fueled by climate change, told the paper “the hurricane scale used to be 1-5, and now they’re adding a 6.”
I’m sorry, that’s more than a goof. It’s a lie. It is simply not true. Period. It never has been true, no have there been any plans to add such a category by the one place that would do it:
“There are no plans by the National Hurricane Center, the federal office responsible for categorizing storms, to create a new category,” she wrote on the environmental group’s website.
There there was this as well (James Taranto covers it):
Gore uses the interview to claim vindication for his 2006 "documentary," "An Inconvenient Truth": "You mentioned my movie back in the day. The single most common criticism from skeptics when the film came out focused on the animation showing ocean water flowing into the World Trade Center memorial site. Skeptics called that demagogic and absurd and irresponsible. It happened last October 29th, years ahead of schedule, and the impact of that and many, many other similar events here and around the world has really begun to create a profound shift."
But that’s not what Al referred too when he talked about water flowing into the WTC memorial site in his movie:
The reference is to Hurricane Sandy, a Category 2 storm when it struck the Northeastern U.S., flooding parts of New York and New Jersey, including downtown Manhattan. (Sandy peaked in the Caribbean as a Category 3 storm. By comparison, 2005’s Hurricane Katrina went as high as Category 5 and made landfall at Category 3.)
But if we roll the film–which is less than scintillating, but the clip lasts less than 2½ minutes–we find that what Gore predicted in "An Inconvenient Truth" was something far direr than a storm and a flood. He predicted that lower Manhattan–along with vast and heavily populated swaths of Florida, California, the Netherlands, China, India and Bangladesh–would be permanently submerged owing to higher sea levels.
"Think of the impact of a couple of hundred thousand refugees when they’re displaced by an environmental event," Gore intoned in the movie. "And then imagine the impact of 100 million or more." And then keep imagining. While Sandy caused severe temporary disruption and wrought an unusual amount of damage because it happened to hit a population center, it was not different in kind from other natural disasters. Lower Manhattan was soon dry again.
Again, a lie, or at best an extreme exaggeration.
And that has been typical of this entire politically driven “science based” effort to claim that we’re headed to disaster because of man. As Taranto says, “while Al Gore isn’t a scientist, the Climategate scandal showed that some scientists are no more scrupulous than he is.”
Have to agree. Exaggeration and alarm are the only way their science-deficient bunk can get any press. So they indulge in it freely and call their opponents names.
Hurricane Sandy was a Cat 2 hurricane. I’ve actually flown into a Cat 2 hurricane (Alex). Trust me no one was calling Alex a “super storm”. That’s because it hit Mexico. But Sandy hit one of the world’s biggest media centers (and it hit the area perfectly for maximum effect). Had it bumped into North Carolina instead it would have just been another Cat 2 already forgotten.
Instead we have this charlatan hyping it for headlines which are easily debunked. You have to wonder why? Certainly in hope of headline reading low information citizens seeing and believing his bunk. But there’s more to it than that … follow the money.
This time, the selection is a very special car. It costs less than a BMW, and it can provide you with one of the most fun and rewarding driving experiences imaginable. You would have to be a complete lunatic to even think about buying it. As always, please recommend the article if you like it.
The following US economic statistics were announced today:
Weekly jobless claims edged higher by 13,000 to 336,000. The 4-week average fell 2,250 to 330,500. Continuing claims rose 29,000 to 2.999 million.
The PMI Manufacturing Index Flash rose 0.4 points to 53.9 for August.
The FHFA House Price Index rose 0.7% in June, following May’s 0.8% rise. Year-over-year, the index is up 7.7%.
The Bloomberg Consumer Comfort Index fell -2.2 points to -28.8 this week, the lowest level in two months.
The Conference Board’s index of leading indicators jumped 0.6% in July, hinting at accelerating growth in the next 6 months.
The Kansas City Fed Manufacturing Index rose 2 points to 8 in August.
The Fed’s Balance Sheet declined by $-0.7 billion last week, with total assets of $3.646 trillion. Reserve Bank credit grew $24.1 billion
The Fed reports that the M2 Money supply fell by $-23.5 billion in the latest week.
Here are today’s statistics on the state of the economy:
The MBA reports that Mortgage applications fell -4.6% last week, as re-fis fell -8.0%. Purchases rose 1.0%, though. Rising interest rates are what is killing the re-fi market.
Speaking of rising rates, the NAR says panic over them drove existing home sales up 6.5% to a 5.390 million annual rate in July. House prices are steady, but rising rates are forcing buyers to purchase before the interest payments get too high.
One might, if one was inclined, parenthetically remark that rising mortgage rates may signal the inevitability of rising interest rates for Treasury bonds. Or, perhaps, vice versa. Whatever.
Either way, you should keep in mind that a rise of 1% in Treasury yields works out to an additional $160 billion or so in debt service payments per year. Right now we’re paying about $350 billion a year on debt service, with a low net interest rate a bit above 2%.
If the net interest rate goes back to the historical rate of 6%, we’re looking at interest payments of $950 billion or so per year. Keep in mind that the Federal government already isn’t taking in enough revenue to cover payments for Social Security, Medicare, and Debt Service. That means that we’re borrowing money to cover part of our debt service, and everything else the federal government does. There’s no way we can afford to pay $950 billion a year in interest payments.
And we certainly can’t borrow an additional $600 billion per year to pay for the additional interest payments. That would quickly result in a debt death spiral. But we could eliminate every single executive department–including Defense– and we’d still have a $1 trillion deficit.
You should be happy the economy is moribund, because that’s keeping interest rates low, and low interest rates are preventing the aforementioned fiscal death spiral right now.
Camille Paglia is someone I disagree with at times but have always found to be, for the most part, refreshingly honest. I like to read her thoughts on current affairs (don’t really care much about the cultural side of it all) and this week, in an interview in Salon, she answered a couple of questions that I think are worth discussing.
Two words: Anthony Weiner. Your thoughts?
Two words: pathetic dork. How sickeningly debased our politics have become that this jabbering cartoon weasel could be taken seriously for a second as a candidate for mayor of New York. But beyond that, I have been amazed by the almost total absence of psychological critique in news analyses of the silly Weiner saga. For heaven’s sake, Weiner is no randy stud with a sophisticated sex life that we need to respect. The compulsion to exhibit and boast about one’s penis is embarrassingly infantile — the obvious residue of some squalid family psychodrama in childhood that is now being replayed in public.
I assumed at first that Huma Abedin stayed married to Weiner out of noble concern for her unborn child, who deserved a father. But her subsequent behavior as Weiner’s defender and enabler has made me lose respect for her. The Weiners should be permanently bundled off to the luxe Elba of Oscar de la Renta’s villa in the Dominican Republic. I’m sure that Hillary (Huma’s capo) can arrange that.
Her first point is the most important – how debased have our politics have become? Look at the circus we deal with on a seeming daily basis. Look at the people we attract. And consider the fact that Anthony Weiner actually figured he had a legitimate shot at being elected.
Look at this idiot mayor in San Diego. He just can’t imagine why he should shuffle off the stage. There are any number of others that need to take the hint as well.
It’s not just a problem on the left. It is a problem on both sides of the isle. As we have said many times here, we are extraordinarily ill served by our political class today… at all levels and from both parties. And it is we who we have to blame for that problem. The fact that Weiner was indeed taken seriously until his latest nonsense was revealed is the point. Elliot Spitzer is another example. The fact that neither demonstrated any character or integrity previously should tell us we don’t need them anywhere near public office. Yet somehow they get signals that they have a chance at a second try. What those signals are I haven’t a clue, but whatever they are, we need to quit sending them pronto.
Any hopes, fears or predictions for the presidential elections in 2016?
As a registered Democrat, I am praying for a credible presidential candidate to emerge from the younger tier of politicians in their late 40s. A governor with executive experience would be ideal. It’s time to put my baby-boom generation out to pasture! We’ve had our day and managed to muck up a hell of a lot. It remains baffling how anyone would think that Hillary Clinton (born the same year as me) is our party’s best chance. She has more sooty baggage than a 90-car freight train. And what exactly has she ever accomplished — beyond bullishly covering for her philandering husband? She’s certainly busy, busy and ever on the move — with the tunnel-vision workaholism of someone trying to blot out uncomfortable private thoughts.
I for one think it was a very big deal that our ambassador was murdered in Benghazi. In saying “I take responsibility” for it as secretary of state, Hillary should have resigned immediately. The weak response by the Obama administration to that tragedy has given a huge opening to Republicans in the next presidential election. The impression has been amply given that Benghazi was treated as a public relations matter to massage rather than as the major and outrageous attack on the U.S. that it was.
Throughout history, ambassadors have always been symbolic incarnations of the sovereignty of their nations and the dignity of their leaders. It’s even a key motif in “King Lear.” As far as I’m concerned, Hillary disqualified herself for the presidency in that fist-pounding moment at a congressional hearing when she said, “What difference does it make what we knew and when we knew it, Senator?” Democrats have got to shake off the Clinton albatross and find new blood. The escalating instability not just in Egypt but throughout the Mideast is very ominous. There is a clash of cultures brewing in the world that may take a century or more to resolve — and there is no guarantee that the secular West will win.
She nails Hillary and Benghazi on the head. I couldn’t agree any more with her assessment of that particular situation and the response from Clinton and the administration.
Note too that Paglia’s candidate isn’t another senator. She too has had enough of that brand of clueless fools that have no executive experience (although Clinton can claim exec experience with the Dept. of State, as far as I’m concerned she made a dog’s breakfast of her time there). Hopefully the rest of the country is just as tired of it as Paglia is.