When they can’t meet established performance standards, the left makes up excuses, lowers the standards, and, if necessary, revises history along the way.
In economics, there’s hardly a better of example of this kind of deliberate responsibility avoidance than what has happened to the idea of “full employment.”
Full employment is supposed to occur when “all … who want to work and are allowed to work are able to find employment.”
The unemployment rate associated with full employment obviously can’t be zero, because there will always be people out of work who are voluntarily or involuntarily moving from one job to another.
What unemployment rate represents full employment? The architects of the Humphrey–Hawkins Full Employment Act of 1978 thought it should be 4 percent for Americans age 16 and over. That benchmark is what Richard Nixon used when presenting “full employment” budgets during much of his time in office. Yes, it was a gimmicky maneuver designed to make what were then seen as horrific deficits seem more palatable; but the rate did represent the predominant economic thinking at the time. While we’re in the neighborhood, I should note that the deficits incurred during the early 1970s, considered awful at the time, were chump change, even after accounting for inflation, compared to the $1 trillion-plus annual shortfalls seen during most of Barack Obama’s presidency.
Forty years later, communications have improved tremendously. Unfilled job listings are available within seconds at any number of web sites attempting to match employees with employers. Applicants send resumes online instead of through the mail. One would therefore expect that the full-employment unemployment rate would have fallen, or at the very least remained the same.
Thus, I was initially quite relieved on September 4 when I sat in on the ADP Employment Report conference call. Moody’s economist Mark Zandi, the report’s overseer, told his audience that he expects that the economy will continue to generate 200,000 or more private-sector jobs each month as far as the eye can see, and that this serendipitous consistency will bring the U.S. economy to full employment by the end of 2016.
He further clarified his prediction by optimistically forecasting that most of today’s workforce dropouts will get back into the game during that time, and that most of those who are currently working part-time but would prefer full-time jobs will find them. Those two assumptions were a bit hard to take, but it’s his conference call, and he can predict what he wants. (The next day’s employment report from the government, which showed only a 142,000 pickup in seasonally adjusted jobs, threw cold water on Zandi’s sunny optimism. He didn’t handle it well.)
But Zandi then noted that all of this would return us to full employment for the first time “in a decade.” That seemed odd, as we’ll explore right after the page break.
The above chart is where the seasonally adjusted unemployment rate stood from 2004 through 2007.
Though the economy posted unemployment rates of 5 percent or lower for 31 consecutive months, including almost a year at or around 4.5 percent, it never got to what I had understood to be the commonly accepted definition of full employment for decades.
But Zandi said it did. So when the call opened up for questions, I asked him what he thought the unemployment rate would be at the end of 2016 when we hit full-employment nirvana.
I was stunned at the answer: 5.5 percent.
It gets worse.
When I asked him if this benchmark meant that we were somehow at more than full employment in 2006 and 2007, he said “yes,” contending that there was significant upward pressure on wages during that time. Does anyone remember that we had a seller’s market for labor throughout the U.S. in the mid-2000s? With rare exceptions in certain sections of the country, neither do I.
When I mentioned that his full-employment unemployment rate was quite a bit higher than I was used to seeing by about 1.5 percentage points, Zandi went further into the land of the absurd. He asserted that full employment was commonly regarded as 5 percent last decade — this 2007 article in the New York Times confirms that — but that the economic damage caused by the recession had upwardly moved that standard to 5.5 percent.
In other words, it’s Bush’s fault — apparently forever — that the rate is now a half-point higher. The economy fell, and it will never entirely get back up. You can’t make this garbage up. This permanent half-point upward move must have been discovered after the Obama administration was done promoting the idea that its 2009 stimulus package would lower the unemployment rate to 5 percent — by the middle of 2013. How convenient.
In a far more efficient communications environment, why did the accepted full-employment unemployment rate rise at all?
Part of the answer is that there are many people who believe that the increase never should have happened. That group, strangely enough, includes card-carrying liberals Jared Bernstein and Dean Baker. It also includes the folks at the American Institute For Full Employment. Its president, John Courtney, goes further. In an email, he specifically asserted his group’s belief that “full employment is below the 4%” Bernstein and Baker advocated in late 2013.
It’s hard to disagree with Mr. Courtney, given that a July 2014 table at the government’s Bureau of Labor Statistics showed six states with rates below 4 percent. Only one of them, North Dakota, where the unemployment rate was 2.8 percent and starting wages at Wal-Mart can be as high as $17 per hour, is seeing significant wage pressure. This strongly suggests that the real-world unemployment rate at full employment is about 3.5 percent.
What has really happened is that the left-dominated establishment economics community has lowered the bar for full employment to avoid having to discuss the welfare state’s pervasive work disincentives and their own Keynesian policies’ utter failure to satisfactorily revive the job market.