Ludwig von Mises Institute
I’ve noticed a phrase that is being used with increasing frequency in the media by pundits and commentators. It’s only used by the winning side in various controversies. The phrase is “settled public policy” or, sometimes, “settled law.”We saw it recently when Mark Oppenheimer in TIME claimed that any “organizations that dissent from settled public policy” should be subject to draconian tax increases (i.e., have their tax-exempt status removed). Oppenheimer was writing on the recent gay marriage decision from the Supreme Court, and in the wake of the decision, he must have sensed his opportunity to declare all his ideological opponents to be heretics, and thus anathema to civilized human society.Other examples abound as well. In an unsigned op-ed from 2014, The Los Angeles Times said that federal ownership of vast areas of land within states should never be questioned because “settled law” dictates that the matter is no longer open to debate.So what is “settled public policy?” It is not a description of fact, but an assertion of correctness. For example, when a Supreme Court rules on something, as it did in the case of gay marriage — and which prompted Oppenheimer’s authoritarian diatribe — the supporters of the decision hail the matter as “settled” and no longer open to debate. It’s the secular version of Roma locuta est — causa finita est. The US government has spoken, the proponents of “settled public policy” proclaim. No more debate need be tolerated. But here’s the thing: There’s no such thing as “settled public policy.”You know what was once “settled public policy”? The Dred Scott decision. After all, the Supreme Court ruled with notable finality that black people have “no rights the white man was bound to respect.”In 1860, this was “settled law.” The Constitution, which is a blatantly pro-slavery document, was fairly clear on this matter. The chief justice at the time, Roger Taney, felt bound to rule according to the written text. And he thus concluded: blacks don’t have the same rights as whites. Case closed.And yet it wasn’t. As I’ve noted before, politics trumps law, and as long as there is no such thing as “settled politics” — which there never is — there will never be “settled public policy.”Here’s another example of settled law: Korematsu vs. the United States. In this Supreme Court case, the majority ruled that yes, it’s perfectly fine to round up people based on their race or ethnicity and put them in concentration camps. In fact, that ruling is still legally valid even today. The Court has never explicitly ruled against this precedent. But hey, it’s settled law, so anyone who opposes that sort of thing should just get over it.Of course, you don’t have be the world’s biggest cynic to see why those who use this phrase use it in this context. It’s merely a political tactic — because its practitioners know politics trumps law — to shut the opposition up, and to demoralize critics of the government in an attempt to convince them that federal law can establish unchanging doctrine.In fact, if one attempts to google the phrase “settled law” google returns this text at the very top of the search-returns page: “If you do a search for "settled law" on google most of the results revolve around either the ACA [“Obamacare”] or the Roe v. Wade decision. Both of which are highly contested.”The fact that google highlights this connection to Roe and the ACA without being prompted tells us that activists and ideologues have been attempting to cast Roe v. Wade and the Affordable Care Act as “settled law.” But google, like any serious scholar (oddly enough), immediately recognizes that such laws are anything but settled. Even in the short-term, they are highly contested, and in the medium- or long-term, they are even less “settled.”In spite of this, the appeal to settled law has a certain air of legitimacy for Americans. Deeply ingrained in the American mind — especially among conservatives — is the idea that there is some kind of final, unchangeable constitutional law out there. American rightists often rather fancifully believe that there once was a time when most everyone agreed on what the constitution said, and that its text was sacrosanct. So even today, for them, once something is determined to be “the law” it must be respected. The political weakness of this position is apparent.At the same time, American leftists hold closely to what Murray Rothbard called the “Whig theory of history” in which humanity is forever progressing toward ever greater and more enlightened heights. And thus, whatever the latest ruling is from the courts, must be the “correct” and more enlightened one. If a court rules the “wrong” way, it is merely a temporary setback on the way to a correct ruling.Europeans, on the other hand, are less prone to such unsophisticated thinking. They understand that political reversals of fortune and legal aberrations can be commonplace in the political world. Certainly, one could have argued in 1989 that the right of the Stasi to imprison people for thoughtcrimes was “settled law.” And then, one day, the Stasi was gone — through extra-legal means.But let us never let actual history get in the way of pressing the political advantage. The slave drivers of old certainly used Oppenheimer’s “settled public policy” tactic with the Dred Scott decision, proclaiming that the northern nullifiers and abolitionists were criminals who opposed what all “reasonable” people could see was settled public policy. “Those reprehensible abolitionists,” they surely said. “They try to overturn our lovely and established rule of law.”The same political tactics persist today, although the ideologies have changed. And it is ideology that lies at the core of the matter. Contrary to the risible myth that courts are above and indifferent to the ideological and political contests of their time, the courts — and especially the US Supreme Court — tend to toe a line very close to whatever will please powerful interest groups at any given time. This is true so long as the court thinks the public at large will at least tacitly accept the decision.The Court certainly took no political risks when it declared Japanese-Americans to be second class citizens with Korematsu. Nor did it go out on a limb with Dred Scott. When it has miscalculated its own political strength, though, the Court has suffered significant blows to its prestige. For example, when the Court ruled in Worcester vs. Georgia (1832) that the Cherokees had a right to private property within the state of Georgia, President Jackson simply ignored the Court and sent the Cherokees on a death march to Oklahoma. The Court learned its lesson: always make sure your ruling will get support from either Congress or the president.Certainly the recent gay marriage ruling presented no risk to the Court. It knew it had the full support of the executive branch and half of Congress. Even if most Americans are blissfully unaware of the fact, the Court knows that without political support, legal rulings are meaningless.But, over time, what the public will accept, and what interest groups and institutions hold the reins of power can change considerably. As these variables change, so will the courts, and the legal interpretations that their judges make. Nothing is ever “settled.”
Earlier this month, the Chinese government decided to depreciate its currency on three consecutive occasions. On August 13, the price of the US dollar was trading at 6.413 — an increase of 3.3 percent against July. The key factor behind the central bank’s lowering of the yuan is a sharp decline in the growth momentum of exports with the yearly rate of growth falling to minus 8.3 percent in July from 2.8 percent in June.It is held that by means of currency depreciation that it is possible to strengthen the export of goods and services, thereby strengthening the gross domestic product (GDP), which currently displays a visible weakening. The yearly rate of growth of real GDP stood at 7 percent in Q2 against 7.5 percent in Q2 last year and 8.6 percent in Q1 2012.According to popular thinking, the key to economic growth is demand for goods and services. It is held that increases or decreases in demand for goods and services are behind rises and declines in the economy’s production of goods. Hence in order to keep the economy going economic policies must pay close attention to overall demand.Why Governments Devalue Currencies to Boost ExportsNow, part of the demand for domestic products emanates from overseas. The accommodation of this demand is labeled “exports.” Likewise, local residents exercise demand for goods and services produced overseas, which are labeled “imports.” Observe that while an increase in exports implies an increase in the demand for domestic output, an increase in imports weakens demand. Hence exports, according to this way of thinking, are a factor that contributes to economic growth while imports are a factor that detracts from the growth of the economy.From this way of thinking it follows that since overseas demand for a country’s goods and services is an important ingredient in setting the pace of economic growth, it makes a lot of sense to make locally produced goods and services attractive to foreigners. One of the ways to boost foreigners’ demand for domestically produced goods is by making the prices of these goods more attractive.One of the ways of boosting their competitiveness is for the Chinese to depreciate the yuan against the US dollar. Based on this one can reach the conclusion that as a result of currency depreciation, all other things being equal, the overall demand for domestically produced goods is likely to increase while also lowering Chinese demand for American goods. This in turn will give rise to a better balance of payments and in turn to a stronger economic growth in terms of GDP. What we have here, as far as the Chinese is concerned, is more exports and less imports, which according to mainstream thinking is great news for economic growth.Why an Exports Boost Fueled by Depreciation Can’t Grow the EconomyWhen a central bank announces a loosening in its monetary stance this leads to a quick response by participants in the foreign exchange market through selling the domestic currency in favor of other currencies, thereby leading to domestic currency depreciation. In response to this, various producers now find it more attractive to boost their exports. In order to fund the increase in production, producers approach commercial banks which — on account of a rise in central bank monetary pumping — are happy to expand their credit at lower interest rates.By means of new credit, producers can now secure resources required to expand their production of goods in order to accommodate overseas demand. In other words, by means of newly created credit, producers divert real resources from other activities. As long as domestic prices remain intact, exporters record an increase in profits. (For a given amount of foreign money earned they now get more in terms of domestic money.) The so-called improved competitiveness on account of currency depreciation in fact amounts to economic impoverishment. The improved competitiveness means that the citizens of a country are now getting fewer real imports for a given amount of real exports. While the country is getting rich in terms of foreign currency it is getting poor in terms of real wealth (i.e., in terms of the goods and services required for maintaining people’s life and well being).As time goes by, the effects of loose monetary policy filters through a broad spectrum of prices of goods and services and ultimately undermines exporters’ profits. A rise in prices puts an end to the illusory attempt to create economic prosperity out of thin air. According to Ludwig von Mises,The much talked about advantages which devaluation secures in foreign trade and tourism, are entirely due to the fact that the adjustment of domestic prices and wage rates to the state of affairs created by devaluation requires some time. As long as this adjustment process is not yet completed, exporting is encouraged and importing is discouraged. However, this merely means that in this interval the citizens of the devaluating country are getting less for what they are selling abroad and paying more for what they are buying abroad; concomitantly they must restrict their consumption. This effect may appear as a boon in the opinion of those for whom the balance of trade is the yardstick of a nation's welfare. In plain language it is to be described in this way: The British citizen must export more British goods in order to buy that quantity of tea which he received before the devaluation for a smaller quantity of exported British goods.Contrast the policy of currency depreciation with a conservative policy where money is not expanding. Under these conditions, when the pool of real wealth is expanding, the purchasing power of money will follow suit. This, all other things being equal, leads to currency appreciation. With the expansion in the production of goods and services and consequently falling prices and declining production costs, local producers can improve their profitability and their competitiveness in overseas markets while the currency is actually appreciating.The economic slowdown in China was set in motion a long time ago when the yearly rate of growth of the money supply fell from 39.3 percent in January 2010 to 1.8 percent by April 2012. The effect of this massive decline in the growth momentum of money puts severe pressure on bubble activities and in turn on various key economic activity data. Any tampering with the currency rate of exchange can only make things much worse as far as the allocation of scarce resources is concerned.
In late June of this year, President Obama signed an executive order and presidential directive clarifying the administration’s hostage policy. Afterward he gave a statement to the press, in which he condemned threats of prosecution against families trying to pay ransom: “the last thing we should ever do is add to a family’s pain with threats like that.”This follows a review into the government’s treatment of overseas hostages, which found that the family of James Foley, the freelance journalist beheaded by ISIS in August 2014, had been told that they would be taken to court if they tried to negotiate with the terrorist group.Following the review’s publication, the president was faced with three choices:(1) A policy of no-concession, whereby the government not only refuses to pay ransom, but continues to threaten the prosecution of private citizens who negotiate;(2) A policy of laissez-faire, whereby the government does nothing — it neither pays ransom, nor interferes with the negotiations of private citizens; and(3) A policy of concession, whereby the government foots the ransom bill.The president’s statement indicates his administration’s support for policy number 2. This article compares said policy — of laissez-faire —with its rivals, policy 1 and policy 3, both of which have recently found public proponents.No-Concession: Coercing the Victims of CoercionAdvocates of no-concession (e.g., House Speaker John Boehner) argue that if families are allowed to bargain for the release of their loved ones, it will send a message to rogues overseas that American hostages fetch a price. The number of kidnappings will increase as a consequence.This argument fails on its own terms. While it assumes hostage-takers are rational, self-interested actors who respond to incentives, it does not afford American travelers the same courtesy, instead assuming that they would not act any differently in light of the heightened threat to life and limb. In fact, if the risk of visiting lawless, unstable regions has increased appreciably, then marginal visitors (e.g., amateur journalists) can be expected to cancel their excursions; more serious trippers, on the other hand, will spend more on security provisions.There are now two forces acting, each in the opposite direction: rogues are increasingly on the prowl for victims, but trippers are fewer in number and more vigilant. Whether the number of kidnappings should fall or rise with the payment of ransom by families, then, is a matter of some ambiguity.But don’t ransom payments enable hostage-taking groups the material resources necessary to kidnap more frequently in the future? This objection is vulnerable to the same counterargument: as rogues militarize, some American visitors will armor up, while others will cancel their trips. Just as the lynx population cannot expand indefinitely at the expense of the snowshoe hare’s, the hostage-takers face constraints — as they become stronger and more numerous, they will find the pickings less plentiful, because their prey has in turn become stronger or has simply stayed at home.In short, a policy of no-concession asks that we tolerate the addition of government coercion to an already coercive situation; that we strip hostages and their families of their best hope for conflict resolution. And yet, in return, it does not even give us reason to believe that the total number of kidnappings should certainly drop. A sorry trade-off indeed!Uncle Sam’s Deep PocketsThere are currently over thirty American hostages held abroad, and, quite understandably, it is their families who form the dominant lobby for a policy of government-paid ransom. How might we expect a policy of concession, practiced in some form by every country but America and Britain, to affect the number of kidnappings?First, we should note that governments do not face the same budget constraints as private individuals, meaning kidnappers could expect to extract a far higher ransom. And even if the government-paid ransom were capped, it would still stimulate kidnappings by rewarding rogues with income security. By contrast, under the policy of laissez-faire which Obama has endorsed, hostage-takers enjoy no such certainty — they may find, after outlawing themselves and expending labor in the process of kidnapping, that their victim hasn’t the means to pay.Furthermore, under a policy of government-paid ransom, Americans could be expected to visit dangerous regions in greater numbers, and to become less vigilant while doing so. It would, of course, still be unpleasant to be kidnapped, but it would be made less onerous if one knew one’s house needn’t be re-mortgaged to buy freedom. That is, government guarantee of ransom creates moral hazard, partly removing the incentive to guard against risk. Other things being equal, a policy of concession stimulates the kidnapping industry, providing it with more opportunities and a stable, generous bounty per head.However, all other things are not equal between countries: that is the reason why, in spite of its government’s consistent refusal to pay ransom, the US had the second highest number of citizens held hostage in 2013.This source reports that America had nine citizens in the hands of hostage-takers in 2013. However, the U.S. government reported in 2015 that over thirty Americans are currently being held. In the interim, IntelCenter reports that only two hostages were taken: 0 in 2013, 1 in 2014 and 1 so far in 2015 (data on yearly kidnappings here. This demonstrates the difficulty of finding consistent data on American hostage-takings. It is government policy not to publicize hostages, so as to deny their kidnappers limelight and bargaining power. America’s foreign policy has made political targets of its citizens, with many being kidnapped and killed to influence military decisions. That is not, however, to discount the importance of government ransom policy; the country that beat America in 2013 was France — the French have intervened in places such as Mali and Libya, while at the same time paying out more in ransoms than any other nation since 2008. And that shows in the hostage nationality data.ConclusionAllowing families to negotiate without fear of prosecution provides some hope for those seized overseas; at the same time, the moral hazard associated with government-paid ransom is avoided. In short, it is a humane and level-headed response to the lengthening list of American abductees.However, until such a day as American spears are beat into pruning hooks, citizens will remain in disproportionate danger of abduction while abroad. John Boehner, in response to Obama’s recent change in hostage policy, lamented, “... you could be endangering more Americans here and overseas.” If, like Boehner, one is truly concerned with the safety of citizens at home and overseas, one must consider the crucial role played by American military intervention and political subversion in creating hatred for the homeland and her people.
[First published in Inquiry, November 12, 1979.]A half-century ago, America — and then the world — was rocked by a mighty stock-market crash that soon turned into the steepest and longest-lasting depression of all time.It was not only the sharpness and depth of the depression that stunned the world and changed the face of modern history: it was the length, the chronic economic morass persisting throughout the 1930s, that caused intellectuals and the general public to despair of the market economy and the capitalist system.Previous depressions, no matter how sharp, generally lasted no more than a year or two. But now, for over a decade, poverty, unemployment, and hopelessness led millions to seek some new economic system that would cure the depression and avoid a repetition of it.Political solutions and panaceas differed. For some it was Marxian socialism — for others, one or another form of fascism. In the United States the accepted solution was a Keynesian mixed-economy or welfare-warfare state. Harvard was the focus of Keynesian economics in the United States, and Seymour Harris, a prominent Keynesian teaching there, titled one of his many books Saving American Capitalism. That title encapsulated the spirit of the New Deal reformers of the '30s and '40s. By the massive use of state power and government spending, capitalism was going to be saved from the challenges of communism and fascism.One common guiding assumption characterized the Keynesians, socialists, and fascists of the 1930s: that laissez-faire, free-market capitalism had been the touchstone of the US economy during the 1920s, and that this old-fashioned form of capitalism had manifestly failed us by generating, or at least allowing, the most catastrophic depression in history to strike at the United States and the entire Western world.Well, weren't the 1920s, with their burgeoning optimism, their speculation, their enshrinement of big business in politics, their Republican dominance, their individualism, their hedonistic cultural decadence, weren't these years indeed the heyday of laissez-faire? Certainly the decade looked that way to most observers, and hence it was natural that the free market should take the blame for the consequences of unbridled capitalism in 1929 and after.Unfortunately for the course of history, the common interpretation was dead wrong: there was very little laissez-faire capitalism in the 1920s. Indeed the opposite was true: significant parts of the economy were infused with proto–New Deal statism, a statism that plunged us into the Great Depression and prolonged this miasma for more than a decade.In the first place, everyone forgot that the Republicans had never been the laissez-faire party. On the contrary, it was the Democrats who had always championed free markets and minimal government, while the Republicans had crusaded for a protective tariff that would shield domestic industry from efficient competition, for huge land grants and other subsidies to railroads, and for inflation and cheap credit to stimulate purchasing power and apparent prosperity.It was the Republicans who championed paternalistic big government and the partnership of business and government while the Democrats sought free trade and free competition, denounced the tariff as the "mother of trusts," and argued for the gold standard and the separation of government and banking as the only way to guard against inflation and the destruction of people's savings. At least that was the policy of the Democrats before Bryan and Wilson at the start of the 20th century, when the party shifted to a position not very far from its ancient Republican rivals.The Republicans never shifted, and their reign in the 1920s brought the federal government to its greatest intensity of peacetime spending and hiked the tariff to new, stratospheric levels. A minority of old-fashioned "Cleveland" Democrats continued to hammer away at Republican extravagance and big government during the Coolidge and Hoover eras. Those included Governor Albert Ritchie of Maryland, Senator James Reed of Missouri, and former Solicitor General James M. Beck, who wrote two characteristic books in this era: The Vanishing Rights of the States and Our Wonderland of Bureaucracy.But most important in terms of the depression was the new statism that the Republicans, following on the Wilson administration, brought to the vital but arcane field of money and banking. How many Americans know or care anything about banking? Yet it was in this neglected but crucial area that the seeds of 1929 were sown and cultivated by the American government.The United States was the last major country to enjoy, or be saddled with, a central bank. All the major European countries had adopted central banks during the 18th and 19th centuries, which enabled governments to control and dominate commercial banks, to bail out banking firms whenever they got into trouble, and to inflate money and credit in ways controlled and regulated by the government. Only the United States, as a result of Democratic agitation during the Jacksonian era, had had the courage to extend the doctrine of classical liberalism to the banking system, thereby separating government from money and banking.Having deposed the central bank in the 1830s, the United States enjoyed a freely competitive banking system — and hence a relatively "hard" and noninflated money — until the Civil War. During that catastrophe, the Republicans used their one-party dominance to push through their interventionist economic program. It included a protective tariff and land grants to railroads, as well as inflationary paper money and a "national banking system" that in effect crippled state-chartered banks and paved the way for the later central bank.The United States adopted its central bank, the Federal Reserve System, in 1913, backed by a consensus of Democrats and Republicans. This virtual nationalization of the banking system was unopposed by the big banks; in fact, Wall Street and the other large banks had actively sought such a central system for many years. The result was the cartelization of banking under federal control, with the government standing ready to bail out banks in trouble, and also ready to inflate money and credit to whatever extent the banks felt was necessary.Without a functioning Federal Reserve System available to inflate the money supply, the United States could not have financed its participation in World War I: that war was fueled by heavy government deficits and by the creation of new money to pay for swollen federal expenditures.One point is undisputed: the autocratic ruler of the Federal Reserve System, from its inception in 1914 to his death in 1928, was Benjamin Strong, a New York banker who had been named governor of the Federal Reserve Bank of New York. Strong consistently and repeatedly used his power to force an inflationary increase of money and bank credit in the American economy, thereby driving prices higher than they would have been and stimulating disastrous booms in the stock and real-estate markets. In 1927, Strong gaily told a French central banker that he was going to give "a little coup de whiskey to the stock market." What was the point? Why did Strong pursue a policy that now can seem only heedless, dangerous, and recklessly extravagant?Once the government has assumed absolute control of the money-creating machinery in society, it benefits — as would any other group — by using that power. Anyone would benefit, at least in the short run, by printing or creating new money for his own use or for the use of his economic or political allies.Strong had several motives for supporting an inflationary boom in the 1920s. One was to stimulate foreign loans and foreign exports. The Republican party was committed to a policy of partnership of government and industry, and to subsidizing domestic and export firms. A protective tariff aided inefficient domestic producers by keeping out foreign competition. But if foreigners were shut out of our markets, how in the world were they going to buy our exports? The Republican administration thought it had solved this dilemma by stimulating American loans to foreigners so that they could buy our products.A fine solution in the short run, but how were these loans to be kept up, and, more important, how were they to be repaid? The banking community was also confronted with the curious and ultimately self-defeating policy of preventing foreigners from selling us their products, and then lending them the money to keep buying ours. Benjamin Strong's inflationary policy meant repeated doses of cheap credit to stimulate this foreign lending. It should also be noted that this policy subsidized American investment banks in making foreign loans.Among the exports stimulated by cheap credit and foreign loans were farm products. American agriculture, overstimulated by the swollen demands of warring European nations during World War I, was a chronically sick industry during the 1920s. It had awakened after the resumption of peace to find that farm prices had fallen and that European demand was down. Rather than adjusting to postwar realities, however, American farmers preferred to organize and agitate to force taxpayers and consumers to keep them in the style to which they had become accustomed during the palmy "parity" years of the war. One way for the federal government to bow to this political pressure was to stimulate foreign loans and hence to encourage foreign purchases of American farm products.The "farm bloc," it should be noted, included not only farmers; more indirect and considerably less rustic interests were also busily at work. The postwar farm bloc gained strong support from George N. Peek and General Hugh S. Johnson; both, later prominent in the New Deal, were heads of the Moline Plow Company, a major manufacturer of farm machinery that stood to benefit handsomely from government subsidies to farmers. When Herbert Hoover, in one of his first acts as president — considerably before the crash — established the Federal Farm Board to raise farm prices, he installed as head of the FFB Alexander Legge, chairman of International Harvester, the nation's leading producer of farm machinery. Such was the Republican devotion to "laissez faire."But a more indirect and ultimately more important motivation for Benjamin Strong's inflationary credit policies in the 1920s was his view that it was vitally important to "help England," even at American expense. Thus, in the spring of 1928, his assistant noted Strong's displeasure at the American public's outcry against the "speculative excesses" of the stock market.The public didn't realize, Strong thought, that "we were now paying the penalty for the decision which was reached early in 1924 to help the rest of the world back to a sound financial and monetary basis." An unexceptionable statement, provided that we clear up some euphemisms. For the "decision" was taken by Strong in camera, without the knowledge or participation of the American people; the decision was to inflate money and credit, and it was done not to help the "rest of the world" but to help sustain Britain's unsound and inflationary policies.Before the World War, all the major nations were on the gold standard, which meant that the various currencies — the dollar, pound, mark, franc, etc. — were redeemable in fixed weights of gold. This gold requirement ensured that governments were strictly limited in the amount of scrip they could print and pour into circulation, whether by spending to finance government deficits or by lending to favored economic or political groups. Consequently, inflation had been kept in check throughout the 19th century when this system was in force.But world war ruptured all that, just as it destroyed so many other aspects of the classical-liberal polity. The major warring powers spent heavily on the war effort, creating new money in bushel baskets to pay the expense. Inflation was consequently rampant during and after World War I and, since there were far more pounds, marks, and francs in circulation than could possibly be redeemed in gold, the warring countries were forced to go off the gold standard and to fall back on paper currencies — all, that is, except for the United States, which was embroiled in the war for a relatively short time and could therefore afford to remain on the gold standard.After the war, the nations faced a world currency breakdown with rampant inflation and chaotically falling exchange rates. What was to be done? There was a general consensus on the need to go back to gold, and thereby to eliminate inflation and frantically fluctuating exchange rates. But how to go back? That is, what should be the relations between gold and the various currencies?Specifically, Britain had been the world's financial center for a century before the war, and the British pound and the dollar had been fixed all that time in terms of gold so that the pound would always be worth $4.86. But during and after the war the pound had been inflated relatively far more than the dollar, and thus had fallen to about $3.50 on the foreign-exchange market. But Britain was adamant about returning the pound, not to the realistic level of $3.50, but rather to the old prewar par of $4.86.Why the stubborn insistence on going back to gold at the obsolete prewar par? Part of the reason was a stubborn and mindless concentration on saving face and British honor, on showing that the old lion was just as strong and tough as before the war. Partly, it was a shrewd realization by British bankers that if the pound were devalued from prewar levels England would lose its financial preeminence, perhaps to the United States, which had been able to retain its gold status.So, under the spell of its bankers, England made the fateful decision to go back to gold at $4.86. But this meant that Britain's exports were now made artificially expensive and its imports cheaper, and since England lived by selling coal, textiles, and other products, while importing food, the resulting chronic depression in its export industries had serious consequences for the British economy. Unemployment remained high in Britain, especially in its export industries, throughout the boom of the 1920s.To make this leap backward to $4.86 viable, Britain would have had to deflate its economy so as to bring about lower prices and wages and make its exports once again inexpensive abroad. But it wasn't willing to deflate since that would have meant a bitter confrontation with Britain's now-powerful unions. Ever since the imposition of an extensive unemployment-insurance system, wages in Britain were no longer flexible downward as they had been before the war. In fact, rather than deflate, the British government wanted the freedom to keep inflating, in order to raise prices, do an end run around union wage rates, and ensure cheap credit for business.The British authorities had boxed themselves in: They insisted on several axioms. One was to go back to gold at the old prewar par of $4.86. This would have made deflation necessary, except that a second axiom was that the British continue to pursue a cheap credit, inflationary policy rather than deflation. How to square the circle? What the British tried was political pressure and arm-twisting on other countries, to try to induce or force them to inflate too. If other countries would also inflate, the pound would remain stable in relation to other currencies; Britain would not keep losing gold to other nations, which endangered its own jerry-built monetary structure.On the defeated and small new countries of Europe, Britain's pressure was notably successful. Using their dominance in the League of Nations and especially in its Financial Committee, the British forced country after country not only to return to gold, but to do so at overvalued rates, thereby endangering those nations' exports and stimulating imports from Britain. And the British also flummoxed these countries into adopting a new form of gold "exchange" standard, in which they kept their reserves not in gold, as before, but in sterling balances in London.In this way, the British could continue to inflate; and pounds, instead of being redeemed in gold, were used by other countries as reserves on which to pyramid their own paper inflation. The only stubborn resistance to the new order came from France, which had a hard-money policy into the late 1920s. It was French resistance to the new British monetary order that was ultimately fatal to the house of cards the British attempted to construct in the 1920s.The United States was a different situation altogether. Britain could not coerce the United States into inflating in order to save the misbegotten pound, but it could cajole and persuade. In particular, it had a staunch ally in Benjamin Strong, who could always be relied on to be a willing servitor of British interests. By repeatedly agreeing to inflate the dollar at British urging, Benjamin Strong won the plaudits of the British financial press as the best friend of Great Britain since Ambassador Walter Hines Page, who had played a key role in inducing the United States to enter the war on the British side.Why did Strong do it? We know that he formed a close friendship with British financial autocrat Montagu Norman, longtime head of the Bank of England. Norman would make secret visits to the United States, checking in at a Saratoga Springs resort under an assumed name, and Strong would join him there for the weekend, also incognito, there to agree on yet another inflationary coup de whiskey to the market.Surely this Strong–Norman tie was crucial, but what was its basic nature? Some writers have improbably speculated on a homosexual liaison to explain the otherwise mysterious subservience of Strong to Norman's wishes. But there was another, and more concrete and provable, tie that bound these two financial autocrats together.That tie involved the Morgan banking interests. Benjamin Strong had lived his life in the Morgan ambit. Before being named head of the Federal Reserve, Strong had risen to head of the Bankers Trust Company, a creature of the Morgan bank. When asked to be head of the Fed, he was persuaded to take the job by two of his best friends, Henry P. Davison and Dwight Morrow, both partners of J.P. Morgan & Co.The Federal Reserve System arrived at a good time for the Morgans. It was needed to finance America's participation in World War I, a participation strongly supported by the Morgans, who played a major role in bringing the Wilson administration into the war. The Morgans, heavily invested in rail securities, had been caught short by the boom in industrial stocks that emerged at the turn of the century. Consequently, much of their position in investment-banking was being eroded by Kuhn, Loeb & Co., which had been faster off the mark on investment in industrial securities.World War I meant economic boom or collapse for the Morgans. The House of Morgan was the fiscal agent for the Bank of England: it had the underwriting concession for all sales of British and French bonds in the United States during the war, and it helped finance US arms and munitions sales to Britain and France. The House of Morgan had a very heavy investment in an Anglo-French victory and a German-Austrian defeat. Kuhn, Loeb, on the other hand, was pro-German, and therefore was tied more to the fate of the Central Powers.The cement binding Strong and Norman was the Morgan connection. Not only was the House of Morgan intimately wrapped up in British finance, but Norman himself — as well as his grandfather — in earlier days had worked in New York for the powerful investment banking firm of Brown Brothers, and hence had developed close personal ties with the New York banking community. For Benjamin Strong, helping Britain meant helping the House of Morgan to shore up the internally contradictory monetary structure it had constructed for the postwar world.The result was inflationary credit, a speculative boom that could not last, and the Great Crash whose 50th anniversary we observe this year. After Strong's death in late 1928, the new Federal Reserve authorities, while confused on many issues, were no longer consistent servitors of Britain and the Morgans. The deliberate and consistent policy of inflation came to an end, and a corrective depression soon arrived.There are two mysteries about the Great Depression, mysteries having two separate and distinct solutions. One is, why the crash? Why the sudden crash and depression in the midst of boom and seemingly permanent prosperity? We have seen the answer: inflationary credit expansion propelled by the Federal Reserve System in the service of various motives, including helping Britain and the House of Morgan.But there is another vital and very different problem. Given the crash, why did the recovery take so long? Usually, when a crash or financial panic strikes, the economic and financial depression, be it slight or severe, is over in a few months or a year or two at the most. After that, economic recovery will have arrived. The crucial difference between earlier depressions and that of 1929 was that the 1929 crash became chronic and seemed permanent.What is seldom realized is that depressions, despite their evident hardship on so many, perform an important corrective function. They serve to eliminate the distortions introduced into the economy by an inflationary boom. When the boom is over, the many distortions that have entered the system become clear: prices and wage rates have been driven too high, and much unsound investment has taken place, particularly in capital-goods industries.The recession or depression serves to lower the swollen prices and to liquidate the unsound and uneconomic investments; it directs resources into those areas and industries that will most-effectively serve consumer demands — and were not allowed to do so during the artificial boom. Workers previously misdirected into uneconomic production, unstable at best, will, as the economy corrects itself, end up in more secure and productive employment.The recession must be allowed to perform its work of liquidation and restoration as quickly as possible, so that the economy can be allowed to recover from boom and depression and get back to a healthy footing. Before 1929, this hands-off policy was precisely what all US governments had followed, and hence depressions, however sharp, would disappear after a year or so.But when the Great Crash hit, America had recently elected a new kind of president. Until the past decade, historians have regarded Herbert Clark Hoover as the last of the laissez-faire presidents. Instead, he was the first New Dealer.Hoover had his bipartisan aura, and was devoted to corporatist cartelization under the aegis of big government; indeed, he originated the New Deal farm-price-support program. His New Deal specifically centered on his program for fighting depressions. Before he assumed office, Hoover determined that should a depression strike during his term of office, he would use the massive powers of the federal government to combat it. No more would the government, as in the past, pursue a hands-off policy.As Hoover himself recalled the crash and its aftermath,The primary question at once arose as to whether the President and the federal government should undertake to investigate and remedy the evils. … No President before had ever believed that there was a governmental responsibility in such cases. … Presidents steadfastly had maintained that the federal government was apart from such eruptions … therefore, we had to pioneer a new field.In his acceptance speech for the presidential renomination in 1932, Herbert Hoover summed it up:We might have done nothing. … Instead, we met the situation with proposals to private business and to Congress of the most gigantic program of economic defense and counterattack ever evolved in the history of the Republic. We put it into action. … No government in Washington has hitherto considered that it held so broad a responsibility for leadership in such times.The massive Hoover program was, indeed, a characteristically New Deal one: vigorous action to keep up wage rates and prices, to expand public works and government deficits, to lend money to failing businesses to try to keep them afloat, and to inflate the supply of money and credit to try to stimulate purchasing power and recovery. Herbert Hoover during the 1920s had pioneered the proto-Keynesian idea that high wages are necessary to assure sufficient purchasing power and a healthy economy. The notion led him to artificially raising wages — and consequently to aggravating the unemployment problem — during the depression.As soon as the stock market crashed, Hoover called in all the leading industrialists in the country for a series of White House conferences in which he successfully bludgeoned the industrialists, under the threat of coercive government action, into propping up wage rates — and hence causing massive unemployment — while prices were falling sharply. After Hoover's term, Franklin D. Roosevelt simply continued and expanded Hoover's policies across the board, adding considerably more coercion along the way. Between them, the two New Deal presidents managed the unprecedented feat of making the depression last a decade, until we were lifted out of it by our entry into World War II.If Benjamin Strong got us into a depression and Herbert Hoover and Franklin D. Roosevelt kept us in it, what was the role in all this of the nation's economists, watchdogs of our economic health? Unsurprisingly, most economists, during the depression and ever since, have been much more part of the problem than of the solution. During the 1920s, establishment economists, led by Professor Irving Fisher of Yale, hailed the 20s as the start of a "New Era," one in which the new Federal Reserve System would ensure permanently stable prices, avoiding either booms or busts.Unfortunately, the Fisherites, in their quest for stability, failed to realize that the trend of the free and unhampered market is always toward lower prices as productivity rises and mass markets develop for particular products. Keeping the price level stable in an era of rising productivity, as in the 1920s, requires a massive artificial expansion of money and credit. Focusing only on wholesale prices, Strong and the economists of the 1920s were willing to engender artificial booms in real estate and stocks, as well as malinvestments in capital goods, so long as the wholesale price level remained constant.As a result, Irving Fisher and the leading economists of the 1920s failed to recognize that a dangerous inflationary boom was taking place. When the crash came, Fisher and his disciples of the Chicago School again pinned the blame on the wrong culprit. Instead of realizing that the depression process should be left alone to work itself out as rapidly as possible, Fisher and his colleagues laid the blame on the deflation after the crash and demanded a reinflation (or "reflation") back to 1929 levels.In this way, even before Keynes, the leading economists of the day managed to miss the problem of inflation and cheap credit and to demand policies that only prolonged the depression and made it worse. After all, Keynesianism did not spring forth full-blown with the publication of Keynes's General Theory in 1936.We are still pursuing the policies of the 1920s that led to eventual disaster. The Federal Reserve is still inflating the money supply and inflates it even further with the merest hint that a recession is in the offing. The Fed is still trying to fuel a perpetual boom while avoiding a correction on the one hand or a great deal of inflation on the other.In a sense, things have gotten worse. For while the hard-money economists of the 1920s and 1930s wished to retain and tighten up the gold standard, the "hard-money" monetarists of today scorn gold, are happy to rely on paper currency, and feel that they are boldly courageous for proposing not to stop the inflation of money altogether, but to limit the expansion to a supposedly fixed amount.Those who ignore the lessons of history are doomed to repeat it — except that now, with gold abandoned and each nation able to print currency ad lib, we are likely to wind up, not with a repeat of 1929, but with something far worse: the holocaust of runaway inflation that ravaged Germany in 1923 and many other countries during World War II. To avoid such a catastrophe we must have the resolve and the will to cease the inflationary expansion of credit, and to force the Federal Reserve System to stop purchasing assets, and thereby to stop its continued generation of chronic, accelerating inflation.
In April it was announced that Greece was imposing a surcharge for all cash withdrawals from bank accounts to deter citizens from clearing out their accounts. So now the Greeks will have to pay one euro per 1,000 euros that they withdraw, which is one-tenth of a percent. It doesn’t seem very big, but the principle at work is extremely big because what they’re in effect doing is breaking the exchange rate between a unit of bank deposits and a unit of currency.Why would they do this? Why would they want to do this? Well, it’s one of the anti-cash policies that mainstream economists have vigorously been promoting.PAVING THE WAY FOR NEGATIVE INTERESTTo make the calculations easier, and to illustrate the effect, let’s say that the Greek “surcharge” is ten dollars for every 100 dollars withdrawn. Now, instead of being able to convert one euro in your checking account into one euro in cash, on demand, you will only be able to buy one euro in cash by spending 1.10 euros in your bank accounts. That’s a negative 10-percent rate in some sense. That is to say that you can only take out one euro from the bank if you’re willing to pay 1.10 euros. So, you would only really get ninety cents for every dollar that you wanted to withdraw and that’s very significant because this means it will be more expensive to buy an item with cash than with bank deposits.At the same time, the Greek government made it very clear that if you deposit the cash in the banks, you don’t get 1.10 euros of bank money for every euro you deposit.So the system is now structured to lock the money in the banks. Now, what does that allow them to do? If you lose 10 percent every time you withdraw one euro in cash, they can lower the interest rate that you get on bank deposits to negative 5 percent, or negative 6 percent. You still wouldn’t withdraw your cash from the banks even if the interest rate went negative.What we are witnessing is a war on cash in which governments make it either illegal or inconvenient to use cash. This, in turn, allows governments the ability to spy on and regulate financial transactions more completely, while also allowing governments more leeway in manipulating the money supply.THE ORIGINS OF THE WAR ON CASHIt all started really with the Bank Secrecy Act of 1970, passed in the US, which requires financial institutions in the United States to assist US government agencies in detecting and preventing money laundering. That was the rationale. Specifically, the act requires financial institutions to keep records of cash payments and file reports of cash purchases or negotiable instruments of more than $10,000 as a daily aggregate amount. Of course, this is all sold as a way of tracking criminals.The US government employs other means of making war on cash also. Up until 1945, there were 500 dollar bills, 1,000 dollar bills, and 10,000 dollar bills in circulation. There was even a 100,000 dollar bill in the 1930s with which banks made clearings between one another. The US government stopped issuing these bills in 1945 and by 1969 had withdrawn all from circulation. So, in the guise of fighting organized crime and money laundering, what’s actually occurred is that they made it very inconvenient to use cash. A one hundred dollar bill today has $15.50 worth of purchasing power in 1969 dollars, when they removed the last big bills.THE PROBLEM IS INTERNATIONALThe war on cash in Sweden has gone probably the furthest and Scandinavian governments in general are notable for their opposition to cash. In Swedish cities, tickets for public buses no longer can be purchased for cash; they must be purchased in advance by a cell phone or text message — in other words, via bank accounts.The deputy governor of the Swedish Central Bank gloated, before his retirement a few years back, that cash will survive “like the crocodile,” even though it may be forced to see its habitat gradually cut back.The analogy is apt since three of the four major Swedish banks combined have more than two-thirds of their offices no longer accepting or paying out cash. These three banks want to phase out the manual handling of cash at their offices at a very rapid pace and have been doing that since 2012.In France, opponents of cash tried to pass a law in 2012 which would restrict the use of cash from a maximum of 3,000 euros per exchange to 1,000. The law failed, but then there was the attack on Charlie Hebdo and on a Jewish supermarket, so immediately the state used this as a reason for getting the 1,000 maximum limit. They got their maximum limit. Why? Well, proponents claim that these attacks were partially financed by cash.The terrorists used cash to purchase some of the stuff they needed. No doubt, these murderers also wore shoes and clothing and used cell phones and cars during the planning and execution of their mayhem. Why not ban these things? A naked barefoot terrorist without communications is surely less effective than the fully clothed and equipped one.Finally, Switzerland, formerly a great bastion of economic liberty and financial privacy, has succumbed under the bare-knuckle tactics of the US government. The Swiss government has banned all cash payments of more than 100,000 francs (about $106,000), including transactions involving watches, real estate, precious metals, and cars. This was done under the threat of blacklisting by the Organization of Economic Development, with the US no doubt pushing behind the scenes. Transactions above 100,000 francs will now have to be processed through the banking system. The reason is to prevent the catch-all crime, of course, of money laundering.Chase Bank has also recently joined the war on cash. It’s the largest bank in the US, a subsidiary of JP Morgan Chase and Co., and according to Forbes, the world’s third largest public company. It also received $25 billion in bailout loans from the US Treasury. As of March, Chase began restricting the use of cash in selected markets. The new policy restricts borrowers from using cash to make payments on credit cards, mortgages, equity lines, and auto loans.Chase even goes as far as to prohibit the storage of cash in its safe deposit boxes. In a letter to its customers, dated April 1, 2015, pertaining to its “updated safe deposit box lease agreement,” one of the high-lighted items reads, “You agree not to store any cash or coins other than those found to have a collectible value.” Whether or not this pertains to gold and silver coins with no collectible value is not explained, but of course it does. As one observer warned, “This policy is unusual, but since Chase is the nation’s largest bank, I wouldn’t be surprised if we start seeing more of this in this era of sensitivity about funding terrorists and other illegal causes.” So, get your money out of those safe deposit boxes, your currency and probably your gold and silver.ONLY (SUPERVISED) SPENDING IS ALLOWEDGregory Mankiw, a prominent macroeconomist, came up with a scheme in 2009: the Fed would announce that a year from the date of the announcement, it intended to pick a numeral from 0 to 9 out of a hat. All currency with a serial number ending in that numeral, would instantly lose status as legal tender, causing the expected return on holding currency to plummet to -10 percent. This would allow the Fed to reduce interest rates below zero for a year or even more because people would happily loan money for say, -2 percent or -4 percent because that would stop them from losing 10 percent.Now the reason given by our rulers for suppressing cash is to keep society safe from terrorists, tax evaders, money launderers, drug cartels, and other villains real or imagined. The actual aim of the ﬂood of laws restricting or even prohibiting the use of cash is to force the public to make payments through the financial system. This enables governments to expand their ability to spy on and keep track of their citizens’ most private financial dealings, in order to milk their citizens of every last dollar of tax payments that they claim are due.Other reasons for suppressing cash are (1) to prop up the unstable fractional reserve banking system, which is in a state of collapse all over the world, and (2) to give central banks the power to impose negative nominal interest rates. That is, to make you spend money by subtracting money from your bank account for every day you leave it in the bank account and don’t spend it.Editor’s Note: This article was adapted from a talk delivered at the New York Area Mises Circle in Stamford, Connecticut.
Economics is dead, and economists killed it.What we have seen over the course of the last eighty years is a systematic dismantling of the contribution of economics to our understanding of the social world. Whatever the cause, modern economics is now not much more than formal modeling using mathematics dressed up in economics-sounding lingo. In this sense, economics is dead as a science, assuming it was ever alive. Economics in mathematical form cannot fulfill its promises and neither the scientific literature nor advanced education in the subject provide insights that are applicable to or useful in everyday life, business, or policy.But apparently what is dead can be killed again. This, at least, appears to be the goal of the present tide of leftist critics who demand that economics be restructured from the bottom up. Why? The real reason is unlikely to be anything but the common leftist fear of what the science of economics reveals about the economy and the world. As often claimed by ideologues on the left, the science of economics “is ideology.” This is evident, we are supposed to believe, when we consult “scientific Marxism.”The stated reason in the contemporary discussion is different, however. We need to restructure (if not do away with) economics because, we are told, it has failed. Why? Because economics could not predict the financial crisis of 2008.These critics of economics will never let a crisis go to waste, and not only do they believe that the most recent crisis should be used to prove the Marxist dogma about the inherent contradictions in the market, but it can also be used as an ostensible reason to rethink the whole science of economics. Indeed, it is general knowledge that economists didn’t foresee the crisis, and their prescriptions to solve it quite obviously haven’t worked, either.You have to applaud the anti-economics left for this rhetorical masterpiece. They have struggled for decades to sink the ship of economics, the generally acclaimed science that has firmly stood in the way of their anti-market and egalitarian policies, hindered the growth of big government, and raised obstacles to enact everything else that is beautiful to the anti-economics left. The financial crisis is exactly the excuse the Left has been waiting for. It is a slam dunk: government grows, Keynesianism is revived, and economics is made the culprit for all our troubles.We see this now in education, as students demand to be taught (and professors demand permission to teach) a more “relevant” economics. Relevance, apparently, is achieved by diluting economics with a lot of the worst kinds of sociology, post modernism, and carefully structured discourse aimed to liberate us from our neoliberal bias. And, it turns out, we must also teach Keynesian ideas about how government must save the market economy.We see this same agenda at academic research conferences, where it is now rather common to hear voices (or, as is my own experience, keynote talks) claiming that “it is time” for another paradigm: post-economics. The reason is always that economics “has failed.”If this weren’t so serious, it would be amusing that the failure of Keynesian macro-economics (whether it is formally Keynes’s theory or post-Keynesian, new Keynesian, neo-Keynesian, monetarist, etc.) is taken as an excuse to do away with sound micro-economic theory to be replaced with Keynesian and other anti-market ideas. But it is not amusing. If most of the discussions heard are to be believed, the failures of central planning is a reason for central planning, just like socialism is a reason for socialism. The success of the market, on the other hand, is not a reason for the market.It should not be a surprise that economics has finally become irrelevant after decades of uncalled-for mathematizing and formal modeling based on outrageous assumptions. This perverse kind of pseudo-economic analysis had it coming, really. One cannot calculate maxima for the social world; it is, as Mises showed almost a century ago, impossible. If mathematical economics is finally dead, then that is above all else an improvement.But the death of mathematical economics should not mean economics is to be rejected. It should mean a return to proper and sound economic analysis — the state of the science prior to the “contributions” of Keynes, Samuelson, and that bunch. Mathematical economics is a failure, but economics proper is still the queen of the social sciences. And for good reason: she relies on irrefutable axioms about the real world, from which logically stringent and rigorous conclusions are derived. The object of study is the messy and sometimes ambiguous social world, but this does not require that the science is also messy and ambiguous. On the contrary, economics is unparalleled in its ability to provide proper and illuminating understanding of how the economy works. It is neither messy nor ambiguous. It brings clarity to the processes that make out the market.This is the reason why the Left hates all that is economics. Because it points out that creating a better world through central planning, money-printing, and political manipulation is indeed impossible. The market is neither perfect nor efficient, but it is better than any available alternative. In fact, the unhampered market is the only positive-sum means available for human society. The market is indeed the only way of progress; all else is a step backward.But the market is also uncontrollable and seems, at least to the non-economist, both unpredictable and unintuitive. This is why the Left hates it — and why the Right despises it.The Left knows full well that they cannot beat proper economics; their ideology will always fail when put up against economic science. But they can beat mathematical economics, since it follows in the tradition of Lange-Lerner market socialism and is fundamentally flawed. They finally have. And they are using this as an excuse to kill economics again. Let’s hope for the sake of humanity that they will fail in their undertaking.
The term “Sagebrush Rebellion” is again showing up in newspapers across the American west as states seek more control over federal lands within their own boundaries. As with the original Sagebrush Rebellion of the 1970s and 1980s, several western states, where the federal government owns well over one-third of land within the states, have begun to look to more local control of lands as an answer to federal indifference, mismanagement, and outright hostility. In at least one case — Utah — the state has initiated a lawsuit in an effort to wrest more control of lands out of federal hands.The Los Angeles Times, in the socio-economic basket case known as California, dismissed the idea outright in an unsigned editorial, declaring the idea to be an affront against unfairly maligned federal supremacy. The Denver Post, meanwhile, offered last month a more evenhanded assessment, suggesting that the cost to the state of maintaining public lands — in the form of fire-fighting, forestry, and more — is too high to be worth it.Why Now?Federal control of lands within states has long been a source of contention between states and the federal government.Prior to the adoption of the 1787 Constitution, there were few provisions in law that allowed for direct federal control of lands within states. The new constitution, however, was a coup for centralizers who wished to assert more direct federal control over lands in the west. The Louisiana Purchase further strengthened control over western lands by placing huge swaths of land under federal control.Throughout most of the nineteenth-century, though, it was accepted that virtually all of this land should eventually be handed over to states and to settlers. In the second half of the century, this was mostly done through various homestead acts and by large land grants to the railroads.The far west and Rocky Mountain west offered new problems, however, because those lands did not lend themselves to homesteading in the way imagined by the Congress. Much of the west is too dry to allow for workable homesteads on the small plots allowed under federal law at the time. Thus, the only way such lands could be profitable was if much larger amounts of land were allowed to single owners.Politically, however, this did not work either, since the Progressive movement opposed ownership of large plots by corporate mining operations, or by other owners who did not fit the romanticized image of the self-sufficient homesteader. Over time, the federal government took over these lands permanently, refusing ownership to states which were portrayed by Progressives as being in the pockets of corporate interests.In more recent times, this has led to conflicts in which local economic interest conflicts with federal plans for lands, and in which federal control of lands has actively hurt local economies. For example, to please environmentalists, the federal government continues to close off roads in federal lands that have long been used by local residents for a variety of purposes. Moreover, the federal government collects taxes and fees on lands that state governments would rather tax themselves.But the most memorable recent examples probably occurred in 2013 when the federal government shut down national parks and other federal lands frequented by tourists. The federal government dispatched federal agents armed with assault rifles who forcibly ejected visitors from the allegedly “public lands.” Meanwhile, nearby towns that rely on tourists for the local economy were powerless to open the parks themselves. State officials, who are far more sensitive to local economic needs than members of Congress or the White House, were also powerless to do anything.Eventually, after much political pressure was applied, the federal government kindly allowed states to pay millions to the federal government to open the parks again.But, at that point, the political damage had been done. Many states realized that if they were going to have to pay the federal government to access their own lands, there may be a problem with the arrangement.And finally, when dealing with the federal government, it’s important to remember that only immense political and corporate interests generally have much chance of influencing federal policy. Because of the cost of doing business with the feds in Washington (it’d be much higher than the cost of influencing state or local governments), federal lands are primarily a battleground between large, well-funded environmental groups and huge corporate organizations. Small companies, landowners, or conservation groups have little-to-no hope of influencing how lands are controlled or owned.Can States Afford to Take Over Federal Lands?Even those who are sympathetic to the decentralization of federal lands have long questioned whether or not states would want to assume the cost of managing them. Even if states took over federal lands, in most western states, there is little danger of those lands being privatized. Coloradans, for example, like their “public” forests just fine, and would be in no rush to hand them over to the Ted Turners of the world. This means that the states (i.e., the taxpayers) would have to pay for intervention in forest fires, road maintenance, erosion control, and other expenses associated with managing public lands.Those who think the transfer would be too costly cite a study of Utah’s public lands which claims that a transfer of federal lands to the Utah government would cost the state’s economy $280 million annually, plus $150 million in federal salaries.Now, it should be remembered that studies like these are pretty sketchy. They take all of the federal monies spent on jobs related to federal lands and then declare that all that money would disappear if the federal government were removed from the equation. They do not consider that, if states were allowed to control these lands, they might manage them better, increase fee income, allow greater levels of economic growth, and avoid local economic disasters like the federally-mandated closing of parks.The Problem of Direct TaxationBut the biggest hole in this analysis is the fact that all of these states ignore the tax revenues that states like Utah and Colorado pay to the federal government. Federal taxes collected in Utah in fiscal year 2012, for example, totaled $15.6 billion. At the same time, according to several sources, Utah is a net taxpayer state, which means its citizens pay more in federal taxes than the state received back in federal spending. In Utah’s case, the federal government spends 66 cents for every dollar in federal taxes collected from the state. Thus, we could conclude that the IRS collects $5.6 billion more in taxes than the state receives back. And this happens year over year.That allegedly “lost” $280 million doesn’t look quite so big compared to the $5.6 billion extracted from the state by the feds every year. Were that money allowed to remain in Utah, there is no question as to the ability of the state to manage public lands.Similarly, in Colorado, which is also a net taxpayer state, the IRS collected $41.2 billion in tax revenues. But, the federal spending in the state totals only 64 cents for every federal tax dollar spent. This means that, every year, Coloradans pay about $15 billion more in taxes than is spent by the feds within Colorado.One Colorado official has claimed that, without federal spending, one major forest fire would “obliterate” the state budget. That’s a pretty vague term, but given that Colorado only collects $12 billion in state taxes compared to the federal take of $41 billion, the biggest potential drain on Colorado productivity isn’t exactly wildfires.From a legal perspective, of course, it’s useless to discuss any of this. Ever since the rejection of the Articles of Confederation, and the adoption of the new Constitution, it is quite clear in federal law that the federal government can directly tax citizens without any regard whatsoever for state and local governments, or apportionment among the states. Apportionment, which was a halfway measure that attempted to even out tax burdens among states, was essentially eviscerated shortly after the Constitution was adopted when early Supreme Court decisions declared that apportionment didn’t apply if it was an inconvenience to federal tax collectors. The Sixteenth Amendment, while not creating the phenomenon of direct taxation, merely strengthened the federal government’s hand considerably.So, today, the states themselves are financial bystanders when it comes to federal spending and the ability of states to control taxes, spending, and resources within their own borders. The perpetual drain on state wealth in net taxpayer states is routinely ignored. The federal government can now extract $15 billion more — net — than it spends in a state, and then claim it is making a fabulously generous gift to the state when it does spend a fraction of what it has already taken. Utah and Colorado may get lucky and be able to somehow take control of federal lands. But, lessened federal spending in the state won’t translate into a lower tax bill for anyone.
The yearly rate of growth of the personal consumption expenditure (PCE) price index adjusted for food and energy stood at 1.3 percent in June — the same figure as in May. Note that on average since the beginning of this year the yearly rate of growth stood at 1.3 percent. Many economists have expressed satisfaction that the yearly rate of growth has been stable so far notwithstanding that it stood below the Fed’s target of 2 percent.Meanwhile, the yearly rate of growth of the overall personal consumption expenditure price deflator stood at minus 0.01 percent in June, versus minus 0.1 percent in May, and 1.4 percent in June last year.Stability in the yearly rate of growth of the PCE less food and energy price deflator is regarded as a very important thing for most economists. It is held that a stable price level will make the movement in the relative prices of goods and services more visible, thereby allowing a more efficient allocation of resources.There is the danger that focusing on so-called price stability may cause economists to overlook the underlying factor that is likely to set in motion an economic bust. However, this underlying factor may be the decline in the growth momentum of the money supply during October 2011 to October 2013.Note that during 1927 to the end of 1928, the US consumer price index was displaying relative stability (see first chart below).This caused many commentators to suggest that the US economy had reached a state of economic stability and to ignore the fact that the growth momentum of the money supply fell sharply during February 1925 to June 1927.This sharp decline set in motion an economic bust of the stock market in October 1929 and the following collapse of economic activity.According to Murray Rothbard,One of the reasons that most economists of the 1920’s did not recognize the existence of an inflationary problem was the widespread adoption of a stable price level as the goal and criterion for monetary policy. … The fact that general prices were more or less stable during the 1920’s told most economists that there was no inflationary threat, and therefore the events of the great depression caught them completely unaware.(Note that for most economists inflation is persistent rises in prices rather than increases in money supply.)Observe that the yearly rate of growth of industrial production climbed from 4.6 percent in February 1927 to 16.6 percent by July 1929, before plunging to minus 31 percent by July 1932.It is likely that the severity of a bust is dictated by the state of the pool of real wealth, namely, whether or not there are still a sufficient number of wealth generators to support various bubble activities that have emerged on the back of the loose monetary policy of the Fed.
The US Federal Reserve is playing with the idea of raising interest rates, possibly as early as September this year. After a six-year period of virtually zero interest rates, a ramping up of borrowing costs will certainly have tremendous consequences. It will be like taking away the punch bowl on which all the party fun rests.Low Central Bank Rates have been Fueling Asset Price InflationThe current situation has, of course, a history to it. Around the middle of the 1990s, the Fed’s easy monetary policy — that of Chairman Alan Greenspan — ushered in the “New Economy” boom. Generous credit and money expansion resulted in a pumping up of asset prices, in particular stock prices and their valuations.A Brief History of Low Interest RatesWhen this boom-bubble burst, the Fed slashed rates from 6.5 percent in January 2001 to 1 percent in June 2003. It held borrowing costs at this level until June 2004. This easy Fed policy not only halted the slowdown in bank credit and money expansion, it sowed the seeds for an unprecedented credit boom which took off as early as the middle of 2002.When the Fed had put on the brakes by having pushed rates back up to 5.25 percent in June 2006, the credit boom was pretty much doomed. The ensuing bust grew into the most severe financial and economic meltdown seen since the late 1920s and early 1930s. It affected not only in the US, but the world economy on a grand scale.Thanks to Austrian-school insights, we can know the real source of all this trouble. The root cause is central banks’ producing fake money out of thin air. This induces, and necessarily so, a recurrence of boom and bust, bringing great misery for many people and businesses and eventually ruining the monetary and economic system.Central banks — in cooperation with commercial banks — create additional money through credit expansion, thereby artificially lowering the market interest rates to below the level that would prevail if there was no credit and money expansion “out of thin air.”Such a boom will end in a bust if and when credit and money expansion dries up and interest rates go up. In For A New Liberty (1973), Murray N. Rothbard put this insight succinctly:Like the repeated doping of a horse, the boom is kept on its way and ahead of its inevitable comeuppance by repeated and accelerating doses of the stimulant of bank credit. It is only when bank credit expansion must finally stop or sharply slow down, either because the banks are getting shaky or because the public is getting restive at the continuing inflation, that retribution finally catches up with the boom. As soon as credit expansion stops, the piper must be paid, and the inevitable readjustments must liquidate the unsound over-investments of the boom and redirect the economy more toward consumer goods production. And, of course, the longer the boom is kept going, the greater the malinvestments that must be liquidated, and the more harrowing the readjustments that must be made.To keep the credit induced boom going, more credit and more money, provided at ever lower interest rates, are required. Somehow central bankers around the world seem to know this economic insight, as their policies have been desperately trying to encourage additional bank lending and money creation.Why Raise Rates Now?Why then do the decision makers at the Fed want to increase rates? Perhaps some think that a policy of de facto zero rates is no longer warranted, as the US economy is showing signs of returning to positive and sustainable growth, which the official statistics seem to suggest.Others might fear that credit market investors will jump ship once they convince themselves that US interest rates will stay at rock bottom forever. Such an expectation could deal a heavy, if not deadly, blow to credit markets, making the unbacked paper money system come crashing down.In any case, if Fed members follow up their words with deeds, they might soon learn that the ghosts they have been calling will indeed appear — and possibly won’t go away. For instance, higher US rates will suck in capital from around the word, pulling the rug out from under many emerging and developed markets.What is more, credit and liquidity conditions around the world will tighten, giving credit-hungry governments, corporate banks, and consumers a painful awakening after having been surfing the wave of easy credit for quite some time.China, which devalued the renminbi exchange rate against the US dollar by a total of 3.5 percent on August 11 and 12, seems to have sent the message that it doesn’t want to follow the Fed’s policy — and has by its devaluation made the Fed’s hiking plan appear as an extravagant undertaking.A normalization of interest rates, after years of excessively low interest rates, is not possible without a likely crash in production and employment. If the Fed goes ahead with its plan to raise rates, times will get tough in the world’s economic and financial system.To be on the safe side: It would be the right thing to do. The sooner the artificial boom comes to an end, the sooner the recession-depression sets in, which is the inevitable process of adjusting the economy and allowing an economically sound recovery to begin.
In this article by economist Jeffrey Sachs, pharmaceutical company Gilead is taken to task for selling its hepatitis C cure, Sofosbuvir (sold as Sovaldi), at a price of $84,000 per course of treatment. Sachs says that the actual production cost of Sofosbuvir is about $100.Sachs says that Gilead is "bilking the taxpayer" by charging the government prices far above production costs — and government is probably paying for most of the Sofosbuvir drugs. Sachs further complains that people will die because of Gilead's refusal to cut the price to something more affordable.Gilead, Sachs says, bought the patent rights to Sofosbuvir for $11 billion in 2011, and took the drug through the last stages of FDA approval, which came at the end of 2013. Gilead made $12.4 billion in 2014 from Sofosbuvir, and Sachs says that first quarter 2015 sales of the drug brought in revenues of $4.6 billion.Sachs quite rightly points out that patents are relevant to the issue. But he says that patents are "an important tool to incentivize R&D" which have been "abused" by Gilead, and argues that "life and death" patent holders should be subject to price controls by the federal government. He goes on to say that patients who are "denied access" (i.e., can't afford the drug) should sue Gilead for reckless endangerment. And finally, Sachs suggests "public outrage and activism."In an August 6 tweet replying to me and to a physician who had briefly engaged him on this issue, Sachs said, "No way to regard the current arrangements even crudely efficient or equitable. Killing people senselessly."Sachs is being somewhat disingenuous in representing the production cost of Sofosbuvir as $100, and the markup as 800 times costs (as he did in an August 7 tweet). There are substantial fixed costs involved in R&D, trials and FDA approval, and the like. Any company incurring those costs expects to recover them by charging an above-marginal cost, and if trade secrets or other features of the market allow them to do so, they will. Sachs’s markup complaint applied elsewhere doesn't make sense — authors of mass market paperback novels (and maybe Sachs himself, the author of several popular books) would also have to be regarded as terrible price gougers because the marginal cost of printing a paperback book is a few cents, while the retail price is $6 to $10. Of course, writing the manuscript is an extremely costly part of the production process. Once that is done, reproduction can be relatively cheap. This is true of a great number of goods and services for which there is a high up-front cost.Problems with PatentsThere are clearly some problems stemming from the intellectual property rules here. The government will prosecute any firm that competes with Gilead in the production of the particular chemical formula Gilead has acquired. Sachs is right, then, that IP is relevant here. But rather than see IP as part of the problem, he defends patents as basically beneficial and proposes using them as a way for a government to beat a company's prices down. In contrast to the widespread notion of most of the public and most policy commentators, it is not at all clear that patents are essential to drug innovation. Even where new drugs could be reverse-engineered and copied, innovation could still be rewarded in a world without patent laws. See, for example, this article by Nathan Nicolaisen. First-mover advantages may be important, as could the inevitable delays in ramping up generic drug production. Nicolaisen mentions a survey of R&D labs and company managers that indicated that they believed trade secrets to be more effective than patents in getting a return on an investment. For a more extensive treatment of a free-market view of IP, see Jacob Huebert's article here, and for an application to a similar issue involving a life-saving drug, see this article by Stephan Kinsella.The FDA’s Role in Denying Access to Health CareSachs — at least in this article — ignores the role of the FDA in causing death and suffering by keeping drugs off the market. When Gilead bought the patent to Sofosbuvir, it was running the risk that the drug would not be approved, or that approval would be delayed so long that the opportunity cost of its initial $11 billion investment would become quite large. Uncertain but potentially large profits after approval may be quite reasonable, given the risk Gilead took on. The FDA itself injects a politicized uncertainty into the drug research, production, and marketing process, and therefore drives up costs.These costs can appear as death and suffering as well as dollars. The American public tends to think of the FDA as a protector against dangerous side effects, as we saw with Thalidomide decades ago. But how many Americans have died because of lags in approval? A five-year delay in bringing the antibiotic Septra to the US market may have cost 80,000 lives. A lag in the approval of beta blockers may have cost 250,000 lives.Miller, Benjamin, and North, The Economics of Public Issues, 18th ed. (2014), pp. 6, 7. The FDA's ban on advertising aspirin as an effective preventer of first heart attacks may have caused the deaths of tens of thousands of Americans every year. But because it's easy to identify those harmed by side effects, and difficult to identify who might have been saved by earlier introduction of Septra to the marketplace, the FDA tends to be over-conservative in its regulatory process.Some Regulation Begets More RegulationBut one of the most interesting features of Sachs's diatribe against Gilead is how well it tracks with Ludwig von Mises's explanation of the natural progression of socialism. In "Middle of the Road Policy Leads to Socialism," Mises points out that a government facing milk shortages from its price controls on milk may add to its initial intervention a second intervention controlling the prices of the factors of production used in milk production, and then — if the government still refuses to acknowledge the fundamental problems of intervention —a third intervention controlling the prices of still other resources. The price system shrinks and is gradually replaced with central planning.Sachs sees problems with the prices of Gilead's new drug. And I do too — I don't think for a minute that the free-market price of Sofosbuvir would be $84,000 per course. But rather than attack the State's patent laws directly, as well as the costly FDA regulatory process and other interventions, Sachs wants price controls on life-saving drugs. This is a well-traveled path toward socialism, and it does not end well.
October 30, 1929. A brisk autumn’s day in Manhattan. The Savoy-Plaza Hotel’s thirty-three stories cast a long shadow over Central Park. At the base of the hotel a financier lies freshly fallen, motionless, while his last breath, wrenched from the lungs by force of impact, is now a red mist of gore in the air.Sirens and uniforms. The suicide spot quickly becomes crowded by spectators, who form a vision-impairing ring-fence of backs, much to the annoyance of elbow-throwers at the periphery. Winston Churchill stands at his hotel window looking down on the mess. To nobody’s surprise, the police will find an empty wallet and five margin calls in the dead man's pockets.This is a dramatization of an event reported by Winston Churchill. Quoted on p. 7 of Robert P. Murphy’s Politically Incorrect Guide To The Great Depression and the New Deal.Churchill’s curtains flutter shut, and we are left to wonder whether anyone — Churchill included — can yet see his clumsy, cigar-wielding hand in it all; whether anyone realizes that, had Churchill as Chancellor of the Exchequer only restored the gold standard at a lower exchange rate, as Keynes had recommended, the Wall Street Crash of 1929 could have been averted (or at least ameliorated).Alas, by ignoring Keynes in 1925, Churchill triggered a calamity so severe that it not only inspired one man to kill himself beneath the British statesman’s very window but, more insidiously, also provided the impetus for the economics profession’s rejection of the “classical” axioms. As Keynes’s biographer Robert Skidelsky writes, Keynes “did not believe in the system of the ideas by which economists lived; he did not worship at the temple.” And while “in former times he would have been forced to recant, perhaps burnt at the stake, as it was ... the exigencies of his times enabled him to force himself on his church.”1925: Britain’s Return to the Gold StandardThe pound sterling’s link to gold was severed at the start of WWI. After eleven years of unfettered inflation, Chancellor of the Exchequer Winston Churchill restored convertibility at the pre-war level of 4.25 pounds per ounce of gold.Keynes, quite rightly, took exception to this particular detail: expecting Britain’s global customers to go on paying the same gold-price for the weakened pound was unrealistic. At this exchange rate the pound would be overvalued, and the only cure would be a sustained period of deflation — which was “certain to involve unemployment and industrial disputes.” Indeed, in 1926 a general strike crippled Britain for nine days.What Keynes did not predict, however, was how Churchill’s blunder would later bring about an easing of monetary policy in America. And even supposing Keynes had predicted this side effect, would he have understood its implications for long-run sustainability? (Recall that both F.A. Hayek and Keynes predicted a crash would occur in 1929: Hayek because interest rates were too low, Keynes because they were too high!)1927: At the Fed (With Cap in Hand)American sellers (in particular) were accepting British gold in exchange for goods, but were dissuaded from returning it due to the unfavorable rate of exchange. As a result, Britain’s gold supplies diminished at a rapid rate, which made the authorities understandably twitchy: how could they keep their pledge to convert pounds into gold if they had none?In response, the Governor of the Bank of England, Montagu Norman, set off across the Atlantic and, with much pleading, persuaded the Federal Reserve to ease monetary policy. By lowering interest rates and raising inflation, the Fed stemmed gold flows into America, giving the British a much-needed respite from the ill-effects of Churchill’s costly pound.With this episode of soft-hearted internationalism came an upswing in the Wall Street boom and “from that date,” wrote Lionel Robbins, “according to all the evidence, the situation got completely out of control.”In The Great Crash, a very popular account of the lead up to the Great Depression, John Kenneth Galbraith writes:the rediscount rate of the New York Federal Reserve was cut from 4 to 3.5 percent. Government securities were purchased in considerable volume with the mathematical consequence of leaving the banks and individuals who had sold them with money to spare. The funds that the Federal Reserve made available were either invested in common stocks or ... they became available to help finance the purchase of common stocks by others. So provided with funds, people rushed into the market.Galbraith goes on to quote a member of the Federal Reserve Board who, with hindsight, called the operation “one of the most costly errors” committed by a banking system “in 75 years.”Galbraith finishes: “the view that the action of the Federal Reserve in 1927 was responsible for the speculation and collapse which followed has never been seriously shaken.”John Maynard Who?When Keynes wrote against returning to the gold standard at pre-war parity in 1925, he did so with the expectation that he might actually influence policy. As a younger, unknown man he had worked at the Treasury for a brief stint, leaving a legendary impression; and by 1925, six years after his best-seller The Economic Consequences of the Peace, he was a famous man whose words carried weight.It is not outlandish then to imagine a world in which Keynes got his way. In such a world, the Wall Street crash and ensuing depression might never have happened — without the costly pound, the Fed would have had no impetus to inflate. Keynes would subsequently have found the economics profession less rattled, less willing to abandon its “classical” axioms in favor of his new-fangled approach. Keynes might have averted Keynesianism.
Although leftists often like to condemn nullification as right-wing kookery, the left is quite good at employing the tactic.Certainly, the most successful nullification trend going on right now is in state refusals to enforce federal drug laws. Four states (Colorado, Washington, Oregon, and Alaska) have all unilaterally declared recreational marijuana to be legal.Left-Wing NullificationWhile the successful passage of these measures did require support from libertarians and conservatives, the Colorado, Washington, and Oregon efforts especially benefited from support from people most would identify as so-called left-wing “liberals.”Many other states have unilaterally legalized marijuana for medical use as well, and it’s in the GOP heartland (i.e., the South and Great-Plains Midwest) where resistance to this form of nullification remains the strongest. Indeed, GOP elected officials from the South and Midwest are currently suing Colorado to force greater federal enforcement of drug laws.Meanwhile, leftists are the driving force behind the so-called “sanctuary city” movement in which local officials refuse to cooperate with federal officials on enforcement of federal immigration laws.In addition to passing these nullification measures, the left then follows up with pressure on federal enforcement agencies to take no action. It’s not an accident that the Obama administration has decided to take a hands-off approach in both cases. The administration has been pressured by key interest groups within the party to let the whole thing slide.Whether or not a Republican administration would be as laissez-faire in the matter is unclear, but in both drug nullification and local nullification of immigration law, the tactic is working, for now.Right-Wing NullificationThe conservatives have had some successes in their own way. Eight states (at the prompting of conservatives) have passed laws that nullify federal laws on guns within their own borders. Like the marijuana nullifiers, the gun-law nullifiers simply refuse to assist the feds in enforcing federal gun laws.In addition to this, conservatives in Michigan helped pass a law prohibiting state officials from assisting the federal government in indefinite detention under the NDAA.There’s also been much talk of nullifying Obamacare (via refusal to set up state-level exchanges) and nullification of the gay marriage mandate (via refusal to issue marriage licenses), but neither of these seem to be gaining much traction.It would make sense that conservatives, who claim to be for so-called states’ rights and “local control,” would promote such measures. But conservatives are also among the most enthusiastic defenders of the NSA and the USA-PATRIOT Act, so we should not be surprised when conservatives attack their own on the nullification issue on the basis of its legality. The influential (among conservatives) Heritage Foundation, for example, back in 2012, issued a fact sheet condemning nullification as unconstitutional.Other conservative groups have strongly opposed it as well.Conservatives continue to be pre-occupied with the legality of nullification, and having correctly figured out that it is in fact illegal — because federal law says what the federal judges say it says — they quickly retreat to the more respectable and “legal” means of opposing federal power.Leftists are then more than happy to pile on and condemn these conservative attempts at nullification while carefully avoiding mention of their own uses of the very same tactic.Political Tactics vs. Legal TacticsNullification has never been a legal tactic. It has always been an extra-legal and political one. Law and politics are two different things, and the fact that something is illegal does not mean it’s politically unfeasible or impossible.Certainly, the nullification of the fugitive slave acts was highly successful, and also very illegal. It was so successful, in fact, that it was a major factor in Southern secession. We know this because the South Carolina declaration of secession specifically says so:But an increasing hostility on the part of the non-slaveholding States to the institution of slavery, has led to a disregard of their obligations, and the laws of the General Government have ceased to effect the objects of the Constitution. The States of Maine, New Hampshire, Vermont, Massachusetts, Connecticut, Rhode Island, New York, Pennsylvania, Illinois, Indiana, Michigan, Wisconsin and Iowa, have enacted laws which either nullify the Acts of Congress or render useless any attempt to execute them. In many of these States the fugitive [i.e., escaped slave] is discharged from service or labor claimed, and in none of them has the State Government complied with the stipulation made in the Constitution.The South Carolinians won the legal argument, but even without the Civil War, they would have lost the political argument. Federal law was clear that the runaway slaves were a federal matter, and anyone who interfered with it was in violation of the law. But that didn’t stop those who lied to federal officers and helped slaves evade bounty hunters. Northern nullifiers simply weren’t going to respect federal law in this matter because they viewed it as immoral. The South Carolinians correctly concluded there was little they could do about this, and they included it among the reasons they seceded.Nullification Works When You Want Government InactionWhen nullification enjoys either the indifference or support of a sizable portion of the local population, and is based on encouraging government inaction, it tends to work. In the case of marijuana-law and immigration-law nullification, local governments have refused to enforce federal law, and the same was true with anti-slavery nullification. In fact, the slave owners, who were a very powerful interest group at the national level, and who had both the Constitution and the federal courts on their side, were at a lopsided disadvantage with individual states because the slave-owners needed more government action and enforcement of pro-slavery laws on the part of state governments. Even worse (from the pro-slavery perspective), enforcement was expensive. The anti-slavery activists, on the other hand, merely needed their state governments to look the other way.Then as now, the federal courts were not on the side of nullification. But when the federal government is faced with enforcing federal law over the objection of local law enforcement — which is more sensitive to local sentiments — and when at least one branch of the federal government is on the side of the nullifiers, things become legally and politically murky.And this, apparently, is what the left understands. They know that the feds can only do so much to enforce federal law on their own, without help from local government. Yes, the feds have their own federal agents, but federal police forces are actually quite small compared to state and local police forces (unless, of course, the feds call in the military.) But big, federal enforcement operations tend to be PR disasters, as in the case of the Bundy Ranch standoff.It’s always better to get help from the locals, and when they refuse, it’s hard to force their hand.The reverse, however, does not work as well. That is, if nullification consists of requiring an active role for state and local officials, follow-through is a problem. For example, if states attempted to nullify Roe vs. Wade, they would run into trouble, because that sort of nullification would consist of actively shutting down physicians and abortion clinics. That’s different than simply refusing to take action.But even “do nothing” attempts at nullification remain on very shaky legal ground. According to modern interpretations of the Constitution, there is no legal provision for state and local officials (including state courts) to refuse to enforce laws that modern constitutional scholars claim are part of “the law of the land.” And if they so choose, federal officials can still selectively enforce the law using their own agents.But on a practical level, these sorts of nullification laws do in fact limit the federal government’s ability to enforce its own laws. The feds can always call out the National Guard and force local compliance, but that can’t be done on a permanent basis. Moreover, the feds avoid big showy displays of federal supremacy, not because they are constrained by the law, but because they are constrained by political opinion. And political opinion is the crucial factor that conservatives tend to ignore. The right seems to think that legal realities trump political ones, while the opposite is more often true.
Asset price inflation, a disease whose source always lies in monetary disorder, is not a new affliction. It was virtually inevitable that the present wild experimentation by the Federal Reserve — joined by the Bank of Japan and ECB — would produce a severe outbreak. And indications from the markets are that the disease is in a late phase, though still short of the final deadly stage characterized by pervasive falls in asset markets, sometimes financial panic, and the onset of recession.Global Signs of DangerA key sign of danger, recognizable from historical patterns of how the disease progresses, is the combination of steep speculative temperature falls in some markets, with still-high — and in some cases, soaring — temperatures in other markets. Another sign is some pull-back in the carry trade, featuring, in particular, the uncovered arbitrage between a low (or zero) interest rate, and higher rate currencies. For now, however, this is still booming in some areas of the global market-place.Specifically, we now observe steep falls in commodity markets (also in commodity currencies and mining equities) which were the original area of the global market-place where the QE-asset price inflation disease attacked (back in 2009–11).Previously hot real estate markets in emerging market economies (especially China and Brazil) have cooled at least to a moderate extent. Most emerging market currencies — with the key exception of the Chinese yuan — once the darling of the carry traders, are in ugly bear markets. The Shanghai equity market bubble has burst.Yet in large areas of the high-yield credit markets (including in particular the so-called covenant-lite paper issued by highly leveraged corporations) speculative temperatures remain at scorching levels. Meanwhile, Silicon Valley equities (both in the public and private markets), and private equity funds enjoy fantasy valuations. Ten-year Spanish and Italian government bond yields are hovering below 2 percent, and hot spots in global advanced-economy real estate — whether San Francisco, Sydney, or Vancouver — just seem to get hotter, even though we should qualify these last two observations by noting the slump in the Canadian and Australian dollars. Also, there is tentative evidence that London high-end real estate is weakening somewhat.How to Identify Late Stages of Asset InflationWe can identify similar late phases of asset price inflation characterized by highly divergent speculative temperatures across markets in past episodes of the disease. In 1927–28, steep drops of speculative temperature in Florida real estate, the Berlin stock market, and then more generally in US real estate, occurred at the same time as speculative temperatures continued to soar in the US equity market. In the late 1980s, a crash in Wall Street equities (October 1987) did not mark the end-stage of asset price inflation but a late phase of the disease which featured still-rising speculation in real estate and high-yield credits.In the next episode of asset price inflation (the mid-late 1990s), the Asian currency and debt crisis in 1997, and the bursting of the Russian debt bubble the following year, accompanied still rising speculation in equities culminating in the Nasdaq bubble. In the episode of the mid-2000s, the first quakes in the credit markets during summer 2007 did not prevent a further build-up of speculation in equity markets and a soaring of speculative temperatures in winter 2007–08 and spring 2008 in commodity markets, especially oil.What insights can we gain from the identification of the QE-asset price inflation disease as being in a late phase?The skeptics would say not much. Each episode is highly distinct and the disease can “progress” in very different ways. Any prediction as to the next stage and its severity has much more to do with intuition than scientific observation. Indeed some critics go as far as to suggest that diagnosis and prognosis of this disease is so difficult that we should not even list it as such. Historically, such critics have ranged from Milton Friedman and Anna Schwartz (who do not even mention the disease in their epic monetary history of the US), to Alan Greenspan and Ben Bernanke who claimed throughout their years in power — and these included three virulent attacks of asset price inflation originating in the Federal Reserve — that it was futile to try to diagnose bubbles.We Can’t Ignore the Problem Just Because It’s Hard to MeasureDifficulties in diagnosis though do not mean that the disease is phantom or safely ignored as just a minor nuisance. That observation holds as much in the field of economics as medicine. And indeed there may be a reliable way in which to prevent the disease from emerging in the first place. The critics do not engage with those who argue that the free society’s best defense against the asset price inflation disease is to follow John Stuart Mill’s prescription of making sure that “the monkey wrench does not get into the machinery of money.”Instead, the practitioners of “positive economics” demonstrate an aversion to analyzing a disease which cannot be readily identified by scientific measurement. Yes, the disease corrupts market signals, but by how much, where, and in what time sequence? Some empiricists might acknowledge the defining characteristic of the disease as “where monetary disequilibrium empowers forces of irrationality in global markets.” They might agree that flawed mental processes as described by the behavioral finance theorists become apparent at such times. But they despair at the lack of testable propositions.Mis-Measuring Increases in Asset PricesThe critics who reject the usefulness of studying asset price inflation have no such qualms with respect to its twin disease — goods and services inflation. After all, we can depend on the official statisticians!In the present monetary inflation, a cumulative large decline in equilibrium real wages across much of the labor market, together with state of the art “hedonic accounting” (adjusting prices downward to take account of quality improvements) has meant that the official CPI has climbed by “only” 11 percent since the peak of the last business cycle (December 2007). The severity of the asset price inflation disease makes it implausible that the official statisticians are measuring correctly the force of monetary inflation in goods and services markets.What Is the Final Stage?A progression of the asset price inflation disease into its final stage (general speculative bust and recession) would mean the end of monetary inflation and also inflation in goods and services markets. What could bring about this transition? Most plausibly it will be a splintering of rose-colored spectacles worn by investors in the still hot speculative markets rather than Janet Yellen’s much heralded “lift-off” (raising official short-term rates from zero). What could cause the splinter?Perhaps it will be a sudden rush for the exit in the high-yield credit markets, provoked by alarm at losses on energy-related and emerging market paper. Or financial system stress could jump in consequence of the steep falls of speculative temperature already occurring (including China and commodities). Perhaps there will be a run from those European banks and credit funds which are up to their neck in Spanish and Italian government bonds. Or the Chinese currency could tumble as Beijing pulls back its support and the one trillion US dollar carry trade into the People’s Republic implodes. Perhaps scandal and shock, accompanied by economic disappointment will break the fantasy spell regarding US corporate earnings, especially in Silicon Valley. As the late French President Mitterrand used to say, “give time to Time!”
Dan Price, the CEO of Gravity Payments, took a $930,000 pay cut to raise the minimum salary of his employees to $70,000. The plan was announced in April 2015, and set to be completed over the course of three years. Both his employees (especially the ones with a larger pay increase) and proponents of income equality celebrated the move. It garnered considerable publicity and rippled through social media, with mostly positive but some negative reactions.In the New York Times piece that reported on people’s initial reactions in April, they quoted Rush Limbaugh calling it “pure, unadulterated socialism,” and an economist from the American Enterprise Institute saying “A lot of people have the sense that this may work for this one firm, but it is nothing we should take general lessons from.” Another economist from the Stanford University Hoover Institution took a different stance and predicted, “This is going to be great for his business.”As usual, most of the praise and uproar from the respective proponents and critics are either wrong or right for the wrong reasons (if there’s not already a name for this phenomenon, there should be). But we can say this even without the benefit of hindsight, which has shown that the CEO’s actions have had some negative consequences he did not anticipate.The Strategy BackfiresThe New York Times published another piece about three and half months later, reporting turmoil and struggles for the Seattle-based firm, directly and indirectly related to the new pay structure.Some clients of Gravity Payments left because they viewed the action as a political move or because they expected fee increases as a result. But the number of new clients has more than offset those that sought payment processing services elsewhere, meaning Gravity Payments had to hire more employees, which now come in at a minimum of $70,000 a head.The firm’s real problems are internal, though. According to the New York Times article, “Two of Mr. Price’s most valued employees quit, spurred in part by their view that it was unfair to double the pay of some new hires while the longest-serving staff members got small or no raises.”Also, Dan Price’s brother, Lucas Price, has sued over violations of his rights and benefits as minority shareholder of Gravity Payments. Lucas also accused Dan of having excessive CEO pay (beyond the stipulations of their contract), which was $1 million before Dan’s voluntary pay decrease. So one major reason for the charitable restructuring of pay may have been to get public opinion on Dan’s side — quite the ignoble scheme for a seemingly noble move.How Does Economic Theory Tie In?It’s tempting to pull in arguments against minimum wage legislation for this case, but the ostensibly applicable claims from economic theory actually don’t apply here. Dan Price voluntarily increased his employees’ pay. All of his employees are still earning no more than their expected discounted marginal revenue product. It’s just that some of their “product” may be non-monetary or “psychic” for the CEO, in the form of a good feeling Mr. Price gets from charitable donations, or the reputation Mr. Price wants as a CEO. The benefit he would get by having public opinion on his side and against his brother in their dispute would also qualify as psychic profit.Entrepreneurs hire laborers on the margin, meaning they make decisions about hiring an additional laborer based on what that additional laborer would be paid and how much that additional laborer would help produce output and therefore generate revenue from the sale of output. Because of this, a laborer’s discounted marginal revenue product is the maximum any entrepreneur is willing to pay for a given laborer (“discounted” because there is a time difference between the laborer’s pay and the sale of output).The situation with Gravity Payments requires that we distinguish between factor payments and charitable gifts. Suppose Dan Price hires a laborer at $70,000/year, but the laborer only brings in $50,000/year of increased revenue for the firm. This means that, for Mr. Price, it’s worth $20,000 for that worker to have $20,000 more per year, whether it’s in the name of income equality, or a happy-workers-are-productive-workers philosophy, or just plain charity.The situation is the same with any sort of charitable gift. If A donates $100 to B, it means A prefers that B have the $100 (and not A) to A having the $100 (and not B). Charity isn’t “socialism” (per Rush Limbaugh), it’s people doing what they want to do with their own money, i.e., capitalism.This, then, is the extent of the economics of the situation. It starts and ends with the coordinated preferences and expectations of the entrepreneur and the workers. On the other hand, there’s much to be said about Mr. Price’s business strategy and the social, psychological, and organizational implications.Fairness and Equal PayWorkers prefer to be treated fairly, which doesn’t necessarily mean they all want the same pay. Maisey McMaster, former financial planner for Gravity Payments argued against the move and ended up leaving her job because of it. In her words, “He gave raises to people who have the least skills and are the least equipped to do the job, and the ones who were taking on the most didn’t get much of a bump.”Mr. Price also lost Grant Moran, a talented web developer, who felt like the new pay structure wasn’t fair: “Now the people who were just clocking in and out were making the same as me.” He also said, “It shackles high performers to less motivated team members.”Many of the employees didn’t like their pay information being open to the public eye, especially with all of the politically motivated attention. Other employees stated they didn’t feel like they deserved their new higher pay. One even admitted, “I didn’t earn it.”So it seems that even the workers of a progressive and trendy (it is Seattle-based, and many of its clients are a part of the ultra-hip Pike Place Market) firm don’t equate “fairness” with “equal pay” — in fact, it has spawned envy, guilt, and ill feelings for their boss and coworkers. But this isn’t some inexorable law of human behavior. We could easily imagine a situation where workers do demand equal pay and collectively bargain for such a result. The economics of this type of situation would be different than the one at Gravity Payments, though (see Man, Economy, and State, chap. 10).Economic theory pertaining to minimum wage legislation, unions, or socialism can’t be applied here directly. We can, however, branch outside the scope of economics and take the social, psychological, and organizational implications of an entrepreneur’s voluntarily chosen minimum salary (like with Gravity Payments) and reasonably apply them to government-mandated minimum wage and equal pay schemes. Imagine millions of people thinking the same things as Maisey McMaster and Grant Moran, who felt unfairly treated with the new pay structure. Or even more people saying, “I don’t earn my wage.”If these sorts of negative consequences arise from a voluntary equal pay scheme, I don’t think we could expect anything better from an involuntary one on a national level.