Finding Balance in U.S. Foreign Policy

Finding the “right” foreign policy mix is never easy for the typical nation.  Each must weigh the national interest, resource availability, power relations & hierarchy, ethics, and the global common good, to name a few considerations.  Unfortunately for America, however, these considerations are, in some ways, far more difficult for it to answer, and much more complex than the decisions others face.  For America is not the typical nation.

Indeed, since the fall of the Soviet Union in 1991, America has found itself in the somewhat awkward position of being the world’s last remaining superpower.  Even today, despite the rise of China and other emerging nations, the United States by far retains the world’s largest economy in absolute terms, and a military might (and budget) that dwarfs any other.  This has undoubtedly brought multiple benefits – vast geopolitical influence, favorable terms of trade, cultural hegemony, etc.  Being as it is at the top of the world system, however, it has also evoked intense scrutiny and criticism of its behavior.  For the world expects, in return for the benefits of unipolar hegemony, certain services in return, namely international security (especially for trade routes), a respect for sovereignty, economic and non-economic aid, and a respect and maintenance of international institutions, agreements, and norms.  Since the end of World War 2, this American-led and American-arranged world system has existed, and has been remarkably successful.  Since 1945, no global wars have occurred, inter-state peace has been (for the most part) maintained,  and new phases of globalization and free trade have flourished, benefiting both America and participants in the world system.

With the collapse of its political, ideological, and economic rival in 1991, however, and with no other country anywhere near matching America in its economic and military clout, the last 20 years have placed new strains on this American-led world system.  Whereas before the rivalry between the Soviet Union and the U.S. helped in some ways to restrain the other, now there was virtually no one to restrain the one remaining superpower.  Although America has, in my opinion, used its unique position to benefit the world on net (not only since 1991 but since the world system came into being in 1945), it still has been difficult for it to strike a good balance between restraint and the need to be assertive on the world stage.

Take, for example, America’s behavior following 9/11.  Up until that terrible event, American military spending and aggressiveness had been on decline.  This was due in part to the Cold War’s end, which meant that much related spending could be unwound.  Additionally, finding itself as the new unipole, and perhaps not wishing to abuse that status, America decided to ease any interventionist impulses (UN-related mishaps in Somalia in 1993 didn’t help).  No doubt, also, that the unique characteristics of America’s government at the time – a domestic-focused Democratic president and budget-conscious Republican Congresses – helped to shape the retrenchment of the 90s.  9/11 quickly changed all of that, and an “offense is the best defense” strategy soon prevailed, no doubt shaped by the unique neoconservative influence of the Bush Administration.  Defense spending surged, and because the 9/11 attacks came from non-state actors that tended to reside in states that “harbored” them, a new focus on militarily intervening in hostile and/or unstable states emerged.  The resultant wars in Afghanistan and (especially) Iraq that began not long after 9/11 have since provoked furious debate as to their necessity and “legality” within the world system.  Was America fulfilling its duties to maintain long-term global stability and acting according to its sovereign right to defend itself?  Or was this an imperial overstretch in which America attempted to impose its will on others?

Perhaps the answer is both, and when one takes a look at the considerations facing policymakers at the time, in some ways America’s actions look quite rational.  First of all, Afghanistan was harboring al Qaeda (the group responsible for 9/11), who needed to be rooted out to prevent future attacks.  Second, to help further prevent terrorist attacks, some “nation-building” of Afghanistan was required.  All of these could be said to benefit both the US and the world (at least theoretically speaking – for actual results, that’s a whole other matter ). Following 9/11, Afghanistan therefore seemed like a reasonable place to intervene.

But why Iraq?  After all, Iraq had no direct connections with al Qaeda, and it certainly wasn’t the only other hostile state in the world at the time (e.g. Iran, North Korea, etc.)  The original rationale, because Iraq was thought by several nations to have WMDs, doesn’t exactly explain why the U.S. decided to intervene solely there, as other hostile nations had or were attempting to gain WMDs.  Nor does the need to bring “democracy” and “freedom” – again, if that were really a main consideration, then the United States should have intervened in a significant portion of the world that lacks those criteria.  No, I think the real reasons are a blending of the following:

a) America badly needed to (or thought it needed to) project power into the Middle East to show that it was in control of the world system following 9/11.  Why the Middle East?  Because, it generally is the region where Islamic terrorism originated and where the ideology is fomented (though certainly not the only region of the world).  In this way, its invasion of Iraq was actually a quasi-response to 9/11.

b) Iraq had repeatedly been uncooperative with UN weapons inspections throughout the 1990s, and that (combined with past use of WMDs, such as using chemical weapons during the 1980s) violated international norms.  In a way, America was punishing noncompliance to international norms and a respect for international security, though ironically in some ways it did so by violating established international norms (e.g. going forth with the invasion of Iraq without UN authorization).  This punishment could also be seen as a message for other regimes (like Iran and North Korea) to get their act together (this arguably worked somewhat as intended, for at least the Libyan regime gave up its WMD pursuits following the 2003 invasion of Iraq).

c) Plans for regime change had long been discussed and advocated by people within both the Clinton and Bush administrations, as Saddam’s regime was widely perceived as a threat.  Failure to remove him from power during the Gulf war was seen as a weakness for the United States.  Since 9/11 was also seen as a moment of weakness, this was the perfect time to demonstrate that America had power and would “get the job done”.  Rahm Emanel’s “never let a crisis go to waste” statement naturally comes to mind.

d) Rightly or wrongly, Iraq (along with Afghanistan) was seen as an opportunity to spread democracy and other American values, arguably to help better maintain world security (refer to the democratic peace theory).

Clearly, these are rational reasons for America’s turn towards interventionism, which can arguably be said to help maintain both the global system and secure American interests (both of which are not necessarily mutually exclusive concepts).  Then again, many of America’s actions (in Iraq especially) seemed unrestrained and negligent.  For example, in the post-invasion phase, Paul Bremer made serious errors as leader of the Coalition Provisional Authority (e.g. basically temporary governor of Iraq).  He unilaterally abolished the Iraqi military, initiated a “De-Ba’athification” program that rooted out Ba’ath Party influences, and failed to hand over sovereignty and self-governance to the Iraqis quickly.  These quasi-totalitarian actions (and lack of action) not only looked like an overreach, but they directly contributed to the insurgency and instability in the years following the invasion, an instability that threatened to cause wider instability in the international arena. An even wider question is whether Iraqi culture is even compatible with democracy.  Was it being imposed?  Was it right for them?

I think America had both maintenance of of the world system, self interest, and, perhaps, some quasi-imperialistic ambitions in mind when it comes to Iraq.  By imperialistic, I don’t necessarily mean creating a colony or subjugating the Iraqis to American rule in the long-run.  I do mean, however, an attempt to impose American-style governance and values in the naive belief that this could only enhance American and world security.  And in the post-9/11 environment, we felt justified in being so bold, seeing it as necessary to preserve our image as leader of the world system.  Imposing American values and maintaining a secure international order  (including America’s position at the top) became one in the same to us.  Regardless of our intentions, however, it is unclear whether intervention in Iraq can ultimately be said to have benefited either America or the world.  While a hostile regime was removed, its replacement by a “democratic” Shi’a regime makes it vulnerable to Iranian influence, and Saddam’s fall removed a check to Iranian behavior.  The conflict was very costly in terms of lives and finances.  Additionally, our negligent actions and non-actions post-invasion ironically have arguably made Iraq a new breeding ground for terrorists, making us worse off security-wise.  Only time will tell what becomes of Iraq.

The shift away from overreach began during the remainder of the Bush administration and has continued into the Obama presidency, regardless of various troop “surges”.  Iraqi sovereignty returned to them slowly, and a recognition that costly military intervention must be followed by costly  post-invasion nation building (lest the world system be more threatened than before invasion) has caused the United States to adopt a policy of “selective engagement” during the Obama years, with the least amount of military intervention as possible.  The problem is, the pendulum risks swinging much too far towards non-interventionism and non-assertiveness, depriving the world of proper American leadership and a fulfillment of its hegemonic duty as implicit maintainer of world security.  Inaction following Syria’s breach of Obama’s vague “red line” and an unwillingness for any military presence in Iraq or Afghanistan (despite continued internal security concerns) undermines the United States’ credibility, its position as hegemon, and threatens the security of the world system it maintains.  In some ways, I think too much emphasis is being placed on diplomacy.

Like the 1990s, military spending and aggressiveness are again on the decline.  But surely, the pendulum will swing back, perhaps this time to help address the rise of China, or just generally to correct for too much retrenchment in the late 2000s-present.  Regardless of America’s actions, China’s rise will surely disrupt or even upend the world system that has existed since the end of World War 2.  The considerations America has faced during its “unipolar moment” since the 1990s have been uniquely complex.  Now, with the construct of the entire world system shifting, striking a good foreign policy balance will be more difficult than ever before.

 

Advertisements

On America’s “Great Stagnation”

This posting will briefly discuss the historically weak growth since the end of the 2007-2009 “Great Recession” (though a greater period of time, reaching into the 2000’s or “aughts” or even further back, can also be included in the definition).  It argues that, though we may have reason to be alarmed at slower long-run economic growth, by many measures living standards have improved at a rapid rate, and will continue to do so into the foreseeable future.


Since the end of the last business cycle trough in June 2009, countless observers have noted – and lamented – the historically anemic growth rate of America’s economy.  In the 19 quarters since the beginning of the recovery, GDP growth has averaged around 2% annually – well below the 4% average for recoveries after 1960, and barely enough to generate the jobs needed to absorb entries into the labor market.  Indeed, in the first quarter of 2014, the economy logged (at a seasonally adjusted annual rate) of  -1%; in other words, it registered an actual contraction, the first quarter since 2011 to do so.  Although the particular severity of Q1’s stagnation  is likely temporary, it nonetheless does well to highlight the unique sluggishness that has characterized this recovery since the beginning.  Also highlighting the recovery’s weakness is the economy’s failure to quickly return to potential output, reflected in the continued existence of a large output gap (see charts 1& 2).

Real vs. Potential

 

bivens-figure2

It is true that economic growth remains far from normal – especially considering the depth of the preceding recession, which usually are followed by sharp “bounceback” recoveries (as pent-up consumer & investor demand is unleashed).

However, there are a couple of things to keep in mind:

1) This was not a “normal” recession.  Normally, recessions are sparked by mild shocks in aggregate demand or aggregate supply, oftentimes instigated by a contractionary monetary policy.  This time, however, there was an extreme shock to aggregate demand as a plummet in housing prices pushed down household consumption (the “wealth” effect) and the deterioration in the balance sheets of financial institutions caused a freeze in credit markets.  “Balance-sheet” recessions like these are typically severe, and have long-lasting effects.  Growth tends to be much weaker in decade following financial crises than normal recessions as households and institutions “deleverage” their debts to repair their balance sheets.  Since the United States had not, until now, experienced a true financial crisis since the Great Depression, this sluggish recovery can be considered historically unique.

2) Growth and potential growth have been slowing for decades.  When one looks at real GDP and potential GDP over long periods of time (see chart), it becomes clear that long-term growth has been slowing for decades.  Especially recently, after each subsequent recession the recoveries have been weaker than the one preceding them.  Although it only shows data through 2011, the second chart below clearly demonstrates this pattern.

2.1.1-GDP-gap-OPT

screen shot 2013-01-30 at 10.48.29 am

3) As the population continues to age and retire, sluggish growth is only to be expected – unless productivity growth accelerates.  An economy essentially grows for two reasons: the population/labor force is increasing and/or labor productivity (ouput/hour or, more generally, the amount of output with a set of given inputs) grows.  The latter is especially important in helping to boost living standards, as more efficient production allows for more income to be distributed and for goods and services to be produced at lower costs.  Historically, especially during the “golden age of capitalism” from the 1940s-1970s, the economy has benefited from both labor force growth and productivity growth.  Beginning in the 70s, however, the 2nd factor – productivity growth – began a to register a marked slowdown, even as the labor force continued to expand (especially with an increase in the participation of women).  The reasons for this slowdown are unclear.  Was US inflation distorting incentives and resource allocation?  Were technological waves delivering less of an impact as earlier technological waves?  And are these changes driven more by changes in the accumulation of capital stock or total factor productivity (TFP)?  Regardless, this slowdown in productivity has continued to the present day, interrupted only by a brief revival in the late 90s and early 2000s (see neat chart below that I made using data from the Bureau of Labor Statistics; as a note, the data represents quarterly % changes at annualized rates, and labor productivity is defined as output/hour).

Capture

All of this points to a couple of things.  Comparing this recovery to past ones should be used with a grain of salt, because

a) it follows a historically unique financial crisis, unlike other recoveries, and thus can be expected to be slow in the short-term

b) the growth trajectory has long been slowing, making many recoveries naturally more sluggish than those that preceded them, suggesting that, even without the financial crisis, stagnation could have been expected anyway

While some parts of this decline in long-term growth appear natural (such as a decline in labor force participation due to ageing populations), other parts – such as the productivity element – may or may not be.  This is because productivity growth can arguably be more strongly influenced by deliberate policy/non-policy actions than labor force participation can (at least when considering that most older baby boomers will have to retire at some point soon). Should the public and/or private sectors, for instance, be investing more in public and private capital?  Maybe.  But the urgency of that question depends on how much that sluggishness is translating into a stagnation in actual living standards.

Certainly, there are good arguments that American living standards have shown signs of stagnation as of late (and not just following the 2007-2009 recession).  For instance, as the chart below demonstrates, median household income (the income level of the theoretical household in the exact center of a data set of all household incomes) has registered virtually no net growth since the late 1980s.

Capture2

Other trends are worrying as well.  Poverty rates as defined by the Census Bureau have made almost no net progress since the 1970s, and the prevalence of health insurance and retirement plans (think defined-benefit pensions) have evaporated (at least until recently).  Combined with a worrying increase in health care costs and tertiary education tuition, and the typical American household has indeed seemingly experienced a “stagnation” for a fairly long period of time.

However, despite all of this negative “evidence”, I personally would still contend that living standards have still registered marvelous improvements, and will continue to do so.  First of all, GDP & productivity growth figures do not account for a crucial aspect of capitalism that is often under-appreciated: a long-run rapid improvement in product quality and capability.  While such figures may capture value-added in the production process, they cannot completely account for improvements in product capability and the additional satisfaction these new capabilities give to consumers.  For example, think about cars.  Economic statistics may reflect the total output of cars, the efficiency of their production, etc, but they oftentimes may completely ignore how much the typical car has changed.  For example, many cars are now equipped with sensory technology that makes driving smoother and more comfortable.  Anti-lock brakes, air conditioning, and even GPS systems, all once reserved for those with the most cash, are now becoming increasingly widespread and standardized with the industry, improving the driving experience of millions of consumers.

Additionally, I think too much emphasis is placed on incomes when it is often ignored how dramatically consumer costs have actually fallen in many industries.  For example, according to statisticbrain.com, the average price/MB of RAM has decreased from approximately $411 million in 1957 to less than six-thousandths of a dollar in 2013.  This has greatly increased the purchasing power of the typical consumer, and has been replicated in many other sectors of the economy.

While it is true that some very important industries that impact the middle class – namely healthcare and education – have shown rising costs, which is a concern that should be addressed, even here this largely reflects increases in quality.  New (albeit costly) technology and healthcare procedures, for example, have given consumers innovative and state-of-the-art choices.  These technologies and procedures have greatly increased the quality of life of people, something the statistics cannot ever fully reflect.

Overall, while I do think America has entered a “Great Stagnation” (not just in the short-run but over the past couple of decades) in terms of economic growth, I do not think this fact should be assumed to be entirely a bad thing.  Indeed, I think it is somewhat misleading – despite slower growth, many elements of living standards (which I only briefly touched upon) continue to make rapid progress, even if many other components of such standards have stalled (e.g. household income, health insurance coverage, etc.)  While it is certainly no excuse for complacency – we would do well to figure out ways to sustainably boost long-run growth – it is reason to think twice about repeated observations of a supposed “decline” in American affluence and its middle class.  The trends are a bit more complex than that.