Crisis Government

Philip Wallach

Summer 2020

Henry Kissinger once quipped: "There cannot be a crisis next week. My schedule is already full." That was back in the 1960s, when America's federal government was busy handling a dozen different state-building projects at home and abroad, when responding to crises required a break from this ongoing work. We still naturally think of crises that way. But in our own century, Washington appears to keep its calendar clear for crises — and to spend the rest of its time just waiting for them to occur. To be sure, our national politics offers plenty of sound and fury (to say nothing of idiocy), but during the non-crisis moments of the 21st century, this has signified very little. It is in times of crisis, for better and worse, that we find the federal government can do previously unimaginable things with remarkable alacrity.

Yet living through the out-years of the late 2010s has left us with the impression that American government is completely stuck. Year after year, we seem less and less able to change our laws. Any action that has taken place is executive-initiated, extensively litigated, and liable to be reversed by the next president. To fund the federal government's existing activities, we've held annual mock crises — though it has become more and more difficult to take these very seriously. And over and over again, we've been told that polarization and partisan enmity are the fundamental defining conditions of contemporary American life, that little short of an actual civil war can shake our national institutions from their torpor.

A global pandemic, it turns out, has served this purpose quite nicely — though, to be fair, our responses to the September 11th attacks in 2001 and the global financial crisis in 2008 were not marked by paralysis, either. It seems our system in the 21st century has heeded the wisdom of Barack Obama's chief of staff, Rahm Emanuel: "You never want a serious crisis to go to waste. And what I mean by that [is] it's an opportunity to do things that you think you could not do before." Indeed, in the out-years, our sense of what we can get done has been so anemic that crisis has been the health of the state.

Nonetheless, we still view crisis times as anomalous, so professional observers of American political life have tended to characterize our era with reference to the defining features of the non-crisis years. But that no longer offers an accurate picture of our politics. When we step back and regard 21st-century American politics, we ought to see that the crisis responses vastly exceed the "normal" actions in terms of importance. This change of perspective compels us to reject the idea that polarization is the defining feature of our era, and we must reassess our understanding of the American political system's capacities and infirmities accordingly. The overall picture is still a negative one, of course, but for reasons that differ from those we are used to hearing about.

IS CRISIS GOVERNMENT NEW?

"Events are in the saddle and ride mankind" is an old cliché, so before we consider how a crisis-dominated view of 21st-century American government should change our thinking, we must first establish whether our federal government's patterns of action really have changed. The short answer is that crisis response has always been a major component of federal activity. But during the second half of the 20th century, crisis responses were only a small portion of the federal government's overall portfolio of activities and enactments.

When we look at major acts of Congress in those decades — such as those included in political scientist David Mayhew's list of significant laws — we see that the federal government was devoted mainly to developing the welfare, civil-rights, and regulatory state, as well as building public infrastructure. Those projects were most active in the 1960s and 1970s, but they continued into the final decades of the 20th century. And they did not suddenly cease when the Twin Towers fell — the 111th Congress, coinciding with the first two years of Barack Obama's administration, stands out as a throwback.

But Congress has shown little ability to successfully keep up existing domestic programs in light of new developments and an aging population — let alone confidently chart new paths. The "normal" way the federal government has dealt with most non-urgent problems in the 21st century is to have executive-branch agencies cobble together disparate existing authorities designed for other purposes, leaving Congress to complain after the fact about the awkwardness of these improvisations and the judiciary to determine whether the agency's behavior was excessively outlandish or arbitrary.

Meanwhile, Congress's activity has become increasingly devoted to two kinds of crisis response. The first, most straightforward type occurs when some dramatic shock to the nation's system prompts a significant change to the law. To be sure, America has arguably received more such shocks in this first fifth of the 21st century than it did during the second half of the 20th, but part of what has happened is that Congress has come to feel more responsible for producing dramatic action in order to justify its activity. There was no major legislative response to the Cuban Missile Crisis of 1962, for instance, whereas Congress responded to the September 11th attacks by passing a spate of new laws.

The second type of crisis that has come to occupy Congress's attention is less obvious. In fact, one could dispute whether it deserves to be called a "crisis" at all. In these cases, some social condition becomes politically urgent enough to elicit the rhetoric of emergency. We have had a crack-cocaine crisis, which was followed by an opioid crisis, as well as a long-running debt crisis that is largely of Congress's own making.

Whereas Mayhew's list of Congress's most important laws, by my count, averaged roughly one crisis response per Congress from the 1960s through the 1990s, in the 2000s and 2010s, there were around four such responses per Congress. Taken as a portion of all laws in the dataset, crisis responses comprised less than a tenth from 1951 to 2000, but have been about a third from 2001 to 2020. Classifying laws as responses to crises involves some subjectivity, of course, but a change of this general magnitude is unmistakable.

To be sure, one could take the Pollyanna approach and attempt to justify the change in the objects of major national legislation by arguing that we of the 21st century have inherited a fully built-out ship of state that no longer needs expanding or upgrading; it simply needs to be steered through choppy waters. According to this way of thinking, the legislative foundation necessary to protect individuals from life's mundane hazards has already been laid. The predominance of crisis response in recent years, therefore, reflects a kind of natural maturation of the state in which only the most unpredictable developments require novel legislative action.

Yet such a view neglects the ways in which crises result at least in part from policy failures that are preventable. In each of the big three crises of the 21st century, actors within and without the federal government warned of dire risks — and were promptly ignored. Of course, information is hard to filter, and each of those events was truly extraordinary; fortuna will always shape a nation's life. The "mature state" defense is more clearly refuted, then, by two other kinds of crises that have in recent years elicited major federal responses: crises of fiscal policy, and responses to natural disasters. Viewed in light of legislative action, these two types of events are more closely related than they seem. And together, our responses to them do not inspire confidence.

First, since the 1980s, America has experienced a notable increase in fiscal crises generated by Congress's own taxing and spending choices. In the 1990s and 2010s, these crises were accompanied by dramatic political showdowns in which failure to reach an accommodation threatened to produce either a default on the obligations of the United States (thankfully averted) or a protracted government shutdown (which we have witnessed a handful of times in the last quarter-century, compared to zero in the country's previous history). We appear to have created an environment in which every attempt to allocate money feels like a kind of emergency. There is undoubtedly something phony in this feeling — a point to which we will return later — but these crises, and our reactions to them, hardly seem the mark of a mature political class calmly running the machinery of government.

For all the consternation over setting spending levels, we have also become much more accustomed to departing from our choices because of other "emergencies." Prominent among these are natural disasters, many of which are eminently predictable — at least in the sense of our being able to anticipate that costs will be incurred.

The federal role in disaster aid has come a long way over the years. Whereas that role was once limited to designating the American Red Cross as the government's official agent in coordinating private aid, Congress set up a permanent structure for addressing disasters beginning with the Disaster Relief Act of 1950, which requires governors to ask the president to designate federal disaster areas. That framework was amended and gradually built out over the following decades, with President Jimmy Carter creating the Federal Emergency Management Agency in 1979. But it was only with the Stafford Disaster Relief and Emergency Assistance Act of 1988 that the modern system rounded into form. As late as 1993, in the wake of the Great Flood of the Mississippi and Missouri Rivers that year, "King of Pork" Senator Robert Byrd could still insist that "disasters are not spending opportunities."

Since then, however, federal disaster spending has grown considerably, often in ways that make it seem that spending opportunities are precisely what disasters have come to represent. Whereas no 20th-century relief bill passed in response to a hurricane had ever been counted as a significant law, three have been counted as such in the 21st century ($14 billion in 2004, $29 billion in 2005, and roughly $140 billion in 2017-2018).

And the pattern is evident beyond these largest bills as well. As Justin Bogie of the Heritage Foundation has noted, the number of disaster declarations has leapt from 28 per year during the Reagan administration to around 130 per year in the 21st century, with spending rising significantly as well. Some of this rise represents a simple re-allocation of responsibility from state and local governments to the federal government, but some seems more about repackaging what is actually long-term investment as emergency response. For instance, after the devastating hurricanes of 2017, Congress appropriated $15 billion to the Army Corps of Engineers, compared to their annual appropriation of less than $7 billion for 2018. Bogie argues that what is really happening is that lawmakers are using disaster relief to sponsor local infrastructure projects, thereby circumventing the self-imposed earmark moratorium in place since 2011.

THE IRREGULAR AS REGULAR

The trend toward such irregular appropriation covers much broader territory than responses to natural disasters. In recent years, for instance, supplemental spending (determined outside of the normal appropriations calendar) has made a comeback. Supplementals were prevalent in the 1970s, when they sometimes constituted as much as 12% of total spending. But back then, such spending was driven by causes largely unfamiliar to us today: federal pay supplementals, appropriations delayed because Congress was waiting for program authorizations (we no longer bother with such quaint propriety today), and providing for newly legislated programs. In the 1980s, supplementals fell to just 1.1% of total appropriations, and in the 1990s, they were less than 1%, with the Persian Gulf War occasioning the only large bill. From 2001 to 2019, however, supplementals averaged 2.3% of (much expanded) federal spending despite five years in the 2010s in which there was basically zero supplemental spending.

Those numbers don't give anything like a full picture of Congress's irregular spending, though. In the realm of defense appropriations, non-base spending has ballooned, especially through Overseas Contingency Operations (OCO) — a spending category that has become infamous in budget circles. From 1970 to 2000, non-base spending averaged around 2% of all defense spending. Over the last two decades, this number has soared to around 20%. Only about half of that spending can be justly classified as supporting temporary operations, according to the Congressional Budget Office, with the other half being used for long-term, predictable operational expenses that Congress could budget for in a normal manner. In fact in recent years, one of the primary reasons for pushing spending into OCO instead of base defense spending has been to evade the budget caps, which exempted OCO from their control.

One can understand this lurch toward routine use of irregular spending as a direct adaptation to the broken budgeting process. Especially in the 2010s, legislated caps did not match legislators' actual spending preferences. Lawmakers coped with this discrepancy by waging a continuous two-front war in which they simultaneously tried to raise the caps (and waive pay-as-you-go requirements) as an official matter while in practice working to subvert their importance by exploiting every available loophole. In other words, the system as designed encouraged the use of "emergencies" to close the gap between legislated spending levels and what Congress actually wanted to spend.

Indeed, because the law is written to exempt formally designated "emergency spending" from budget rules (basically without limitation), "emergencies" have become the pre-eminent magic asterisk of our time. In practice, this has enabled Congress to make huge expenditures completely outside of any planning process.

This includes the banner headlines of the age of emergency government. When a bipartisan Congress quickly approved $40 billion in aid in response to the September 11th attacks, lawmakers were already pushing the frontier of what the federal government could do in a crisis. But 2008 was the watershed year in which Congress proved willing to vote through emergency spending in amounts previously unimaginable: more than $150 billion in February 2008 for stimulus, $300 billion in July 2008 for Fannie Mae and Freddie Mac, $700 billion in October 2008 for the Troubled Asset Relief Program (TARP), and $787 billion in February 2009 for stimulus. Much of the money spent in response to the financial crisis was used to further priorities that had little to do with overcoming the emergency.

By way of comparison, the federal government's total outlays during 1944 in 2008 dollars was around $1 trillion. Turning the tide against the Nazis and the Japanese was apparently much cheaper than providing the emergency funds necessary to combat the 2008 recession.

Our fight against the economic devastation of the coronavirus — which is far from over — has already made the response to those first two crises of the 21st century look like child's play, at least fiscally. First came the warm-up acts — an $8.3 billion emergency supplemental appropriation, signed on March 6, and the Families First Coronavirus Response Act, signed on March 18, which has been scored as including $95 billion in new outlays and $94 billion in decreased revenues. Then came the mammoth Coronavirus Aid, Relief, and Economic Security (CARES) Act on March 27, providing more than $2 trillion in federal funds and scoring a budgetary impact of around $1.7 trillion over the next decade — including $326 billion in extra discretionary spending, $988 billion in extra mandatory spending, and a $408 billion decrease in revenues. That was a tough act to follow, but less than a month later, the president was signing the $484 billion Paycheck Protection Program and Health Care Enhancement Act, deepening the previous law's commitments to businesses and hospitals. We may yet see more aid packages of comparable size, including a possible bailout of long-suffering state pension funds now posturing as victims of the pandemic.

Astonishingly, these figures don't even include what has turned out to be the largest emergency resource: the balance sheet of the Federal Reserve. This went from around $900 billion in mid-2008 to $2 trillion by the end of that year, then to $3 trillion by 2012 and $4.5 trillion in 2015. It is now at over $7 trillion, likely heading up to $8 trillion or more by the end of 2020.

THE PRESIDENT'S SURPRISINGLY LIMITED ROLE

We tend to think of crises as moments when American presidents must rise to the occasion and leave their mark on history. Franklin Roosevelt is our paradigm, having taken on two of the last century's severest tests with inspiring resolve. In both the banking crisis of 1933 and throughout the Second World War, Roosevelt was the undisputed master of government. Congress not only took its cues from him, it quite literally took bills from his administration and passed them with little or no modification. In the American historical imagination, Roosevelt single-handedly stared down the Great Depression and then steeled the nation to defeat the Axis powers of Germany, Italy, and Japan. That is a caricature, of course, but it is not entirely without basis.

By contrast, when we think of the responses to our 21st-century emergencies, the president is a much less focal figure. This resulted in part from the particular trajectory of George W. Bush's presidency. Bush was given an opportunity to follow the FDR model in the immediate wake of the September 11th terrorist attacks, with the country rallying behind him as he promised to identify the perpetrators and bring them, and anyone who gave them comfort, to justice. Bush's administration initially took the opportunity to explore the inherent powers of the presidency. But, as the nation became entangled in two complicated and indecisive wars, he failed to deliver a clear resolution to the terrorist threat.

Bush's attempts to deal with the awful wreckage of Hurricane Katrina badly disappointed the American people's expectations (which, as Patrick Roberts has argued in these pages, are very poorly aligned with the federal government's legal powers). By the end of Bush's second term, when the financial crisis arrived, he was the most unpopular president of the modern era, with barely a quarter of the country supporting him.

In fact, the iconic crisis responses in 2008 did not feature Bush at all. One such response that stands out is the meeting convened in Speaker Nancy Pelosi's conference room on September 18, at which Treasury Secretary Henry Paulson and Federal Reserve chairman Ben Bernanke impressed upon Congress's four top leaders the extent of the financial carnage. Asked what would happen if Congress failed to act, Paulson gravely replied: "God help us all." By Paulson's recollection, the congressional leaders were left "ashen-faced" and eager to empower the Treasury and the Fed to undertake a massive rescue effort. Together, they hastened to push through the $700 billion rescue bill in spite of what turned out to be widespread suspicion of the plan on both sides of the political spectrum.

In many ways, this has become the model for our 21st-century emergency government: congressional leaders transcending their mutual partisan animosity in order to massively empower the appointed members of the executive branch. The president is hardly absent, given his position at the top of the executive branch and his centrality in the popular media. But aside from shoving wheelbarrows of money at the problem, none of the nation's elected officials, executive or otherwise, has shown a clear desire to shoulder the basic decisions of crisis governance. After all, it was the unelected Paulson, Bernanke, and Timothy Geithner (first as president of the Federal Reserve Bank of New York, then as Treasury secretary under President Obama), not President Bush, who were most decisive in 2008 and 2009. Similarly, during the ongoing pandemic crisis, a bipartisan Congress has massively empowered Treasury Secretary Steven Mnuchin and Federal Reserve chairman Jerome Powell. The most prominent face of the crisis response has been that of Dr. Anthony Fauci, whose position as director of the National Institute of Allergy and Infectious Diseases would seem to make him an unlikely political figurehead. These are not the "constitutional dictators" that Clinton Rossiter's work on Abraham Lincoln, Woodrow Wilson, and Franklin Roosevelt prepared us for.

In each of these cases, power has flowed from Congress to the executive. It has not done so free of all constraints, of course; in each debate, legislators have recoiled at the idea that they would provide a "blank check," and they have made some effort to ensure that the final laws include reporting and oversight mechanisms, time limitations, and various requirements for private actors who benefit from public largesse. Still, the basic theme is massive delegation of discretionary power — along with enormous financial resources — to executive actors. In 2008, for instance, Congress went out of its way to give Treasury Secretary Paulson the ability to use TARP money for purposes other than providing relief for troubled assets, which he promptly acted on. And in 2020, Congress encouraged the Treasury and the Fed to use every trick in their financial-crisis playbook and more, seemingly with very few reservations.

There are good reasons to turn to the Fed during times of crisis. It is undoubtedly one of the most competent components of America's state apparatus, in part because it manages to easily interface with Wall Street firms. (It recently contracted with BlackRock, the world's largest asset manager, to help stand up its bond-buying efforts, as it did in 2008.) And given the Fed's vaunted "independence," empowering the central bank seems like a way to liberate an issue from partisan politics.

For all that, however, the repeated prominence of the Fed is, frankly, a somewhat surprising development. The Fed's unusual legal status — in which its reserve banks are technically privately owned by the nation's national banks — makes it a strange vehicle for overwhelming shows of political will. Its unwavering devotion to Ph.D. economists would not seem to suit it for political high-wire acts. And there is a long tradition of populist distrust of the Fed, whose detractors call it "the creature from Jekyll Island" after the private conference that launched it into existence in 1910. In fact, in the wake of its financial-crisis responses, the Fed experienced a serious backlash, including a Ron Paul-led effort to "end the Fed" that resonated on the Tea Party right. President Trump was never connected to such efforts, but for the first three years of his administration, he too showed an unusual willingness to beat up on the Fed's monetary policy; at times, he even seemed determined to drive the chairman (whom Trump himself had appointed) from his job.

Yet all of these negatives have suddenly evaporated in the midst of the coronavirus crisis. The trust put in the duo of Powell and Mnuchin is near-total. It is now genuinely difficult to recall how much skepticism and outright ridicule the Fed's first round of quantitative easing occasioned. If there is a backlash coming in the 2020s, it has so far been strikingly hard to detect.

The contrast of the Fed's role with the role of the president is extreme. President Trump has declared himself a "wartime president" against an "invisible enemy," and nobody working in his government is allowed to forget who is boss. Throughout the coronavirus crisis, the president's superpower of press magnetism has remained undiminished. But for all of the attention and gestures of obeisance he commands, does anyone imagine Trump is really leading or even deliberately shaping his own administration's response? He is in fact a lagging, and faulty, indicator of policy.

One might chalk this up to this particular president's distinct lack of preparation, interest, or administrative ability, but the same pattern could be seen even in the early days of the Obama administration. Obama was, by all accounts, keenly interested in the details of the financial crisis and quick to master complicated facts. All the same, his administration never looked anything like Franklin Roosevelt's juggernaut. The massive crisis bills, as well as the Affordable Care Act that came to bear the president's name, were developed and written in Congress — often with significant Republican input, if not ultimately Republican votes. Bernanke, Geithner, Lawrence Summers (director of the National Economic Council), and even people further down the organizational chart — like Steven Rattner, nicknamed the administration's "car czar" — were all arguably as important as Obama himself in determining the shape of the administration's response. The progressive vision of president-directed government has receded, and with it the hope that the president-as-administrator can impose a degree of orderliness across all the federal government's actions.

Analyzing trends in presidential governing styles is a tricky business because we have so few cases available to us. But our historical attunement to the 20th century's debates over the imperial presidency may have caused us to miss a contrary trend in the 21st. We appear to live in an age of massively empowered non-presidential executive actors. This is an unfamiliar framing, but one we should perhaps get used to.

POLARIZATION'S SURPRISINGLY LIMITED ROLE

If it's hard to wrap our heads around emergency-but-not-president-centered government, it may be even harder to contemplate the idea that polarization has not been decisive in shaping the responses to the nation's most imposing challenges of the 21st century. We have been trained to understand polarization as the defining characteristic of our era; the political-science profession has devoted itself to proselytizing this truth with great zeal.

And, if one lives by daily news coverage alone, the headlines in Politico provide assurance that polarization is yet alive and well. An April 17 headline read, "Phone tag and snubs: Leadership feuds undermine pandemic relief," accompanied by the sub-headline, "Party leaders are squabbling like normal, even as the stakes are so much higher." Immediately following the Senate's approval of the CARES Act addition in late April, the headline was: "Senate passes $484 billion coronavirus deal after weeks of deadlock." Time is certainly of the essence in handling this crisis, but the story's opening words are breathless to the point of self-parody: "After two weeks of stalemate and days of frenetic negotiations...."

It is true that many habits of bitterly polarized politics persist in moments of crisis response. Members continue to say nasty things about the opposing party, and negotiations do not instantly resolve through warm embraces (and not only because of social-distancing rules). But we ought to be able to look past the surface hostilities to see how remarkably consensus-based the crisis response has been in 2020. The supplemental spending bill passed 96-1 and 415-2; to the extent there was controversy, it was only about how much more money should be added. The entitlement-creating Families First Coronavirus Response Act passed 90-8 and 363-40. The CARES Act passed 96-0 in the Senate and by an overwhelming voice vote in the House — a procedural option that could only be used in defiance of a request for a recorded vote by gadfly Thomas Massie, a Republican representative from Kentucky, because more than four-fifths of members present wanted to shield absent colleagues (of both parties) from criticism.

In the wake of September 11th, Congress also acted with a striking degree of consensus — perhaps unsurprisingly, given the tendency to rally around the flag in times of crisis. The Authorization for Use of Military Force passed just three days after the attacks by votes of 420-1 and 98-0. The USA PATRIOT Act, which would become quite controversial over time, passed in October by votes of 357-66 and 98-1. A year later, the vote for the Authorization for Use of Military Force Against Iraq was divided along party lines, with the majority of House Democrats opposing the resolution. Even then, however, Senate Majority Leader Tom Daschle lent it his support, as did most other Senate Democrats.

The responses to the 2008 financial crisis were famously controversial as they happened, but the divisions did not break cleanly along partisan lines — as the hyper-polarization model would have led us to expect. Instead, the basic split was between the leadership and populist-leaning dissidents on both sides of the aisle. Inspired by the warnings delivered by Bernanke and Paulson, Republican leaders John Boehner and Mitch McConnell formed a united front with Democratic leaders Nancy Pelosi and Harry Reid. When the House failed to pass the $700 billion bailout on September 29, 2008, it was because two-thirds of Republicans and nearly half of Democrats banded together to defy the wishes of the president and of their leaders. After markets reacted violently to the rejection and the Senate passed the bill by a vote of 74-25, that same House brought along those members whose populist desire to stick it to the banks proved limited in the face of possible meltdown and passed the law 263-171.

The unifying theme in each of these episodes is not partisanship, then, but the dominance of leadership and the leaders' ability to make deals. Again, the 111th Congress of 2009-2011 stands as the lone exception, given Democrats' success in pushing through an economic stimulus, a health-care reform law, and a financial regulatory overhaul with almost no Republican support. But it is the exception that proves the rule.

Claims about polarization are most often supported with reference to indices based on observed roll-call votes. The glory of these measures is that they are, or seem to be, purely descriptive; they simply tell how similarly members of Congress vote. And in recent years, members have sorted themselves by party more than at any time in the past century. As far as this goes, it is a neat piece of number crunching. But when we turn to interpreting the results, their meaning is far from self-evident. If leaders intentionally arrange votes so as to accentuate partisan differences — the better for framing nationalized elections — most people nevertheless interpret the increased polarization in voting patterns as evidence of legislators' increased ideological polarization. The standard description is thus "Republicans are more conservative than ever, and Democrats more liberal," even though all the evidence shows on its face is that Republicans and Democrats vote more differently from each other than they did previously. These methods of analysis give no leverage at all over the question of whether "liberal" and "conservative" retain constant meanings, nor do they have anything to say about how members' views are evolving on issues that are kept off the agenda. And since these techniques focus on differences between members, they cannot assimilate any information on votes where the whole chamber moves together.

When we consider the recent coronavirus-response votes, it ought to be clear why these polarization measures are unable to tell us anything important. On what are unquestionably some of the most important matters that have faced Congress in a generation, partisan difference has all but melted away. It turns out that there are no meaningfully different "conservative" and "liberal" approaches to reviving the economy in the face of a catastrophic pandemic; both sides are overwhelmingly committed to massive deficit spending and liquidity support provided by the Fed. This is hardly the only conceivable position. There are still differences, of course; as I write, there is a much-watched fight over how liberally to hand over federal tax dollars to ailing state and local governments. But focusing on those differences should not blind us to the rather striking degree of bipartisan unity these responses demonstrate.

Since the main narrative about polarization is that it is choking off Congress's ability to act, the basic absence of polarization in our crisis moment may seem like good news. But it is by no means clear that this unity serves us well. Robust partisan competition would be capable of producing public deliberations with real gravity. Such deliberations are necessary in moments of crisis for both their performative and their information-generating properties. When instead we end up with massive deals rapidly whipped up by the two parties' leaders, there is very little meaningful deliberation. Indeed, the leadership makes sure that members understand how very unwelcome deliberation is, threatening to treat any member who insists on debate as a dangerous obstructionist. With very few exceptions — most notably the obstreperous Massie — members have accepted this arrangement. As a result, some very difficult questions about how relief should be targeted (and especially the question of whether it should be directed toward individuals or firms) were treated as non-issues.

GOVERNMENT DYSFUNCTION RECONSIDERED

Leadership dominance is the organizing factor that underlies both of the alternating postures of 21st-century American government: polarized stalemate in normal times, and massive, rapid action in crises. If the agenda during out-years is taken up with matters designed to accentuate differences rather than move policy, that suits the leaders' desire to arm their troops ahead of the "most important election in the history of the republic" — i.e., the next one. Polarization is not "fake" so much as epiphenomenal. If, during crises, leaders can reliably (if not without friction) deliver enormous interventions, then they can be sure that when the nation's needs are most acute, we will get what we need.

Taken as a whole, this evaluation is much less dire than is the common diagnosis of paralysis by polarization. Our system seems to be living up to the apocryphal Churchill-ism that "Americans will always do the right thing — after exhausting all of the alternatives." That is of course much better than not being able to do anything, ever. It is not, however, anything to be proud of. Our relatively poor response to the coronavirus crisis ought to make clear that, in spite of the dominance of crises in our politics, one of the things we are worst at is preparing for emergencies before they happen — or, better yet, preventing them from becoming emergencies altogether.

This is not because everyone is oblivious during normal times, but rather because the absence of decentralized deliberation makes the system unable to sensibly sift through overabundant information and prioritize matters that are most serious. The system responds only to urgency — and so artificial cliffs are designed to facilitate the maintenance of the status quo. Meanwhile, sensible efforts to proactively deal with problems before they become unmanageable languish in congressional committees that no longer have the clout to shape their chambers' agendas.

Pandemic preparation is a clear example of this phenomenon. It was predictable — and predicted! — that our existing disaster-relief system would be a poor fit for dealing with a widespread outbreak. A 2017 article in the American Journal of Public Health rather matter-of-factly assessed the situation: "The mere existence of federal mechanisms to mobilize assistance to state and local governments does not guarantee adequate and timely funding during an outbreak (especially since the 'no year' PHEF was depleted in 2012)." It concluded: "The historical precedents that support state and local leadership in preparedness for and response to disasters are in many ways at odds with the technical demands of biological risk preparedness and response."

The PHEF in question is the Public Health Emergency Fund, a federal pot of money meant to provide ready resources in the event of a public-health emergency. Unfortunately, Congress failed to fund it throughout the 2010s in spite of proposals to do so being introduced in several forms. In 2019, Congress did create a fund specific to the Centers for Disease Control and Prevention (CDC) called the Infectious Disease Rapid Response Reserve Fund, which in early 2020 had $105 million available. Needless to say, we burned through that amount in no time at all, leaving the CDC to scrounge for money elsewhere in the crucial month of February 2020.

Neither of these funds is to be confused with the Public Health and Social Services Emergency Fund that the CARES Act funded to the tune of $100 billion, which is now being used to keep hospitals from going out of business and to help them grapple with the immense demands of treating patients during the pandemic. The relative magnitudes of these efforts speak volumes: a hundred million for preparedness, a hundred billion for response.

HOW TO RECOVER NON-CRISIS GOVERNMENT

It would be too harsh to say that our government completely wastes its time during normal years. Members of Congress (and their staffs) build expertise and relationships that they put to use during crises, and federal agencies' planning efforts have not been entirely in vain. But all too often, governing in the 21st century amounts to little more than piling up raw materials that we then scramble to process and put to use in moments of crisis — rather than processing those materials ahead of time so we can act decisively as soon as those moments hit.

Instinctively, we think of such failures as essentially managerial; we then suppose that the solution must be more effective top-down administration. There is no question that more able presidential management could make a real difference in the quality of our crisis-response efforts. But imagining that a newly empowered president could swoop in, cut through the chaff, and prioritize important threats is just a fantasy. A single person, no matter how talented, lacks the scope of attention needed to make sense of America's many troubles. And, in the age of social media, American presidents (and their immense White House apparatuses) may be irretrievably mired in the news cycle, leaving them little time to imagine and prepare for the threats looming on the horizon.

We would be better served by envisioning a government with greater decentralized processing capacity running on a more continuous basis. Some form of strong cabinet government might satisfy this need, but there is no American tradition of strongly empowered cabinet secretaries thriving without the president's help. Perhaps, with our elevation of the Treasury secretary and Fed chairman in recent crises, we are inventing one on the fly, and perhaps we should not too quickly dismiss the merits of such a system. But unelected officials are at an inherent disadvantage in fashioning legitimate crisis responses: At any moment, the public can turn on them as fundamentally unrepresentative — as working for "them" rather than "us." In our political tradition, there will never be an effective rebuttal to such claims.

A response more firmly established in the American tradition would involve strengthening congressional committees, especially by restoring their powers to set agendas independently of the elected party leadership in each house. Leadership in the modern era allows longstanding committee work to shape each chamber's agenda only sporadically. What is needed to incentivize committees to engage in deep, forward-looking work — the kind of work that will allow government to govern effectively — is assurance that such efforts will not simply be kept on ice until some crisis makes them urgent, but that they can be converted into real, constructive changes beginning today.

The remaining question, as always, is how we can get to there from here. After all, leadership dominance has only been consolidated during the current crisis as rank-and-file members have absented themselves from Washington and public committee work has nearly ceased.

The answer will require members themselves to insist that there is no substitute for representatives from around the nation deliberating together on matters of pressing public importance. Even during a crisis, the purpose of their convening is not merely to decide how much fuel to pump into some proven crisis-fighting machine, because no such machine exists. Every problem is different, and it is not self-evident which parts of a problem are susceptible to federal action. Bringing lawmakers together to deliberate — forcing them to engage with one another, to persuade those who do not already agree — will open up new ways of thinking about problems, often along with useful, innovative governing structures for dealing with them.

This point is especially urgent given the push by many well-meaning reformers to cope with the coronavirus pandemic by reducing Congress to a remote-voting body — no longer an "assembly" in any real sense, but a ratifier of bargains struck elsewhere by leadership. That model is, in many ways, a comfortable fit for what our 21st-century Congress has become. But this is the opposite of an endorsement.

Philip Wallach is a resident senior fellow in governance at the R Street Institute.


Insight

from the

Archives

A weekly newsletter with free essays from past issues of National Affairs and The Public Interest that shed light on the week's pressing issues.

advertisement

Sign-in to your National Affairs subscriber account.


Already a subscriber? Activate your account.


subscribe

Unlimited access to intelligent essays on the nation’s affairs.

SUBSCRIBE
Subscribe to National Affairs.