From Progressivism to Paralysis
abstract. The Progressive Movement succeeded in replacing laissez-faire with public oversight of safety and markets. But its vision of neutral administration, in which officials in lab coats mechanically applied law, never reflected the realities and political tradeoffs in most public choices. The crisis of public trust in the 1960s spawned a radical transformation of government operating systems to finally achieve a neutral public administration, without official bias or error. Laws and regulations would not only set public goals but also dictate precisely how to implement them. The constitutional protections of due process were expanded to allow disappointed citizens, employees, and students to challenge official decisions, even managerial choices, and put officials to the proof. The result, after fifty years, is public paralysis. In an effort to avoid bad public choices, the operating system precludes good public choices. It must be rebuilt to honor human agency and reinvigorate democratic choices.
In January 2020, University of Washington epidemiologists were hot on the trail of COVID-19.1 Virologist Alex Greninger had begun developing a test soon after Chinese officials published the viral genome. But he needed Food and Drug Administration (FDA) approval to deploy his test in-house. Greninger spent 100 hours filling out an application for an FDA “emergency use authorization” (EUA). He submitted the application by email. Then he was told that the application was not complete until he mailed a hard copy to the FDA Document Control Center. After a few more days, FDA officials told Greninger that they would not approve his EUA until he verified that his test did not cross-react with other viruses in his lab, and until he also agreed to test for MERS and SARS. The Centers for Disease Control (CDC) then refused to release samples of SARS to Greninger because it is too virulent. Greninger finally got a sample of coronavirus that satisfied the FDA. By the time it arrived and his tests began in early March, the outbreak was well on its way.2
Modern government has a major flaw. It is structured to preempt the active intelligence of people on the ground. This is not an unavoidable side-effect of big government, but a deliberate precept of its operating philosophy. Law will not only set goals and governing principles, but it will also dictate exactly how to implement those goals correctly. The tools are familiar: volumes of detailed rules, mandatory procedural paths prior to approvals, and—when there is disagreement—adversarial proceedings aimed at discovering objective truths.
The effect is a kind of paralysis—specifically, an institutional inability to act in a timely and appropriate way in the particular situation. The complex shapes of life rarely fit neatly into legal categories Decisions slow to a snail’s pace as people are diverted towards compliance and anxiety about the legal correctness of choices. As a result, things do not work as they should in schools, in hospitals, in workplaces, on playgrounds, and especially, within the government itself. Permits for needed infrastructure can take upwards of a decade.3 People cannot grab hold of problems and solve them.
COVID-19 is the canary in the bureaucratic mine. Public-health officials across America were prevented from using their own tests, buying them overseas or using local labs.4 When the virus spread to New York and other cities, hospitals found they could not cope with the surge of cases except by tossing the rulebooks to the winds.5 Providers did not have the time to keep track of or comply with the myriad of granular legal dictates6—for example, maximum hospital capacity limits,7 staffing requirements,8 diet dictates,9 ambulance equipment regulations,10 restrictions on telemedicine,11 and rules providing for same-sex attendants when transporting disabled patients from state facilities.12
The ripple effects exposed counterproductive rigidities in virtually all sectors of government. When schools were shut down because of coronavirus, needy students no longer had school meals.13 Katie Wilson, executive director of the Urban School Food Alliance and a former Obama administration official, quickly got an agreement in principle to transfer federal meal funding to a program that provides meals during summer months.14 But the red tape required a formal waiver by each state, which in turn required a formal waiver by Washington. The bureaucratic instinct was relentless even when waiving rules. Each school district in Oregon was first required “to develop a plan as to how they are going to target the most-needy students.”15 Meanwhile, as time went by, the needy children were getting no meals. New York Times columnist Bret Stephens, interviewing Wilson, summarized her plea to government: “Stop getting in the way.”16
There is a paradox here. COVID-19 illustrates the vital need for public oversight and action in an interdependent world, but it also illustrates the folly of thinking that rigid rules and procedures are the best way of achieving that oversight. Public frustration with government has rarely been higher. Sixty percent of Americans support “very major reform,” more than twenty points higher than two decades ago.17 But what should that reform be? The political parties focus on the goals of government and present-competing Manichean visions—either deregulate or add more programs to deal with inadequate healthcare coverage and other modern challenges.18 Neither party has a vision for fixing the operating system of government.19 Not one question in the 2020 Democratic presidential primary debates focused on how to make government work better.20
Change is in the air. Americans are starting to take to the streets. But the unquestioned assumption of protesters is that someone is actually in charge and refusing to pull the right levers. While there are certainly forces opposing change,21 it is more accurate to say that our system of government is organized to prevent fixing anything. At every level of responsibility, from the schoolhouse to the White House, public officials are disempowered from making sensible choices by a bureaucratic and legal apparatus that is beyond their control.
Since the dawn of the Progressive Era in the late nineteenth century, government has assumed growing responsibilities to oversee public health, products and markets, and to provide services and safety nets. Most of these goals, in my view, are essential to enhance freedom in the interdependent modern world. Individuals cannot readily check the purity of food and water or test for lead paint in toys. As the demand for public oversight and services has grown, however, there has been only episodic attention to how government does its work—even as public frustration has led to growing alienation and polarization.22 This Essay argues that the public operating system, as it has evolved since the Progressive Era and especially since the 1960s, is incapable of making practical public choices, responding to citizen needs, or promoting accountability in our democratic hierarchy. Part I traces changes in the bureaucracy’s “operating philosophy” since its inception during the Progressive Era—and details the flawed precepts, rooted in distrust of official authority, leading to our current discontent. Part II explains why prior reforms have failed. And Part III concludes by proposing a simplified framework, focused on goals and guiding principles, where clear lines of accountability replace detailed dictates and procedures for many public choices.
Distrust of government is a consistent theme in American history. Americans “consider all authority with a discontented eye,” Tocqueville observed.23 From the beginnings of the new country, the drive to disempower officials has been evident—for example, in the toothless Articles of Confederation. The main argument against the Constitution by the “Anti-Federalists” was that its powers were too vague and that it gave too much authority to the different branches of government—“‘like A Fiddle, with but few Strings,’ so that those in power might ‘play any tune upon it they pleased.’”24 A few decades later, distrust of the permanent bureaucracy in Washington motivated Andrew Jackson’s introduction of patronage hiring. “The doctrine of rotation-in-office was thus in large part conceived as a sincere measure of reform,” according to historian Arthur Schlesinger, Jr.25 The judicial philosophy of laissez-faire, in which courts struck down legislative efforts to regulate business during the nineteenth century, reflected a broad distrust of democratically elected legislatures.
Progressive Era reformers sought to minimize political authority, and conceived of the administrative state as a body of apolitical experts who could handle scientific and technical issues political bodies would be ill equipped to address.26 Reformers in the 1960s went one giant step further and sought to guarantee neutral public choices by transforming regulation into a kind of software program: detailed rules would dictate proper compliance, overseen by public employees who, like technicians wearing white jackets, have no authority other than to guarantee strict compliance.27 When a choice was necessary—for example, to use a local lab to test for a virus, or to give a permit for new infrastructure, or to maintain order in the classroom, or to fire a bad cop—extensive procedures would strive to protect against human error or bias.
Today, as a result of these reforms, governance takes place, not mainly in the halls of Congress or the Oval Office, but in dictates imposed by thick books of regulations, administered mechanically by public employees. The body of regulatory law—about 150 million words for federal law alone28— imposes detailed obligations upon citizens and limits on public authority Together with detailed public-union collective-bargaining agreements, it also limits management authority over public employees.29 Regulatory goals and principles make up only a tiny fraction of this dense legal framework; the majority of these regulations prescribe the one correct way how to achieve the goals.30
A report by the New York Times found, for example, that a family-owned apple orchard was subject to 5,000 rules from seventeen different regulatory programs.31 The regulatory programs are aimed at valid public goals, such as making sure apples are clean. However, most rules specify exactly how to achieve the goals.32 One regulation, for example, requires covering the cart of picked apples with a cloth to protect against bird droppings.33 This level of granularity is hard to keep track of—the farmers have thirteen clipboards hanging in their office—and sometimes inspires ridicule. The apples in the cart, for example, have been growing on the trees for five months, fully exposed to birds.34 A few minutes more on the trip to the barn is unlikely to advance public welfare.
The Progressive Era’s premise of neutral government, untainted by the judgments of imperfect humans, was always flawed. The effort to avoid official judgment has resulted in a framework where the “merit system” has become a sinecure without merit and the drive to dictate correct regulatory choices has evolved into a paralytic version of central planning. This Part discusses each development in turn.
The Progressive Era is best known for supplanting the laissez-faire governing philosophy with regulatory oversight—antitrust law in 1890,35 the Pure Food and Drug Act in 1906,36 the Federal Trade Commission in 1914,37 and child-labor laws and other laws dealing with work conditions. The first major progressive reform was not about regulation, however, but rather about a better government operating system. Enacted a few years before the full age of reform, the Pendleton Act of 188338 created a federal civil service to reduce reliance on the “spoils system” in appointing federal officials. The main harm of the spoils system was that unqualified political hacks got public jobs as a kind of property right for their campaign help.
Civil service not only avoided the harm of spoils, but also provided a trusted way to expand the role of government. The regulatory oversight envisioned by progressives could not be achieved without a professional public service that could faithfully enforce new legal duties.39 The Pendleton Act provided for hiring by competitive tests of initially about ten percent of the federal workforce.40 There were no restrictions on firing, both because of constitutional concerns about interfering with the President’s executive powers under Article II and because a neutral hiring protocol was all that was needed to get rid of the spoils system. As reform leader George William Curtis said, “if the front door [is] properly tended, the back door [will] take care of itself”41:
[I]t is better to take the risk of occasional injustice from passion and prejudice, which no law or regulation can control, than to seal up incompetency, negligence, insubordination, insolence, and every other mischief in the service, by requiring a virtual trial at law before an unfit or incapable clerk can be removed.42
Accountability for performance was a core assumption of civil service. Government employees would get and keep their jobs on the basis of merit—hence its shorthand name: the merit system.
The reform ideal, which remains with us today, was to purify and professionalize government. No longer would government suffer, as reformer Carl Schurz said, from the “demoralizing influence of the patronage.”43 “Administration,” Woodrow Wilson wrote in 1887, “lies outside the proper sphere of politics.”44 Instead of party hacks, the public would be served by officials with “skill, ability, fidelity, zeal, and integrity.”45 Government would be run not by political leaders but by professionals, carefully organized to fulfill the designated tasks. “A bureaucracy devoid of political appointees,” as historian Michael Nelson put it, “would be like a royal guard of eunuchs—an agency with no distracting wants of its own to impede the execution of its assigned tasks.”46
Civil service had its critics. Opponents described the idea as “snivel service,”47 filled with detached experts doing what they, rather than the public, wanted. But these critics were party hacks, ignored or derided by those devoted to good government.48 After a few years, however, observers began to note an alarming truth in this critique, but not exactly for the anticipated reasons.
Civil service became permanent patronage. Although Congress initially “classified” only ten percent of federal employees as professional civil servants, it empowered the President to decide whether to increase the ranks.49 Presidents Cleveland and Harrison made their usual political appointments and then “blanketed in” the party faithful by simply redesignating their patronage jobs as civil service.50 By 1900, more than eighty thousand party hacks had become proud members of the merit system, representing half of all civil servants,51 virtually all without competitive examination. President McKinley, under pressure from their burgeoning ranks, then perfected this alchemy of civil service into super-spoils by closing the “back door” to give political supporters lifetime tenure.52
Reformers who had struggled for the “merit system” barely knew what hit them. President Theodore Roosevelt, a longtime civil service reformer, “tried to increase presidential control over federal employees by, among other things, making dismissals for cause easier to obtain.”53 But at this point, federal employees had become their own political force. And another change decades later made the problem worse: collective bargaining rights, further insulating civil servants from accountability.
Until the 1960s, even pro-union political leaders believed that, as President Franklin Roosevelt put it, “collective bargaining, as usually understood, cannot be transplanted into the public service”54 because it pitted public servants against the common good. But President Kennedy’s Executive Order 10988 allowed federal public unions to engage in collective bargaining.55 The task force that recommended these changes, chaired by Arthur Goldberg, characterized collective bargaining as a way that unions could help make government work better,56 but most knew that the task force was a fig leaf to disguise a political payback.57 Once the federal government acceded to collective bargaining, it was only a matter of time before unions persuaded states to do the same. New York authorized collective bargaining in 1967, and California in 1968.58
Collective bargaining was motivated not by practical problems with the prior framework, but by the political power of millions of public workers—representing fifteen percent of the total workforce59— and swept in by the powerful tide of the rights revolution. Public workers had not been at high risk of termination: public managers tended to avoid the potential for controversy in unpleasant personnel decisions. But the rights revolution presumed that the fair approach to any dispute is to favor the individual over the executive running a department. Without any controversy, Congress enshrined collective bargaining as a statutory right in the Civil Service Reform Act of 1978.60 Representative Patricia Schroeder of Colorado, the lonely voice of caution, warned that “in years to come[,] [civil servants] will have to put up with colleagues who do not pull the load.”61 That proved to be a vast understatement.
Many government departments became virtually unmanageable. Contracts hundreds of pages long set out specific work rules and job descriptions, strictly controlled supervisory judgments on job performance, and provide pension and other benefits far richer than provided by private employers. Featherbedding became common in some states, driving up the cost of governmental services. Construction of the Second Avenue Subway in New York City cost $2.5 billion per mile, about five times the cost of a subway in Paris using similar machines, in part because union contract required twice as many people as needed.62
Terminating poor performers was basically impossible. A negative comment in the file, for example, gives rise to a right to confront the supervisor and sometimes a challenge in an adversarial proceeding. As a result, over ninety-nine percent of federal civil servants receive a “fully successful” job rating.63 The standard is not poor performance but whether the individual is that much worse than others, and the burden is on the supervisors. The need to build a record means, as a school superintendent put it, “dismissing a tenured teacher is not a process, it’s a career.”64 California is able to dismiss for poor performance only two out of almost 300,000 teachers per year.65 Civil service came full circle: where the progressive goal was to overcome public jobs as property of political spoils, public service after the 1960s became a property right of the public employees themselves. Public employees answer to no one.
The inability to manage public employees hit headlines when, in May 2020, an eight minute video showed the slow suffocation of George Floyd by Minneapolis policeman Derek Chauvin. Chauvin had been the subject to eighteen complaints over his career,66 but, under the union collective-bargaining agreement, police supervisors had no practical ability to terminate him. Minneapolis, like most cities, has a poor record of holding police accountable. According to the Wall Street Journal, out of 2,600 complaints since 2012, only twelve resulted in an officer being disciplined.67 The most severe penalty was a suspension for forty hours. Union rules protecting police—for example, restricting use of a prior record of misconduct—make it almost impossible to hold officers accountable for even extreme misconduct.68
Critics of public unions focus on unaccountability as the key missing tool for managing police departments, schools and other public institutions.69 Creating a public culture with energy and pride is also difficult when everyone within an agency knows that individual performance is optional.70 But unaccountability had a larger system-wide effect—the inability to hold public employees accountable for actual results entrenched a rule-based regulatory structure where they are accountable instead for formal compliance. Instead of asking, “What’s the right thing to do here?” or, “What does it take to get this done?” the relevant inquiry is, “What does the rule require?”71
A human longing for security, described by Plato in his allegory of prisoners not wanting to leave the cave, is reflected in the progressive reformers’ ideal of neutral government that could be run as a kind of legal machine. As philosopher David Hume observed, people “are mightily addicted to . . . rules.”72 Experts would oversee systems that, if managed properly, would run government automatically, neutrally and uniformly, “to straighten the paths of government,” as Woodrow Wilson put it, and to “purify its organization.”73
As the industrial revolution matured into corporate combines, the power of organization became a belief system. Frederick Winslow Taylor’s influential 1911 book, The Principles of Scientific Management, taught that even the smallest choices could be organized into efficient patterns.74 The assembly line’s efficiency became the model for organizing government. Progressives had championed for decades for a neutral administration to rid government of the taint of patronage and partisan politics. Due to scientific methods, we got both: government would be pure and it would be efficient. In 1923, Congress passed the Classification Act, dividing civil servants into grades with designated tasks and assigned salaries. New York State divides its workers into over 4,500 specific job categories.75 Over time, as Paul Light has chronicled, government added layers of organizational complexity, based on models like “P.O.S.D.C.O.R.B.” (planning, organizing, staffing, directing, coordinating, reporting, and budgeting).76
Organization only went so far, however. It did not remove the authority of officials to adapt to situations before them. Progressive reformers welcomed official discretion.77 Consequently, most Progressive Era regulations set forth goals and principles, not detailed dictates, and could be enforced by officials using their judgment. Section 2 of the Sherman Antitrust Act, for instance, is only five lines long.78 Such delegations were “so commonplace,” as Daniel Ernst describes, that in 1951 administrative law expert Kenneth C. Davis “could compile a long list, including grants of the power to set ‘just and reasonable rates,’ to remove ‘unreasonable obstructions’ to navigation, to end ‘unfair methods of competition,’ and to regulate broadcasters in accordance with ‘public convenience, interest, or necessity.’”79
Traditionalists saw administrative decisions as undermining the rule of law, and had two main objections. The first was an inadequate focus on “whether private interests are adequately safeguarded.”80 The second was delegation of legal authority to nonelected officials. Their answer was to make law more prescriptive through legislative restrictions and judicial oversight.81 But their narrow focus on protecting private rights collided with the broader need for public oversight. Government needed to act. As then-Harvard professor and future Supreme Court Justice Felix Frankfurter put it: “After all, we can’t consider whether private interests are safeguarded without equally considering the public interests that are asserted against them.”82 Northwestern Law School Dean John Henry Wigmore concluded that the only way to achieve public goals was to give officials room to use their judgment: “The bestowal of administrative discretion, as contrasted with the limitation of power by a meticulous chain-work of inflexible detailed rules, is the best hope for governmental efficiency.”83 Ultimately, New Dealers, such as James Landis, adopted and implemented these theories of discretion.84
Debates over administrative power stalled during World War II, as wartime mobilization demanded delegation to a hierarchy of appointed officials and military officers.85 When the debate revived post-war, all sides were more willing to compromise—and that compromise culminated in the 1946 passage of the Administrative Procedure Act (APA).86 The APA basically incorporates into regulatory agencies all three branches of government—not only executive powers, but the legislative and judicial powers through regulations and adjudications.
The APA was intended to facilitate regulatory change; instead, it became a tool of the status quo. Procedural arguments began to trump substance, allowing officials to evade responsibility—and special interests to seize control of the regulatory process. By the 1960s, regulatory “capture” by airlines and other industries of the agencies supposedly regulating them became notorious.87 Regulatory “capture” by airlines and other industries of the agencies supposedly regulating them became notorious.88 Washington began to settle into a comfortable equilibrium of regulators and regulated keeping public machinery predictably moving in the same direction.
Then came the 1960s. The tumult of that decade prompted new policies, many long overdue, such as civil rights and environmental protection. It also prompted radical changes in the operating framework of government, dramatically altering how public choices are made and introducing a new concept of individual rights over public authority. The debate in the New Deal over whether the executive branch or judicial branch should have final say was replaced by a new vision: No one would have authority. Detailed rules would prescribe exactly how to do things: “Between 1969 and 1979 the Federal Register nearly quadrupled in length, expanding not just the scope of regulation, but the granularity of its mandates.”89
Some decisions can be preset by rules, whether to give a permit, for example. A new approach to governing philosophy, the “legal process movement,” suggested that right and wrong should be replaced by procedures in which the correct public choice could be shown by objective evidence in a hearing with public participation.90 The need to demonstrate correct choices promoted even greater detail in rulemaking. What better justification than an explicit requirement in a rule?
Diminishing the authority of officials with rules and procedures, however, was not sufficient to salve the wounds of racism, pollution, unsafe cars, abuse of disabled children and lies about Viet Nam. In a decade where “Don’t trust anyone over 30!” was a popular refrain,91 reformers sought a way to guarantee that no public decisions would harm anyone. Charles Reich had an answer: make government decisions the “new property” of people who were affected.92 The distrust of government from the nineteenth century came full circle. A main precept of conservative laissez-faire jurisprudence had been “the freedom to challenge any administrators’ deprivation of a private right” in a court of general jurisdiction.93 Then, in the early Twentieth Century, Ernst Freund and other legal traditionalists had battled, unsuccessfully, to focus judicial review of administrative action on the affected individuals. Now Reich and other liberals sought to reassert individual rights as the primary focus of judicial review.94
The Supreme Court was not immune to the cultural upheavals, and issued a series of rulings expanding the rights of individuals over public officials. It embraced Reich’s theory that public jobs and benefits were a “new property,” and applied the constitutional protections of due process to public benefits, public jobs and school discipline.95 Officials no longer had authority to make basic decisions unless they were prepared to prove why it was fair to whoever complained.
The governing philosophy of the New Deal—to give officials “ grants of power with which to act decisively”96—was turned upside down. Governing was too important to be left to officials exercising their judgment. Clear rules, extensive public procedures, and individual rights would guarantee that unfairness would be no more. Authority at all levels was put in the penalty box.
Scrutinizing official decisions through the lens of individual rights, we are taught, is the essence of the rule of law.97 But there is a difference between protecting citizens’ liberty or property against state coercion—the stated purpose of the due process clause in the Fifth Amendment—and making judgments needed to manage a public agency or a school. Regulations to achieve public health and safety similarly cannot operate sensibly unless officials can balance the public need for oversight against individual burdens. Courts can oversee the fairness and regularity of these choices, applying well-known standards such as having “substantial evidence” for agency findings. Due process, however, embodies a presumption against state action—a prohibition until the state meets its burden. Sometimes that is appropriate, as, say, with criminal penalties or taking their property. In many cases, as with school discipline, the effect is to paralyze necessary choices.98
Distrust of public choices has not produced a stable, if slow, governing framework. Putting almost any decision by a public employee under legal klieg lights on has set in motion a downward spiral of legal complexity, as new assertions of rights have led to greater granularity of regulations and more extensive processes. Environmental reviews mandated by the National Environmental Policy Act of 1970, for example, were supposed to illuminate key issues, and regulations stated that they should generally be no more than 150 pages long.99 A proposal in 2009 to raise the roadway of the Bayonne Bridge to permit a new generation of “post-panamax” ships to use the port of Newark, required an environmental assessment of 10,000 pages, plus another 10,000 page appendix.100 All this detail was provided not because it increased public transparency or informed public decisions, but to provide justification when, as occurred in that case, some group sued to stop the project, arguing inadequate review. An article co-authored by former Environmental Protection Agency General Counsel E. Donald Elliott estimates that ninety percent of the detail in environmental reviews is included to mainly to provide arguments when someone sues.101
Officials’ inability to use their judgment for the public good, in the name of protecting individual rights, has the unintended side effect of eroding citizen freedoms. The same rules that constrain officials also shackle citizens. Whether or not rules make sense, the farmer must cover the applecart and comply with thousands of other rules. The breakdown of discipline in schools, correlated directly with the rise of regulations designed to satisfy due process,102 compromises the ability of students to learn. Instead of advancing public goals, rigid regulations, untempered by judgment on the spot, often prove counterproductive. A study of nursing-home regulation, for example, concluded that quality suffers when aides focus on compliance instead of the needs of their residents.103 A focus on compliance disempowers officials from using their best judgment at the point of implementation. Detailed rules also disempower citizens: What good are the parents’ ideas if the principal cannot act on them? Finally, the rulebound state weakens democracy. Elections do not matter much if officials lack the authority to deliver on their promises.
What has gotten lost in this formalistic framework is the ability of officials to make judgments for the common good—the need, as Frankfurter put it, to do “substantive justice both to public and private interests.”104
The progressive dream of neutral administration, untainted by partisan goals and human fallibility, was always doomed to failure. Governing requires officials to make choices, applying their values and judgment to try to arrive at fair and sensible choices. To keep environmental reviews focused on what’s important, an environmental official must decide what the key issues are. To manage a community-friendly police department, the police chief must be able to decide which police are able to earn local trust. The mismatch between the ideology and the reality of automatic government, purged of human judgment and partisan values, was a central theme of many wise observers of the twentieth-century state. Hannah Arendt saw evil in thoughtless acceptance of detailed rules, which she called “the rule by Nobody.”105 Management expert Peter Drucker concluded that “government has outgrown the structure, the policies, and the rules designed for it,” with the result that it is “bankrupt, morally as well as financially.”106 Czech President Vaclav Havel called for modern societies “to abandon the arrogant belief that the world is merely a puzzle to be solved, a machine with instructions for use waiting to be discovered, a body of information to be fed into a computer in the hope that, sooner or later, it will spit out a universal solution.”107
The civil-service reformers, according to historian Henry Steele Commager, had “no real faith in democracy.”108 Who else is going to hold the bureaucrats accountable but those we elect? What happens when a value judgment needs to be made? Almost every decision—indeed, even the timing of a decision—involves the exercise of personal judgment and values. Progressives thought that bureaucrats could pursue the “public interest” instead of partisan goals. But, as historian Alan Brinkley noted, “[t]he idea that a ‘public interest’ exists somewhere as a kernel of true knowledge, untainted by politics or self-interest, is an attractive thought. But it is also a myth. We cannot identify a public interest outside of politics.”109
Government was bound to drift ever further from the needs of its citizens because no political leader had authority to make the value judgments needed to keep it aligned with the needs of citizens. As columnist Walter Lippmann put it in 1914,
[T]he confusion of political life . . . [comes from] insist[ing] upon looking at government as a frame and governing as a routine. . . . [P]olitics has such an unreal relation to actual conditions. Feckless—that is what our politics is. . . . [I]t has been centered mechanically instead of vitally. . . . [W]e have hoped for machine regularity when we needed human initiative and leadership, when life was crying that its inventive abilities should be freed.110
By disconnecting public choices and accountability from decisions at the ballot box, we succeeded not in purifying government but in severing the cord connecting government to the country it supposedly serves. The idea of neutral civil service, historian Paul van Riper observed, ultimately “carried with it the seeds of its own potential destruction.”111 Instead of neutrality, we got the pathology of a democracy purged of the beliefs of its constituents.
Government became accountable not to voters but to courts for compliance with the ever-thickening accretion of rules, rights, and restrictions. Slowly but inevitably a sense of powerlessness has pervaded the public and private institutions of our society. That’s a common thread of many of the frustrations that drove Americans to the streets in 2020.112 The delayed responses to COVID-19; the inability to hold bad cops accountable; and the continuing failures to fix broken schools, infrastructure, and other public services all share a common flaw: the people with responsibility lacked the authority to get the job done. The discontent is not new, but it is reaching the breaking point. For decades, Americans and their elected leaders have pushed for bureaucratic reform but failed. The next Part explains why nothing much was fixed.
Every President since Jimmy Carter has promised to rein in bureaucratic excesses. But none has succeeded.113 Most proposed reforms have focused on reducing the size or scope of government. For instance, President Carter came to Washington promising “to reorganize a Federal Government which had grown more preoccupied with its own bureaucratic needs than with those of the people.”114 He deregulated trucking, airlines, and other industries, to good effect.115 Carter also advocated sunset laws that would periodically clean up programs that were not working as intended.116 However, his reform efforts got sidetracked by problems with the economy and in Iran, and he ultimately proved unable to alter bureaucratic operation fundamentally. Carter’s successor, Ronald Reagan, famously said that “government is not the solution to our problem; government is the problem.”117 Reagan appointed the Grace Commission, which recommended nearly 2500 ways to reduce the bureaucracy, most notably by instituting a base-closing commission to make politically difficult decisions about which military bases to close.118 But at the end of his term, government was denser than before.
Reform efforts continued into the 1990s. Vice President Al Gore’s Reinventing Government Initiative streamlined internal processes and sponsored successful pilot programs for goals-oriented worker-safety regulation.119 Their idea was to treat citizens as “customers,” and use market-based mechanisms to achieve public results. But it had no theory of authority to replace red tape or to hold officials accountable, and, in the vast sea of federal government, the totality of its permanent reforms made only small ripples. President Obama appointed Cass Sunstein as his “regulatory czar” to promote simplifying regulation. Sunstein made sure new regulations were more coherent, but he did not have the time, staff, or political will to engage in much “retrospective review,” intended to make old programs work better.120
And, of course, President Trump was elected on a campaign promise to “drain the swamp.”121 But Trump decided early on to focus on unilateral executive orders, not statutory overhaul. He immediately reversed many of Obama’s executive orders122 and initiated some reforms to simplify interactions with government, including for infrastructure permitting,123 nutritional compliance in schools,124 and regulation of small banks.125 But like his predecessors, he proposed no new operating vision that might make government more responsive and less intrusive.
In hindsight, Republican presidents since Reagan failed to fix broken government because they focused on getting rid of programs altogether. Deregulation sounds good on the stump, but when push comes to shove, voters want clean air and Medicare, and they do not want to deregulate. Democrats failed because they cloaked themselves in the virtuous goals of government programs, without acknowledging the operational failures of a framework designed to avoid official judgment. Democrats instead tried to alleviate voter frustrations by striding into the legal thicket with pruning shears. But pruning the jungle has no demonstrable impact except to a few insiders and, often, creates new ambiguities that lead to more rules.
What both sides share is the operational frame of reference: that good government requires avoiding human agency in implementation of public decisions. Both sides assume that public choices should be laid out in advance, to achieve neutrality, or proved to be correct in formal hearings. Letting officials take responsibility to achieve public goals and abide by governing principles is beyond their comprehension. So is letting other officials have authority to hold subordinates accountable, or to provide checks and balances.126 The ideal structure is what they call “clear law.” There will be little or no room for government overreach, the theory goes, when human judgment is purged from daily public choices. Because accountability of public officials has not been possible since the turn of the last century, the only tools available to reformers are either to get rid of government programs altogether, or to compound the complexity with new legal rules and procedures. Distrust of any modicum of authority has welded shut the door to the reforms that could fix broken government.
Democracy is supposed to be a system that authorizes officials to use their judgment, not to bar them from using it. The idea of a “republic” was grounded in the precept that elected representatives would serve the public by acting on their best judgment. Officials in the executive branch were to have a similar responsibility. As Madison put it,
It is one of the most prominent features of the constitution, a principle that pervades the whole system, that there should be the highest possible degree of responsibility in all the executive officers thereof; any thing, therefore, which tends to lessen this responsibility is contrary to its spirit and intention . . . .127
A functioning democracy requires two gears that the giant bureaucratic machine is designed to avoid. The first is to give back to officials and citizens the authority needed to do their jobs. What is needed is an open framework of goals and principles, such as the Constitution, that re-empowers Americans, at all levels of responsibility, to roll up their sleeves and start getting things done.
Re-empowering officials to make common choices is, as noted, the key to our own freedoms. The authority of officials, judges, and others with responsibility to assert and act on norms of reasonableness is what liberates everyone else to act reasonably. Students will not be free to learn unless the teacher has authority to maintain order in the classroom. To get a permit in a reasonable time frame, the permitting official must have authority to decide how much review is needed and to resolve disputes among different agencies. To prevent judicial claims from being a weapon for extortion, the judge must have authority to draw lines on whether a claim is excessive or far-fetched. To contain a virulent virus, the public-health official must have authority to respond immediately.
The second gear needed for a functioning democracy is an unbroken chain of accountability. There’s no need for detailed dictates if officials can be accountable when they abuse the public trust. That requires returning the original principles of the merit system. For democracy to be responsive, elected leaders must have the ability to manage the officials actually doing the work. As Madison put it, the President must “possess alone the power of removal from office,” in order to create an unbroken “chain of dependence . . . the lowest officers, the middle grade, and the highest, will depend, as they ought, on the President.”128 The possibility of abuse of personnel authority for partisan reasons can be protected against by giving an independent personnel agency the authority to protect against partisan reprisal, as was provided in the Lloyd-LaFollette Act of 1912.129
Migrating from a command-and-control legal framework to responsibility-for-results would be as liberating to Americans as it would be historic. But designing simplified structures defies the laws of legislative physics, and would quickly get bogged down in the horse-trading among numerous interest groups crowding the halls of Congress. The mechanism for overhauling dense legal structures, used throughout history,130 is to delegate the job of making proposals to small committees of experts. Just as it delegates to an independent commission decisions on which Department of Defense bases to close,131 Congress should appoint recodification commissions to propose simplified codes.
The last major recodification in America was the Uniform Commercial Code, created in the 1950s by a small committee led by Karl Llewellyn.132 Replacing the tangle of separate state contract laws with one set of understandable principles provided a reliable platform for the post-war growth of interstate commerce.133 The transformative effects of recodifications have been demonstrated since ancient times.134 In the 1980s, Australia replaced its prescriptive regulations for nursing homes with thirty-one general principles—for example, to provide a “homelike environment,” and to respect the “privacy and dignity” of residents.135 Within a short period, nursing homes had dramatically improved, researchers found, because the aides were focusing on the needs of the residents instead of keeping track of detailed rules.136
Simplifying structures to give officials the needed authority does not require trust of any particular person. Responsibility will prove far less dangerous, as studies on official corruption have found,137 than bureaucratic labyrinths that serve to obscure wrongdoing. What’s needed is to trust the overall system and its hierarchy of accountability—as, for example, most Americans trust the protections and lines of accountability provided by the Constitution. There is no detailed rule, or objective proof, that ultimately determines what constitutes an “unreasonable search and seizure” or “freedom of speech.” Those protections are nonetheless reliably applied by judges who, looking to guiding principles and precedent, make a judgment in each disputed situation.
But the idea of giving anyone authority is like poking a raw wound. What if they are venal, or worse? The answer to distrust is also authority—give other people the authority to hold them accountable. Holding people accountable is even more terrifying to the modern mind. Who are you to judge?
We cannot have it both ways. Without traffic cops saying “stop and go,” a crowded society is soon stuck in gridlock. Instead of moving forward, people honk their horns and blame others for not getting anywhere. Our choice is to wallow in distrust and accept paralysis by red tape, or to get society moving forward by restoring clear lines of authority.
Breaking free of the current public operating system will be difficult, notwithstanding popular pressure for change. The natural instinct of reformers is to focus instead on unmet public challenges such as climate change or income stagnation. But much of the public frustration stems from government’s inability to deliver. That operational failure cannot be reformed within the existing framework. Nothing much can work sensibly because no one is free to make it work. We must acknowledge that the progressive reformers’ dream of pure public choices, unsullied by human judgment on the spot, produces failure and frustration.
The giant bureaucratic framework designed to replace human responsibility cannot be repaired. It must be replaced. Releasing the legal stranglehold on official responsibility will be difficult, especially to experts who live in the legal thicket. They must come to grips not only with the failure of this system, but with two misconceptions about the rule of law. The first is that law should strive to dictate correct choices and judgments in advance. The second is the belief that so-called “clear law” avoids abuse and enhances freedom.
The goal of law, broadly, is to protect and enhance our freedom. It does this by prohibiting bad conduct, such as crime or pollution, not (generally) by prescribing correct conduct. Think of law as a corral surrounding the field of freedom. The legal fence protects against outside misconduct and defines the open field within which people can act freely without being unduly fearful or defensive. As Isaiah Berlin put it, law provides “frontiers, not artificially drawn, within which men should be inviolable.”138
The modern bureaucratic state, too, aims to be protective. But it does this by reaching into the field of freedom and dictating how to do things correctly. Instead of protecting an open field of freedom, modern law replaces freedom.
The logic is to protect against human fallibility. But the effect, as discussed, is a version of central planning. People no longer have the ability to draw on “the knowledge of the particular circumstances of time and place,” which Nobel laureate Friedrich Hayek thought was essential for most human accomplishment.139Instead of getting the job done, people focus on compliance with the rules.
At this point, the complexity of the bureaucratic state far exceeds the human capacity to deal with it. Cognitive scientists have found that an effect of extensive bureaucracy is to overload the conscious brain so that people can no longer draw on their instincts and experience.140 The modern bureaucratic state not only fails to meet its goals sensibly, but also makes people fail in their own endeavors. That is why it engenders alienation and anger, by removing an individual’s sense of control of daily choices. This anger is unlikely to dissipate until there’s a new framework of law that honors human agency.
It is received wisdom that detailed law achieves better clarity. Sometimes law can be both precise and clear, as with speed limits or effluent discharge limits. But no human can comprehend, much less comply with, thick rulebooks, such as the thousands of detailed rules mandated by federal worker safety law.141
For most human activity, clarity in law is usually achieved not with precise rules but with goals and principles that people can readily understand and internalize. As Richard Posner observes in The Problems of Jurisprudence, “Standards that capture lay intuitions about right behavior . . . may produce greater legal certainty than a network of precise but technical, non-intuitive rules . . . .”142
Precise law, conservatives believe, prevents officials from acting arbitrarily or corruptly. To the contrary: the inability of mortals to comply with thousands of rules puts arbitrary power into the hands of each official. That is part of why Americans go through the day looking over their shoulders. Is your paperwork in order?
Uniform application of clear rules is also believed to guarantee fairness. But circumstances matter, and mechanical application of clear rules often guarantees unfairness. Disciplining an eight-year old under “zero tolerance” laws for bringing to school plastic soldiers carrying rifles is absurd.143 So too is a life prison sentence for someone who stole three golf clubs, under a “three strikes and you’re out” law, because of prior theft convictions.144 As Benjamin Cardozo put it, “Justice . . . is a concept by far more subtle and indefinite than any that is yielded by mere obedience to a rule.”145
Law achieves trust and supports practicality only when applied with human values and understanding. “The first requirement of a sound body of law,” Oliver Wendell Holmes wrote in The Common Law, “is that it should correspond with the actual feelings and demands of the community . . . .”146 The way law achieves this is that people are able to draw on norms of fairness and reasonableness at the point of implementation. Otherwise law is brittle, and words of law are parsed for selfish purposes. Legal philosopher Jeremy Waldron puts it this way: “[T]he Rule of Law is, in the end . . . a human ideal for human institutions, not a magic that somehow absolves us from human rule.”147
conclusion: american government at a crossroads
No one designed this bureaucratic tangle. No experts back in the 1960s dreamed of thousand-page rulebooks, ten-year permitting processes, doctors spending up to half of their workdays filling out forms, entrepreneurs faced with getting permits from a dozen different agencies, teachers scared to put an arm around a crying child, or a plague of legal locusts demanding self-appointed rights for their clients. America backed into this bureaucratic corner largely unthinkingly, preoccupied with avoiding error without pausing to consider the inability to achieve success.
We tried to create a government better than people. Without our noticing, the quest for hands-free government started paralyzing daily choices. Now the broad sense of powerlessness is causing frustrated Americans to pound the table for change. America is at a crossroads. Just as when the Progressive Era, the New Deal, and the 1960s rights revolution caused tectonic changes in how government worked, American government seems ripe for overhaul. What is missing is any vision for a new operating vision. To do so, we must return to first principles and rebuild government on the Framers’ vision of a republic activated by human responsibility and accountability.
Philip K. Howard is Senior Counsel at Covington & Burling.