This article was originally published by The American Conservative on February 23, 2023.

The United States has long prided itself on its skepticism of authority, and this skepticism extends to the health care sector. Balancing the nation’s commitment to individualism with the realities of modern, scientifically informed mass medicine has proved to be a daunting challenge. The longstanding tension between what Michael Lind refers to as the “Jeffersonian” and the “Hamiltonian” visions for American society provides a useful framework to understand how this process has played out.

From its quintessentially Jeffersonian origins in the practices of unevenly trained independent physician-craftsmen, America’s health care system would gradually incorporate more centralized Hamiltonian measures aimed at fostering medical expertise and making these services broadly available to its public. However, the resulting hybrid system represents the worst of both worlds, failing to provide either the freedom of a more decentralized Jeffersonian approach or the efficiencies of a more centralized Hamiltonian one. For the American health system to achieve its full potential, it must become more Jeffersonian in its approach to supplying care and more Hamiltonian in its approach to organizing it.

From the American republic’s outset, the medical environment differed significantly from the state-controlled models of continental Europe. While many physicians entered the profession through apprenticing under a more established practitioner, doing so was not a long-standing legal requirement as it was in France. And the United States certainly had no appetite for anything resembling Prussia’s “medical police” tasked with enforcing health and medical standards. Care for the sick was largely delivered outside of hospitals by yeomen practitioners, mostly educated with a varying basis in the science of the time at commercial vocational schools. All of this was partly a reflection of the fact that the United States was still a young, largely agrarian country, but it was also a product of the Jeffersonian ideals, such as individual autonomy and limited government, that animated the country in the years after the revolution.

Much in the same way that the Jeffersonian vision conflicted with Alexander Hamilton’s goals of industrializing the young republic through state support, a similar conflict of visions brewed regarding the advancement of medicine itself. In contrast with the decentralized laissez-faire ideal associated with Jeffersonianism, the Hamiltonian approach is associated with an active government role in achieving broader societal objectives. As physicians academically trained in these institutions filtered throughout the country, momentum toward professionalization began to build, leading to the passage of licensure laws in states and cities up until the 1820s.

Yet just as momentum was building for the professionalization of medicine, America entered a period of Jeffersonian backlash. This period, characterized by pervasive anti-elite and anti-monopoly sentiment throughout politics and society, began with the election of President Andrew Jackson in 1828. This backlash against established authority led to the de-licensure of many professions across the country, including medical professionals, and further progress toward licensure largely stalled out.

This Jeffersonian backlash reflected the growing anti-authoritarian sentiment of the time. New religious movements, such as the Adventists and Latter-Day Saints, were taking shape during this period. Ralph Waldo Emerson’s 1841 essay “Self-Reliance” illustrates the individualist currents of the time. These social currents, premised on the belief in the power of the individual to take control of their own well-being, certainly extended to matters of health. Alternative medical practitioners at odds with the academic medical orthodoxy rose in popularity during this period, such as Thomsonianismeclecticism, and homeopathy. Americans’ distrust of medical expertise was not entirely without basis either—the rudimentary medical techniques available at the time often exposed patients to greater health risks than the less invasive folk therapies on offer.

Whereas the trajectory of American medical professionalization at the turn of the century was not too dissimilar from England, the public skepticism, internal sectarianism, and political indifference that took root in the Jackson era had put the United States on a decidedly different path. While England would regularize its own physician workforce with the passage of the 1858 Medical Act, the same would not be achieved in the United States for another fifty years.

The changing economic conditions in the years following the American Civil War aided the reconciliation of medical expertise with Americans’ skepticism of concentrated power. As large corporations began to dominate American economic life, perceptions of medical licensure also evolved in directions more compatible with the country’s Jeffersonian ethos. Licensure was now framed as a bulwark against the corrupting influence of faceless corporations, leading to the reintroduction of licensure regulations requiring an academic degree in addition to board examination. By the turn of the twentieth century, these regulations had spread to every state.

The rise of corporations in America also enabled the accumulation of unprecedented wealth among business elites. As they reflected on how to contribute their funds toward larger societal objectives, many became aware of the backwardness of American medicine compared to Europe’s application of laboratory science. Ambitious aspiring doctors traveled to places such as Germany to study medicine, returning as zealous advocates for importing this model to the United States. When Baltimore businessman Johns Hopkins died in 1873, he bequeathed his fortune to the establishment of an institution in this mold. This model would be rolled out nationwide following the release of the Flexner Report in 1910. The report, which decried the quality of North American medical instruction, precipitated the shutdown or reorganization of a vast number of teaching institutions, with major foundations devoting roughly half of their giving to medical education.

Among physicians, a professional culture emerged that sought to insulate American medicine from the pressures of both the democratic state and the market economy. This is illustrated by statements from the American Medical Association decrying the influence of the “spirit of trade” on the profession, as well as by William H. Welch’s statement to a Senate Committee in 1900 that “all medicine asked of Congress was not to interfere with its progress.”

American health care remained more personalistic and less hospital-based. Physicians had access to support staff and equipment furnished by hospitals without being their employees, and even modest forms of corporate organization were opposed as threats to professional autonomy. The Jeffersonian skepticism of concentrated power had been reconciled with clinical expertise, resulting in an American health care system led by highly-trained physicians.

As America progressed toward the frontier of scientific medicine, the conflict between Jeffersonian and Hamiltonian ideals came to a head with the challenge of providing health care services to the mass public. While charitable financiers played a significant role in America’s medical professionalization, their interest and funds were insufficient to bankroll health care for the public as a whole. To provide high-quality medicine for the bulk of its citizens while still respecting their Jeffersonian skepticism of government power, the United States would gradually establish an opaque patchwork of programs and subsidies aimed at financing health care broadly while maintaining the appearance of personal responsibility.

With physicians increasingly professionalized, the gradual shift in hospitals’ operational model from charity to cost-reimbursement created growing problems for patients attempting to access the system. Whereas European government had gotten involved in financing health insurance as early as the 1870s, small government attitudes thwarted similar social insurance schemes from being established in the United States. Indeed, even voluntary insurance was widely opposed among American physicians up until the Great Depression, viewing such arrangements as threats to their autonomy.

The dam on federal involvement in financing health care broke in earnest with the establishment of a tax subsidy for employer-provided health insurance during World War II. After the war, the Hill-Burton Act of 1946 provided federal matching funds that led to a substantial expansion of hospital construction in communities across the country. This development shifted the power centers in the American health care industry, with hospitals and health insurers gaining influence that rivaled that of physicians. The American health care industry became characterized by a tripartite separation of powers between physicians, hospitals, and health insurance companies.

With federal subsidies having decisively reoriented health care financing around employers by the 1960s, the gaps in health care coverage for those outside that system, namely the elderly and the poor, had become increasingly apparent. In spite of opposition from the industry’s players, the federal government enacted bills authorizing the creation of Medicare and Medicaid aimed at providing public health insurance to these populations. Reflecting the considerable skepticism these programs faced from industry players at the time, as well as the broader American mistrust of government meddling in health care, these programs were set up to mimic the relations within the private market. Government provision of health insurance in America did not entail that the system be placed on a budget, as had been the case in other countries.

The tension between Jeffersonian and Hamiltonian ideals in American health care reached a tipping point in the 1970s amid rapidly rising prices. The government’s growing commitment to paying for health care clashed with its longstanding hands-off approach to delivery. Overutilization and waste were believed to be the problems, but getting the government directly involved in care decisions was a political nonstarter. In response to this dilemma, the government pursued policies aimed at reducing health care costs through indirect means. One example here is “certificate of need”, which aimed to control utilization by restricting the availability of health care services. The other prominent strategy was encouraging the growth of managed care, empowering private third-parties to control utilization rather than the government. The resulting health care system is perhaps best labeled as “decentralized Hamiltonianism,” reflecting a mix of Jeffersonian and Hamiltonian elements.

By the 1980s, the health care industry became alarmed by the tightening control measures. Reforms to Medicare reimbursement as well as the looming specter of managed care led many in the industry to believe that the good times were effectively over. One influential report released in 1981 projected a “physician surplus” unless the profession’s size was constrained. Federal support for physician education and training was withdrawn, and M.D.-granting medical schools put a voluntary moratorium on new schools and enrollment. Hospitals followed suit and began consolidating rapidly to accommodate themselves to a managed care world. At the time, few were concerned about the impact on competition as these changes were viewed as consistent with the conventional wisdom that American medicine would increasingly resemble the Taylorist industrial factory floor, with a lean, highly specialized workforce coordinated through large, vertically integrated health systems.

The administrative complexity of American health care was exacerbated by this turn toward managerialism. Owing to Americans’ skepticism of concentrated power, authority had always been more dispersed, spread across various stakeholders including health insurers, hospitals, employers, the courts, levels of government, and non-profit bodies. This gradual supplanting of clinical expertise with managerial authority was helped along by events such as the highly publicized Libby Zion case, which shook the public’s confidence in the safety and quality of American medicine. These trends not only empowered administrators within health system facilities and government, but also the various non-profits regulatory bodies that derive power either directly as government-designated accreditors or indirectly through malpractice law

By the early 2000s, the long-heralded managed care revolution had largely collapsed. The drive toward consolidation and specialization taken to accommodate managed care created conditions for its overthrow. Many of the cost-control measures seemed draconian. The lack of insurance choice provided in our employer-based system allowed this dissatisfaction to bottle up until it eventually reached a breaking point. Industry groups, realizing the public was on their side, supported laws that substantially constrained managed care. Against the backdrop of a strong economy, health care providers realized that all they had to do was say “no,” and the system fell apart. After a decade-long pause, American health care costs resumed their pre-managed care trend of rapid ascent.

The system we are left with today might alternately be described as “monopolistic Jeffersonianism” or “incoherent Hamiltonianism”. No matter what you choose to call it, the fact remains that it is a mess. Despite the retreat of managed care, the trend of hospital consolidation and physician specialization continued, driven by financial incentives to gain market power. No reexamination of the decisions made during the managed care era took place, highlighting the continuing conflict between the Jeffersonian and Hamiltonian approaches in the American health care system. Nowhere is this lack of clarity better illustrated than the Affordable Care Act enacted in 2010, which combined health insurance expansions with a grab bag of managerialist tweaks drawn from various incompatible points of view regarding the overarching direction toward which the American health care system should evolve.

The paradox of the American health care system is that its Jeffersonian heritage, with its mistrust of unchecked power, has led to a system that is so complex that accountability is impossible. The once simple fee-for-service model has given way to a fragmented mess, based on a patchwork of insurers and countless uncoordinated oversight bodies. The resulting health care system is needlessly cumbersome, wasting hospitals’ budgets on administrative overheads and forcing doctors to spend countless hours on paperwork instead of patient care. Patients, too, face myriad difficulties navigating the labyrinthine system. To restore order, a more Hamiltonian approach is needed to streamline administration, reduce complexity, and increase efficiency.

Limiting the supply of health care facilities and professionals is not the answer to the problem of waste. Such restraints don’t lead to the most valuable care being prioritized and cause sensible utilization restraints to be experienced as draconian by patients. Instead, we need a more Jeffersonian commitment to ensuring that health care markets are competitive, with a focus on balancing supply and demand to provide high-quality medicine and personal relationships with providers.

To realize the potential of the American health care system, we must strike a better balance between Jeffersonian and Hamiltonian impulses, taking the best from each aspect of our heritage. The resulting health care system could be described along lines such as “state-capacity Jeffersonianism” or “competitive Hamiltonianism”. But no matter how we choose to categorize it, the fact remains that the United States needs to more thoroughly embrace Jeffersonianism in terms of delivering health care and more Hamiltonianism regarding how we organize it.

This article is part of the American System series edited by David A. Cowan and supported by the Common Good Economics Grant Program. The contents of this publication are solely the responsibility of the authors.