In order to improve the future of US healthcare, we must grapple with the past. As we create change, we must ask ourselves: how can we be grateful for where we are now and what can we learn from our past?
Patients were mostly cared for by the domestic household or lay healers. The caretaker in the home environment was usually the matriarch of the family and maintained a close emotional bond with the patient. The treatment provided by matriarchs in domestic settings was often rooted in traditional care practices, including herbal remedies, home-cooked foods, and emotional support. Lay healers too rooted their practices in experience, tradition, and community knowledge. In the 1800s, most lay healers were apart of the Thomsonian (aka Popular Health) movement, which favored natural remedies over professionalized medicine. They believed in the body’s ability to heal itself and emphasized treatments using botanicals.
Thomsonianism arose as a reaction against the harsh and invasive medical practices of the time, such as bloodletting and the use of toxic substances like mercury. It gained significant traction among the working-class, who appreciated its simplicity, affordability, and independence from elite, professional physicians.
1834 Cartoon
“Authority… involves a surrender of private judgment, and nineteenth-century Americans were not willing to make that surrender to physicians.” - Paul Starr
Despite its popularity, Thomsonianism faced criticism and legal challenges from the emerging medical establishment, particularly as physicians sought to regulate medical practice through licensing laws. By the mid-19th century, Thomsonianism began to decline, giving way to other alternative medical movements, such as homeopathy and eclectic medicine, which built on its ideas and sought a middle ground between traditional and conventional medicine. This pattern of alternative medicine rising in popularity, facing professional opposition, and either evolving or being absorbed into mainstream medicine has continued throughout U.S. history. Movements such as naturopathy, chiropractic care, and functional medicine have followed similar trajectories, often emerging in response to perceived gaps in conventional care and adapting to regulatory pressures while maintaining public appeal.
“There can be no good reason for keeping us ignorant of the medicines we are compelled to swallow.” - Thomsonian writer
battled for authority in the late 1800s and early 1900s by professionalizing medicine through the establishment of medical schools, standardized curricula, licensing laws, and organizations like the American Medical Association (AMA). The war waged by medical schools and societies came with its losses, these are detailed in The Social Transformation of American Medicine by Paul Starr:
Physicians ultimately gained authority in the late 19th and early 20th centuries through key reforms. Medical schools standardized education, adopting rigorous scientific curricula influenced by the landmark Flexner Report of 1910, which highlighted the inadequacies of most schools and led to the closure of weaker institutions. Licensing laws, enforced by state governments, restricted entry into practice, elevating standards and eliminating competition from untrained practitioners. These reforms, along with the country's growing trust in science and specialized knowledge, established medicine as a respected profession.
“[Physicians] claim authority, not as individuals, but as members of a community that has objectively validated their competence… In justifying the public’s trust, professionals have set higher standards of conduct for themselves than the minimal rules governing the marketplace and maintained that they can be judged under those standards only by each other, not by laymen.” - Paul Starr
With the rise of the profession, medical breakthroughs defined the 20th century. Antibiotics beat infections head-on. Vaccines stopped polio and smallpox in their tracks. New surgical methods and organ transplants gave patients second chances. X-rays, MRIs, and CT scans let doctors see inside the body. These changes didn't just improve medicine - they transformed how long and how well people could live.
were founded because “there was no other option. Fellow creatures could not be allowed to die in the streets,” Charles Rosenburg once said. Without formal doctors, those who did not have a care taker were left on the street.
Early hospitals were concentrated in major cities like New York, Philadelphia, and Boston, in the early 1800s operating in poor conditions to serve these homeless populations. Religious organizations, particularly Christian groups, established these facilities as part of their mission to care for those in need, without requiring payment. It is important to note that early hospitals were not seen as places of healing for the general population but rather as facilities for the poor, homeless, and chronically ill—people who had nowhere else to go for care. Early hospitals frequently assessed patients' credibility and moral worth before admitting them, selectively excluding those deemed 'undeserving' to maintain their reputations and control mortality rates. These medical institutions were through a mix of charitable donations, religious organizations, and government support. As the demand for medical services grew, the hospitals began allowing private physicians to treat paying patients in their facilities. This system eventually shifted toward the European approach where hospitals directly employed physicians.
“Our country has always tried to provide essential medical care to those who are ill and unable to provide for their own care.” - We’ve Got you Covered