To some, it’s the greatest idea since price supports for agriculture: A government assumes its citizens’ healthcare choices, paying every cost and minimizing all guesswork. To others, it’s an infringement on individual human autonomy, the transference of private decisions about health to a taxpayer-funded bureaucracy.
A euphemism for “government-run,” “single-payer” means that instead of every person in the marketplace paying for his or her own healthcare, there’s just one payer. A monopsony. In some parts of the world, such a system has been entrenched for so long that it’s difficult to conceive of any other way. In others, in particular, the United States, there’s still plenty of debate on the issue. It’s easy to talk about a fundamental “right to healthcare,” but the issue gets complicated when one realizes that entitling a person to certain time and resources means putting an obligation on someone else to provide the same.
An Old Idea
Advocacy for a single-payer system in the U.S. is nothing new. In the fall of 1945, just after the end of World War II, recently inaugurated President Harry Truman addressed Congress with a plea for a national healthcare system. The American Medical Association opposed the idea, and it eventually faded away.
Incremental steps did continue throughout the decades. Medicare and Medicaid were established in 1965, essentially becoming a de facto single-payer system for certain groups of the population – senior citizens, and young children and the poor, respectively.
Brought Back in Recent Times
In modern times, the strongest push to nationalize healthcare in the world’s largest economy happened in 1993. When her husband’s administration was months old, then-First Lady Hillary Clinton spearheaded the Health Security Act. Thus known commonly as “Hillarycare,” the bill required all citizens to enroll in a government-approved health plan and forbade them from ever exiting that plan.
Hillarycare also called for the creation of a National Health Board, a seven-member panel whose duties would include determining what constitutes “an item or service that is not medically necessary or appropriate” [Section 1141(a)(1)]. The bill was a bureaucrat’s dream, as it set criteria for everything from a new tax on cigarette rolling papers [Section 7113(a)], to payment limits on certain drugs. When prominent members of the President’s own party began to question the bill’s feasibility, support continued to weaken. The bill officially died a few weeks before 1994’s midterm congressional elections, which was seen as something of a referendum on Hillarycare.
One fact often used to defend the concept of a single-payer plan is that the U.S. spends more of its gross domestic product (GDP) on healthcare than do other nations.
Mexico and Turkey each spend barely a third as much on healthcare, relative to GDP, as does the United States. Among countries that aren’t part of the Organization for Economic Cooperation and Development, the numbers can go even lower. For instance, Equatorial Guinea spends less than a quarter as much of its GDP on healthcare as the United States does. But Equatorial Guinea's 13.4% savings over the US on healthcare also nets the country 27 fewer years in life expectancy and 12 times the infant mortality rate of the US.
But it’s probably most instructive to compare U.S. healthcare expenditures to those in the nation’s “peer group” – other developed nations. Canada, for example, has a life expectancy of 81 years while the US sits at 79 years. And Canada's infant mortality rate per 1,000 live births is five, as opposed to six in the US. Yet Canada spends $2,233 less per capita on healthcare than does the U.S.
Is Socialized Really Better?
Just ask citizens of Canada or the United Kingdom, two nations famous for their universal healthcare systems. Many Canadians love to talk of their “free” healthcare system, forgetting that if a free lunch doesn’t exist, then a free colonoscopy can’t either. Neither doctor salaries nor cardiopulmonary bypass pumps are cheap, and the money to pay for them has to come from somewhere.
Canadian health care expenditures work out to just shy of $6,000 per capita per year, compared to the top-ranked U.S. with $8,233. In Canada, nearly all of the $6,000 is funded via taxes. Less than half of that comes from income taxes with the bulk of the costs bankrolled by corporate and sales taxes.
Increases in per capita healthcare spending in Canada have kept pace with those in the U.S., expenditures in the former having almost tripled since the mid-70s, going from $39.7 billion to $137.3 billion. The Canadian government not only acknowledges that many of its citizens have to wait a long time for care, but recently spent an additional billion dollars to examine the issue. In the meantime, watching the months pass is an unavoidable component of Canadian healthcare. If you want a new hip or knee, prepare to live with your old one for at least half a year.
Wait times are a fact of life under socialized medicine in the United Kingdom, too. The U.K.’s National Health Service claims that you shouldn’t have to wait longer than 4.5 months for your approved service yet recent reports say patients can wait as long as eight months for cataract surgery.
Wait times in Canada are increasing, too and are up by 95% since 1993, according to one measure. At least one Canadian doctor has pointed out the absurdity of dogs being able to see specialists faster than humans can. In the U.S., such wait times aren’t even an issue.
The Bottom Line
It wasn’t all that long ago that healthcare was a market no different than that for furniture or electronics: you paid as you went, usually out-of-pocket. Then rising costs led to the notion of a single-payer. When a party other than a patient or a provider starts making healthcare decisions, it’s easy to lose sight of whose interests should be paramount in a healthcare transaction. Governments and private insurers often have conflicting agendas regarding treatment, but a sick person never does. He or she just has one goal: recuperation.