
Actual serious question for my left-leaning friends, because I can't get my head around this one and Google isn't helping.
How is health care a right? I could easily grasp "equal access to health care" as a right. I mean, having a job isn't a right, but it is a right to have equal access to what jobs are available. Is that's what's meant?
But for health care itself... it's a service, provided by other people. If - I know this is a bizarre hypothetical, but - if we ran out or short of doctors... then what? Does the government draft high-performing college students and put them through medical boot camp, then send them to provide health care for a two-year stint or something? The draft is about the only thing I can think of that's comparable, when the government decides the collective rights of the nation outweigh the individual right to self-determination.
(It's a philosophical question. I doubt all the doctors are going to move to Tahiti anytime soon. But if they did, and if health care is a right... then what?)
(Also, question should not be construed as a statement on the health care reform debate, but rather something brought to mind because of it. But it does feel like something that should be ironed out before the reform is done... wouldn't the reform look different if the underlying principle was a right to health care, a right to access to health care, an ethical responsibility to provide health care, or a fiscal responsibility to restructure health care costs? Those are four different ways to approach the reform right there and I'm sure there's more.)