If you want to drive a car, you must be insured. Nobody seems to have a problem with that. You don't pay, you don't drive. If you get caught driving w/o car insurance, you are in huge trouble. I don't see the difference in mandated car insurance to mandated health insurance, yet people seem to whine about it.
If you want to see a doctor, use an ER, need a hospital stay, an operation, need an ambulance to come get you when you're dying, etc. etc. etc., then you should be insured. People who are not insured seem to get care somehow anyway- there are billions of dollars of outstanding medical bills unpaid out there. Why are these compassionate conservatives who whine about people on welfare and receiving food stamps not upset about healthcare bloodsuckers who make everyone else's insurance premium rates go up to absorb the cost? People say it's "too expensive", well, yes it is, but a year of premiums is still cheaper than paying the rack rate for, say, a routine appendectomy, which my good friend is currently paying for. (Or... she could just refuse to pay and let the hospital pass the loss on to you? Which do you prefer?)
Conservatives want everyone to be accountable for themselves- get a job, don't mooch off the system, don't have kids you can't afford, etc. If this is a virtue they value in our citizens, then why aren't they all behind requiring everyone to purchase their own health insurance? Please explain the logic.