Quote:
Originally Posted by The Dude
Because all people will eventually require medical care of some sort. If you don't have insurance then who is going to pay for your medical bills?
|
That's the whole point of Insurance.
Insurance - Definition and More from the Free Merriam-Webster Dictionary
: an agreement in which a person makes regular payments to a company and the company promises to pay money if the person is injured or dies, or to pay money equal to the value of something (such as a house or car) if it is damaged, lost, or stolen
If you don't pay for insurance, then the person who gets injured is suppose to pay, duh. It's business. And, if you believe people should receive free health care/universal health care then you study to become a doctor and live the rest of your life providing free health care for no pay.