Coming from Canada, where having health care is a right, it seems illogical that in america people are going bankrupt over medical bills. I see the commercials running on tv that go against it, but they are packed full of lies. I cannot understand why americans think that only the rich deserve to be taken care of. I understand that insurance companies pay big bucks to have lobbyists in the government try to shoot down any type of universal health care, but why is the average joe american so scared of it?