The government already forces you to buy car insurance in order to legally and safely operate a vehicle. If you're not buying it, god help us all.
We are not FORCED to purchase car insurance. If you want to drive a vehicle, than you must purchase insurance. Those that don't drive, don't have to purchase car insurance.
mandatory health insurance just because your a U.S. Citizen VS. mandatory car insurance IF you decide to drive a car on public roads = APPLES AND ORANGES
This is a tired, lame and completely false comparison that I see on every left wingnut site.