Why is public health care in the USA extremely limited/non-existent?
It seems that in every westernized/developed country (and quite a few others, I believe) public health care is free. For example, here in the UK if you get ill you can go to an NHS hospital. You might have to wait months for treatment, but it will be done at the expense of the state.
If you get ill in the USA and don't have enough money or insurance, the state leaves you to die. How can the USA call itself the greatest country in the world with this going on?