is the richest country in the world and yet it is the only major industrialized country in the world that does not guarantee healthcare as a right to its citizens. About 45,000 uninsured Americans die each year (What The US). As a nation built on the ideals of “life, liberty, and the pursuit of happiness” and the idea that government is responsible for protecting people's fundamental rights, it is a great source of shame that the United States does not have universal healthcare. It is the government's job to guarantee the rights of citizens, not to profit from their suffering and the denial of any of their fundamental rights. Universal healthcare could save lives and alleviate suffering, physically, financially and emotionally. It would remove a large financial burden from each individual, as well as from the nation and government as a whole, avoiding wasting all the per capita income we currently waste without universal healthcare. It would even be beneficial for capitalism because people would be more willing to take risks without the fear of having to go without medical insurance (Why the United States). By allowing its people to suffer and die, especially just to make a profit that will still be wasted needlessly, the United States government is committing great immorality. Aren't human lives more important than allowing greedy independent companies to profit from their suffering and deaths? Like a country that is even willing to go to war to protect the fundamental rights of foreign peoples,
tags