This is a poll. Just curious to know what you think.
The news we get, even in the age of citizen journalism is largely negative. We have every reason to believe the world is getting worse based on what we see on TV and the internet and a lot of people seem to think it actually is. But is it really?
Is it getting better? Is it getting worse? Or is it a cycle of prosperity, health, disease and war with no significant net gain or loss?
In 2011 we no longer have Hitler or the black plague. Slavery is less prevalent than it once was. Society has come a long way socially. We have amazing charities and a spirit of generosity. But we do have AIDS, obesity and a variety of cancers that are much more common because of the indulgent lifestyles we lead. We have huge debt and we still do have wars.
What do you think?