Americans Tell Us If America Is Actually Getting Better

It’s a tumultuous time in America. Political, cultural and economic divides feel more stark than ever before. Arguments from social media spill over into real-life debates at dinner tables and in workplace break rooms.

Optimism has always been in the American DNA — despite how bad today may be, the American thing is to think tomorrow will be an improvement.

Is that still true?

VICE News traveled across the country, stopping in Colorado, Ohio and North Carolina to ask as many people as we could one question: Is America getting better?

Subscribe to VICE News here:

Check out VICE News for more:

Follow VICE News here:
More videos from the VICE network: