My parents came to the United States because it was the land of opportunity. It was a place where you could, on the basis of your abilities and hard work, make a good life for yourself and your family. It isn’t a utopia, it’s far from it, but it’s a lot better than a lot of other places in the world. We still have our issues and our problems but fundamentally the “dream,” as I always understood it, was that you could succeed through hard work.
Somewhere along the way the American Dream meant owning a home. I saw it mentioned a bunch of times as the housing markets fell and we stumbled our way into the financial crisis. When did that happen? Is that what most people think of when they see the American Dream?
I know that Fannie Mae and Freddie Mac were created, in part, to make homeownership easier. If you have someone willing to buy up pretty much any mortgage loan you create, you can continue to make loans and people can continue to buy homes (and then you can see where abuse comes into play). However, I don’t think that means that homeownership is the American Dream. It might be part of it, but I don’t think it embodies it.
What does the American Dream mean to you? And do you think it’s dead? Or just taking a breather?