America is the home of many great things in this world. When you think of America, you think of Apple, Facebook, Chevy, Nike, NFL, NBA, etc… However when you look at some of the most iconic American brands, are American companies still making everything in America? I would say that back in the 80’s, there was a true sense of everything being American made by American companies, but things have changed.
In our great American culture, we’re influenced by society to buy American, and support American made items. While we have big brands such as Apple and GM that are American companies, they’re truly not making products strictly in the US anymore. Most production of GM cars and Apple products are being handled in countries outside of the US. While the sales of the products ultimately benefit the American companies, the manual work and labor are benefiting another country and the economy of that country.
On the flip side of things, we’re lead to believe as a society that import brands such as Honda and Toyota are only benefiting Japan, and not benefiting the US and our economy. In my opinion, this could be further from the truth. Although Honda and Toyota are technically Japanese companies, they also have factories and corporate staff based here in the US. If we’re truly looking at companies for having the label “American Made”, Honda and Toyota are technically making cars right here in the US. The products are being made here, and helping to support local economies.
At the end of the day, all things being made in the US truly aren’t from American companies anymore. It’s not a bad thing, and more or less just a sign that companies are more geared towards globalization versus being tied down to one particular market.