I've been watching a few American TV shows and it blows my mind that they put up with such atrocious working terms and conditions.
One show was about a removal company where any damage at all, even not the workers fault, is taken out of their tips. There's no insurance from the multimillion dollar business. As they're not paid a living wage the guy on the show had examples of when he and his family went weeks with barely any income and this was considered normal?!
Another example was a cooking show where the prize was tickets to an NFL game. The lady who won explained that she'd be waiting in the car so her sons could experience their first live game, because she couldn't otherwise afford a ticket to go. They give tickets for football games away for free to people where I live for no reason at all..
Yet another example was where the workers got a $5k tip from their company and the reactions were as if this amount of money was even remotely life changing. It saddens me to think the average Americans life could be made so much better with such a relatively small amount of money and they don't unionize and demand far better. The company in question was on track to make a billion bloody dollars while their workers are on the poverty line and don't even have all their teeth?
It's not actually this bad and the average American lives a pretty good life like we're led to believe, right?
Sounds accurate to the east coast from NYC to Florida. Is it better out west?