Friday, June 06, 2008
Regular reader Kevin wrote in this week with a question regarding my post about our dreadful pubic education system in which I laid partial blame on the insidious teachers' union....
I have an offtopic question. What do labor unions do that actually benefit America?
There was a time in America, before the EEOC, labor law, and watch-dog groups, during which labor unions protected workers' rights to safe work environment, fair compensation, unfair dismissal, and negotiated for benefits like health care, severence, vacation, higher wages, etc. There are now ridiculous numbers of worker protections in place such that the unions are pretty much unnecessary. In fact, in the 70's when these laws were put in place, the union bosses resented their phasing out so they essentially became an organized crime group, hellbent on maintaining their grip on power over the companies. They forced the employees to join the union, then extorted protection money(the union bosses called them dues) from the workers. Anyone who didn't join the union had his house burned down.
Today, unions do little but force companies to maintain huge, unnecessary pools of workers who sit around and do nothing but collect huge checks. This keeps the union rolls well-populated and the dues coffers full of cash. Moreover, they force wages far higher than the labor market can bear, they insist on exorbitant severence and retirement packages for even the most uneducated, unskilled workers, and they enjoy excellent health-care benefits. Many American companies like GE, GM, FORD, Chrysler, and others are no longer primarily car and truck builders. They are first and foremost union-forced health-care and retirement providers for retired and unemployed former workers. And democrats complain when they go to Mexico or China to build cars.
It is a common myth in America that employer-provided health-care is a birthright. It is not! Labor is, or should be, subject to market forces just like any thing else. Health care is an incentive, just like vacation, higher salaries, and company cars, that employers sometimes dangle to entice the best workers to fill positions. There is no birthright to health-care, retirement, or high wages at somebody elses expense. All are subject to market competition, including labor.
Finally, you may wonder why labor unions are always, always aligned with the most liberal of democrats. It's because unions definitionally, work against companies. Democrats pretend to be the champions of the little guy against the evil rich. Naturally, unions want democrats in power because they know democrats will pass laws that support their insidious, anti-capitalist practices. Democrats like unions because union bosses contribute union dues to their campaigns and force members to vote democrat. Also, a dependent labor force is the democrats' wet dream for America, so of course they support any organization, no matter how corrupt, that can suppress and intimidate that many citizens into voting democrat.
Teachers' unions aren't quite that bad but, they do extort huge amounts of money from taxpayers that gets wasted on administrative boondoggles. The US spends more per student than almost any modern industrialized nation, yet we fall far below them in comparative testing and academic performance. The teachers' unions demand more and more money and then indoctrinate our kids about global warming, how evil America is, and the joys of sexual experimentation.
Unions and punitive taxation are the two biggest reasons by far, that many otherwise profitable companies leave the US to do business somewhere else. So to answer your question, in my opinion, labor unions do nothing that benefits America. They do far more harm and almost no good.