What’s The Real Reason There Are More Women In The Workplace?

For the past several years, there has been much publicity over the increase of women in the workplace. The hiring of women has greatly outpaced the hiring of men in certain jobs. Advertising, marketing, healthcare, and many other industries are becoming dominated by women. The statistics would lead you to believe that the U.S. has finally become gender equal.

But before you begin applauding American corporations for their enlightment, you may want to consider another, not quite so flattering, reason for the change. In their never-ending quest to increase profits and pump up stock prices, corporations may simply be hiring more women because they can pay them less.

That’s right. American corporations have cut employee-related costs by increasing productivity, automating production lines, and shipping high-paying jobs overseas where workers are paid less and receive virtually no benefits. Many have hired illegal immigrants to replace workers at the lowest end of the pay scale. They’ve utilized independent contractors to replace full-time office workers in order to avoid paying Social Security, health care benefits, disability insurance and unemployment insurance. They’ve even come up with ways to use the Internet to pare the cost of marketing, advertising and design. So what’s left?

Women have always been able to do most jobs as well as men (and many better). But their salaries have long been suppressed. (A recent study found that female attorneys in elite law firms were paid an average of $66,000/year less than their male counterparts.) So why not take advantage of them once again?

Hiring more women is a sign of progress toward gender equality. But the reason for it is not necessarily one that corporations should be proud of.