Are Women Becoming the More Dominant Gender?

2512 Words11 Pages
Throughout history, men have always been considered to be the more dominant gender. When the ancient tribes needed food or shelter, the men from the tribes were the ones who built and hunted. When towns and colonies needed leaders to fight in wars and battles, men went out and fought. When the factories were starting to be created, and office, white collar work was becoming more common, men were the ones who went to work in the more respectable jobs. All these old, manly traits, like hunting and fighting, have been proved to be no longer necessary, as today’s society is more based on brains and emotions than in the physical aspect (Gross). Women, possessing the traits of intellectual and emotional intelligence, are now being needed in more jobs, have more power, and are able to have more control of their lives than what they used to have before, as many years back they didn’t even have the right to vote. Society has been evolving, and so have the people’s needs. Excelling in fighting, hunting, and building are no longer what marks a successful person. Society has evolved into creating more intellectual, more emotional necessities for everyone. It has been proven that women are becoming more dominant than men, as men’s abilities are more physical; women have shown to be more motivated to learn and succeed in life than men, and women possess the emotional, social, and intellectual skills that modern society demands. Large companies have been hiring more women than before, after realizing their high potential in the business world. Take Sheila Bair, for example, who is seen as one of the most powerful businesswomen to date. She is the 19th Chairman of the Federal Deposit Insurance Corporation (“Board of Directors and Senior Executives”). Her ability to connect with others has enabled the company to interact with other companies professionally, allowing them to

More about Are Women Becoming the More Dominant Gender?

Open Document