Photo courtesy of Boston College / YouTube

Empowering Pink-Collar Professions

Picking a major was a struggle for me. I never really had one concrete thing that I wanted to be when I grew up, and I found myself constantly changing my mind throughout high school as I found new interests. So when I finally made a decision and had chosen something as concrete and rewarding as nursing, I thought I would feel supported, just as all my peers had been when they discussed their own decisions. Instead, I received almost instantaneous criticism each time. “Why just a nurse?” “Oh, but you’re going to go to medical school after undergrad, right?” “Have you changed your mind yet?” “But you’re smart enough to be a doctor, why be just a nurse?”

I was honestly shocked. I had never realized how much nurses were looked down upon until that moment. And it didn’t really make sense to me. What is so wrong with being a nurse? I then naively thought maybe once I started college, people would get the hint that their pestering wasn’t going to change my mind. But alas, I was wrong yet again. Teachers still brought it up when I went to visit my high school, family members still asked me about it, and I even heard that current seniors in my high school were talking about how it’s “weird” and “doesn’t make sense” that I don’t want to be a doctor.

So all of this made me wonder why. Why is there this sense of inferiority towards nurses? If I had said that I wanted to be a teacher, no one would say to me, “But why just a teacher, you could be a professor.” And even if I had said I wanted to be a doctor, no one would have said, “But why be just a doctor when you could be a surgeon?” To me, it seems that the profession is viewed as inferior because of its feminine dominance.

Nine out of 10 registered nurses are females. It is considered a “pink-collar” job and is looked at as a caregiving profession, one that people turn to if they can’t make it as a doctor. Work that is done largely by women is grossly undervalued. According to a study from Cornell University, when women enter male-dominated fields, their pay declines—despite doing the exact same jobs that were done before. Gender bias causes people to think that if a woman is doing a job it must not take as much skill or be as important. So when a field is already female-dominated, not only is there still a disparity in pay (men are still paid more, what a surprise) but the work itself is trivialized and deemed less significant.

Even though I know all of this, when I read about how powerful women are breaking barriers and entering male-dominated fields, I often feel an intense guilt. It makes me think that everyone is right and that maybe I should change my career. I’m a feminist, and I believe in female empowerment, so why am I doing what is expected of me? Shouldn’t I be doing my own part to prove that we are capable of anything? And selfishly, shouldn’t I want a job that is going to be more valued?

But I’ve begun to realize that this thinking is wrong. By doing whatever the hell I want to do, I am empowering myself and my fellow women. Just because a career is inherently feminine does not mean that it is less worthy than others or is not something to be aspired to. These jobs exist for a reason, and they make incredibly important contributions to society. I’m not choosing this profession because I feel like it is the place for me as a woman or because I don’t think that I could make it as a doctor. I chose it because it felt like the right path for me. So no, I am not looking to be just a nurse. I am going to be a nurse, period—and a badass one at that. And for any women who are feeling these same sorts of pressures, do whatever makes you happy, and forget about everybody else.

Comments