It Might Save My Life

I remember when she told us. My sister and I were snuggled in my parents’ bed listening to my dad read us Harry Potter. It was The Order of the Phoenix I think. She walked in, really calmly, and sat down on the edge of the bed. We thought nothing of it until my dad stopped reading and said my mom had something to tell us. “I have cancer”.  I think I was in shock, or didn’t really understand. We talked about it briefly, my mom saying she would be at the doctor’s a lot. And really tired for a few months. She also showed us the wig she was going to start wearing. That’s when I got really upset. I saw the wig box in the trunk of the car the day before and thought it was a new cat. I really wanted a new cat.

I don’t remember her going into surgery. Honestly, I don’t even think I knew until she came home with the bandages. I never saw her throw up from the chemo, or even stay in bed for a day. She never showed me what she looked like without her wig, even when swimming at the lake house. Parents of kids in my fifth grade class would come up to me, “I’m so sorry to hear about your mom. I hope she’s doing okay!” I would just smile and nod, wondering why they were making such a big deal about it. My mom seemed fine to me. She would pick us up from school, watch us eat dinner (I realize now she didn’t eat because of the nausea), and then we would all listen to my dad read Harry Potter.

That wasn’t my family’s first interaction with breast cancer. My mom’s own mother died from it when I was very young. My grandma was 70 when she got it. My mom was 45. I didn’t really realize the implications of that until much later, when I started to understand genetics. BRCA1 and BRCA2 are genes that produce tumor suppressor proteins. 55 to 65 percent of women who inherit a harmful BRCA1 mutation and around 45 percent of women who inherit a harmful BRCA2 mutation will develop breast cancer by age 70. About 12 percent of women in the general population will develop breast cancer sometime during their lives[1]. Luckily, my mom was aware of these odds, which is why she got checked at an age most people don’t even think about it. That’s why when I go home for break my mom picks me up from the airport with open arms and (hopefully) a snack.

My roommate Emily burst into our suite a few weeks ago and loudly announced “GUYS! Did you know I’m related to Bono? That must be why I have such a great singing voice!” Just to clarify: she doesn’t. She explained that her dad made her do a genetic predisposition test to see if she had any risk factors for common diseases. This includes breast cancer. She doesn’t have any serious family history of these common diseases, but her dad just thought it was better to be safe than sorry. I knew these tests existed (I got one after my mom was diagnosed), but I had never met anyone who had gotten one without having a serious disease run in their family. But why not? They are easy to find, relatively inexpensive (compared to the medical bills you could prevent), and could possibly save your life[2].

In this case (like in so many cases) I believe the solution is knowledge. If people knew how easy it was to screen their DNA for risk factors they would be more aware of potential health risks (and even some potential famous relatives). This awareness could lead to action: life saving action. Of course, the tests could cause unneeded panic. Not everyone who has a risk factor for cancer ends up getting it, but, like Emily’s dad, wouldn’t you rather be safe than sorry?

We were on The Half-Blood Prince when my mom finished chemo. Her hair grew back, although significantly curlier than before (apparently this is common). She would venture outside more often without her wig—a feeling I image to be one of the most liberating in the world. Life returned to normal. But now I know. Thanks to genetic screening I know my chances of getting cancer are relatively high. I know I need to get checked. And I know that if it does happen, I will catch it before it progresses that far. The $300 test might eventually save my life.

 

Posted in News | 3 Comments

Study Drugs

“I’m going to take an Adderall and study all night.”  There’s no telling how many times those words have been echoed around campus.  Today students face extreme academic pressures to get the best grades possible.  In high school, the stakes are getting into a good college.  However in college, the stakes become much higher as getting bad grades can lead to you not getting into the med school you want, or not landing an internship that could lead to your dream job.  The stakes are different for everyone, but one thing is certain at a top university like Vanderbilt: everyone wants to succeed.  Because of the rigorous academic work, and competitive scholastic environment at universities across the country, students look for any advantage they can get.  Today, the answer for so many is Adderall, also known as the ‘study drug’ (1). Many have been prescribed it; many more have taken it.  Because of the effects of the drug, many students will find ways of obtaining the drug such as buying it from a friend, or even faking symptoms to get a prescription.

Adderall is prescribed for people with ADHD or narcolepsy.  However, since the drugs effects on the brain increase student’s ability to focus, many people fake symptoms in order to get the drug so they can use it to study.  A recent study showed that 93% of students were able to fake Adderall symptoms and get a false positive diagnosis.  Therefore, many of the people who get prescribed ADHD medicine do not necessarily need it.  This means doctors are overprescribing the medicine.  Furthermore, one experiments findings said that “students could successfully get a false positive diagnosis with just five minutes of Google searching on ADHD symptoms” (2).  Thus, the testing currently in place is ineffective and does not accurately assess the symptoms of ADHD. The fact that students are able to very easily and successfully obtain a prescription if they desire is scary considering the perceived competitive advantage it gives in terms of studying and grades.

So how prevalent is Adderall usage?  This week I made a survey to ask people around Vanderbilt’s campus to find out how many students around us are using Adderall or other drugs prescribed for people with ADHD.  Of the people who completed the survey, only 38% of students were prescribed Adderall or other drug, however 79% of students said they had taken the drug.  This means that roughly 40% of students surveyed who were not prescribed the drug had taken it for studying or other use.  With such a large number of people surveyed saying they have taken the drug, I was curious to see if there were any specific characteristics of people taking the drug.  There seemed to be equal percentages of males and females taking Adderall.  However, there were a disproportionate number of people in fraternities or sororities taking the drug.  Further, the older the respondent was, the more likely they were to have taken the drug.  Thus, on Vanderbilt’s campus it seems the most likely people to take these study drugs are older members of fraternities or sororities.

At a top quality academic institution such as Vanderbilt students are often in great competition with each other.  Therefore, is it fair to have a drug that helps you study? In a competitive college if your friends are doing it and showing positive results, then you have a huge incentive because those are the people you’re competing for jobs with.  As bad and unethical as that may sound, many people who don’t need the drug have resorted to Adderall in order to increase their GPA.  As one student put it, Adderall is “steroids for the brain”.  Thus, if you look at baseball where in the early 2000’s steroids became a huge problem, as one person taking them would lead to increased batting average and homeruns.  Then more and more people began taking steriods.  While the problem was eventually cured in major league baseball, the same principles apply to academics in college.  As some people begin taking these ‘brain drugs’ and seeing results, more and more will join them.

Students have gone beyond taking these drugs for studying.  Many college kids taking these drugs are unaware of their severe consequences.  “Between 2005 and 2010, emergency room visits related to ADHD stimulant medications used non-medically tripled from 5,212 to 15,585 visits” (2).  Nearly half of those visits were from mixing Adderall with alcohol.  Therefore college students are now using Adderall to stay up late into the night and be able to drink more.  The solution from stopping students from abusing these drugs is two fold.  First and easiest, awareness about the side effects and consequences of taking prescribed Adderall should be preached to college students.  Second, and more effective, would be developing a more accurate test for Adderall so only those in need will be prescribed the medicine.  However until that is done, students will always be able to fake the symptoms and doctors will be unable to tell.

  1. http://www.wjla.com/articles/2013/11/adderall-use-rising-among-college-students-97245.html
  2. http://www.thedailybeast.com/articles/2013/12/02/7-things-you-need-to-know-about-adderall.html
  3. http://www.rxlist.com/adderall-side-effects-drug-center.htm

 

Posted in News | 2 Comments

Remembrance

As with most of the current advancements in society, events in the past have greatly affected the way we live today. Language originated from communication through physical actions and ancient pictographs, and the Arabic number system used today came from the Indian Brahmi numerals in the 3rd century. We learn about the history of these advancements in school, but it’s easy to forget them. After all, why should we have to remember these details of the past, as long as we continue to progress into the future?

But the progression to today’s society has not always been a clean road like the one from fire, to candlesticks, to oil lamps, to fluorescent light bulbs. There are some things that were accomplished through immoral, inhumane means, and for this reason, people believe that ignorance is bliss. But in the case of science, is it right to be ignorant of the millions of people who have undergone controversial and unfair medical treatment, all for the sake of propelling science forward? Should we just forget about the millions of lives that were lost in the medical human experimentations of the Holocaust?

Despite the evidence that is prevalent in the hundreds of stories of family members and friends who have survived the Holocaust, there are those who deny that the Holocaust happened at all. Those in denial argue that the Nazis had no official intention to exterminate the Jews, that the poison gas chambers in the Auschwitz-Birkenau concentration camp never even existed, and that bodies were never used for human experiments. They also distort the facts of the Holocaust by claiming that the deaths in the concentration camps were primarily caused by starvation or disease but not policy, that the six millions Jewish deaths is an exaggeration, and that the stories told by survivors and left behind in the writings of victims were fabricated. These people wish, for various reasons, for the Holocaust to be forgotten, for the stories of millions of lives that have been mutilated and lost to be erased from our history books and documentaries.

But the truth is, 6 million Jewish men, women, and children were killed in concentration camps, and thousands more were executed by Nazi courts for minor crimes. In a cold, hard, heartless sense, it makes sense that the Nazis used the bodies of the killed for experimentation; German scientists had been complaining to the government about the lack of supply to conduct medical experiments. So when Nazi courts began to execute the Jewish by the thousands in the 1930s and 1940s, it became tragically convenient for their bodies to be used by German anatomists, whether or not the scientists were allied with the Nazis. Some anatomists that did hold Nazi beliefs conducted experiments in the sole attempt to advance the racial and ideological theories of the Nazi worldview; Josef Mengele, an officer and physician in the concentration camp at Auschwitz, directed experiments in order to determine how people of varying races reacted to contagious diseases such as malaria, tuberculosis, and infectious hepatitis. August Hirt, chairman at the Reich University in Strasbourg, conducted experiments in an attempt to “establish Jewish racial inferiority.” Other scientists attempted to develop procedures for the mass sterilization of Jews and other groups that Nazis considered inferior. These unethical experiments were always conducted on people without their consent, and they often resulted in death, permanent disfiguration, or disability. Medical records and images confirming these experiments, and the stories told by survivors and witnesses all negate the claims of those in denial of the Holocaust.

Why should we remember the corrupt past of science and medicine? Thousands of victims lost their lives in brutal mutilation; maybe people would like to forget about a violent past and instead focus on a brighter future. However, it was also these experiments that resulted in new discoveries in science that have allowed for such advancement in science today, as well as the Nuremburg Code that prohibits unethical human experimentation. It’s important to keep in mind that society as it is now did not always have such innocent origins. Even by remembering the lives that had been lost in these human experiments, we cannot undo the injustice that had been brought upon them, but we are at least giving them the respect and remembrance that they deserve.

Posted in News | 1 Comment

The Stigmatization of Mental Illness

Preston Evans

02/23/14

 

The Stigmatization of Mental Illness

 

Mental disorders: an estimated 57.7 million people in the U.S. suffer from them every day. These people are not only discriminated against and not given equal opportunities as their mentally healthy counterparts, but they also suffer from abnormal mood patterns, eating habits, and, in extreme cases, the inability to communicate thoughts or ideas with those around them. A multitude of people with mental disorders struggle with day to day activities that many of us take for granted. As such, society should not further stigmatize or marginalize those with mental disabilities, but rather strive to become more understanding and aware of how to help them.

That mental disorders have adopted this negative connotation with most of society serves only to hinder the progress and help that is offered by therapy and regular doctoral visits. According to the United Nations’ World Health Organization, global mental health care facilities offer the mentally ill poor quality care that often hinders their recovery. As a result, the mentally unstable receive worse care than “normal” people even though they pay higher premiums on their medical insurance. The WHO’s Mental Health Policy Coordinator Michelle Funk said in an interview with Voice of America that those living in mental health facilities face “high levels of abuse and violence,” and that they often receive “degrading treatment.”

For example, many facilities will overmedicate patients in order to keep them docile. As if zombifying your own patients wasn’t enough, they can also be locked in cells or put in restraints for days (and in certain cases, months) without food, water, or human contact. Funk also points out that these violations “are not restricted to inpatient and residential facilities,” as many who seek outpatient or community help are treated with contempt and often denied their basic rights to receive medical aid. For instance, in her article “When Doctors Discriminate,” Juliann Garey claims that she was refused medical aid twice because of the prescription cocktail she took for her bipolar disorder. Garey further professes that when it is revealed that she has a mental illness, “it wipes clean the rest of my résumé, my education, my accomplishments.”

Studies done by the WHO demonstrate that Garey is not the only mentally ill patient to have been reduced to a diagnosis. The human rights violations are numerous and astonishing, if not disturbing, especially when considering these health workers are supposed to be providing medical care and support for these poor souls. This dehumanization and marginalization needs to stop, people’s lives and good health depend on it.

Aside from this “hidden” human rights crisis, the stereotypes, stigmas, and discrimination associated with “unsatisfactory” mental health have led to dire consequences in our society. Many people fear that their insurance will become more costly, they will be stripped of basic medical needs, or will be viewed by neighbors and friends with newfound contempt if diagnosed with a mental condition. The unfortunate truth is that they probably will. As a result, they stop seeking the help they need. The National Alliance on Mental Illness, or NAMI, reports that only 39% of those with serious mental illnesses in the United States sought and received actual mental health services in 2010. According to NAMI, a sizeable portion of those who required the most support (those with an illness that that affected one’s ability to function normally day to day) still did not receive adequate care due to high costs or a strong refusal to seek help.

What’s more, the number of people who reject the idea of getting help has been rising in recent years; however, progressions in medicine have allowed for treatment by taking a daily prescription in place of therapy. As this method is more discreet, the use of medication to combat mental disorders has increased drastically while the use of therapy has declined to only 7%. Unfortunately, these prescriptions are written out and given away fairly easily and quickly in today’s fast-paced culture. As a result, there is a dangerous rise in the number of people who abuse these prescription drugs.

Furthermore, untreated mental illnesses can often lead to suicide. Suicide has become one of the leading causes of death in the U.S., just behind cardiovascular disease. Suicide rates have been steadily increasing since the year 2000, and, according to NAMI, over 90 percent of people who commit suicide have been diagnosed with a mental illness. What’s scary is that more than one in three of these victims were intoxicated, usually by their own medicine or other prescription drugs they bought illegally, when they died. As a point of masculine pride and retaining the societal norms set for them by previous generations, men generally reject help more often than women. As a result, males tend to go untreated more often than women. Consequentially, men are also four times more likely to commit suicide than women.

Perhaps if these people had been getting the proper treatment and care they deserved, they would not have to resort to such drastic measures. Furthermore, these are very real risks with very real solutions in our society. We need to take a step back and realize that helping these people, rather than discriminating against them and viewing them with disdain, will yield more benefits and well-being for all of society as well as help to solve some pressing issues such as suicide and prescription abuse.

 

 

Posted in News | Comments Off on The Stigmatization of Mental Illness

Academic and Occupational Discrepancy

When it comes the issue of women and their place in the workforce, I am extremely unqualified to express an opinion. First and foremost, I am not a woman. Therefore, I have never experienced the cruel double standard that so often holds them back in terms of both salary and position; I have never experienced the stress of juggling the dual responsibilities of raising a child and building a career; and I have never experienced the arbitrary marginalization or ridicule that many professional women face each and every day at the office. But my incompetence at addressing this phenomenon doesn’t end there: I am not just a man, but an unemployed man, one who has never even entered the corporate sphere where this discrimination takes place. All of my awareness of the problem is secondary, stemming from both indifferent fifteen second news clips as well as desperate, impassioned articles like Sarah Kendzior’s “Mothers are not ‘opting out’ – they are out of options.” If she wasn’t advocating as loudly as she is, I would have never known that, “for nearly all women, from upper middle-class to poor, the ‘choice’ of whether to work is not a choice, but an economic bargain struck out of fear and necessity.” And if it wasn’t for some further research due to the requirement of writing this essay, I would have never known that, on average, in 2012, year-round female workers made about $12,000 dollars less than year-round male workers, or that this figure is a huge improvement on those of the past (catalyst.org).

The reason that I make such a point of these disparate corporate abuses of women is because this reality is completely incongruous with the only relevant arena in which I have had the opportunity to observe their competence: school. Through shrewd personal observation, in both high school and college, I have witnessed intelligent, motivated female students exhibit a slew of scholarly attributes and skills that their intellectually equivalent male peers simply do not utilize. While one could justifiably accuse me of making an overgeneralization, I stand by this statement, because for years and years I have watched smart girls attend more classes, pay more attention and take better notes during those classes, put more effort into their homework, studying, and other assignments, and generally function as better students than their male counterparts.

Researchers have even conducted several studies whose conclusions support these claims, including one “Evaluation of Comparative Academic Performance of Undergraduate Students at University Level,” which found that less than 50% percent of male students perform at a GPA level above 3.00, compared to almost 60% of female students (Journal of Animal & Plants Sciences). And Christopher Cornwell, an academic and researcher at UGA Terry College of Business, prefaces another recent, applicable study with a claim that is similar to my own: “You can think of ‘approaches to learning’ as a rough measure of what a child’s attitude toward school is: it includes six items that rate the child’s attentiveness, task persistence, eagerness to learn, learning independence, flexibility, and organization. I think that anybody who’s a parent of boys and girls can tell you that girls are more of all of that.”

The more that I read and learn about the imbalance between the achievements of women during their formative academic years and the lack of respect that they receive in the white-collar world, the more I realize that something is being lost in translation. One of the main designs of school is to provide the student with the proper preparation to land a good job. So the fact that so many women are consistently killing it in the classroom, yet still getting murdered in the job hunt and on the pay scale, shows just how flawed our society is. We have set ourselves up to fail by ignoring the results of the public education system that we commissioned to create the most qualified candidates. We have set ourselves up to fail by ignoring the full value that women can bring to a company. We have set ourselves up to fail by neglecting to pay them appropriately in the instances that we do utilize their skills.

With the ways things are now, I can envision two possibilities for the future. The first is a bright one, in which effective employers cease to discriminate against women as a result of the natural forces of the market. In this future, if an employer wants his/her company to succeed, they will hire the best candidate for the job, regardless of gender. And in this future, the best candidate for the job will increasingly turn out to be female. But the second possibility is much bleaker. In that future, we continue along the path that we have been on for decades, and we suffer the exact fate for which we have been preparing ourselves all along. We have set ourselves up to fail, and that’s exactly what we do: we fail miserably, we fail divided, and we deserve every bit of our failure.

 

Works Cited
Kendzior, Sarah. “Mothers Are Not ‘opting Out’ – They Are out of Options.” – Opinion. N.p., n.d. Web. 16 Feb. 2014.
Khan, B.B., R. Nawaz, K.M. Chaudhry, A.U. Hyder, and T.M. Butt. “EVALUATION OF COMPARATIVE ACADEMIC PERFORMANCE OF UNDERGRADUATE STUDENTS AT UNIVERSITY LEVEL.” The Journal of Animal & Plant Sciences (2012): 798-801. Web. <http://www.thejaps.org.pk/docs/v-22-3/47.pdf>.
“Knowledge Center.” Catalyst. N.p., n.d. Web. 13 Feb. 2014.
“UGA Today.” New UGA Research Helps Explain Why Girls Do Better in School. N.p., n.d. Web. 16 Feb. 2014.

 

Posted in News | 3 Comments

The Ongoing Women’s Rights Movement

For centuries, men and women in the United States have had differing roles both in the home and in the workplace; while men have had the stereotype of being in charge and making the money in the family, women were viewed with the stereotype of staying at home, cooking, and taking care of the kids. These traditional prejudgments often led to inequality in opportunity and treatment, usually not in favor of women. Fortunately, decades of attempts to dissolve barriers between men and women have resulted in much fairer chances for success and opportunity on both sides. However, disparity between the two sexes is still present in modern society as discrimination continues to flourish.

The women’s rights movement began in 1848, when the first convention was held in Seneca Falls, New York. During this convention, 68 women and 32 men signed the Declaration of Sentiments, which outlined the grievances against the inequality between men and women’s rights, and demanded equal opportunity for women in education, employment, and voting rights [1]. This set the stage for more conventions, organizations, debates, and protests; two years later, the first National Women’s Rights Convention in Worcester, Massachusetts had more than 1,000 participants, and these national conventions were held yearly until 1860. In 1869, Susan B. Anthony and Elizabeth Cady Stanton, two of the most important contributors to the women’s rights movement, founded the National Woman Suffrage Association, urging the passage of a Constitutional amendment for women’s voting rights [2]. Later that same year, Wyoming was the first territory to give voting rights to women. Colorado was the first state to do so, the rest of the states following thereafter.

As women eventually attained the right to vote and serve as judges, they gained more power to further the women’s rights movement in other aspects of society, such as the workplace and schools. In 1903, the National Women’s Trade Union League was founded, and it advocated higher wages and improvement of working conditions for women. In 1963, Congress passed the Equal Pay Act, which prohibited employers from paying a woman less than what a man was paid for the same job. In the following year, the famous Civil Rights Act of 1964 made race- and sex-based discrimination in employment illegal [3], and in 1972, the Education Amendments prohibited discrimination in school enrollment and educational activities based on sex, resulting in a dramatic increase in the participation of women in professional education and athletic programs. Additionally, the Supreme Court has ruled in favor of women’s rights in numerous cases throughout the movement.

Over the course of more than a century, women have fought to have the same rights as men. And it’s true that they have come a long way; women can now vote, receive education, and work in safe conditions and earn higher wages. However, even today the disparities in the rights of men and women can be seen, especially in the workforce: in 2010, the median income for men was $42,800, while the median for women was $34,700. Furthermore, just two years ago, the gender wage gap widened even more; in 2011, women earned 82.2% of what men earned, but in 2012, this figure decreased to 80.9% [5]. Studies have shown that a root cause of this inequality is that despite the effort put into obtaining equality between the two sexes over the decades, discriminating stereotypes still influence the actions of the workplace; the belief that men are naturally more skilled than women in certain tasks still lead some to think that certain types of jobs are meant for men, while other jobs, usually lower-paying occupations, are meant for women. These gender stereotypes are clearly prevalent in society today, further indicated by the fact that the Supreme Court continues to deal with cases involving discriminatory actions against women; in 2011, when Betty Dukes sued Wal-Mart for sex discrimination in pay and promotion policies, the Supreme Court declared that the plaintiffs “did not have enough in common” to execute a class action, ruling in favor of Wal-Mart [6].

Through the countless debates, protests, and court cases over the decades, women’s roles in society emerged from the household, out to the workplace and schools. While women could not vote or attend schools and were expected to stay at home in the early 1800s, women in the modern age can now vote, obtain high education, and apply for jobs in their field of interest. With such progress in the equality between men and women, it might seem as though there is not much left to do in terms of fighting for gender equality. However, discriminatory social values between men and women are still clearly present, as shown through the current gender wage gap and differences in work and promotion conditions. With this discrimination still prevalent, the women’s rights movement continues to move forward in today’s society.

[1] http://history.house.gov/Exhibitions-and-Publications/WIC/Historical-Essays/No-Lady/Womens-Rights/

[2] http://www.brynmawr.edu/library/exhibits/suffrage/nawsa.html

[3] http://www.infoplease.com/spot/womenstimeline2.html

[4] http://www.infoplease.com/spot/womenstimeline3.html

[5] http://www.huffingtonpost.com/2013/03/07/gender-wage-gap-2012_n_2830173.html

[6] http://www.huffingtonpost.com/2011/06/20/walmart-too-big-to-sue_n_880930.html

Posted in News | 4 Comments

Girls Do Not Have Cooties

Diet Coke is for women. Coke Zero is for men. At least, that’s Coca Cola’s strategy. Released in the 1980’s, Diet Coke was Coca Cola’s sugar-free, low calorie alternative to its original drink. It’s marketed (in the US) as a diet drink, packaged in a standout white and silver can. However, since it’s release, Diet Coke consumers have largely been female. In 2005, Coca Cola sought to change this male avoidance of their diet soda. They released Coke Zero, a low-calorie version claiming to mimic the taste of original Coke. It’s packaged in a black can, often featuring sports or other manly icons. Though the two drinks differ by 0.3 calories and an additional artificial sweetener in one, Coke Zero was much more popular among men.

This Diet Coke/Coke Zero gender disparity was fundamentally influenced by gender portrayals in marketing. Men were opposed to Diet Coke because it was perceived to be feminine. The word “diet” and the more female-targeting promotional strategies and advertisement made it seem as if Diet Coke was created for women. Men refused to be seen drinking it. According to Jill J. Avery, a senior lecturer at Harvard Business School and former brand manager for Gillete, Braun, Samuel Adams, and AT&T: Coca Cola realized that despite the functionality of men drinking low-calorie or diet sodas, they “couldn’t bridge the gap image-wise without a new brand and product just for them.” By doing this, Coca Cola was giving men a drink of their own, untainted by female affiliation and feminization, thus avoiding gender contamination.

Gender contamination is a term used in marketing that describes “when one gender is using a brand as a symbol of their masculinity or femininity, and the incursion of the other gender into the brand threatens that.” Although its main usage now lies in branding and public relations, gender contamination is part of a much larger picture. It is a leftover remnant of ancient history and culture that continues to affect our contemporary society.

For a long time, gender contamination has been feared; in particular, men have feared it much more so than women. The source of these long-established fears have been reflected in the beliefs of several ancient cultures. In some, they believed that men could only touch certain magical totems or talismans; they believed that if a woman touched these objects, they would lose their powers. In others, they believed that female speech and female gaze were forms of gender contamination. According to their beliefs, these areas were inherently polluting and were as dangerous and disruptive to men as death. In such societies where men are the focus, it becomes much less socially acceptable for men to show feminine traits than it is for women to show masculine traits. Across many patriarchal, androcentric cultures, there has been a historic fear of gender contamination, especially by men.

However, despite these deeply rooted fears and hesitations, history has shown that female incursion is not always a negative. Throughout our pasts, women have broken down barriers and infiltrated domains previously untouchable. They’ve forever changed the way things were done and left their impact on society. For example, take the women of the US during the World War II era. America entered the war almost overnight and suddenly found a quickly increasing need for supplies and machinery with very short notice and little time to produce it. At first, many companies underestimated the labor shortage and women were left out of the workplace. Facilities were expanded, production was heavily increasing, and several companies were signing large, lucrative contracts with the government during wartime. Eventually women were hired to take the place of the men who were leaving factories and industrial plants to go to war. Without the admittance of women into the workplace, something previously regarded as primarily male, the war efforts would have been severely harmed; at-home support and wartime production would have been dramatically lower than they were with the help of the women. This opened the door for all women, not just minority and lower-class women, to continue working outside of the home. In this case, not only did female incursion help the situation at the time, it paved the way for social change in the future.

In the end, if women stick to Diet Coke and men commit completely to Coke Zero, there is still something to be said about gender contamination. Even if your drink of choice can’t change the world, there are so many possibilities that can arise from letting the opposite gender into a new domain. Despite the time and space that these fears span across, girls do not have cooties and giving females opportunity and a little bit of faith can forever revolutionize the way we function as a society.

 

Sources:

http://hbswk.hbs.edu/item/7149.html

http://moses.creighton.edu/jrs/2009/2009-7.pdf

trey-sullivan.com/images/CocaCola.docx

http://www.princeton.edu/~achaney/tmve/wiki100k/docs/Diet_Coke.html

http://www.huffingtonpost.com/2012/01/11/diet-coke-vs-coca-cola-zero_n_1199008.html

http://www.ajc.com/news/business/coke-zero-becomes-a-hero-for-coca-cola-co/nQkHh/

https://www.inkling.com/read/marketing-dhruv-grewal-michael-levy-3rd/chapter-8/coca-cola

http://www.nps.gov/pwro/collection/website/rosie.htm

 

Posted in News | 4 Comments

choice and compromise

Nowadays, with an increasing number of women occupying fulltime jobs outside home and a closer gap between women’s wealth and men’s than before, the whole society has raised plenty of concerns about work-family conflict.  According to John H. Johnson, who examined data from the Survey of Income and Program Participation, on average, adding ten hours to the husband’s work week raised the probability of divorce by a small percentage point, while adding ten hours to the average wife’s work week would double the number.  Besides, he found that higher income for men reduced the probability of divorce, but in contrary, higher income for women increases the probability of divorce.  All the data showed the failure of many women to the balance of their ambitious career plans and their traditional roles as wives and mothers.  Although both husband and wife have to consider additional factors, such as family dynamics, career scenarios and lifespan when they pursue their career goals, the situation is much more disadvantageous for employed women to find out a solution due to their family obligations and limited occupational opportunities.

I did realize that it’s a little impractical and unpersuasive for a college sophomore like me to comment on the relationship between marriage, family and career, but I will introduce you the example of my mother.  Back to my childhood, my mother played a traditional role in my family.  Her occupation was so-called “housewife”.  As I grew up and became more independent, her workplace suddenly shifted from home to business market.  As a compromise to tighter and tighter family financial demand, she had to step out from home and run a business together with my father.  With no experience in accounting and management, it’s still her responsibility to quickly accommodate to the role as a household financial provider.  Struggling to become a good mother and qualified businesswoman at the same time, my mom took a large share of household and child care while stayed at workplace ten hours per day.  Her hectic and overwhelmed lifestyle continued until I went to college.  For most of women, although they have made choices and compromises throughout their careers to achieve a balance between their own desires and those of their families, sometimes the goal becomes too unrealistic to accomplish.

The main reason why more and more women enter the job market and compete with men in leader position is education.  According to the U.S. Department of Education, in the 2005-06 school year, women made up 57.5 percent of all students earning bachelor’s degrees, nearly 60 percent of students earning master’s degrees and 48.9 percent of students earning Ph.D.s.  As a result of such an intellectual advancement, women are earning more money and controlling more of the nation’s wealth than ever before.  However, the major problem is, although the definition of women in workplace has changed dramatically, the definition of women in families stayed stationary for centuries.  It’s still their prior responsibility to take a large share of housework and raise children.  The situation has become so extreme that for some women, once they have made the choice between work and family, there is no middle land for compromise; they just have to give up the other, or scramble to run two lives – one at home and one at work until the pressure squeezes out all of their energy.

Even though many women sacrifice their jobs for families or end up with tragic divorces, some women did find perfect balance.  Leslie Crossman is a clinical psychologist and a mother of three.  Her initial plan was to work in private practice, but she took the job with school district so that she would have free summer with her children.  However, not all women are capable of finding a flexible job under limited opportunities to allow them to spend plenty of time staying with their children.  Anne-Marie Slaughter, a well-known Princeton professor and former top aide to Hillary Clinton, addressed that the best solution for the conflict between work and family for women, is to “close the leadership gap: to elect a woman president and 50 women senators; to ensure that women are equally represented in the ranks of corporate executives and judicial leaders. Only when women wield power in sufficient numbers will we create a society that genuinely works for all women.”  However, even such an intelligent woman fail to give a clear answer of how women can occupy the top position in these fields without sacrificing home or family.

Posted in News | 4 Comments

The Alarming Stress of Young People

 

The harm of stress is often thought to be limited to the old and invalid: a high-earning businessman has an early heart attack because of his job, or the stress of losing a loved one sends a senior citizen into illness. Yet this harm can be caused by stress in youth many years earlier. Unfortunately, high stress in young people is exactly what has been discovered.  Workplaces have made young people highly stressed, and this is a short and long-term risk to their health.

Changes in characteristics of the workplace also forecast worrying trends that may plague the workplaces where young people forge their professional lives. These trends have been extensively reported on and, in many cases, relate to changes that the internet and other digital technology have ushered in. For example, the now-commonplace smartphone has enabled email on-the-go, and services such as Google Docs allow collaborative drafting without meeting in person.  In short, working is now more convenient, and this trend is unlikely to stop.  Unfortunately, convenience is part of the problem. Because employees have gained the ability to communicate at nearly any time of the day, their managers expect them to do so, and this results in more hours worked outside of the office. This type of work has some destabilizing effects. More work outside the office impedes the ability of young people to regulate their schedule and devote time for personal pursuits and stress reduction. Another factor: the amount of part-time work in the American economy is rising, and part-time work has especially risen in prevalence among young Americans. Almost a fifth of all part-time work is done by people between 20 and 24 years old. Unfortunately, part-time work comes with its own set of difficulties. Workers have less money from their work, and overcoming this deficit through working multiple jobs means dealing with multiple requirements and a difficult schedule.

Statistics show that young people in particular are highly stressed, and much of this stress relates to their work arrangements. The American Psychological Association’s annual Stress in America survey takes nationwide data on perceptions about stress and stress-caused symptoms. The 2013 survey breaks respondents into four age groups: millenials, gen Xers, baby boomers, and matures. The youngest of these groups, millennials (18-  to 33-year-olds), show the most concerning responses in a number of different categories. Millennials are tied for the highest level of stress reported, and 39 percent report that their stress has risen in the last year, the highest of the age groups. Only 29 percent of millennials report doing an excellent or very good job of managing stress, the lowest of the four age groups. Millennials also show the highest percentage of individuals reporting work as a “somewhat or significant stressor,” 76 percent, and this is 11 points higher than the next highest group. This could suggest one of two things: either the workplaces in which millennials work are stressful by nature, or millennials are not equipped for dealing with workplace stress. Regardless of the reason, stress in these young people is painful and can be the cause of mental and physical health degeneration.

One group of young people are particularly vulnerable in the coming years to rising stress: single mothers. The percentage of American mothers raising a child while unmarried rose 80 percent from 1980 to 2007, and 62 percent of women  between 20 and 24 years old who gave birth in 2011 were unmarried. Single mothers in the United States have more difficulty than those of other countries. Over 85 percent of these mothers report “severe time shortage,” a higher percentage than any other country in the developed world. There is simply too much to do, and in many cases, work must take priority over spending time with children because there is no other source of income. These women may turn to marriage to relieve financial pressure. But this comes with a loss of control over the specific conditions of marriage: as a recent Atlantic article puts it, “wanting a certain lifestyle, or fighting against societal pressures to marry, are both questions of privilege.” If the trend of increasing single motherhood continues, the young women of today will increasingly be subjected (or are already being subjected) to highly-stressful situations.

Stress is not something that should be dismissed; it causes real harm to its victims and disproportionately harms those who already suffer from other difficulties. Stress is correlated with increased morbidity and death from heart disease, diabetes, common colds, immune deficiency syndromes, and more. One might think that the work stress these young people have would be a transient problem, overcome as they settle into long-term jobs. But even transient stress can cause long term impacts. Stress causes illness by degeneration, so the damage it causes might only be reflected in illness many years later. Stress itself is also correlated with recurrence of stress, which multiplies its harm over time. As Stanford biologist Robert Sapolsky writes, “early-life stress…with every passing bit of aging, gets harder and harder to reverse.” Stress also exacerbates the harm that poverty has on the individual. Socioeconomic status early in life has shown to be correlated with degenerative disease, regardless of the present income. And if certain trends of the workplace continue, then higher levels of workplace stress might be here to stay. We should be concerned about this: anything that threatens the health and equality of society should draw attention. Workers should demand measures to reduce the stress they suffer from their employment.

 

Posted in News | 2 Comments

You Don’t Have To Choose

My mom is one of the smartest people I know. No, she didn’t find the cure for cancer or design a rocket that shoots humans into space. She didn’t invent the world’s fastest computer or write a classic American novel. But, she did, and continues to, raise two (fairly well-adjusted?) kids while working a full time job and handling my father who, like a typical man, can barely boil water or load a washing machine. A lot of stories in the news say women have no choice: that they are ‘forced’ into choosing to stay home with their young children. Well, from my experience, they aren’t.

My mom is great, but she doesn’t have superpowers. She doesn’t cook…ever. Well, there was that one time with the taco fiasco, but I won’t go into that now. Sometimes, she was late picking me up from soccer practice or school. She would pull up sharply, her hair frazzled from a long day at work and an out of breath “sorry guys”. You get used to it. She didn’t come to every soccer game or bake for every fundraiser. But, she understood the important stuff. After every school musical, even though I never had the lead, I could always count on a huge hug and a beautiful bouquet of flowers. She stayed up all night with me my junior year of high school helping me write the Andrew Jackson term paper that I artfully put off until the last second. Old habits die hard I guess. When I felt sick as a kid, she would read Greek myths to me until I fell asleep.

What I’m trying to say is I never felt neglected. Sure, sometimes it got frustrating when Katie’s mom was always the first in the carpool line, smiling her megawatt smile with silky blonde coiffed hair. But with a little patience, my mom would eventually pull up, sometimes even with snacks from the bakery in her office. Sometimes it was annoying not to see her in the stands cheering me on, but she would always ask a million questions about the game when I got home. The point is, she always found a way to show me how much she really cared. She didn’t always have to be there.

I remember the first time I visited my mom’s office. We had to weave through the DC rush hour traffic just to reach a building that looked like every other building on the block. Grey. We walked through halls lit with fluorescent lights. Printers and fax machines and copiers were whirring incessantly all around. I honestly didn’t understand why anyone would want to work there. But then I realized his was her place. She had to share our house with a dog that never seemed to stop shedding, a cat who’s only hobby was to claw at furniture, a husband who thought oatmeal was an acceptable dinner, and two daughters who accidentally (I promise!) got makeup on the walls and left clothes literally everywhere. At her office, she was in charge of her space. The only finger paint on the walls was hung there on purpose. It’s like she lived in two worlds, one ruled by chaos and school science projects and the other ruled by cases and laws.

I think that is the key to the freedom of choice. A lot of women try to merge their worlds, letting the work blend into the home. Then, even when they are with their families, they feel that weight of the work on their mind. That’s why they feel like they have to choose: they can’t separate the experiences. When the home world starts to close in, they have nowhere to escape to. Somehow, my mom figured it out. She’s not a superhuman. I promise you she sleeps (and snores really loudly). She doesn’t have Hermione’s time-turner. She just luckily managed to find that perfect balance. She didn’t come to every game, she was rarely on time, and she cooked a homemade meal maybe once a month. But when she did these things, she was fully present. She skipped nimbly between her worlds, choosing to occupy both.

 

Posted in News | 2 Comments