Pop Culture Wasteland

Preston Evans

ENGL 120W

03/16/14

 

Pop Culture Wasteland:

When MTV Stopped Being Music Television

If you grew up watching re-runs of Saved by the Bell or laughing at the mishaps and shenanigans of awkward teenager Cory Matthews from Boy Meets World, chances are you probably also watched your fair share of Music Television, or MTV. In my self-conscious and undoubtedly graceless youth, I remember waking up in the morning (after hitting the snooze button a few times) and turning on the television to find music videos. As the only other exposure to finding new music I had came from long car rides with Mom, the music videos featured on MTV helped me discover new artists and develop my own taste in music (not that I didn’t love listening to Eric Clapton’s Unplugged on repeat, Mom). Unfortunately, it seems MTV and its producers have completely phased out the music video in recent broadcasting. As CNN critic Jarrett Bellini writes, the MTV network has essentially become “a giant bouncy castle for America’s drunk, tanned, and pregnant.” So why has MTV decided to cast away from its roots? Why does Music Television provide the consumer with drunk, overly dramatic Italian-Americans instead of music from the latest emerging artist?

The answer is simple: MTV is a business. Former MTV executive Peter Hoare stated in November of last year that MTV “won’t play music videos because [the audience consumer] won’t watch them.” Exactly. Perhaps YouTube and other video archival and distributional outlets have thwarted the efficacy of MTV’s former programming. Instead of waiting around for a music video we want to see on TV, we can simply type the name into YouTube or Google and have the video, lyrics, band biography, and countless other informational tidbits about the song. This accessibility and developing consumer culture (where, obviously, we want everything as quickly and conveniently as possible) has pushed the music away from Music Television. We, the consumer, are the reason that MTV has stopped having bands like Nirvana and Sir Mix-a-Lot (don’t act like you don’t remember all the words to this glorious one hit wonder—you know you do) appear in regular programming and instead has been showing multiple groups of sexually crazed, ill-tempered twenty-somethings living together (usually in some sort of beach paradise) for the past twenty-two years…

Yep, how old does that make you feel? MTV’s The Real World began in 1992, only 11 years after the channel’s founding in 1981. A quick search for a list of programs broadcast through MTV (past and present) literally made my head spin when I saw how much the programming had changed since the 90s. In the late 90’s and early 2000s MTV aired 63 music series (along with various other news, reality, comedy, drama, and talk show series); however, today MTV airs only six. That’s right, six. Of these six, only two have stood the test of time: MTV Unplugged has been running since 1989 and Making the Video first appeared in 1999. On the other hand, MTV also currently airs eight reality shows. What’s more, seven out of eight of these shows premiered in 2009 or later. Maybe they should just rename the station RTV? After all, there are more reality shows than music shows on Music Television. It just doesn’t add up.

I suppose I should cut MTV some slack. After all, the station is a multi-billion dollar company that is consistently found on Forbe’s “World’s Most Valuable Brands” list. As of November of 2013, the MTV brand value was listed at $5.6 billion. I assume it’s safe to say that MTV is a station that knows what it’s doing, even if it does sacrifice its traditions and values to turn a profit (but what big business doesn’t these days?). In his weekly column “Apparently This Matters,” Bellini also states that “if there was money in music videos, that’s what [MTV] would show.” Unfortunately, it seems the real money comes from reality television shows, watching drunken and frighteningly immature “adults” brawl over a girl (who doesn’t even seem that attractive in the first place) or a newer series called “Friendzone” that I haven’t even bothered to look up nor have any desire to do so, as the name alone is enough to send me running. Maybe I’ll even run all the way over to VH1, MTV’s sister station, so I can watch a group of different, though equally trashy, people competing for a washed up country or hip-hop artist’s love and affection.

In MTV’s defense, the station broadcast 12 hours of music videos last Independence Day in an attempt to nostalgia trip some of its lost (and older) audience back into the vortex which is MTV. Sadly, this daylong ritual did almost nothing to convince me that MTV is no longer truly Music Television; especially considering the next show that came on portrayed a screaming 16-year-old going through extreme labor pains. Yikes. So MTV should no longer be called “Music” television. And maybe that’s our fault.

Posted in News | 4 Comments

What is Wrong with the Doctor’s Office?

Dan Petrovitch

Sunday, February 23rd, 2014

Essay #5: Medicine and Medical Ethics

I always feel uncomfortable while paying a visit to the doctor’s office. I never end up asking most of the medical questions that I originally intend to, and I rarely express any of the concerns or anxieties that sometimes crop up throughout the year, during the indefinite intervals that separate these half-hearted check-ups. Here’s another confession: I am never completely honest with my doc. For example, I tell her the truth about my diet, my exercise habits, and my sleeping patterns, but I lie to her about my sex life and my consumption of any mind-altering substances, such as caffeine, tobacco, or alcohol. Yet I am fully aware of the modern, legally binding Hippocratic Oath, and I am supremely confident in the miraculous healing powers of properly applied medicine.

So, to state the obvious and natural question, wherein lies the problem? Why don’t I just trust my doctor, in order to learn as much as possible, in order to optimize the best resource available on my unique quest towards superior well-being? These questions are forged into existence by the inherent paradoxes of our current medical/pharmaceutical complex. Medicine is a unique, contentious, and incredibly influential field because it shakily straddles the gap between purely personal and purely professional spheres.

In a sense, one’s health is the most intimate and private aspect of his or her life. The degree to which a person’s parts function and interact is often messy, embarrassing, painful, or tragic; poor health can be the the limiting factor of someone’s dreams, or of their career, or of their physical, social, reproductive, and sexual experience as a human being. However, despite the stakes, we rarely take responsibility for our own health, instead choosing to blindly trust an over-payed legion of allegedly qualified professionals and their artillery of verifiably greedy corporations to look after it. We allow the doctors and the pill companies to determine our fates by actively muting our own voices in the debate over our own health; we forfeit our own votes on some of the most important personal questions of our lives. Why?

Just because our health care system offers the required privacy doesn’t mean that it offers the equally necessary libertarian comfort of personal and discrete choice. The unfortunate state of modern medicine is just as Juliann Garey of The New York Times presents it: “Perhaps the most notable of these efforts–and so far the only one of its kind–is the narrative medicine program at [the C.U.M.C.], which starts with the premise that there is a disconnect between health care and patients and that health care workers need to start listening to what their patients are telling them, and not just looking at what’s written on their charts.” Although doctors and nurses hold the most qualified opinions about the medically factual, on paper details of a given patient’s situation, only the patient can really hold any valid opinion on the actuality of their experience: the fear, the pain, the symptoms, the side effects of treatment, and the heavy, lonely weight of a grim diagnosis or a cold prognosis.

What we need is a medical community that recognizes and properly respects the consequences of arbitrarily blending this dichotomy of personal versus non-personal problem versus solution. We desperately need self-aware doctors who understand their roles as foreign entities, paid to carefully cross private and personal boundaries in order to cure their patients. And, even more desperately, we need self-aware doctors who fully comprehend their responsibility as the gatekeepers to the world of prescription drugs. A good doctor is obliged to protect his patients from the myriad of evils that undoubtedly reside in such a potent realm: for example, the doctor who over-diagnoses countless unprepared nine and ten year old boys with ADD and ADHD, and then prescribes them  unhealthy, excessive daily doses of amphetamines, should lose his job. But the doctor who selfishly refuses to prescribe the correct drugs in the correct situation should likewise lose his job. The philosophical leap that we need our medical community to make is the one that asserts every situation as the correct situation, as long as it involves a rational, mature adult, making a highly informed and fundamentally individual decision about which chemicals to introduce into the soup of his/her brain chemistry. As a medical patient, as a paying customer, and as a sovereign human being, you ought to confidently possess the right to this choice. And the degree to which you consider this to be a radical proposition is indicative of the degree to which you treat yourself as both a product of and a slave to the novel, unproven, and unfairly strict socio-medical paradigm of our time.

 

Works Cited
Garey, Juliann. “When Doctors Discriminate.” The New York Times. The New York Times, 10 Aug. 2013. Web. 22 Feb. 2014.
Tyson, Peter. “The Hippocratic Oath Today.” PBS. PBS, 27 Mar. 2001. Web. 21 Feb. 2014.
Posted in News | 2 Comments

The Right to Die

“I would not want to live with a tube in my neck and not be able to move a finger. I wouldn’t… that to me is not life.”

Dr. Jack Kevorkian

Euthanasia is the act of ending someone’s life in order to prevent suffering in circumstances of incurable illness or injury. From the Greek for “the good death”, the word has been stigmatized by those who believe that assisted suicide in any form is murder. Proponents of the practice believe that for terminally ill patients who are mentally capable of making such a decision, it is just. Neither side is wrong. It is not possible nor would it be worthwhile to attempt to alter someone’s belief in the sanctity of life. It is worthwhile, however, to argue that there are cases in which assisted suicide is necessary, cases in which life is nothing more than agony and death is merciful. In an instance where a terminally ill individual is aware of his or her imminent death to the extent to which they welcome death as an alternative to suffering, they should retain the right to die painlessly and mercifully.

In June of 1990, Dr. Jack Kevorkian conducted the first of over 130 assisted suicides he would ultimately perform. His patient was suffering from Alzheimer’s disease. Over the following eight years, Kevorkian gained notoriety, along with the nickname Dr. Death, as he was continually scrutinized by the public, law enforcement, and judges at all levels of the United States justice system. Dr. Kevorkian believed that the responsibilities of the medical profession included assisting patients with death [1]. He served eight years of a 10-25 year sentence in federal prison for second-degree murder and was released in 2007. Perhaps more important than the patients whose suffering ceased thanks to his efforts, Kevorkian brought the issue of euthanasia into the eyes of the American people and began a conversation that continues to be had today.

On the other side of the conversation are those like New York Times contributor Ross Douthat, who regard Kevorkian as a murderer rather than a revolutionary [2]. Critics like Douthat argue that suicide in any form is morally wrong and that terminally ill patients should not be held to a separate moral code than the rest of us. Others who do not believe in the right to die, as it has become known, cite religious texts as condemnation of euthanasia. While religious freedom is a core American value, another is that no one’s liberty should be infringed upon as a result of the religious beliefs of another. From a constitutional point of view, commenters like Nadine Strossen, former president of the American Civil Liberties Union, feel that the government determining end-of-life decisions can be interpreted as cruel and unusual punishment in some cases [3]. It is these differences of interpretation and belief that make issues of this nature tremendously difficult to resolve in the public arena, particularly through legislation.

The debate about euthanasia is similar to a number of others facing the American people, contraception and abortion specifically, in which the conflict between individual liberty and religious freedom is central. Legislation in issues such as these is few and far between, however it is clear in the euthanasia debate as it is with abortion that public opinion is increasingly liberal. In fact, according to a 2013 Gallup poll, 70% of Americans believe that in cases of incurable disease, doctors should be able to assist in ending the patient’s life by painless means [4]. With this type of public support, it is only reasonable to expect policy makers to react sooner rather than later.

Euthanasia remains illegal in all U.S. states, however physician aid in dying (PAD), commonly referred to as assisted suicide, is legal in Oregon, Washington, Montana, and Vermont. Yet patients in 46 states are suffering as a result of the religious beliefs and lack of compassion of others, a great injustice from any perspective. Those who tout the sanctity of life as a deterrent for those in favor of the right to die may allow themselves and their loved ones to suffer, that is their right. But those who wish to seek death as salvation from the hell in which they live are not immoral or wrong, and to do so should be their right.

 

[1]http://www.nytimes.com/2011/06/04/us/04kevorkian.html?pagewanted=all&_r=0

[2] http://www.nytimes.com/2011/06/06/opinion/06douthat.html

[3] http://en.wikipedia.org/wiki/Laws_against_suicide#United_States

[4] http://www.gallup.com/poll/162815/support-euthanasia-hinges-described.aspx

 

Posted in News | 3 Comments

How Can it be a Problem if They are Prescribed?

Lynette Scavo, one of the women on the TV Show Desperate Housewives, had a lot on her plate. She was a working mother of three sons and a daughter. Two of her sons were twins and both had Attention Deficit Disorder. When she felt overwhelmed, she often turned to their medication to give her the boost of energy to keep going through the day.

When one thinks of illegal substance abuse, they often think of drugs like marijuana, cocaine, or heroin. What they usually do not think of are the prescribed drugs because if they are prescribed how can they be abused? Prescription drug abuse is a very serious problem, but one that often goes unnoticed because the substances themselves are legal. According to the Foundation For a Drug-Free World studies show that teens are more likely to have abused a prescription drug versus an illegal drug. The abuse of prescription drugs is an incredibly dangerous and underrated problem.

Substance abuse of prescribed drugs can be a problem for the actual patient it is prescribed for, and for others who have been able to acquire the drug. The most abused drugs are painkillers, depressants, and stimulants. Why are people abusing these drugs anyways? The painkillers can cause a high for the user due to their ability to kill pain. The central-nervous system depressants can also create a sort of high feelings that leaves the user calm and relaxed. Lastly stimulants have been used not only for the feeling they create, but also to help the user be more productive. This in particular is a problem across college campuses due to the workloads of today’s college students.

The first element of the problem is the stigma behind these medications. Most people, especially young adults, think that because they are prescribed, they are okay to take. The problem with this is that what might be okay and safe for one person is not necessarily safe for another person, and could be potentially fatal. Because these substances are legal, people think it is okay to share with their friends or even sell to others. All of these drugs affect the nervous system, and all of them have a high chance for addiction.

The next element of the problem is that these drugs are not only abused by people who are acquiring them without a prescription, but the people who have actually been prescribed. Because the substances have such a strong effect, they are very easy to become addicted to. The patients then begin to take doses stronger than recommended and also begin to self-medicate. The most famous example of this problem is the death of Michael Jackson. His doctor prescribed him medications that he did not necessarily need and he eventually over-dosed. This example also provides evidence for the problem of doctors over prescribing these types of medications. This is especially common with drugs like Adderall and Concerta. Parents believe their child has Attention Deficit Disorder or Attention Deficit Hyperactive Disorder and it then becomes relatively easy for the parents and the children to go through the tests and get the drugs prescribed. This effects the child’s development, and they can go on later to sell the drug when they realize they do not really need it.

There are other side effects of abuse of prescribed drugs that are very dangerous as well. Stimulants can cause unusually high heartbeats, paranoia, and high temperature fevers. They also alter the mind and people might do things they would not normally do. Depressants can cause shortness of breath, slurred speech, fatigue, and even seizures. Overdosing on depressants or mixing with alcohol is a recipe for fatality. Abuse of painkillers can cause nausea and drowsiness. All of these drugs alter the way one thinks and one’s actions. Because of this, the chances of abusers contracting HIV or another STD is much higher and those who do not abuse these substances.

This is a growing problem, but how do we address it? It would be impossible to completely eliminate the drugs from use because there are users who do use the substances responsibly and genuinely need them. The best way to fix the problem would be to eliminate the reasons why people feel like they need to take the drug. Because of this, the abuser is in control of the solution and they must take it upon themselves to stop abusing prescribed drugs.

Works Cited

http://www.healthline.com/health-slideshow/prescription-drug-addictions#2

http://www.drugfreeworld.org/drugfacts/prescription-drugs.html

http://teens.drugabuse.gov/drug-facts/prescription-drugs

 

Posted in News | 2 Comments

Rhabdomyosarcoma and Proud

Age 1.

The age I was diagnosed with Rhabdomyosarcoma. In plain terms, Rhabdomyosarcoma is a rare type of soft tissue cancer that often occurs in children, and I had it in my right pelvic area. When my parents found a firm area on my body, they took me to the doctors. Of the 4 stages of cancer, I was diagnosed with stage 2.5, and the cancer was nearing the stage of metastasis, when it would start spreading through my body. I was 18 months old, and I could have died from a disease that I could not even pronounce. As a baby, I had no idea what was happening to me and had to rely on my parents, who also had no idea what was going on because they had just immigrated to the United States from China. I began receiving cancer treatment, which consisted of chemotherapy, radiation therapy, and surgery. I lost much of my weight and all of my hair. I couldn’t eat normally; instead, they inserted a feeding tube into my chest. There were nights when my white blood cell count dropped almost all the way to zero. The doctors performed procedures so risky that I had a huge chance of never being able to use the bathroom normally or walking around by myself. But finally, after 10 months of exhausting treatments, confusing medical jargon, and fear of getting worse, I was declared cancer-free.

Age 11.

The age I realized I was the same and different. On the outside, I seemed like a normal girl. On the inside, I felt different from everyone else because of the physical and emotional aftereffects of cancer. I thought that I was alone, because no one seemed to understand my life or what I was dealing with physically. I gave up trying to tell my friends because I figured they would not understand that their harmless Mac n’ Cheese was my creamy plate of disaster. I gave my friends excuses instead of telling the truth about my physical condition. I told them I was lactose intolerant instead of telling them that my body could not handle digesting certain foods. I was tired of constantly reminding my friends of my condition and embarrassed that I was different. My family was the only people who knew the details of my life. In 6th grade, I met my English teacher, Mrs. Fuller, who changed my entire perspective on cancer. She was 65, diagnosed with cancer, and was ready for whatever came, even if it meant losing her life. Mrs. Fuller was the first person outside my family I opened up to about my cancer story. I told her about my anger, embarrassment, and sadness. She showed me her strength and positivity. She helped me look at cancer differently; instead of it being an inhibition on my life, I should be proud that I overcame such a deadly disease. She helped me realize that the things I was ashamed of were also the things that proved my strength, and I should be proud of sharing my story. Mrs. Fuller became my mentor and role model. We talked about her fight with cancer, and she inspired me to become stronger and more optimistic. A year later, she unfortunately lost her battle to cancer and passed away before 8th grade started. When I found out, I cried harder than I ever have in my entire life. The mentor who had helped me through some of my toughest challenges was gone forever. I decided that in honor of her, I would follow her advice and be a proud survivor of cancer. I had promised her that I would never be ashamed again for winning my battle, and I would live my life helping others find their strengths.

Age 18.

The age I reflect and look forward. When I look back on my journey, I think how foolish I used to be for feeling ashamed of cancer. Whenever I meet people now, I tell them the whole story of my cancer battle, because I am proud of being a survivor and unembarrassed of being different. Sure, I still cannot eat Mac n’ Cheese like everyone else, but I no longer say that I am lactose intolerant; instead, I say that I am a cancer survivor and have certain aftereffects from the medical treatments. I try not to talk about it too much as I don’t like to make it the focus of my life, but I also don’t try to ignore it anymore. Cancer is a significant part of my development. This past year, I was diagnosed with lymphedema, a rare condition that causes swelling due to fluid retention caused by a damaged lymphatic system. It can occur to people who have damaged lymphatic systems or people who have received cancer treatments that have harmed their bodies permanently. My right leg swells up because lymph gets trapped there and my lymphatic system cannot cycle it through my body like a normal person’s. My life and activity are compromised today, but I still remain strong and proud of beating cancer and now fighting lymphedema. I remain positive about my life and thankful that I am even blessed to live such a great one. It is as Mrs. Fuller used to quote from Joshua J. Marine, “Challenges are what make life interesting and overcoming them is what makes life meaningful.”

Posted in News | 3 Comments

medications reach too far

In contemporary high-technology society, when people face physical and psychological issues, they tend to ask various medications for help to pursue “normality”.  With the passage of legislations that loosen restrictions on medical marijuana in many states, controversies regarding to widespread practice of prescribing medicine have raised many people’s concern.  Have we gone too far in “normalization” technologies?

Take for example the development of medicine to treat short stature.  Statistically, taller men are more socially popular than shorter men; they are more likely to be hired or to be elected as leaders.  Because of our culture’s preference of average-height people, it seems beneficial for short kids to grow higher with the help of medication.  As a result, some pharmaceutical companies and doctors advocate more research on human growth hormone injections to very short children.  According to those experts, besides physical and psychological advantages, the treatment of short stature can also prevent disorders like back problems.  However, does the growth hormone treatments really bring a healthy life to short kids?  A study in “very short girls” revealed that the hormone trials did result in the average height of treated girls three inches more than control group, but failed to show any psychosocial benefits for those girls.  In fact, the treatments backfire.  According to the psychological follow-up, many girls who received growth hormone reported feelings of “abnormal”, different from other children around them.  One girl of them even subjected to significant personality change.  Many parents began to wonder if a normal physical appearance outweighs potential sacrifice of psychological well-being.

Perhaps the most controversial medication for teenagers will be those prescribed to treat ADHD.  Last summer, when I worked as a medical assistant in a pediatric neurology office, I found an average of four to five children visiting the doctor with their parents to discuss about their symptoms of ADHD every day.  Those kids were often labeled as “behavioral problems” by their teachers.  Teachers in elementary school usually have a checklist.  Based on my observation, children with symptoms of distractibility; poor academic performance; incapability to pay attention, follow instructions, or sit still would be seen as potential ADHD patients.  Their parents, who were urgently informed by teachers, usually believed their children really suffered from ADHD and brought them to neurology doctors.  After doctor prescribed common ADHD medication, many parents felt a sense of relief because they were convinced by themselves that their children would finally be able to share the same social and academic life as other children. Moreover, some parents even encouraged the doctor to give their children some medication although the doctor didn’t think their children had ADHD because if parents didn’t make their children to achieve some “improvements” in school, teachers would constantly talk to them.

I don’t want to question, doubt or attack the roles that medication, teachers, or parents played in the treatment of ADHD.  I believe with years of scientific research in understanding of human’s behaviors, doctors and teachers both know better than I do regarding to ADHD symptoms, but I really doubt the necessity of widespread use of medications.  In fact, there are other ways to treat ADHD besides medications, though there is far less research on these alternatives because stimulants are the fastest therapy with best results.  “Pills don’t teach skills,” said Dr. Timothy Bilkey, a Canadian adult psychiatrist in private practice.  Medications can treat ADHD, but cannot really solve behavioral problems.  Other approaches, such as psychological therapy or cognitive behavioral therapy, maybe more crucial than simply taking medications.  For example, a type of talk therapy that redirects people and helps them get out of their way can be reformed to target ADHD.  Some neutraceuticals (versions of natural food) such as fish oil supplement can also have some effects in modifying people’s behavior.

There is a story often retold by Sir Ken Robinson.  Gillian Lynne was believed to have learning disorder when she was eight years old.  Her mother took her to see a doctor.  The doctor asked Gillian to wait in the room for a few minutes while he spoke to her mother outside, turning on the desk radio as they left. As soon as they closed the door behind them, Gillian was on her feet, dancing and moving around to the music. The doctor told her mother, “Mrs. Lynne, Gillian isn’t sick, she’s a dancer. Take her to dance school.”  Gillian Lynne later becomes the choreographer of Cats and one of England’s best ballerinas.  I’m not suggesting the limitations of our educational system; I just believe that there are many better approaches to educate our children or solve adults’ behavioral problems other than treatments of medications.

 

Posted in News | 6 Comments

The Unnecessary Good

Aimee Copeland lost her hands and her right leg to flesh eating bacteria.[1] It all started when she fell from her zip line into the Little Tallapoosa River and suffered a cut to her leg. The subsequent infection endangered her life and prompted the doctors to amputate the affected regions. With stories like this, it is easy to understand why people become suspicious of the “outside” world. Who in their right minds would want to contract a bacterium that ate at their skin? Considering that monstrous diseases are abundant everywhere, it follows that parents would want to keep their children protected in safe areas where they can’t get hurt. However, this obsession with hygiene and cleanliness may be hurting children in the long run. Although medical drug use and prevention measures are useful, they should be used in moderation to prevent any negative consequences.

Prevention measures are prevalent in our society. The notion of children having minimal contact with germs is prominent in many families. Parents would not be excited seeing their child playing outside with a handful of dirt entering the child’s mouth.  However, the microbes that parents are appalled by may be good for us during childhood. Michael Zasloff an immunologist at Georgetown University Medical Center stated that our immune systems use these early microbes to learn the correct immune responses to harmful and not harmful objects.[2] This general idea is part of the hygiene hypothesis in which too much cleanliness actually makes us sicker. A lack of the early exposure necessary to develop the immune system could result in a hypersensitive immune system that overreacts to particles like pollen. This creates a link between this lack of exposure and allergies. Along with allergies, the condition of asthma may also have strong roots to early exposure to microbes. Kathleen Barnes, an immunogeneticist at Johns Hopkins University believes that the environment heavily influences the prevalence of asthma because the rate of asthma has increased in the last 50 years. This implies that genetics is not the only factor at play. For instance, a study observed farm and non-farm children and found that the children that lived on the farms had lower rates of asthma, hay fever and eczema.[3] Since these children interacted with livestock and drank raw milk instead of microbe-free pasteurized milk, they had a chance to introduce foreign objects to their immune system. The emphasis on cleanliness in modern society restricts the development of our early immune system and thus may be harming us in the long run.

Another concern for the long run is antibiotic resistance through excessive medical treatments. Antibiotic resistance refers to the resistance that bacteria have when applying antibiotic treatment. This results in the survival of the bacteria after subjugating it to the antibiotics. Multidrug resistant bacteria or the “superbugs” pose a serious problem because they are incredibly difficult to kill. If a considerable number of bacteria species become multidrug resistant, then human society may be in big trouble. Without an easy cure for these bacteria, they may easily become the next disease like HIV to ravage populations. There are already at least 23,000 people dying every year due to antibiotic resistant bacteria.[4] This number will most likely increase as the use of antibiotics goes up and the bacteria adapt to them. The main issue is that the overuse of antibiotics is concentrated towards livestock—around 80 percent of the antibiotics sold in the United States are administered to farm animals.[5] These antibiotics are used in a “nontherapeutic” way, which means that the livestock is not even sick when given these drugs. This situation allows the bacteria to easily evolve and become resistant to the antibiotics being thrown in their direction. With antibiotics being used in excess, many strains of bacteria adapt and become resistant thereby threatening the future of our health and safety.

To preserve our health, we do want to avoid life-threatening diseases like flesh eating bacteria. Administering antibiotics to cure a potentially deadly bacterial infection is the correct procedure as well. However, we should exercise caution when we resort to using antibiotics in a reckless manner. Flesh eating bacteria is currently treatable due to antibiotics, but terrible things would happen if they became multidrug resistant. This concern is not meaningless fear mongering. Superbugs already exist and are raising health concerns. In a similar manner, the rise of asthma and allergies in the past 50 years is also not a fantasy. The truth is that we were meant to experience the world and its potential dangers. We should not preemptively try to avoid potential dangers by condemning our children to lives of hermitage. Only when a real risk arises that necessitates action should we apply the solution. Life is about living freely and experiencing the unknown, rather than restricted by the chains of potential dangers.


[1] Lupkin, Sydney. “What Aimee Copeland Has Gained Since Losing Limbs.” ABC News. http://abcnews.go.com/Health/aimee-copeland-achieved-accident/story?id=19559033 (accessed February 23, 2014).

[2] Telis, Gisela. “Hypercleanliness may be making us sick.” Washington Post. http://www.washingtonpost.com/national/health-science/hypercleanliness-may-be-making-us-sick/2013/03/25/9e6d4764-84e9-11e2-999e-5f8e0410cb9d_story.html (accessed February 23, 2014).

[3] Holbreich, Mark, Jon Genuneit, Juliane Weber, Charlotte Braun-Fahrländer, Marco Waser, and Erika von Mutius. “Amish children living in northern Indiana have a very low prevalence of allergic sensitization.” Journal of Allergy and Clinical Immunology 129, no. 6 (2012): 1671-1673.

 

[4] Falco, Miriam. “CDC sets threat levels for drug-resistant ‘superbugs’.” CNN. http://www.cnn.com/2013/09/16/health/antibiotic-resistant-infections-cdc/ (accessed February 21, 2014).

[5] Mckenna, Maryn. “Update: Farm Animals Get 80 Percent of Antibiotics Sold in U.S..” Wired.com. http://www.wired.com/wiredscience/2010/12/news-update-farm-animals-get-80-of-antibiotics-sold-in-us/ (accessed February 23, 2014).

 

Posted in News | 2 Comments

Life or Death

I don’t like to think about death, which is interesting, considering the fact that death is the only thing in life that is certain. Nevertheless, death, has always seemed just a little bit too permanent for me. I can’t escape it though – everyone talks about death once in a while,
and sometimes issues surrounding death make the front headlines. So, living in a society in which death is such a scary, sensitive, intriguing topic, and in which the conclusions we make define medical practice, I cannot help but wonder why issues dealing with death are so difficult for us to resolve. More specifically, why medical decisions that have the power to sustain or remove life from a living being seem so complicated and so unresolvable.

Why can’t we make a rule that says, if person A has been in a coma for X number of days or has X number of pain receptors activated, then he or she has the right to receive assisted suicide? Why are there so many if’s, then’s, but’s, etc.?

I think that part of it has to do with our morals. In our “civilized” society, doing things that are morally and ethically correct extend far beyond doing what is most pleasing, or doing what others are telling us to do. That’s why, even though a medical patient may be experiencing critical levels of pain, it is so difficult for us to accept assisted suicide
as an option. Because it isn’t just about ending their immediate pain. Our morals don’t work like that. When we, as humans consider what is morally right, we also think about what we take away, by relieving pain. We think about life, how precious it is, and how impossible it is to get back. The patient is suffering and is telling us to take out the IV, to shut the power down. But because we have been raised to think along certain ethical standards, it is impossible for many of us to neglect the consequences. We are thinking into the future, not just in the present.

This also shows how greatly we value living. We don’t even have a true definition for living. Many of us have built our lives upon understanding life, what makes life, and what is so special about it. Yet despite our lack of understanding (and maybe because of it), life is something that many of us feel is more valuable than anything else. As we contemplate ripping the IV out, we realize that we are taking something away that can never be
returned. The patient may be in a coma. He or she may not ever wake up and may never realize that he or she is still grasping onto the life force. We comfort ourselves, however, by holding onto the fact that the life is there. It may be slipping through our fingers, but it’s still there. That’s what is most important. And as a result, the question of assisted suicide becomes massively complicated. It’s not just the patient who is suffering, we are suffering. No one wants to watch their father in debilitated state, unable to talk, or unable to move. No one wants to be around for the screaming and pleading. At the same time we are thinking about how much we value our lives, and how much we value the lives of our loved ones. We are driven by our moral standards. Doing the right thing, we tell ourselves, isn’t just about yielding to the wishes of the suffering patient, it’s about realizing that life isn’t something we have the right to take away.

This all leads to the second reason why making life decisions is such a source of conflict for us. It’s interesting how much the dilemma centers around us, the people who are making the decision. We think a lot about the implications that assisted suicide has on ourselves. We wonder how we will manage to live the rest of our lives knowing that we have somehow had a part in the death of someone we care about. Somewhere in the process, I think that our attitude transitions. We start thinking that we know better. That the person
suffering is blinded by emotion, pain, anger, whatever. So it’s like trying to solve a math problem while someone is screaming (literally or not) at you, trying to tell you what they think the solution should be. We try to think through the intricate logic; try to make sense of all the factors and reasons. Nevertheless, this isn’t easy. We want to listen – after all, this is someone else’s life on the line. At the same time, we feel as though this all has to
make sense for us. When we place so many factors into the situation, finding the optimal solution is just so much more difficult.

In the end, the topic of death and suicide is such a difficult one because of how many people it affects; because we hold ourselves to a high moral standard. Death hurts everyone, emotionally and physically. Often times, the burden of decision that is placed on our own shoulders is overwhelming. The fact is that we value life; some of us value it even more than pleasure, happiness, or easiness. And it isn’t just the patient who were talking about. We have to live with the decisions we make, and so we search for a rationale. Euthanizing, death, and assisted suicide – it’s never easy to make the “right” choice, but we try to balance the factors and do what is best for everyone. I hope the above, although limited by my own experiences and beliefs is relatable or at least modifiable. If not, one can consider this essay to be a recording of my own troubles, and the reason why the topic of death and question of assisted suicide is so difficult for me.

Posted in News | 1 Comment

The FDA Unnecessarily Puts Patients At Risk

The Food and Drug Administration is supposed to protect the people: from bad drugs, unsafe food, and greedy companies trying to make a profit at the people’s expense.  But sometimes the FDA harms the people more than it helps them.  Dr. Scott Gottlieb, a physician and resident fellow at the American Enterprise Institute, wrote an article in the Wall Street Journal criticizing the FDA for allowing patients to undergo fake surgeries as part of FDA testing (http://online.wsj.com/news/articles/SB10001424052702304680904579365414108916816).  This is unethical for 2 reasons.  First off, patients were exposed to unnecessary risks such as infection.  Secondly, the medical device in question, Symplicity HTN-3, has already been approved and used in Europe.  Not only did the FDA unnecessarily put American citizens at risk, but they also needlessly delayed an approved treatment from entering the market.

For most studies, the FDA has utilized the practice of giving patients placebo pills in order to test the efficacy of actual pills that may enter the market.  The FDA has expanded this methodology, but for medical device-makers.  This means some patients are being cut up, and exposed to infection, but without receiving the medical device.  This is much more dangerous than a sugar pill.  In essence, these patients are getting all the downsides, with none of the benefits.   On The surface this seems like carrying out an effective experiment in order to get the best results.  In most elementary science classes we were taught to control all variables in order to get the best results.  This seems what the FDA is trying to do.  However, Dr. Scott Gottlieb states that there are “…other scientific approaches also allow us to get rigorous evidence of safety and benefits, while enabling patients to get real rather than fake therapies.  He goes on to say that 10 years ago the Food and Drug Administration would compare patients getting the new procedure to patients getting a different standard treatment for the same condition.[1] This way, both patients are getting treatment, while the government will be able to compare the effectiveness of the new treatment to the conventional treatment.

Even more startling, Symplicity HTN-3 has been already approved by European regulators.  However, The FDA did not think that was enough, because the regulators did not use sham surgeries.  It would be one thing if this medical device was completely new and had never been tested; yet the FDA insists on exposing patients to infection on a product that has been approved by regulators who are known to be especially stringent.  So while I do understand there may be some situations in where sham surgeries are necessary – such as when it is a very small, non-invasive procedure or when the device is truly groundbreaking – this was not one of them.  An opponent may counter that patients were aware that they might get a sham surgery.  This is true.  The FDA does let people know that they may either get the medical device implanted in them or that they may get a sham surgery.  This does not change the fact that this is still unethical.  Just because someone is told the consequences of something and still agrees to do it, does not make it ethical.  For instance if one person paid another person $1,000 to jump off a cliff and the other person does, that does not make it moral even if that person agrees to it.  Many of the patients who agree to the possibility of receiving a sham surgery, have no other choice.  Other treatments aren’t working for them, and this method might be there only possibility of survival.  It is worth it for them because this may be their only chance to get better.  There choice is either live with a debilitating condition, or try a promising medical treatment but also have the risk of a sham surgery.  These people should never have to make this choice.  They could get this medical treatment without having to face a sham surgery and the risk of infection that accompanies it.

I realize that he FDA is trying to protect the safety of consumers.  I understand that they are being cautious to protect our interests.  But the FDA could do all of this without jeopardizing the health of patients or causing delays for a medical treatment.  I’m not ruling out the possibility of sham surgeries for certain medical devices.  But in this case, The FDA can rely on the work of European regulators as well as test the new medical device against existing treatments.  By doing this the FDA would be able to uphold their duty of protecting patients without hurting them in the process.

 

Posted in News | 2 Comments

Aesthetics By Knife

Double-eyelid surgery. Rhinoplasty. V-line surgery. These are some of the most common procedures in South Korea. A Western crease, a higher and smaller nose, a slimmer V-shaped jawline. Women all across the country risk thousands of dollars, weeks of recovery time, and serious side effects to achieve these “more beautiful” features. In just the past few decades, the demand and popularity of cosmetic surgery have been on the rise. Today, one in five Korean women have had some sort of cosmetic surgery compared to one in twenty American women. But why is there such a drastic demand for these procedures in South Korea? It stems from a historical beginning and a modern society perpetuated by the pressure of beauty and sustained by cutting edge technologies.

In 1954, American plastic surgeon Dr. Ralph Millard travelled to South Korea work for the US Marine Corps in the post-Korean War period. Though his job was to help treat Korean accident and burn victims, during his time there Dr. Millard performed the first recorded double-eyelid surgery. He claimed that giving Asians a more Western look would help them “assimilate better into an emerging international economy.” After that, the surgery quickly gained popularity. It started with Korean prostitutes who got the surgery in an attempt to appeal to American soldiers. Then, it spread to Korean mainstream culture. Pretty soon after that, the double-eyelid surgery was common and very much normal. Even more, the hype extends far beyond those who reside in metropolises and industrialized areas. It pervades even small rural towns far from capitals or big cities, such as Gumi, which sits 115 miles away from the capital of Seoul. It has become a nationwide obsession.

Nowadays, surgery has become an ordinary part of growing up for most Koreans. Chae Jeongwon, a 16-year-old student from Gumi, explains her acceptance that she will one day have double-eyelid surgery: “It’s a present for senior schoolgirls. They say, Mommy, if you get my eyes or nose [done], my scores [will be] better than before.” Kang NaYeon, Jeongwon’s classmate, says she will get the double-eyelid surgery when she finishes her school exams. Double-eyelid surgeries and rhinoplasties have become as commonplace as to be synonymous with a graduation present or allowance; in fact, they have become so popular and mainstream that they are no longer referred to as “surgeries” but as “procedures.” These operations have become a routine part of adolescence for many girls across Korea.

Driven by a society that marries beauty and productivity, Korean adolescents seek to be the best they can be through intense schooling as well as cosmetic surgery. In Korea, the standards regarding women’s appearance are much harsher and stricter than in most of its Western counterparts. Though beauty is highly regarded almost everywhere, it is particularly open and upfront in Korea. For example, most job applicants are required to provide a headshot in their resumes as employers search for physical attractiveness in addition to their skills and qualifications. In today’s society, it’s not enough to just be capable you must also be beautiful. Ever since the economic crisis in 1997, people have been trying to gain advantage in the job market any way they can—even if it means getting plastic surgery to do it.

Beyond just the cultural burden that spurs this surgical boom, Korea has all the right advancements in technology to keep up. Korea prides itself on its ability to be constantly changing. In just a few decades the country may see drastic improvements in technology and industrialization. In Korea, technology infiltrates and dominates almost every part of life. Sixty-seven percent of Koreans use smartphones, the highest in the world, and ninety-five percent of households have Internet access. From karaoke studios to keyless doors, Korea’s technology is constantly bettering. Ultimately, this combined with the high demand allows for Korean surgeons, more so than any others in the world, to consistently modernize and experiment with cutting edge technology, making Korea a hub for cosmetic surgery and medical tourism.

On the other side, there has been a lot of backlash from those who claim that Korean women get these procedures to look “more white.” Dr. Hyuenong Park, a Korean surgeon, refutes that: “A small and slim face is ideal to most of people now. Even though many Caucasians have small and slim faces, it doesn’t mean Asians want to look like Caucasians. If you inspect some Caucasian celebrities, you find many examples of prominent jaws and high cheekbones. But if you inspect Asian celebrities, they all have small jaws and cheekbones.” He, like many others, attributes these quickly rising, commonly accepted practices to the K-pop (Korean pop music) culture. This culture has redefined beauty in a way that may, in some ways, mirror Caucasian features, but by no means seeks to copy them. It presents a package not limited to just music; young girls see their idols and like how they look and then they want to look like them too.

Perhaps one day the social pressures will lessen and the emphasis on beauty and necessity for cosmetic procedures will eventually dissipate. Or, perhaps one day technology will become so advanced that everyone will get surgery and be beautiful. Either way, for the time being, this booming national obsession can simply be seen as a country and culture that puts pressure on its people in order to be the best possible.

 

Sources:

http://www.theatlantic.com/health/archive/2013/05/the-k-pop-plastic-surgery-obsession/276215/

 

Posted in News | 2 Comments