Consumers visit "doctors" all the time while shopping. A funny post over on mental_floss discusses the medical backgrounds of these doctors, specifically Dr. Brown, Dr. Scholl, Dr. Martens and Dr Pepper. As one thinks about this type of branding, it reflects how America's attitudes towards physicians have changed over the past 100 years. At the turn of the century, doctors were seen as respectable members of society who were authorities on a wide range of subjects. The title 'doctor' had influence. The logic of marketing dictates that manufacturers would not have branded something "Doctor X's Tonic" or whatever unless the label "Doctor" added some value. Would you buy a drink called "Sergeant Pepper" (heh)? A shoe insert named "Commodore Scholl's"? I think not.
However, society today views physicians differently. Popular media shows physicians as on edge, as we often are. Or, perhaps worse, as being ditzes and sex-starved (um, as we often are?) People are skeptical of their doctor's advice, and often turn to the internet or other non-traditional sources for advice. Why the change?
It seems that the seeds of medicine's demise were sown in medicine's success. As one of my attendings on surgery mentioned, the advent of modern medicine changed people's expectations. His father had been a pediatrician prior to the days of vaccines. Many children would be stricken by diseases such as polio. Parents were fearful, and physicans often provided a calming presence, even if they could not provide any solutions. In some sad cases, patients would even die, but no one would blame the pediatrician, but rather in fact would sing his praises at the funeral.
Then, vaccines arrived. Over time, parents stopped seeing the crippling effects of diseases like polio, and rather were left to deal with crying babies and sore arms. Instead of seeing the physician as an authority, people (especially the kids) came to see the pediatrician as someone unduly inflicting pain. Over the decades, this has led to people even questioning the rationale behind vaccines. Of course, vaccines are not necessarily 100% safe, and physicians and parents should remain vigilant. However, this is not tantamount to rolling back decades of progress against diseases by arguing that vacccines are harmful.
Of course, the story is anecdotal, and only one part of the issue. The "evolution" of the American healthcare system, dramatic shifts in the structure of society, and increased consumer awareness have all served to tear down the pedestal upon which doctors once stood. While it may be beneficial for sodas and shoe inserts to have "doctors" tied their brand, physicians should be mindful of their own 'brand' and how it is perceived in the marketplace of society.
However, society today views physicians differently. Popular media shows physicians as on edge, as we often are. Or, perhaps worse, as being ditzes and sex-starved (um, as we often are?) People are skeptical of their doctor's advice, and often turn to the internet or other non-traditional sources for advice. Why the change?
It seems that the seeds of medicine's demise were sown in medicine's success. As one of my attendings on surgery mentioned, the advent of modern medicine changed people's expectations. His father had been a pediatrician prior to the days of vaccines. Many children would be stricken by diseases such as polio. Parents were fearful, and physicans often provided a calming presence, even if they could not provide any solutions. In some sad cases, patients would even die, but no one would blame the pediatrician, but rather in fact would sing his praises at the funeral.
Then, vaccines arrived. Over time, parents stopped seeing the crippling effects of diseases like polio, and rather were left to deal with crying babies and sore arms. Instead of seeing the physician as an authority, people (especially the kids) came to see the pediatrician as someone unduly inflicting pain. Over the decades, this has led to people even questioning the rationale behind vaccines. Of course, vaccines are not necessarily 100% safe, and physicians and parents should remain vigilant. However, this is not tantamount to rolling back decades of progress against diseases by arguing that vacccines are harmful.
Of course, the story is anecdotal, and only one part of the issue. The "evolution" of the American healthcare system, dramatic shifts in the structure of society, and increased consumer awareness have all served to tear down the pedestal upon which doctors once stood. While it may be beneficial for sodas and shoe inserts to have "doctors" tied their brand, physicians should be mindful of their own 'brand' and how it is perceived in the marketplace of society.
No comments:
Post a Comment