In the past few years, as a surgeon, I have become increasingly aware of the scourge of the wellness industry. I am seeing patients who opt for diets, supplements or magical therapies instead of the less seductive – though scientifically grounded – medicine I have to offer. Like everyone else, I am constantly bombarded with messages in advertisements and from well-meaning friends as to how this diet or that vitamin is the key to health, longevity, beauty and status.
For doctors such as myself, the rise of this brand of wellness is distressing. However, medicine as a profession and a science has no doubt played a part in the genesis and growth of big wellness. For virtually the whole of its existence, medicine has disenfranchised women and, to varying degrees, continues to do so. Even as medicine has modernised with an emphasis on autonomy and resolving bias, it remains, at times, paternalistic and patriarchal.