I think the fact that pharmaceutical advertising is so rampant in this country is very telling about how messed up the US healthcare system is. Your health and well-being has been relegated to a mere business; an opportunity for multi-billion dollar corporations to make massive profits.
Look through just about any magazine these days. Watch any program on television. The advertising is everywhere. There's something very wrong when we are being told to request certain medications from our physicians, as if we're window-shopping for stuff that looks good. Instead, we should be letting our physicians - the actual medical experts - offer us medicine on their own when they feel it might actually help our condition(s).
We are taking medication for everything now. It's like we're afraid of feeling anything even resembling pain anymore. Or as I call it... life.
No comments:
Post a Comment