I was reading this post by Vegan Mainstream about needing more vegan doctors in the world. This would be awesome. I know a few, the usual suspects Dr. McDougall, Dr. Campbell, Dr. Esselstyn, my friends over at Exsalus, my vegan heart doctor friend Heather. I'm so happy that these doctors exist and continue to dispel the common myths of Western medicine.
It's very rare to find a regular Western doctor who knows anything about nutrition. They take maybe 1 or 2 credits of nutrition in their schooling. It's mind-boggling to me that a whole system of science was created without consideration of how food affects the body, in turn creating most of the diseases in the country. You don't have to be a doctor to see the correlation! As we continue to overeat processed crap, exercise less, and become overweight and obese, the "lifestyle" diseases in this country continue to rise and kill more and more people each day.
In the East, a "doctors" job was to keep you well. And if they did not perform that job then you didn't have to pay for the services. The doctors of the West are in the business of keeping people barely alive. Why would anyone go see a doctor that couldn't help you prevent or treat the ailments that you currently have? Why would you put your trust in someone that is overweight, smokes, or has had 2 triple-bypass surgeries? They obviously know nothing about living a healthy lifestyle! Why would you pay someone money to prescribe a ton of expensive medications you must take everyday when you could simply change the way you eat? We have spawned a generation of on the go, no time in the day because I'm working 2 jobs to pay the bills and feed my children, easier to just drive thru and grab a meal to feed my whole family for around $5 kind of generation. It's very sad, but we aren't entirely to blame. Yes, we vote with our dollar, and ultimately we are in charge of our decisions and health, but the government has not done a bang up job of making healthy food affordable. Or making our food system a priority at all.
We are living in scary times people. The children of today will not live as long as their parents, and their children (if they are even healthy enough to produce offspring) will live most of their lives overweight, in pain, taking tons of prescription medicines, unless we DO something NOW.
Back to the question in the title of this post: Who needs doctors anyways? Well, I'm here to tell you that if you eat a healthy plant-based diet, over time you won't need to see doctors anymore. I actually haven't seen a doctor the whole time I've been vegan, because I didn't have any poor health conditions to begin with that needed treated. I am my doctor now. I determine my state of health with my fork. But if you do have a health condition, find a doctor (or alternative practitioner) that can treat your symptoms and the underlying problem. Not a doctor that will just throw a band-aid over your condition in hopes that it will be more manageable. Who wants to live that way? Why not just treat the cause so the condition can go away completely?
Sorry for the rant! Our medical system really urks me!
What is your opinion of doctors and Western medicine? Have you noticed you don't need to see the doctor after switching to a plant-based diet?