doc.tor (n): a learned or authoritative teacher.
This is the definition of a doctor. A doctor's job is to teach you how to reach maximal health. Any learned or gained knowledge is taught by a teacher. Not just what comes in to your mind when you hear the word teacher, which is probably your 2nd or 3rd grade teacher, at least it is for me. But teachers can come in a variety of shapes and forms. Your preacher at your church teaches you how to get closer to God (hopefully), your mechanic teaches you what's wrong with your car, your personal trainer teaches you how to become fit or lose weight. Even if you gain knowledge through self discovery, there was something or someone to spark the thought or pave the way. Even a walk through nature can teach you a lesson or two. My question is your doctor teaching you how to become healthy? When he put you on the statin for your cholesterol did he teach you how to exercise or what diet changes you should make so that you could get off of that drug? Does he teach you how your body works? Or what it means when you are sick and fighting off disease? Does he say, gee you are in my office quite a bit, why are you sick all the time? We need to see what we can do to strengthen your immune system, because it's weak... Or does his visit primarily consist of him scribbling a name of a drug on a pad and handing it to you? Would that fly in other aspects of your life? If your child came home from school and the teacher gave them a recommendation from the teacher of a good math book, and she would see him next the day of the test? Or if your mechanic handed you a bill without explaining what was wrong and what needed to be done in order to fix it? If your doctor isn't teaching you, I recommend getting a different one.
Leave a Reply.
An open minded place for alternative thought and discussion.