I hate doctors. Last week on Tuesday I started getting sick. I had a fever and a sore throat. I played it off as the flu and drank water, took Advil and relaxed. As Thursday came I called my girlfriend to come to school and pick me up and take me home for the weekend so I could relax in my house to get better. By Friday my throat is on fire and I can’t even swallow, I go to the doctor and as I’m getting inspected she takes the flashlight and the little wooden tongue press and examines my throat. "Ahh your fine, just a little inflamed and red from coughing. Go home and rest." So I do that and it only gets worse. Saturday morning I go to the old family doctor convinced there’s something wrong with me. This time during the inspection, "Holy crap this is one of the best cases of step I’ve seen. I don’t even need to throat swab you." The doctor handed me a prescription for antibiotics and I was feeling dandy by Sunday night. Now what kind of horse shit is this? What kind of doctors do we have for one to completely have not a clue what they are looking at, and another to know perfectly? Does every asshole in pre-med gets their degree? Like what the fuck is this? It’s not like she was diagnosing some rare illness only seen a few times a year. It's frinkin STREP THROAT! I shouldn’t have to play Russian roulette when I go to the doctor hoping I get a good one. Right? What happens when I go for a check-up and that strange lump on my back is just “knot” and not aggressive cancer? How will I know if I can’t even trust doctors? This is ridiculous. If you all have a family doctor you trust stick with him/her because I took a chance at care center and it came around to bite me.