While I was reading your story, I had to check and see if you were from Florida. It sounds exactly like most of my doctors here. My doctors didn't/don't seem to care unless it involved big money(surgery)for them.

I'm still plagued with alot of problems that are practically crippling me and all of them have written me off. Write the rx's, pass the buck, bla bla bla. They all seem to be the same. A person is left with disabilities that keep them from leading a normal life and just because they say, "you've been treated", that's what everybody believes and stays with. God forbid they should ask the patient how they really feel. In these cases, who gives the doctor such authority to always be right and have the last word? If the doctor/s are worthy, I have no problem with them being in charge. Unfortunately, this doesn't seem to be the norm in alot of areas.

If anybody wants to know what's wrong with healthcare in this country, this is one of the big ones. The doctors are complacent, uncaring, money hungry, arrogant, and take no time to learn about their patients.

It's such a shame that you took such care in keeping records about your own health issues and no one had enough respect to listen to you. I am so sorry and able to relate completely. You look forward to that doctor visit for weeks, only to find out they didn't seem to be as interested as you were.

Wow!! This hits close to home.