Is it normal to feel a woman doctor are more compassionate
Over the years of doctor visits and most of them being male, I finally seen a lady doctor. I have to say that now I see a lady for all my medical reasons, health, vision, dental and find they are way more compassionate and understanding. Has anyone else had this experience