Is it normal to be taught to lie to your doctors?
At some point when I was a child, I was taught to lie to doctors. I no longer remember who exactly it was that taught me, most likely my grandmother, nor for what "their" reasoning behind this was. Well, for many years I did this and it never really occurred to me that this was strange. There were times when I would feel somewhat frustrated that couldn't be entirely open with my doctors. I knew that if certain people in my immediate family found out they would be very angry with me. As a result, I was also taught not to be trusting of my doctors. I was constantly paranoid about my doctors and fearful that if I said the wrong thing that they would call the police or put me into a mental institution.
.
I am an adult now and I have been bothered by this. I feel bad that I did lie to my doctors because I never realized that it could cause them problems like being unable to properly treat me. While, I don't plan to do this to any doctors I have in the future, I worry that I may get panicked over saying something wrong and them possibly living up to my family's bullshit. This "belief" also extended itself to any adult outside of the family, especially during certain incidences..
Is any of this normal?