Hi all,
This has been a topic that has been on my mind since I was first diagnosed 7 years ago but keeps reappearing after every recurrence. Do you feel stigma related to your diagnosis of cancer? Do you feel like sharing your diagnosis, treatments to people outside of the cancerverse is approached with fear, denial, negative reactions? I honestly can say that my closest friends have become acquaintances since my diagnosis. None of them could even say the word "cancer" to me and would brush off anything I would mention regarding my cancer life. I understand where they're coming from (it's not their fault) but is this common for you guys? Another thing - I met a couple of doctors from non-oncology specialties who either feel so bad for you or they basically write you off... for ex: a GI doc who did an endoscopic biopsy on me - he delivered the news that the lesion in question was cancerous (this was 3 years ago) and he was bawling. I was like , felt odd having to comfort him as he was telling me this crying. The dr who did my gyne surgeries made a comment when I met him "OH stage 4! Not many treatment options huh??". I myself am an oncology nurse, so cancer is present in my personal and professional life. I feel like I am comfortable discussing it but I definitely feel the stigma of it weighing heavily on me at times, the longer I am dealing with this disease.