Is Teaching Really That Bad?
I don't know if this sub is strictly for teachers, but I'm a senior in high school hoping to become a teacher. I want to be a high school English teacher because I genuinely believe that America needs more common sense, the tools to analyze rhetoric, evaluate the credibility of sources, and spot propaganda. I believe that all of these skills are either taught or expanded on during high school English/language arts. However, when I told my counselor at school that I wanted to be a teacher, she made a face and asked if I was *sure*. Pretty much every adult and even some of my peers have had the same reaction. Is being a teacher really that bad?