Well, I have a question for you. In American movies, we see that teachers at universities in the U.S. (specially in California) are not very respected. You said you taught there, do you agree with this? Or do you think that is a general idea? Or just a movie?
Thanks for answering.