I'm at a point where I am having difficulty determining what exactly is politically correct, and what women would want in this particular situation. Okay, in my "nonphobias" post, I mentioned that I don't have a fear of dentists, and that people who are afraid of dentists need to toughen up. Now, I wanted to add that females should be excluded from this, since they are females, and are naturally scared of more things, but I felt that that might be condescending, and held back from saying that.
Now by question to my female readers is, what do women these days want/expect from us guys? Should we be sensitive to gender differences, and acknowledge the fact that women are different from men in many respects, or should we treat you as complete equals in ALL respects, and in my example tell you that you need to "man up" when it comes to visiting the dentist? This really is a legitimate question, and my suspicion is that the women of this country are currently split into two different camps over this issue, but I'm just interested in hearing your opinions. All discussion as always is welcome, aww shit damnit fuck I just delved into the realm of politics which is something that I promised not to do in my mission statement, but this post is already too long, and actually might actually help the male readers become more enlightened on this critical issue, so I will let the post stand.