From: Jonathan Paul Cowherd [mailto:jpcowh01@slug.louisville.edu] Sent: Friday, June 08, 2001 11:25 AM As a women, have you found that the men that you encounter have tried to assert dominance or some aspect like that with you? Or do you feel that men are always trying to assert dominance regardless who it is or what it is about? ... like we are compensating for something ... :)
I haven't felt any dominance play going on with the men that I'm surrounded with. The most common thing that I get from men that I encounter is feeling uncomfortable. It seems that a lot of guys in the work place are afraid to say anything around a girl that might be construed in negative terms and the consensus seems to be just ignore the female and maybe she'll go away. I haven't encountered any real assertive behavior, nothing to make me feel inferior, just not one of the guys. Mandie Smith "Beware of programmers carrying screwdrivers." -- Chip Salzenberg