Is it me or have the roles that men and women traditionally play changed. Have men and women switched places? I see dudes walking around wearing tight clothes,& earrings and we're suppose call them "metro-sexual". Women have always naturally been independent but these days it seems like the behavior has gotton aggressive! Women have always wanted to be look as equal to a man but have some gone over board. Men seem to be more sensitive and feminine than ever! Is it me or is the line drawn between genders slowly disappearing.