I love wearing make up, contrary to what one would think based on my opinions about society's expectations. I can't wear make up to work during the summer because I sweat in the hot attics, on roofs, and carrying around the 28 ft ladder.
It's cool enough to wear make up now, and this has me thinking of the whole concept and people's perceptions of make up. So... I'm posting a poll. In addition to the poll, I'm wondering if people think that women look more professional with make up on, particularly Real Estate agents and women in the coporate field and sales ladies and pretty much any "white collar" field. Also, does this change if the woman works in a field that's dominated by men?
I'm going to go finish getting ready for work and putting on my make up now...