Tell Them I Was Happy
I remember years ago, Someone told me I should take Caution when it comes to love, I did, I did And you were strong and I was not, My illusion, my mistake I was careless, I forgot, I did And now when all is done there is nothing to say, You have gone and so effortlessly You have won, you can go ahead tell them, Tell them all I know now Shout it from the roof tops, Write it on the sky line All we had is gone now, Tell them I was happy And my heart is broken, All my scars are open Tell them what I hoped would be Impossible, impossible, Impossible, impossible.
Smiling is one of the warmest gestures a person can give to another person. It's especially warm when children smile because it's a sign of genuine happiness even if it's for a slight moment - that smile is appreciated. This is why I am extremely uncomfortable when strange men tell me to smile. It's overbearing, invasive and slightly eerie for men to tell women (that they've never seen or met before) to smile. I can't help but to wonder if these same men that are commanding women to smile also tell other men to smile? Telling a woman to smile, even if your intent is purely innocent is dictatorial and it shouldn't happen.
Now I know, there are some men and even women that will read this and assume I'm being a radical feminist but let me ask you, when was the last time someone, a stranger even, demanded you to do something you didn't want to do? What if you are having a bad day, perhaps you are having severe cramps, or maybe you are late for work and there is a stranger, looking at you requesting and telling you to smile?
It might seem like a friendly gesture but there is nothing friendly about a man encouraging a strange woman to smile. The sexualization behind telling women to smile is alarming. It makes women feel that we are only meant to be happy and pretty and it's a passive way to engage into an unwanted conversation. Asking a woman to smile is a selfish act and it's rarely in a caring tone; it's condescending and it turns a simple gesture into something sexual.
Instead of asking a woman how she actually feels or being open minded to the idea she might not be interested, there are men that will berate a woman into doing something that she isn't comfortable doing. That is unacceptable.
Tell Them I Was Happy And My Heart Was Broken
'Men tell women to smile because society conditions men to think we exist for the male gaze and for their pleasure. Men are socialized to believe they have control over women's bodies. This is the result in them giving unsolicited instructions on how we should look, think and act. Essentially what a man is saying when he tells a woman - one he doesn't even know - to smile, is that his wants outweigh her own autonomy over how she exists in the world.' , writer and activist explains to me about her views on men telling women they don't know they should smile.
Just about every woman has a story similar to mine. Viera recalls the time she was embarrassed on a flight, 'It's frustrating.
Women are just trying to get from point A to point B without commentary from men on our bodies or telling us to smile. During the Christmas holiday, I was on a plane when a military guy embarrassed me in front of everyone in our section by tapping me on the shoulder and telling me, 'You need to learn how to smile.'
I was stunned speechless. The longer I thought about how embarrassed I was the more I couldn't help but let him know that telling women to smile is both corny and sexist. So I passed him a nice 'Fuck you' note explaining to him that I could've just lost a parent or went through something tragic. Telling women to smile is not men's place.'