I'm currently in between jobs, yes, again, but that's not what I want to talk about. I'm home every day, I don't especially do well with silence, so I usually have the TV or stereo on. I prefer to have the stereo on, but I only know one level for the stereo, extra loud. Having a sleeping teenager in the house has prevented me from doing that of late, so I have the TV on all day long for back ground noise.
I've noticed a trend that I find quite disturbing. There have been a few female celebrities that have recently stood up to say positive things about sex. That has been met with quite a bit of criticism. The general consensus is that sex is something of an obligation for women.
This is not necessarily a new concept for me. I very distinctly remember my Mother telling me sex was something "you put up with". I remember her saying, " just lay there and let him do what he wants, don't think about it and it will be over soon."
Thank goodness, I didn't believe her, I did not embrace that mindset. Because of the way I was raised, because of the incest that occurred, sex has been a part of my life for my whole life. It started out as something very, very negative, I could have left it as that. I chose not to do that. I took the journey that I needed to take to transform the negative into a positive.
Some may judge the way I went about doing that. Some have judged me. My own family members, knowing what my background was, have called me names, slut and whore were two of there favorites. It hurt at the time, but I continued on undeterred, because I knew they were wrong, because I was not willing to accept what they wanted to force feed me.
I learned that my body is capable of incredible things, incredible pleasure, that I have yet to find in any other way. I learned that if this act is carried out with one you love, the connection is like no other in this world. It touches my soul, like nothing else can. It's a small slice of heaven here on earth.
I was always very frank with both my kids. I taught them that sex, within a loving relationship, is the very best thing. I also made sure to teach them to respect themselves and not to let people use them. I taught them to be responsible about sex, a concept my daughter clearly didn't get, but that isn't because I didn't try.
It makes me sad that society today still can't seem to embrace the fact that for women, sex can be a good thing. It makes me sad that we still can't talk about it in a positive way. Yes, I understand sex is not everyones focus.
I believe, that if given the opportunity to explore, in a safe and positive environment, without judgement, sex can be a very fulfilling activity. It creates energy and connections to others that you can't achieve elsewhere.
I'm left wondering: what is it going to take for society to change? How long will it be before sex is not presented to women as denigrating? Society as a whole exploits women and sexuality to sell every thing under the sun, but let a woman step forward and say she enjoys sex, and they cut her down and criticize her for that. Does this seem a bit hypocritical to anyone?