For many of us, the sex education we received as teenagers wasn’t very comprehensive—especially for anyone who wasn’t heterosexual.
That void of information was often filled by one particular source: pop culture. Whether it was learning how to kiss, having sex for the first time, or adding some adventure to our bedroom antics, a lot of our inspiration—and instruction—came from studying certain scenes in movies and television.
The thing is, Hollywood often shapes our perceptions of sex and love in a way that makes them misshapen in the real world. In other words, we’re codified to believe that certain pop-cultural tropes—like finding “the one,” always (vocally) achieving orgasm, and the idea that self-worth comes from being in a relationship or, at the very least, having regular sex.
We also only see very limited clichés of sexuality—rarely is it represented as the wonderfully nuanced spectrum that it actually is. (And don’t even get us started about how pop culture often equates female puberty with the supernatural.)
What's more, pop culture can mess with our body images. Up until recently, we’ve only been presented with basic stereotypes of what is “sexy,” none of which encourage us to embrace the perceived imperfections of our (and our partners’) bodies. It’s hard to feel confident about your body image when you’re bombarded with images of an unattainable ideal.
Fortunately, things are slowly changing and we're beginning to see more pop culture that accurately represents sexuality and sex as they really are.
Here's to real sex—even when it’s super awkward.