I read a lot of literature, studies and books about escorting and the sex industry. I was curious if you guys ( and ladies) felt that one reason why sex work is still marginalized and seen as immoral or negative is because of the way the media often portrays escorts or strippers in such a degrading way. Or do you feel it's still our deeply Victorian values about sexuality? I always watch Law and Order and shows of that nature and there hasn't been one damn show that ever showed a successful, happy escort. It's always about how they get killed, they're drug-addled, not very bright, or show the sex industry as sleazy. The other night I was watching reruns of Criminal Minds and they depicted this high-class escort who actually came from wealthy family but was 'angry' at her father for having affairs with escorts and she was going around killing clients. I thought, you know, jeez.
Or is that many people simply think sex work is so 'horrifying' that no one wants to think or believe some of us are actually well-adjusted, intelligent, successful and ACTUALLY LIKE what we do?
It doesn't matter to me what society thinks, or why they think it, but I was just curious as to what your opinions are.
I also believe that it obviously isn't men who think lowly of the sex workers. Here in Nevada, and in the West, prostitutes were treated with respect by the lonely bachelors, cowboys and miners. Only until the families started moving into areas were ladies of the night then forced to ply their trade in remote areas. In the 1950's and 1960's, it was legal here for a lady to work out of her home as a prostitute. Then once the families started moving in, they were then told to go work in brothels in remote counties. As if that was going to stop men from driving around to pay for play.