Bernita's post yesterday got me thinking. And ticked off, truth be known. It seems that writers, romance writers in particular, are not only responsible for hordes of unsatisfied, lonely, crumpet-challenged housewives, now they are to blame for any of those impressionable readers having unprotected sex if condoms aren't specifically mentioned in the bouncy-bouncy scenes.
I have one question. Why do we keep having this conversation? It may happen and I'm just not aware of it, but I haven't read of mystery authors accused of alienation of affection because a beloved family member or friend always turns out to be whodunit, or western authors scolded for promoting violence when the sheriff shoots the bad guy. I know fantasy authors are accused of teaching children witchcraft all the time, but that's on the parents if they want to keep Junior's eyes off Harry Potter.
People are trying to convince me that simply by writing a story, I am responsible for influencing the decisions of a GROWN PERSON. Not just to inspire, to comfort, or even to thrill, but to convince them it's okay to engage in risky behavior. I ask you- why on earth would even a not-quite-fully-functioning adult look at something like this...
...and think the story gives them sound advice about anything?
I think I have a reasonable expectation that authors are going to do their research; say, if an author states boldly that orange juice and goat cheese is a cure for cancer I'd better see a footnote. Even so, I'm going to check with my doctor and read all I can before planting an orange tree and buying a goat. But not explicitly showing condom use? Okay by me. It's fantasy. I can ignore it or supply those in-between-the-lines details all by myself.
Even if I am convinced that a Greek billionaire is going to kidnap me, spirit me away on his yacht, and bribe me into marrying him.