Sunday, February 28, 2016

Why it took me so long to read To Kill a Mockingbird.



This week, I made Harper Lee a Rursday. Unfortunately, it was a posthumous honor, but at least I managed to read To Kill a Mockingbird before she died.

As I said in my review, it began when I admitted on Facebook not long ago that I had never read the novel. A number of people whose opinions I respect were surprised, and told me I needed to get on it right away. So I did.

But why did it take me so long?

I suspect it's partly my Northern upbringing. I understood that To Kill a Mockingbird is a Southern novel, and even in the 1960s and '70s, the South was thought of as a different sort of place than where I grew up. And yet, Maycomb, Alabama, in the 1930s is not all that unlike northern Indiana in the 1960s, in terms of the way whites treated blacks. Our schools were integrated, but our neighborhoods were not. We didn't have separate water fountains for whites and "coloreds," as the South did, but the prejudice was certainly there, and so was the mistrust between the races.

A few blocks from the house where I grew up was a little neighborhood beside the railroad tracks where several black families lived. (At the time, the correct term was Negro; it became "black" while I was still in school. Eventually, America realized that not every "black" person is actually black, so the preferred term now is African-American. The derogatory term has remained consistent; I don't use it, although some of my relatives did, and do.) I have a clear memory of a girl who lived in one of those houses riding her bike down my street. She wasn't in my grade, but we rode the same bus to school. She came into the yard, and we played outside for a little while. Then we went inside. I don't remember which of us proposed it, and it doesn't matter; what matters is that my mother immediately told us to go back outside. And she made it very clear to me later on that it was not to happen again.

I thought the whole thing was crazy. We went to the same school. And my father was friends with one of the men who lived in the girl's neighborhood; the gentleman in question had helped my father and my uncle build our house.

Anyway, when I had kids of my own, I was determined to raise them without any sort of prejudice. We lived in an integrated neighborhood, and my kids' books and TV shows had characters of all sorts of colors and ethnicities. I remember practically bending over backwards to describe people in terms other than skin color. And it worked -- until the first time the girls' school had a Black History Month program. Kitty came home from school incensed: "Why didn't you tell us any of this stuff?"

Well, because I wanted my kids to think skin color doesn't matter. Because it doesn't matter -- any more than does eye color or hair color or native language, or headscarves, or henna, or that dab of ashes Catholics wear on their foreheads in the spring.

And yet, of course, it does matter. Because humans have not yet reached the level of maturity at which skin color and religion are just another way to describe our fellow humans -- and not a reason to hate, mistrust, and even kill them.

Tomorrow is the last day of this year's Black History Month. I wish we lived in a post-racial world, but we don't. In too many ways, we're still stuck in the mindset of Maycomb, Alabama, circa 1935.

And yet I still hope that someday, we can get to the place where religion and skin color are nothing but ways to describe our fellow human beings.

***
These moments of bloggy diversity have been brought to you, as a public service, by Lynne Cantwell.

Sunday, February 21, 2016

Musings on "pay the writers."

Distributors of creative content are notorious cheapskates. Back when I was in radio ("uh-oh, here she goes again..."), it was well-known that radio stations in Florida, say, paid less than did those in less salubrious climes. Station owners could get away with paying lower salaries because people would apply in droves to work in a place where they could go to the beach every day. There was even a term for it: We said these stations paid in sunshine.

So last October, when I heard Wil Wheaton had turned down an offer from the Huffington Post to reblog one of his posts, I wasn't surprised when I learned the reason why: they weren't going to pay him. The editor he spoke with couched it in these conciliatory terms: "Most bloggers find value in the unique platform and reach our site provides..." Translation: We pay in exposure.

Keep in mind this was Wil Wheaton the editor was talking to. Wil Wheaton, who played Wesley Crusher on Star Trek: The Next Generation. Wil Wheaton, internet sensation, who plays a version of himself on the TV show The Big Bang Theory. That Wil Wheaton. HuffPo would have benefited from his "unique platform and reach" -- not the other way around.

The blogosphere blew up over it. And then the news cycle moved on, as it does, and folks simmered down -- until last week, when the editor of the HuffingtonPost UK, Stephen Hull, ripped off the scab. He has admitted that the UK site has 13,000 contributing bloggers, none of whom are paid. Why? Because it's more "authentic." Because that way, everybody knows the bloggers didn't give up their words for, you know, filthy lucre.

This from a company that might be worth as much as $1 billion dollars. So it's not like they can't afford to pay their bloggers. They just don't. And Hull says he's proud of that.

Needless to say, the blogosphere lit up again. Chuck Wendig has told everybody what he thinks of the idea, in his own endearingly NSFW way. Kristen Lamb has followed suit. Both of them are calling for a ban on linking to any HuffPo blogs until the company starts paying bloggers actual cash money for their posts.

And I agree with them. HuffPo is exploiting their bloggers. It doesn't matter whether I approach you with an idea for a column, or whether you approach me first; if I write it and you publish it, you have hired me, and you need to pay me -- and not in sunshine, or in your "unique platform and reach." My landlords won't let me pay them in sunshine. They're funny that way.

I part ways with Kristen, however, when she extends the "pay the writers" drumbeat to ebook sales. Why? Because now we're talking about two different ecosystems. Big publishers who hire writers need to pay them fairly for their work, period. But indie authors who are trying to get their names out there need to do what's necessary to disseminate their work as widely as possible -- and if that means writing a guest post for a book blog, or even giving away several thousand copies of their novel, then that's what they need to do. The first is a work-for-hire relationship; the second is a marketing strategy.

I've heard the arguments: indies ought not undervalue themselves; selling their work for a dollar is too cheap and giving it away is outrageous. I get it. I do. But you can't let your professional pride in your writing cloud your judgment. Nobody's going to pay five or ten bucks for an ebook by someone they've never heard of. If your book will only sell for a dollar at first, sell it for a dollar. You'll earn 35 cents on each copy. What do you think a publisher would pay an unknown author as an advance? $3,500? Then make it your goal to sell 10,000 copies. If you price your book at $2.99 instead, you'll make $2.09 per copy, and you'll only need to sell 1,675 copies.

Writers who get insulted over 99-cent price tags aren't thinking like content distributors, but that's what you have to do to sell books. As I said, content distributors are cheapskates. But one of the joys of going indie is that you get to pocket not just the writer's cut, but the distributor's cut, as well.

***
These moments of bloggy sunshine have been brought to you, as a public service, by Lynne Cantwell.

Sunday, February 14, 2016

Happy Black Sunday.

We're all friends here, right? So I'll be frank: I don't have a lot of use for Valentine's Day.

I'm in my sixth decade on this planet. For the first ten years or so, I could count on Valentines from my parents and my classmates (except for the year my teacher ruined Valentine's Day); for another ten years, I was married, and could count on something from the husband. For the remaining three-plus decades, I've been in Valentine limbo -- either wondering whether I'd have a reason to celebrate, or simply waiting for the day after, when Valentine candy would be fifty percent off.

Why do we celebrate this day, anyway? To start with, it was the feast day of a Catholic saint. St. Valentine of Terni, Wikipedia tells me, was a Roman who was martyred on this date in the third century. Or maybe it was two different guys, both named Valentine and martyred in different years. I guess the church records were a little sketchy in those days. In any case, the Catholic church didn't establish February 14th as the feast day of St. Valentine until 496 -- and then the church sort of demoted him in 1969. He's still a saint, but local churches don't have to observe his feast day.

Traditionally, St. Valentine is the patron saint of beekeeping, epilepsy, and bubonic plague. Romantic, no?

It was Geoffrey Chaucer who first connected St. Valentine to romance in his Parliament of Foules. People in the Middle Ages believed that birds paired off in mid-February, according to the Julian calendar; St. Valentine's feast day was the 14th; hey presto, we've got a holiday for love and romance. Some scholars over the centuries since have argued that Valentine celebrations were invented by the church to supersede and replace the pagan celebration of Lupercalia, but that has pretty much been debunked.

Anyway, within the past hundred years or so, Valentine's Day has been commercialized like almost every other holiday -- and like many other holidays, that has led to...oh, let's not beat around the bush here: if you don't have a date on Valentine's weekend, popular culture considers you to be something of a failure.

That's a lot of hooey. There are all sorts of reasons why people might not be paired off on a random day in mid-February. And some people who are paired off likely wish they weren't.

My only suggestion for getting through Valentine's weekend is that we all try to lower our expectations. If you have a date, great. If not, hey, it's one day. Stay away from romantic movies, and ads for flowers and jewelry. It'll all be over tomorrow.

So to those of you who are happily paired off today, happy Valentine's Day. To the rest of you, I'll see you at the drugstore tomorrow for half-price chocolate.

***
Here's one bit of cheerful holiday news: I Heard It on the Radio is just 99 cents through next weekend. This anthology has something for everyone -- including a story by me. Consider it our Valentine to you, Dear Reader.

***
These moments of black-hearted blogginess have been brought to you, as a public service, by Lynne Cantwell.

Sunday, February 7, 2016

No more denial: Bring back the Fairness Doctrine.

So much for rose-colored glasses; this calls for prescription specs.
I didn't want to write this post. I really didn't want to write this post. But I can't live in denial any longer: the news business isn't what it used to be -- and that needs to change.

I was quite the idealist in college, and in my early years as a reporter. I took pride in being a member of the vaunted Fourth Estate, the final check-and-balance on the three branches of government enshrined in the U.S. Constitution. Freedom of the press was guaranteed by the First Amendment, right beside freedom of speech. Numero uno, baby!

The press is supposed to be the ultimate whistleblower -- the institution that keeps an eagle eye out for unfairness and corruption, in government as well as in other facets of society. And when unfairness and corruption are found, the press is supposed to be the first institution to raise a stink about it, to hold officials' feet to the fire, to demand explanations, and to keep making noise until the problem is fixed.

I got my first job in broadcast news in late 1978, at a time when broadcasters still took that mission seriously. They had to. Thanks to the Federal Communications Commission and its Fairness Doctrine, local radio and television stations had to operate in the public interest -- or lose their licenses. "Public interest" involved keeping listeners and viewers apprised of what was going on in their cities and towns -- not just crimes, but also stuff like what the city council and the county zoning board were up to. Those meetings were as dry as dust for a reporter to cover, usually, but local regulations affect people far more directly than the actions of the federal bureaucracy. The Fairness Doctrine also required stations to provide airtime for opposing viewpoints.

The Doctrine was instituted in 1949, at a time when the big broadcast networks controlled such a large chunk of the airwaves that public discourse might have been in danger of being stifled. It was still in place when I graduated from college. But by 1987, pro-business, Reagan appointees were running the FCC. They argued that with the advent of cable, the broadcasting "marketplace of ideas" had become sufficiently diverse and the Fairness Doctrine was therefore obsolete. So the FCC repealed it.

And over the past thirty-ish years, broadcast news operations have become ever more polarized, while at the same time there has been a frenzy of mergers and ownership changes. Today, according to Freepress.net, ten companies own a huge chunk of the information and entertainment industry:

  • CBS Corporation, which owns Showtime and Simon & Schuster, among others;
  • Comcast Corporation, which owns NBCUniversal, MSNBC, and Fandango, as well as chunks of other media companies;
  • Gannett Corporation, which owns USA Today and a bunch of websites, including Careerbuilder.com;
  • News Corp., which owns Fox News, the Wall Street Journal, and HarperCollins; among others;
  • Time Warner, which owns a bunch of magazines (including Time and People), HBO, CNN, and DC Comics, to name a few; 
  • Tribune Corporation, which owns the Chicago Tribune, WGN, and the Food Network, among others; 
  • Viacom, which owns MTV, Paramount Pictures, Nickelodeon, and more:
  • Walt Disney Company, which owns ABC, ESPN, and the Marvel Universe, as well as the Disney theme parks and other companies;
  • The Washington Post Company, which owns the Washington Post, Slate, Kaplan (the educational test prep people) -- and which, in turn, is owned by Jeff Bezos, who also owns a little internet sales site called Amazon; and
  • A couple of venture capital companies, which currently own Clear Channel, the largest radio station owner in the United States. Clear Channel owns more than 850 radio stations nationwide, as well as Premiere Radio Networks, which distributes the Rush Limbaugh, Glenn Beck, and Sean Hannity shows.
Ten huge corporations own so much of what we see, hear, and read every day -- including news operations. So there's nobody left to hold their feet to the fire. "News" has become all about the ratings. And when other big corporations send jobs overseas and give all their profits to shareholders and the guys in the C-suites, who's to blow the whistle on them?

Once again, I'm grateful that I got out of the news business when I did. But more importantly, I think it's time to enact a new Fairness Doctrine.

***
This call for bloggy fairness has been brought to you, as a public service, by Lynne Cantwell.