© 2024 91.9 KVCR

KVCR is a service of the San Bernardino Community College District.

San Bernardino Community College District does not discriminate on the basis of age, color, creed, religion, disability, marital status, veteran status, national origin, race, sex, sexual orientation, gender identity or gender expression.

701 S Mt Vernon Avenue, San Bernardino CA 92410
909-384-4444
Where you learn something new every day.
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

An Offensive Hypothetical Slips Through Newsroom Oversight Process

Paul Sakuma
/
AP

The investigative reporters at ProPublica turned up a disturbing story about Facebook "taking money to connect advertisers with anti-Semites," as Morning Edition host Rachel Martin phrased it last week. NPR's reporting on the story, however inadvertently, raised its own disturbing newsroom lapses.

Listener Deirdre Brennan, of Minneapolis, succinctly summed up the problem with the NPR piece, which was an interview between Martin and reporter Aarti Shahani:

"The story today about Facebook's selling ads aimed at anti-Semitic people contained an example to explain the word 'psychographics' that to me seemed to propagate the kind of racism and bigotry that ProPublica was trying to expose in its Facebook investigation. The example to explain the word psychographics associated young, African-American men with the term 'cop-killer.'"

Here's what Shahani, who covers technology for the business desk, said, in explaining to Martin how Facebook's ad-targeting worked:

"Facebook's business is based on letting advertisers do exactly what ProPublica did, which is targeting the most personal, even insidious parts of ourselves. OK? There's an industry term for this. It's called psychographic marketing. In the old days, if you were placing ads, you relied on demographics. But with psychographics, you go deeper. You don't just advertise to, say, men in Baltimore, age 19 to 35, who are black. You can add interests, like cop killer. And if Facebook finds and zaps that term, you pick a proxy — you know, say, a band or a movie that's all about mowing down cops."

"What an offensive hypothetical — no better than what Facebook was allowing," wrote Brooklyn, N.Y., listener Amanda Harrington.

Nathan Ashley Sterner, who hosts Morning Edition at the Baltimore NPR member station WYPR, wrote: "If hypothetical, the example's inclusion seems unnecessary, as targeting ads to 'Jew hater,' as ProPublica did, is enough (for me at least) to explain the situation without raising the specter of young black men from Baltimore who are interested in killing cops (I'd also add that a Baltimore [city] police officer hasn't been murdered in more than a decade)."

The newsroom later on Friday added this editor's note:

"We heard back from a lot of listeners on this story. Many complained about the example we gave to portray the dangerous search terms used by some Facebook advertisers that use targeted ads. The intent of the example was to illustrate how online advertisers searched extreme subgroups. We didn't mean to either offend anyone or perpetuate a stereotype; the specific example we used was provided by a leading online marketer that uses Facebook tools. We should have made that clearer during the conversation."

The newsroom is essentially saying the unidentified online marketer provided the racist stereotype, although that does not explain why it was included in the first place, in addition to the example in the ProPublica reporting. Perhaps it was to show that other disturbing targeting is possible on Facebook? If so, that should have been made clear (not "clearer.") And was that the only other example provided by the online marketer?

Sterner still had questions, too. "Why is the 'leading online marketer' left anonymous? If this marketer's example was trenchant enough to be cited, should that individual not be identified? What is the journalistic interest in keeping this person's identity secret?" Sterner wrote to my office, adding, "I still don't know for certain whether the example was hypothetical. Has anyone marketed ads to 'men in Baltimore, age 19 to 35, who are black' with interests in 'cop killer'? If so, I'd like to know."

Mark Memmott, NPR's standards and practices editor, who replied to my request to the newsroom for an explanation, told me the marketer "spoke with NPR on condition of anonymity due to the sensitive nature of the subject." The example, he said he was told, "was a hypothetical."

Mistakes happen, especially in live interviews, as this one was. This was a serious one, but an even bigger question is why no one on the Morning Edition staff or in the newsroom raised an immediate red flag on hearing the interview. Instead, it was distributed in the morning podcast Up First, and aired several times, until it was finally pulled from the lineup for the newsmagazine's final airing of the morning at 11 a.m. ET. This despite the listener emails that started coming in just before 6 a.m.

Memmott, speaking for the newsroom, told me: "Our editorial process didn't work as well as it should and we ended up with comments in the piece that needed more context. When we realized we'd left a misimpression, we discussed what to do and came up with the solution you see — removing the piece from the show's final feed and adding an editor's note to the story page, where anyone who came to our digital platforms to confirm what they heard would find it. We try to learn from our mistakes and aim to do so once again."

By "learn from our mistakes," he meant, he later told me, "that all those in the process are aware of what went wrong, have discussed it and will have learned from it to be more vigilant."

Some three dozen listeners wrote to NPR to register concerns about this interview and several wrote back to say they remained unhappy with the editor's note ("lukewarm," as one listener put it.) NPR's response indeed skirts the issue: NPR gave voice to an offensive stereotype without an explanation for why it was doing so, and many layers of newsroom staffers who should have immediately realized there was a problem — or reacted quickly when apprised of it — did not. More vigilance going forward is needed, yes. But also, in this case, a more forthright acknowledgment of just what went wrong (and why it was wrong) would have been in order.

Copyright 2021 NPR. To see more, visit https://www.npr.org.

Elizabeth Jensen was appointed as NPR's Public Editor in January 2015. In this role, she serves as the public's representative to NPR, responsible for bringing transparency to matters of journalism and journalism ethics. The Public Editor receives tens of thousands of listener inquiries annually and responds to significant queries, comments and criticisms.