Thursday, August 29, 2019

What's Wrong with News?

Since Jimmy Wales came to speak at the University of Alabama last weekend, I've been thinking more about WikiTribune, his new-ish news curation/creation venture. This seems like a rare opportunity to contribute to the building of a tool that has the backing of the creator of one of the most popular, influential, important websites in the world. Of course the venture might fail, because most new ventures fail and because news is a tricky thing to get right, maybe trickier in a lot of ways than building something akin to a library or encyclopedia. WikiTribune seems to be functioning right now as a collectively curated news aggregation website; it might evolve into something else in the future, but I'll assume that's what it is for now.

What might be the starting points for a platform like this? If news is broken (which seems a fairly uncontroversial take in 2019), what part of it could be fixed by a Wiki-type platform?

First, there's the issue of factual accuracy. A decent way of getting a newsfeed that only has news that is factually correct is just by linking to stories that are from sources that can be held accountable, that have a reputation, and that typically follow standard journalistic practices. You weed out the parody stories, the polarizing disinformation, the deliberate attempts to poison discourse with fake news. This seems like something a dedicated group of volunteers could do.

I feel like the factual accuracy problem isn't as widespread as some believe, that networks of upvoting and sharing bots only make it seem as though a handful of untrue stories are being very widely read and believed. Of course, there are most likely relatively small pockets of people who are actually believing in false stories and act on those beliefs, and despite their relatively small numbers, they can be very harmful to society. And various characteristics of popular social media platforms such as Twitter, YouTube, and Facebook (such as algorithms that don't vet information, the ability for anyone to post and share anything, being too big and too fast to moderate) increase the footprint and influence of false stories beyond what they were in the pre-internet says.

I doubt WikiTribune would lure the type of folks who seek out false stories away from their favorite hyperpartisan sites, but who knows what might happen if it eventually became even a tenth as popular as Wikipedia. Maybe it would end up as a kind of standard system for current events information vetting, a better middle layer between journalism and audiences than social media currently is. Most people would then recognize the false news stories and sources in a way that most people did before the internet: as a kind of inevitable fringe to the information ecosphere, relegated to a recognizable outskirt rather than popping up in the midst of a feed of journalism vetted in the traditional way (as tends to happen w/ news via social media). It's important to know whether factually inaccurate news exists, how much of it exists, and who is sharing and reading it, but it's also important to think about where, in our information environment, it resides. Is it concentrated or distributed? In the center or on the periphery?

During his talk, Wales brought up the problem of clickbait headlines: headlines that mislead audiences, play on our emotional, tribal, impulsive tendencies, exploit a kind of shallow curiosity. So, maybe WikiTribune curators use a kind of rough guiding principle: avoid posting clickbait headlines (or maybe just rewrite the headlines. Often, the stories are fine, but the headlines seem like they were written by search engine optimizers). Obviously, clickbait is a term with a fuzzy definition, but that's nothing new, and certainly doesn't stop websites, publishers, etc. from moderating various kinds of vaguely-defined content standard. Get multiple experienced coders to rate each headline's clickbait-iness, and when they agree, don't post it or re-write the headline.

Wales also brought up ads and their effects on content, one of which is to essentially incentivize the creation of clickbait headlines. But I guess I'm a bit unclear on how to account for the fact that many of the sources to which WikiTribune links are ad supported. Related to this, how does WikiTribune work with paywall news sites? Wales seemed to be pro-paywall, to endorse the idea that if more news sites were subscription-funded rather than ad-funded, the quality of information would improve. Would WikiTribune just give you a taste of the article, a kind of abstract or summary, something a bit more than a headline, some kind of compromise that would give the reader some value while not totally substituting for the story itself, perhaps pushing users to the full article in the way that Wikipedia might push users to the source material? That seems reasonable to me.

He also brought up algorithms. Perhaps algorithms also, in some way, guide news consumers and creators toward more clickbait-y headlines. If you have humans in the loop, maybe it would be easier to slow or stop the prevalence of clickbait and false news.

He also brought up the fact that Wikipedia is not totally open and it's not a democracy, and I think that's a way of setting this apart from the primary news aggregator of our time, Reddit. Reddit is ostensibly open, and is a kind of democracy: registered users submit and vote on content, thereby increasing or decreasing its visibility. Over time, this has led to certain kinds of news stories ending up on the front page. You might call it a product of 'hivemind.' It has a certain narrow tonal range, and a certain focus on particular topics that reflects the interests and values of the voting public.

There is recent talk of moderators playing a bigger role in directing what gets posted in subreddits, and the inevitable push-back by those who value unfettered speech and a democratic public sphere above all else. I doubt that push-back will fully subside on Reddit because voting for the noteworthiness of content is a defining characteristic of the platform. It's not ancillary in the way that 'likes' are to Instagram. But if you start a new site that simply doesn't have that as one of it's defining characteristics, you don't necessarily have that problem. Sure, a lot of people believe strongly in a flat hierarchy, an open democratic sphere, but I think that enough people have been repulsed by what that yielded to want something else.

Somewhat more controversially, we might consider the question of tone and emotion in news, and whether or not that needs 'fixing,' or could be fixed. Personally, I'm turned off by the abundance of outrage and fear that many news stories in 2019 evoke, but I recognize that an argument can be (and has been) made for the value of anger and fear in this sphere. Here, I think it's more a matter of personal preference, and not everyone would agree on this, but some would like, sometimes, to see a newsfeed that features less fear and anger. I feel like that's where a lot of popular democratic news feeds, like r/news on Reddit, end up: a kind of distilled outrage and fear.

Here, we need think about the purpose of news. Is it a kind of 'immune system' for societies that's sole purpose is to detect threats and alert us to them? Well then, it seems entirely appropriate that news would invoke anger and fear. Should news be more broad, emotionally, than that? Should it invoke wonder, curiosity, gratitude? You have subreddits like r/upliftingnews that cater to another point on the emotional spectrum, so it's not as if there isn't already a place for that in many people's information diets.

Then there's the matter of filter bubbles/echo chambers, and I don't know that there's much that WikiTribune could do about that. I think worries about filter bubbles and echo chambers are somewhat overstated and/or that we'll never fully solve that problem, and while we wait around for the perfect solution, we're making do with a pretty lousy news ecosystem that's run, by default, by impulsive clicks and a lack of accountability. I think pure democracy was one potential solution - giving the power of vetting and curation to anyone and everyone - but we've seen how well that went.

Wikipedia never solved the filter bubble problem when it came to creating an encyclopedia. The editors don't remotely reflect the readership, in terms of race, gender, education level, ideology, etc. Wikipedia isn't flawless, but it seems to be working well enough, and I gather that there is a sense within the organization that they should try to broaden the diversity of people who edit it to include more women and people of color. Should it also try to include more people who identify as politically conservative? Does it make sense to pursue intellectual diversity among information curators as well?

It looks like the current version of WikiTribune features a way to follow particular feeds curator by particular Wiki-editors. If that's the case, then what's to stop a couple of ultra-liberal or ultra-conservative editors from setting up feeds full of clickbait and partisan vitriol? Is there some overseer that decides when editors have gone too far, similar to the way that admins on Reddit ultimately have control over volunteer moderators? Might that decision as to what goes too far be motivated by one's ideology?

Yes. But I get the sense that Wikipedia has already dealt with similarly motivated people who have tried to turn Wikipedia into a more partisan information environment, and it has some sort of mechanism for dealing with them that seems to be largely effective. Is the mechanism entirely democratic and open? Probably not, but now might be the time that some of the public revisits the relationship between direct democracy and news curation and distribution.

Even if something similar to WikiTribune existed in the past (and I get the sense that that's the case) and ultimately failed, that does not determine whether WikiTribune will fail. There are many, many cases in which a creation didn't succeed because it was timed poorly. Maybe we had to wait to see how poorly open, democratic, free-for-all, algorithmic, impulsive curation of news would go before there would enough demand for something like WikiTribune to be sustainable. I'm just happy to see someone trying something like this right now.


No comments: