Thursday, August 29, 2019

What's Wrong with News?

Since Jimmy Wales came to speak at the University of Alabama last weekend, I've been thinking more about WikiTribune, his new-ish news curation/creation venture. This seems like a rare opportunity to contribute to the building of a tool that has the backing of the creator of one of the most popular, influential, important websites in the world. Of course the venture might fail, because most new ventures fail and because news is a tricky thing to get right, maybe trickier in a lot of ways than building something akin to a library or encyclopedia. WikiTribune seems to be functioning right now as a collectively curated news aggregation website; it might evolve into something else in the future, but I'll assume that's what it is for now.

What might be the starting points for a platform like this? If news is broken (which seems a fairly uncontroversial take in 2019), what part of it could be fixed by a Wiki-type platform?

First, there's the issue of factual accuracy. A decent way of getting a newsfeed that only has news that is factually correct is just by linking to stories that are from sources that can be held accountable, that have a reputation, and that typically follow standard journalistic practices. You weed out the parody stories, the polarizing disinformation, the deliberate attempts to poison discourse with fake news. This seems like something a dedicated group of volunteers could do.

I feel like the factual accuracy problem isn't as widespread as some believe, that networks of upvoting and sharing bots only make it seem as though a handful of untrue stories are being very widely read and believed. Of course, there are most likely relatively small pockets of people who are actually believing in false stories and act on those beliefs, and despite their relatively small numbers, they can be very harmful to society. And various characteristics of popular social media platforms such as Twitter, YouTube, and Facebook (such as algorithms that don't vet information, the ability for anyone to post and share anything, being too big and too fast to moderate) increase the footprint and influence of false stories beyond what they were in the pre-internet says.

I doubt WikiTribune would lure the type of folks who seek out false stories away from their favorite hyperpartisan sites, but who knows what might happen if it eventually became even a tenth as popular as Wikipedia. Maybe it would end up as a kind of standard system for current events information vetting, a better middle layer between journalism and audiences than social media currently is. Most people would then recognize the false news stories and sources in a way that most people did before the internet: as a kind of inevitable fringe to the information ecosphere, relegated to a recognizable outskirt rather than popping up in the midst of a feed of journalism vetted in the traditional way (as tends to happen w/ news via social media). It's important to know whether factually inaccurate news exists, how much of it exists, and who is sharing and reading it, but it's also important to think about where, in our information environment, it resides. Is it concentrated or distributed? In the center or on the periphery?

During his talk, Wales brought up the problem of clickbait headlines: headlines that mislead audiences, play on our emotional, tribal, impulsive tendencies, exploit a kind of shallow curiosity. So, maybe WikiTribune curators use a kind of rough guiding principle: avoid posting clickbait headlines (or maybe just rewrite the headlines. Often, the stories are fine, but the headlines seem like they were written by search engine optimizers). Obviously, clickbait is a term with a fuzzy definition, but that's nothing new, and certainly doesn't stop websites, publishers, etc. from moderating various kinds of vaguely-defined content standard. Get multiple experienced coders to rate each headline's clickbait-iness, and when they agree, don't post it or re-write the headline.

Wales also brought up ads and their effects on content, one of which is to essentially incentivize the creation of clickbait headlines. But I guess I'm a bit unclear on how to account for the fact that many of the sources to which WikiTribune links are ad supported. Related to this, how does WikiTribune work with paywall news sites? Wales seemed to be pro-paywall, to endorse the idea that if more news sites were subscription-funded rather than ad-funded, the quality of information would improve. Would WikiTribune just give you a taste of the article, a kind of abstract or summary, something a bit more than a headline, some kind of compromise that would give the reader some value while not totally substituting for the story itself, perhaps pushing users to the full article in the way that Wikipedia might push users to the source material? That seems reasonable to me.

He also brought up algorithms. Perhaps algorithms also, in some way, guide news consumers and creators toward more clickbait-y headlines. If you have humans in the loop, maybe it would be easier to slow or stop the prevalence of clickbait and false news.

He also brought up the fact that Wikipedia is not totally open and it's not a democracy, and I think that's a way of setting this apart from the primary news aggregator of our time, Reddit. Reddit is ostensibly open, and is a kind of democracy: registered users submit and vote on content, thereby increasing or decreasing its visibility. Over time, this has led to certain kinds of news stories ending up on the front page. You might call it a product of 'hivemind.' It has a certain narrow tonal range, and a certain focus on particular topics that reflects the interests and values of the voting public.

There is recent talk of moderators playing a bigger role in directing what gets posted in subreddits, and the inevitable push-back by those who value unfettered speech and a democratic public sphere above all else. I doubt that push-back will fully subside on Reddit because voting for the noteworthiness of content is a defining characteristic of the platform. It's not ancillary in the way that 'likes' are to Instagram. But if you start a new site that simply doesn't have that as one of it's defining characteristics, you don't necessarily have that problem. Sure, a lot of people believe strongly in a flat hierarchy, an open democratic sphere, but I think that enough people have been repulsed by what that yielded to want something else.

Somewhat more controversially, we might consider the question of tone and emotion in news, and whether or not that needs 'fixing,' or could be fixed. Personally, I'm turned off by the abundance of outrage and fear that many news stories in 2019 evoke, but I recognize that an argument can be (and has been) made for the value of anger and fear in this sphere. Here, I think it's more a matter of personal preference, and not everyone would agree on this, but some would like, sometimes, to see a newsfeed that features less fear and anger. I feel like that's where a lot of popular democratic news feeds, like r/news on Reddit, end up: a kind of distilled outrage and fear.

Here, we need think about the purpose of news. Is it a kind of 'immune system' for societies that's sole purpose is to detect threats and alert us to them? Well then, it seems entirely appropriate that news would invoke anger and fear. Should news be more broad, emotionally, than that? Should it invoke wonder, curiosity, gratitude? You have subreddits like r/upliftingnews that cater to another point on the emotional spectrum, so it's not as if there isn't already a place for that in many people's information diets.

Then there's the matter of filter bubbles/echo chambers, and I don't know that there's much that WikiTribune could do about that. I think worries about filter bubbles and echo chambers are somewhat overstated and/or that we'll never fully solve that problem, and while we wait around for the perfect solution, we're making do with a pretty lousy news ecosystem that's run, by default, by impulsive clicks and a lack of accountability. I think pure democracy was one potential solution - giving the power of vetting and curation to anyone and everyone - but we've seen how well that went.

Wikipedia never solved the filter bubble problem when it came to creating an encyclopedia. The editors don't remotely reflect the readership, in terms of race, gender, education level, ideology, etc. Wikipedia isn't flawless, but it seems to be working well enough, and I gather that there is a sense within the organization that they should try to broaden the diversity of people who edit it to include more women and people of color. Should it also try to include more people who identify as politically conservative? Does it make sense to pursue intellectual diversity among information curators as well?

It looks like the current version of WikiTribune features a way to follow particular feeds curator by particular Wiki-editors. If that's the case, then what's to stop a couple of ultra-liberal or ultra-conservative editors from setting up feeds full of clickbait and partisan vitriol? Is there some overseer that decides when editors have gone too far, similar to the way that admins on Reddit ultimately have control over volunteer moderators? Might that decision as to what goes too far be motivated by one's ideology?

Yes. But I get the sense that Wikipedia has already dealt with similarly motivated people who have tried to turn Wikipedia into a more partisan information environment, and it has some sort of mechanism for dealing with them that seems to be largely effective. Is the mechanism entirely democratic and open? Probably not, but now might be the time that some of the public revisits the relationship between direct democracy and news curation and distribution.

Even if something similar to WikiTribune existed in the past (and I get the sense that that's the case) and ultimately failed, that does not determine whether WikiTribune will fail. There are many, many cases in which a creation didn't succeed because it was timed poorly. Maybe we had to wait to see how poorly open, democratic, free-for-all, algorithmic, impulsive curation of news would go before there would enough demand for something like WikiTribune to be sustainable. I'm just happy to see someone trying something like this right now.


Saturday, August 24, 2019

A World Without Frames

I had the privilege of seeing Jimmy Wales, founder of Wikipedia, speak at the University of Alabama, courtesy of The Blackburn Institute. In anticipation of seeing him speak, I'd been reflecting on the value of Wikipedia in a post-2016 world. Since the 2016 U.S. Presidential election, it seems as though the discourse on social media has become more toxic and less fact-based, that traditional news outlets don't quite know what to do with Trump and his international equivalents, and that non-traditional news outlets are creating and disseminating biased and false information about our world. All the while, Wikipedia continues to churn away, largely free from the toxicity and contentiousness that has gripped the rest of the internet and, seemingly, the rest of the world. How did they do it?

I went into the talk with a slightly more specific version of that question: is it the particular approach/model that Wikipedia uses that is responsible for its relative success vis a vis the Truth, or is it the particular domain in which it operates - that of the encyclopedia? Wales didn't quite speak to this question, but he did talk about his relatively new passion project: the WikiTribune. In a way, the fate of that venture will answer the question, as it applies the model and logic of Wikipedia to the world of news and current events. Wales' working theory seemed to be that ads were largely to blame for the degradation of news: the way that the online ad economy works puts all websites on an even playing field, all competing against one another for attention. News sites do not just compete with other news sites; they compete with parody news sites, entertainment sites, gossip sites, etc. He praised the recent move toward the subscription model, noting that the New York Times has seen recent financial success pursuing this model. Subscriptions prompt users to think, 'what is the overall value of this product, in the long term?' That's a key shift in thinking, from what you click on in an impulsive manner to what you value, a shift from short term thinking to long term thinking. So, if we get rid of the ads, do we improve the quality of news and discourse around news?

My suspicion is that there is another factor at play: whether the content pertains exclusively to current events. News must privilege certain stories over others in a way that an encyclopedia or a library does not, assuming it has a front page (can we conceive of a news site without a front page, regardless of whether said front page is personalized?). Here, the decades of research on framing and agenda setting is relevant: by virtue of editorial decisions about what to cover and what not to cover, news gets us to think about certain issues or events (or certain aspects of those issues/events) and ignore others. Encyclopedias do not direct attention in quite the same way. Sure, it could be argued that within a given entry, an encyclopedia/wikipedia chooses to emphasize certain aspects of the subject while ignoring or downplaying others, and thus frames the subject in a way that shapes perception. But I'd argue the way in which Wikipedia has incorporated different perspectives on contentious topics into entries reduces this effect.

Can you do the same thing with news? Is that what WikiTribune will be?

I haven't a clue. But it was kind of thrilling to be in the room with someone who was taking a stab at solving a problem of this scope, someone who was uniquely positioned to stand a decent chance of succeeding.