Hooray, Facebook’s changed its algorithm again. Normally it doesn’t announce these shifts, leaving media organisations to quietly draw their own conclusions about why their page likes have quadrupled in a month or what’s going on with all that app traffic. This time it’s published a blog post on the topic.
“If people click on an article and spend time reading it, it suggests they clicked through to something valuable. If they click through to a link and then come straight back to Facebook, it suggests that they didn’t find something that they wanted. With this update we will start taking into account whether people tend to spend time away from Facebook after clicking a link, or whether they tend to come straight back to News Feed when we rank stories with links in them.”
This is an update aimed squarely at the curiosity gap, designed to take out clickbait (whatever that means). It isn’t going to touch Buzzfeed’s lists, for example, because their informational heads give you exactly as much knowledge as you need to decide whether to click, and they’re geared around getting you to scroll all the way to the end. It won’t hurt any sites successfully getting second clicks from Facebook traffic, rare as those are. It might hurt Upworthy and its imitators, but not much, because of the method Facebook’s using to decide what’s valuable and what’s not. Tracking time on page is going to hurt thin, spammy sites where a user’s first response is to click back; Upworthy is very focussed on dwell time as part of its core engagement metric, and it’s certainly neither thin nor spammy.
But one unintended consequence of a focus on time away from the Facebook feed is a negative impact on breaking news. Facebook’s algorithm already struggles with news because of its lack of timeliness and the slow way it propagates through newsfeeds; it’s fine for features, for comment, for heartwarming kitten videos, and all sorts of other less-timely reads, but if you’re seeing a 12-hour-old news post there’s every chance it’s no longer really news. Recent events in Ferguson have highlighted Facebook’s ongoing problems in this area, and this risks adding another issue: news is fast, and Facebook is prioritising slow.
Time on site isn’t a particularly sensible metric to use for news: most people hunting for news want it quickly, and then they want to get on with the rest of their lives. The inverted pyramid of news writing is built around that principle – give the reader all they need as quickly as possible, then build in detail later for those who want it.
Increasingly, news sites are using stub articles – a few sentences or shorter – to break fast-moving stories, atomising them into smaller and smaller pieces. Those pieces might take seconds to read. If they’re promoted on Facebook, how does a news reader clicking through, reading the whole thing then backing out look different from someone clicking on a curiosity-gap headline then backing out because it wasn’t what they wanted?
One of the fundamental problems with a few large companies controlling the primary means of mass digital distribution is that media organisations who want to be widely read have to change their work to fit those distribution channels. Not just in terms of censorship – no naked female nipples in your Facebook images, no beheading videos on Twitter – but less obviously, and more integrally, in terms of form.
Online media has as many formal constraints as print, perhaps more, if you want to be widely read; they’re just trickier, more self-contradictory, and constantly shifting. Facebook’s changes are going to have an effect on what news looks like, just as Google’s algorithm did (and still does – Google News requires posts to have a minimum of 50 words in order to count as “news”, which is still shaping decisions about how to break what where in newsrooms).
If Facebook thinks fast, informative, snippets are less important in its newsfeed than longer reads, then news is either going to keep losing out – or change its shape to accommodate the algorithm.