Facebook’s ‘clickbait’ clampdown: more bad news for news?

Hooray, Facebook’s changed its algorithm again. Normally it doesn’t announce these shifts, leaving media organisations to quietly draw their own conclusions about why their page likes have quadrupled in a month or what’s going on with all that app traffic. This time it’s published a blog post on the topic.

“If people click on an article and spend time reading it, it suggests they clicked through to something valuable. If they click through to a link and then come straight back to Facebook, it suggests that they didn’t find something that they wanted. With this update we will start taking into account whether people tend to spend time away from Facebook after clicking a link, or whether they tend to come straight back to News Feed when we rank stories with links in them.”

This is an update aimed squarely at the curiosity gap, designed to take out clickbait (whatever that means). It isn’t going to touch Buzzfeed’s lists, for example, because their informational heads give you exactly as much knowledge as you need to decide whether to click, and they’re geared around getting you to scroll all the way to the end. It won’t hurt any sites successfully getting second clicks from Facebook traffic, rare as those are. It might hurt Upworthy and its imitators, but  not much, because of the method Facebook’s using to decide what’s valuable and what’s not. Tracking time on page is going to hurt thin, spammy sites where a user’s first response is to click back; Upworthy is very focussed on dwell time as part of its core engagement metric, and it’s certainly neither thin nor spammy.

But one unintended consequence of a focus on time away from the Facebook feed is a negative impact on breaking news. Facebook’s algorithm already struggles with news because of its lack of timeliness and the slow way it propagates through newsfeeds; it’s fine for features, for comment, for heartwarming kitten videos, and all sorts of other less-timely reads, but if you’re seeing a 12-hour-old news post there’s every chance it’s no longer really news. Recent events in Ferguson have highlighted Facebook’s ongoing problems in this area, and this risks adding another issue: news is fast, and Facebook is prioritising slow.

Time on site isn’t a particularly sensible metric to use for news: most people hunting for news want it quickly, and then they want to get on with the rest of their lives. The inverted pyramid of news writing is built around that principle – give the reader all they need as quickly as possible, then build in detail later for those who want it.

Increasingly, news sites are using stub articles – a few sentences or shorter – to break fast-moving stories, atomising them into smaller and smaller pieces. Those pieces might take seconds to read. If they’re promoted on Facebook, how does a news reader clicking through, reading the whole thing then backing out look different from someone clicking on a curiosity-gap headline then backing out because it wasn’t what they wanted?

One of the fundamental problems with a few large companies controlling the primary means of mass digital distribution is that media organisations who want to be widely read have to change their work to fit those distribution channels. Not just in terms of censorship – no naked female nipples in your Facebook images, no beheading videos on Twitter – but less obviously, and more integrally, in terms of form.

Online media has as many formal constraints as print, perhaps more, if you want to be widely read; they’re just trickier, more self-contradictory, and constantly shifting. Facebook’s changes are going to have an effect on what news looks like, just as Google’s algorithm did (and still does – Google News requires posts to have a minimum of 50 words in order to count as “news”, which is still shaping decisions about how to break what where in newsrooms).

If Facebook thinks fast, informative, snippets are less important in its newsfeed than longer reads, then news is either going to keep losing out – or change its shape to accommodate the algorithm.

Time vs the news

Jason Kint, in an interesting piece at Digiday, argues that page views are rubbish and we should use time-based metrics to measure online consumption.

Pageviews and clicks fuel everything that is wrong with a clicks-driven Web and advertising ecosystem. These metrics are perfectly suited to measure performance and direct-response-style conversion, but tactics to maximize them inversely correlate to great experiences and branding. If the goal is to measure true consumption of content, then the best measurement is represented by time. It’s hard to fake time as it requires consumer attention.

Some issues here. Time does not require attention: I can have several browser tabs open and also be making a cup of tea elsewhere. TV metrics have been plagued by the assumption that TV on === attentively watching, and it’s interesting to see that fallacy repeated on the web, where a branching pathway is as easy as ctrl+click to open in a new tab. It’s also easy to game time on site by simply forcing every external link to open in a new tab: it’s awful UX, but if the market moves to time as the primary measurement in the way that ad impressions are currently used, I guarantee you that will be widely used to game it, along with other tricks like design gimmicks at bailout points and autorefresh to extend the measured visit as long as possible. Time is just as game-able as a click.

 

It’s worth noting that Kint is invested in selling this vision of time-based metrics to the market. That doesn’t invalidate what he says out of hand, of course, but it is important to remember that if someone is trying to sell you a hammer they are unlikely to admit that you might also need a screwdriver.

In a conversation on Twitter yesterday Dave Wylie pointed me to a Breaking News post which discusses another time-based metric – time saved. It’s a recognition that most news consumers don’t actually want to spend half an hour clicking around your site: they want the piece of information they came for, and then they want to get on with their lives. Like Google, which used to focus on getting people through the site as fast as possible to what they needed. Or like the inverted pyramid of news writing, which focusses on giving you all the information you need at the very top of the piece, so if you decide you don’t need all the details you can leave fully informed.

There’s a truism in newsroom analytics: the more newsy a day is, the more traffic you get from Google News or other breaking news sources, the less likely those readers are to click around. That doesn’t necessarily mean you’re failing those readers or that they’re leaving unsatisfied; it may in fact make them more likely to return later, if the Breaking News theory holds true for other newsrooms. Sometimes the best way to serve readers is by giving them less.

#jcarn: Dear Santa, please bring us all more time

Given the recent dearth of posts on here, my request in response to this month’s Carnival of Journalism prompt is probably not surprising, though it may be impossible.

Dear Santa, for journo-Christmas I would like more time. Not just for me, but for everyone.

I was lucky enough, recently, to be part of a Guardian hack day. As a result, some awesome tools got built, including three that I started using inmediately. They’re still very much in beta, being improved and worked on occasionally, but I use them constantly. They’ve changed my job. Not by giving me new things to do, but by automating some repetitive, tricky, admin bits of the job and therefore making them require less time and attention – so I can spend more time and energy focussing on the bits that really need it.

That’s wonderful. It’s a gift of time. It means I can work smarter, not just harder. I wish, if I have to be limited to one Christmas wish, that every journalist and everyone involved in making journalism – including developers – could have at least one tool, in 2012, that makes the tedious admin bits of their jobs faster. I hope that every tricky CMS for journalists that contains unnecessary time-consuming admin processes releases an update that makes it no longer so.

And, because this isn’t a one-way process, I hope that every journalist takes the initiative to go find out where their techies live and actually talks to them, in person, about the problems they have. There’s no point griping only to each other about the difficult bits, or in keeping quiet and carrying on doing things that don’t make sense: tell developers what’s wrong, because otherwise they won’t know it needs fixing. Sometimes what looks like a tech problem is actually a communication issue, because the people who need to know that something’s broken haven’t been told.

These fixes often aren’t the big, sexy, exciting projects for devs. They’re the sort of thing that, if it exists, you very quickly take for granted. Things like, say, a spellchecker that also flags up common house style violations, or a geolocation module that understands when you type “Norwich” that you want the geographical area defined by the boundaries of the city of Norwich, not a point at the centre of its postcode area. They’re often small niggles that you’d only notice if you’re doing these processes day in, day out, many times a day.

In an age of cutting costs, one of the most precious resources we have left is our time. Anything that saves it, that means it can be spent doing journalism or making tools that journalists can use, instead of busywork, is a wonderful thing.

Oh, and if you work in a place that has admin staff, go say thank you to them. They deserve it.

#jcarn: Workflow hacking

For this month’s Carnival of Journalism, we’ve been challenged to write about life hacks, tips, tools and techniques that help us work smarter and more effectively.

It’s been an interesting one, because it’s forced me to quantify the things I do to try and work efficiently. The things I’m sharing here make me sound like some sort of uber robot journalist geek, which I’m not, really, but trying to follow these principles helps me pretend.

Your job is not your admin

  • Every job has a tedious admin phase you have to deal with every day. But that’s not your real job – it takes time away from doing what you need to do.
  • The most basic ways you can be more awesome involve cutting down on admin time and increasing the time you spend actually working.
  • I keep track of what I do to work out which tasks take up time without contributing anything meaningful. I’ve used Rescue Time, Remember The Milk, Epic Win and custom Google Docs to track this in the past.
  • Once I’ve worked out where there’s time to be saved, I start working out how to save it. This is useful admin time.
  • It’s always worth learning keyboard shortcuts for any program I use daily. It saves small chunks of time over and over again.
  • I use a To Do list for big stuff that needs it rather than day-to-day routine things – I’m using Remember The Milk at the moment, but I tend to rotate list apps every few months because otherwise the novelty wears off and I stop using them. I’ve used 2Do, Google Tasks, Outlook Tasks, Doomi, enormous spreadsheets and Epic Win in the past.

Repeated tasks can be automated

  • It’s worth a day of my effort to automate something that takes me more than about 20 minutes a day to do. If it’s an interruption or a flow-breaking task or something I will have to do every day for a year, it’s probably worth more.
  • I think of certain tasks – finding sources on Twitter, for instance, or researching a topic for a story – as building a re-usable resource, not a one-off event. It takes much less effort to build a Twitter list or filter and aggregate a few RSS feeds the first time around, so you can go straight back to your sources if you’re doing a follow-up.
  • I use a lot of dashboards. The new Google Analytics beta lets me customise and keep half a dozen ways of slicing web data at my fingertips, so I can answer common business questions in seconds not hours. iGoogle combined with custom alerts by RSS lets me filter the entire web for certain subjects. Hootsuite and Tweetdeck let me monitor social networks in similar ways.
  • I use macros to automate tasks in Excel and Word. I use Google Docs with various APIs to build a few regular reports, occasionally combined with ScraperWiki. I build a lot of very specific spreadsheets where I can plug in data in a certain format and get back insights very quickly. I try to build things that can be re-used or re-purposed.
  • If there’s a boring repetitive task, there’s almost certainly a plugin or a script somewhere on the internet that’ll help you make it faster or easier. Sometimes those are more work to rewrite/implement than it would be just to get on with it. Other times they’re lifesaving.
  • Greasemonkey can be astonishingly helpful in saving little annoyances (and big ones, sometimes). For instance, I love this script that automatically pushes the “access analytics” button in Google Analytics. It saves one click – but it saves it three or four times every single day.
  • After all that – I do very little coding. I mostly borrow other people’s code and put it to use in new situations.

All information can be filtered

  • Twitter lists, search operators and even individual users if they’re focussed on a specific topic of interest. The -RT search operator is fantastic. Topsy‘s advanced search is also amazing powerful. And it has an API, which I haven’t yet worked out how to use to best advantage.
  • RSS folders in Google Reader (or a similar reader service) and combinations and filters using Yahoo Pipes. Postrank is an awesome service that helps you filter popular and engaging content from feeds. Combining Postrank with Pipes gives you neat automatic filters.
  • Google alerts, especially using advanced search terms – you can use site:youtube.com with keywords to build a video alert service, for instance.
  • Google custom search – great for checking whether anyone’s covered a particular story, or for working out who on your beat is talking about a certain subject – just give it a list of links.

Interruptions can be limited

  • I use rules in Outlook to limit the number of times I see email alerts – I have several set up to filter out various levels of noise, including a white-list for emails most likely to need urgent responses. It was well worth the time spent setting these up – if every pop-up on-screen is only 5 seconds of attention, I’ve still saved more than 5 minutes a day.
  • I use rules in Gmail to sort incoming mail by priority, and use the email game to deal with it all in small bursts, quickly and efficiently, when it’s convenient rather than when a mail comes in.
  • I turn off email notifications for sites I visit every day anyway. I set up as much as possible to come via RSS (where I can filter it using Yahoo Pipes and categorise it in a sensible folder) or via Twitter (where its immediate impact is limited to 140 characters).
  • When I need to focus, I stay away from Tweetdeck completely. I have a 2-column view in Hootsuite with nothing but mentions and direct messages, so I can see anything requiring urgent responses at a glance. I turn my iPhone off.

Waiting kills productivity

  • If a task I do regularly is governed by a set of rules and involves waiting for something to happen, I do my best to automate it away. I win twice.
  • If I’ve got to do something that involves waiting, I plan for the wait: go take a break, stretch, do a simple time-limited task.
  • I have a  folder of RSS feeds from folks who write short, and I read a couple while Iwait. And I have Reeder on my iPhone, for long out-of-the-office waits (some people call them “commutes”).
  • I save up several stop-start tasks and use them as a “distraction loop” – taking each one in turn and switching when a wait starts.

What do you do to hack your workflow? What tools do you use to simplify the stuff that doesn’t matter and help you spend more time on the stuff that does?