Where is the Roger Ebert’s commissioning editor of games?

warren spectorWarren Spector’s latest GI column asks: where is the Roger Ebert of gaming? He bemoans the lack of accessible, consistent writing about games in mainstream media, aimed at broad rather than specialist audiences. The key passages are a call to action:

I’m not saying reaching an audience that doesn’t know enough to take games seriously will be easy. I’m for sure not finding fault with people currently trying to accomplish this difficult goal. I’m just saying we need to continue working and harder to bring more writers and thinkers into the area between Reviewers and Academia. We can’t be complacent and say, “Aw, what we got is good enough.”

Let’s inundate the bookshelves, magazine sections and the web with work that isn’t above (or below) the heads of readers. Only in that way will we achieve the level of respect I believe we deserve. Only in that way will we create an audience more demanding of the medium, which will inevitably lead to different and, I’d argue, better games.

This kind of treatment – as exemplified by the Times articles mentioned above – would do games a world of good. Establishing games in the public mind as something good and worthy and serious, and not just “fun for kids, but not for me” seems important to me. It’s important to developers, publishers, players and maybe even to – for want of a better word – enemies who might come to a more nuanced understanding of our medium.

Frankly, if games are not up to this sort of critical analysis then maybe they are just a way to provide some thrills and chills or some time away from real world problems, as our critics (in still another sense of the word) contend.

I agree, broadly, with the sentiment of the piece – that there is not enough mainstream game criticism of explanation, rather than of evaluation – but the issue is not that we do not have one or many Roger Eberts. It’s that we don’t have Roger Ebert’s editors. In the English speaking world, we don’t have a mainstream press that commissions these pieces consistently from the many talented critics who are already doing this work. We have a mainstream press, for the most part, that commissions very short reviews with evaluative ratings on only the very biggest, most blockbusting titles, or that syndicates specialist content written for gamer audiences rather than for the general public. We have a mainstream media that doesn’t want to – or can’t – pay excellent writers properly to produce excellent work, or promote it appropriately when it does. (That’s a sweeping generalisation, of course. There are many exceptions and many outlets where this is changing. But there aren’t enough.)

Outside the specialist press, the enthusiast press and the academic press, accessible games criticism is not reaching the audience it deserves because it’s not being widely commissioned or published in mainstream publications. It’s not that there’s no demand from audiences – the proliferation of intelligent and accessible work on Tumblr, on personal blogs, and elsewhere is testament to the voracity of that demand. It’s not that there are no writers capable of such accessibility, insight and excellence; there are dozens.

It is, however, about the business and budgetary crises in mainstream media. It’s about the gradual shift away from gamer-as-identity to gaming-as-mainstream-pastime, as more people play games and fewer think of game-playing as a fundamental element of their personality. It’s about the youth of video games and video game writing alike as creative media, and their dual resistance to external critique. And it’s about the shift in thinking involved in situating video games as culture and entertainment when historically mainstream media has covered them as technology. These are all issues that time will solve, one way or the other.

This month’s Blogs of the Round Table topic over on Critical Distance asks what the future of video game blogging is, and this can serve as my response: the future of video game blogging is mainstream. At the moment – like it or not – the best, most accessible, most interesting games writers are freelancers, working for niche outlets and writing for themselves. The future for video games blogging is a mass audience. And hopefully better pay along with it.

[I can’t make the iframe link list to the other BoRT blogs work here – but you should go here and read the others too.]

News making money

Ryan McCarthy, at Reuters:

But if you’re working in media now you shouldn’t be worried about getting your website to hit 20 or 30 million uniques — if ad rates continue to fall, even websites of that size may not be economically viable. Instead, media companies should be doing everything they can can to improve the economic value of their work (which may not mean more pageviews).

For those of us actually working in web journalism, this adds yet on another layer of existential angst. Journalists certainly shouldn’t spend their time worrying about how to make their articles more attractive to advertisers.

Whole article is worth a read, but I don’t think its conclusions quite hold true. For one thing, there are more ways of selling ads than simple CPM, from more careful targeting to real time bidding to TV-style channel takeovers at busy periods. Some of those have the same problems as CPM with oversupply when it comes to pure growth, but for others size of market is vastly important when combined with good user data.

Secondly, maybe journalists should think about the value of their articles, as well as their other attributes. Or if not the journalist themselves, at least someone on the editorial side. The nature of journalism online is a fascinating crossover of popularity, importance, usefulness and financial value, and every news organisation builds its editorial criteria differently from those sets. But if you build your business only on the first three, and ignore the last one, then eventually you don’t have a business at all.

10 things I learned from a web traffic spike

Look Robot wordpress stats

Last week, my other half wrote a rather amusing blog post about the Panasonic Toughpad press conference he went to in Munich. He published on Monday afternoon, and by the time he went out on Monday evening the post had had just over 600 views. I texted him to tell him when it passed 800, making it the best single day in his blog’s sporadic, year-long history.

Next day it hit 45,000 views, and broke our web hosting. Over 72 hours it got more than 100,000 views, garnered 120 comments, was syndicated on Gizmodo and brought Grant about 400 more followers on Twitter. Here’s what I learned.

1. Site speed matters

The biggest limit we faced during the real spike was CPU usage. We’re on Evohosting, which uses shared servers and allots a certain amount of usage per account. With about 180-210 concurrent visitors and 60-70 page views a minute, according to Google Analytics real-time stats, the site had slowed to a crawl and was taking about 20 seconds to respond.

WordPress is a great CMS, but it’s resource-heavy. Aside from single-serving static HTML sites, I was running Look Robot, this blog, Zombie LARP, and, when I checked, five other WordPress installations that were either test sites or dormant projects from the past and/or future. Some of them had caching on, some didn’t; Grant’s blog was one of the ones that didn’t.

So I fixed that. Excruciatingly slowly, of course, because everything took at least 20 seconds to load. Deleting five WordPress sites, deactivating about 15 or 20 non-essential plugins, and installing WP Super Cache sped things up to a load time between 7 and 10 seconds – still not ideal, but much better. The number of concurrent visitors on site jumped up to 350-400, at 120-140 page views a minute – no new incoming links, just more people bothering to wait until the site finished loading.

2. Do your site maintenance before the massive traffic spike happens, not during

Should be obvious, really.

3. Things go viral in lots of places at once

Grant’s post started out on Twitter, but spread pretty quickly to Facebook off the back of people’s tweets. From there it went to Hacker News (where it didn’t do well), then Metafilter (where it did), then Reddit, then Fark, at the same time as sprouting lots of smaller referrers, mostly tech aggregators and forums. The big spike of traffic hit when it was doing well from Metafilter, Fark and Reddit simultaneously. Interestingly, the Fark spike seemed to have the longest half-life, with Metafilter traffic dropping off more quickly and Reddit more quickly still.

4. It’s easy to focus on activity you can see, and miss activity you can’t

Initially we were watching Twitter pretty closely, because we could see Grant’s tweet going viral. Being able to leave a tab open with a live search for a link meant we could watch the spread from person to person. Tweeters with large follower counts tended to be more likely to repost the link rather than retweeting, and often did so without attribution, making it hard to work out how and where they’d come across it. But it was possible to track back individual tweets based on the referrer string, thanks to the t.co URL wrapper. From some quick and dirty maths, it looks to me like the more followers you have, the smaller the click-through rate on your tweets – but the greater the likelihood of retweets, for obvious reasons.

Around midday, Facebook overtook Twitter as a direct referrer. We’d not been looking at Facebook at all. Compared to Twitter and Reddit, Facebook is a bit of a black box when it comes to analytics. Tonnes of traffic is coming, but who from? I still haven’t been able to find out.

5. The more popular an article is, the higher the bounce rate

This doesn’t *always* hold true. However, I can’t personally think of a time when I’ve witnessed it being falsified. Reddit in particular is also a very high bounce referrer, due to its nature, and news as a category tends to see very high bounce especially from article pages, but it does seem to hold true that the more popular something is the more likely people are to leave without reading further. Look, Robot’s bounce rate went from about 58% across the site to 94% overall in 24 hours.

My feeling is that this is down to the ways people come across links. Directed searching for information is one way: that’s fairly high-bounce, because a reader hits your site and either finds what they’re looking for or doesn’t. Second clicks are tricky to get. Then there’s social traffic, where a click tends to come in the form of a diversion from an existing path: people are reading Twitter, or Facebook, or Metafilter, they click to see what people are talking about, then they go straight back to what they were doing. Getting people to break that path and browse your site instead – distracting them, in effect – is a very, very difficult thing to do.

Look Robot referrals
The head of a rather long tail.

6. Fark leaves a shadow 

Fark’s an odd one – not a site that features frequently in roundups of traffic drivers, but it can still be a big referrer to unusual, funny or plain daft content. It works like a sort of edited Reddit – registered users submit links, and editors decide what goes on the front page. Paying subscribers to the site can see everything that’s submitted, not just the edited front. I realised before it happened that Grant was about to get a link from their Geek front, when the referrer total.fark.com/greenlit started to show up in incoming traffic – that URL, behind a paywall, is the place where links that have been OKed are queued to go on the fronts.

7. The front page of Digg is a sparsely populated place these days

I know that Grant’s post sat on the front page of Digg for at least eight hours. In total, it got just over 1,000 referrals. By contrast, the post didn’t make it to the front page of Reddit, but racked up more than 20,000 hits mostly from r/technology.

8. Forums are everywhere

I am always astonished at the vast plethora of niche-interest forums on the internet, and the amount of traffic they get. Much like email, they’re not particularly sexy – no one is going to write excitable screeds about how forums are the next Twitter or how exciting phpBB technology is – but millions of people use them every day. They’re not often classified as ‘social’ referrers by analytics tools, despite their nature, because identifying what’s a forum and what’s not is a pretty tricky task. But they’re everywhere, and while most only have a few users, in aggregate they work to drive a surprising amount of traffic.

Grant’s post got picked up on forums on Bad Science, RPG.net, Something Awful, the Motley Fool, a Habbo forum, Quarter to Three, XKCD and a double handful of more obscure and fascinating places. As with most long tail phenomena, each one individually isn’t a huge referrer, but the collection gets to be surprisingly big.

9. Timing is everything…

It’s hard to say what would have happened if that piece had gone up this week instead, but I don’t think it would have had the traffic it has. Grant’s post hit a chord – the ludicrous nature of tech events – and tapped into post-CES ennui and the utter daftness that was the Qualcomm keynote this year.

10. …but anything can go viral

Last year I was on a games journalism panel at the Guardian, and I suggested that it was a good idea for aspiring journalists to write on their own sites as though they were already writing for the people they wanted to be their audience. I said something along the lines of: you never know who’s going to pick it up. You never know how far something you put online is going to travel. You never know: one thing you write might take off and put you under the noses of the people you want to give you a job. It’s terrifying, because anything you write could explode – and it’s hugely exciting, too.

Why liveblogs almost certainly don’t outperform articles by 300%

In response to this study, linked to by journalism.co.uk among many others.

  1. The sample size is 28 pieces of content across 7 news stories – that content includes liveblogs, articles, picture galleries. That’s a startlingly small number for a sample which is meant to be representative.
  2. The study does not look at how these stories were promoted, or whether they were running stories (suited to live coverage), reaction blogs, or other things.
  3. The traffic sample is limited to news stories, and does not include sports, entertainment or other areas where liveblogs may be used, and that may have different traffic profiles.
  4. The study compares liveblogs, which often take a significant amount of time and editorial resource, with individual articles and picture galleries, some of which may take much less time and resource. If a writer can create four articles in the time it takes to create a liveblog, then the better comparison is between a liveblog and the equivalent amount of individual, stand-alone pieces.
  5. The study is limited to the Guardian. There’s no way to compare the numbers with other publications that might treat their live coverage differently, so no way to draw conclusions on how much of the traffic is due to the way the Guardian specifically handles liveblogs.
  6. The 300% figure refers to pageviews. Leaving aside the fact that this is not necessarily the best metric for editorial success, the Guardian’s liveblogs autorefresh, inflating the pageview figure for liveblogs.

All that shouldn’t diminish the study’s other findings, and of course it doesn’t mean that the headline figure is necessarily wrong. But I would take it with a hefty pinch of salt.

Requesting politely to stay in the dark will not serve journalism

At Salon, Richard constantly analyzed revenue per thousand page views vs. cost per thousand page views, unit by unit, story by story, author by author, and section by section. People didn’t want to look at this data because they were afraid that unprofitable pieces would be cut. It was the same pushback years before with basic traffic data. People in the newsroom didn’t want to consult it because they assumed you’d end up writing entirely for SEO. But this argument assumes that when we get data, we dispense with our wisdom. It doesn’t work that way. You can continue producing the important but unprofitable pieces, but as a business, you need to know what’s happening out there. Requesting politely to stay in the dark will not serve journalism.

– from Matt Stempeck’s liveblog of Richard Gingras’s Nieman Foundation speech

Aggregation – a substitute newspaper?

I’m not sure that I completely agree with Scott Fulton’s conclusion in this piece, but it’s well worth a read nonetheless. On the difference between Google and journalism:

News has always been a loss leader; it’s the thing publishers provide to make the real products they used to sell timely, interesting and competitive. It’s literally the sugar coating.

The Internet commandeered the services that newspapers once championed and delivered each of these services on an a la carte basis. In an earlier era, it made sense to bundle these services in a single package – the newspaper – and deliver it fully assembled. Today, the Web itself is the package, and each of the services now competes against other similar services in separate, often healthy, markets. And this is as it should be – this is not somehow wrong.

But it leaves local news providers with only the container, abandoning them with the task of making a living from the news alone. What’s worse, it thrusts them into a market with tens of thousands of journalistic ventures of all sizes, all of which have charged themselves with the same objective: building a business model around solely the news. What gives all these services a bit of a reprieve, albeit temporary, are Google News and the other aggregators in its category. Aggregators serve not only as front pages for a multitude of news services, but by bundling them together and giving them the illusion of plurality, aggregators substitute for the missing thunder of the press. The end product is not exactly editorial, but if you squint, there are moments when it reminds you of something that might have been editorial once.

Journalism online has a distribution problem. Unlike a road network, Google isn’t a neutral network through which news can be pushed; unlike hauliers and newsagents, social networks don’t exist primarily to distribute our news but have their own purposes and uses that sometimes conflict with ours. As the Mail Online prepares to turn its first profit, there is a wider argument playing out about whether journalism can or should be valued by how well and widely it is distributed – for display ad driven models this is particularly acute. And Google, as a display ad provider, potentially profits twice by being the primary distributor as well.

For news, Google is a distributor trying to make the product fit its network. (In other areas too – Schema.org microdata, authorship markup and other elements of Google+ spring to mind.) Though it’s certainly useful – I would argue vital to most news sites – it’s not the only way to distribute news, and for some sites it’s not the dominant method. Google is competing with email, social networks or even direct traffic to be the primary access method. Of course, then, it wants access to news and other content in a form that’s easy for it to parse and display. No wonder it fell out with Twitter and Facebook.

To my mind, this is the quote that gets to the heart of it:

Like it or not, aggregation is an interim solution. It’s a kludge that satisfies an immediate need in the short-term; it’s a substitute newspaper.

Google News is the best of what we’ve got now. It’s not necessarily what’s best for news. It’s certainly not where we’re going to end up.

Journalists and dickishness

Are journalists dicks? Lyra McKee wrote a rather interesting post on the subject, suggesting that many new journalists and tech journalists in particular are more about the ego than the story, and that while it can be good for their profiles their work suffers as a result. I came across the post via John Thompson on Twitter, and it spawned a rather fascinating (if meta and navel-gazing) conversation on the subject, which I’ve Storified below.

My personal opinion has long been that being very good at anything creative and public (both of which journalism certainly is) tends to involve both a large ego and a well of insecurity. Going out in public and proclaiming that what you’re doing is worth someone’s time and attention – that your work is important – requires a certain brash self-confidence. But being ambitious and driven more often than not means being terrified that one day what you do won’t be worthy – and that means a constant anxiety and need to prove yourself, sometimes at the expense of niceties. The combination makes for fascinating, creative people who combine often seemingly incompatible traits – thick skin and vulnerability to criticism – with deep insight, blinding intelligence, common sense, a work ethic that would make an oxen blush and myriad other laudable traits. Sometimes that means a bit of dickishness, too.

Stop blaming the internet for rubbish news content

Newspapers and newsrooms generally have always striven to publish stories that are important, interesting, informative and entertaining.  Not every one puts those in the same order or gives them the same importance. But the internet hasn’t changed that much.

The unbundling effects of the net mean that instead of relying on the front page to sell the whole bundle, each piece has to sell itself. That can be hard; suddenly the relative market sizes for different sorts of content are much starker, and for people who care more about important/interesting/informative than entertaining, that’s been a depressing flood of data. But the internet  didn’t create that demand – it just made it more obvious. Whether we should feed it or not is an editorial question. Personally, I think it’s fine to give people a little of what they want – as long as a newsroom is putting out informative and important stories, a few interesting and entertaining ones are good too, so long as they’re not lies, unethically acquired or vicious.

If you spend a lot of time online you will see a filter bubble effect, where stories from certain news organisations are not often shared by your friends and don’t often turn up in your sphere unless you actively go looking for them. That means the ones that break through will be those that outrage, titillate or carry such explosive revelations that they cannot be ignored. That does not mean those stories are the sum total output of a newsroom – any more than the 3AM Girls are the sum total of the Mirror in print – but those pieces attract a new audience and serve to put that wider smorgasbord of content in front of them (assuming the article pages are well designed).

Of course, some news organisations publish poor stories – false, misleading, purposefully aggravating or just badly written – in the name of chasing the trend. That’s also far from an internet-only phenomenon. The Express puts pictures of Diana on the front, and routinely lies for impact in its headlines. The Star splashes on Big Brother 10 weeks running. The editorial judgement about the biggest story for the front is about sales as much as it is newsworthiness. Sometimes those goals align. Sometimes they don’t, and editors make a choice.

It is ridiculous to blame the internet for the publishing of crap stories to chase search traffic or trend-based clicks – just as it’s ridiculous to blame the printing press for the existence of phone hacking. In both cases it’s the values and choices of the newsroom that should be questioned.

What is a blog, anyway?

This post by Andy Boyle seems to have struck a nerve on Twitter today. It exhorts news organisations to stop referring to things they produce as blogs just because they use different CMS or are branded differently to regular content. While I don’t think it quite applies across the board – this, for instance, is definitely a blog – Andy makes some very good points.

Sadly, blogs brought along a stigma that people still use  – which is wrong — that they’re done by people in their pajamas in a basement somewhere. Blogs are not the same as regular news content, some media folks thought, because they weren’t in your “main” CMS. They had a wall between them and they are different. They may even be branded differently, with a different header and logo. They weren’t the same as regular content because they were in a different system! Right?

Wrong.

It’s time to stop bifurcating your content as blogs and news because they run on separate systems. It is all content, so why not call it that? Even if you have outside people writing posts on your website that are unmoderated by your staff — that’s still content that’s part of your media outlet’s website. I don’t have any research proving this, but in my short journalism career many media outlets just slapped the name “blog” on something because it lived in a different CMS. We should stop this. Please.

While I don’t have any hard stats or user testing data on how readers react to the word “blog”, my gut instinct is that their readings are very different from the way news organisations tend to use the term. To a newsroom, the word blog might signify a lighter tone than news or feature. It might imply a home for specialised subject matter that might not fit with the rest of the site. It might be used to signify a linked, ongoing set of posts like the word “series”. It might mean “something done through WordPress” or “something put online without subbing first” or “a side project we give the juniors to prove themselves”. To some, in some newsrooms, it almost certainly means “not proper journalism”, despite the (somehow, still ongoing) conversations about whether bloggers can be journalists.

The question is what it means to our readers. My fear is that for them it may have more resonance with the meanings towards the end of that little list than the ones at the start. Blog shouldn’t be a dirty word or one that’s used to put down the effort of the people creating something – but in the minds of many, at the moment it still is. It’s important to set readers’ expectations by what’s on the page, but we don’t need to distinguish web-only or web-first or even tone in this way – there are other words that might make just as much sense to us, and even more to readers.