For the past ten years or so, I’ve been working on a software project to assess stylistic similarity automatically, and at the same time, test different stylistic features to see how well they distinguish authors. De Morgan’s idea of average word lengths, for example, works — sort of. If you actually get a group of documents together and compare how different they are in average word length, you quickly learn two things. First, most people are average in word length, just as most people are average in height. Very few people actually write using loads of very long words, and few write with very small words, either. Second, you learn that average word length isn’t necessarily stable for a given author. Writing a letter to your cousin will have a different vocabulary than a professional article to be published in Nature. So it works, but not necessarily well. A better approach is not to use average word length, but to look at the overall distribution of word lengths. Still better is to use other measures, such as the frequency of specific words or word stems (e.g., how often did Madison use “by”?), and better yet is to use a combination of features and analyses, essentially analyzing the same data with different methods and seeing what the most consistent findings are. That’s the approach I took.
It’s interesting not just for its insight into a field that rarely comes into the public eye, but also for what’s written between the lines about how authors write. It suggests that, unless we really make an effort to disguise it, most writers have a linguistic fingerprint of sorts: a set of choices that we tend to make in roughly similar ways, often enough for a machine to notice when taken in aggregate. A writer’s voice goes beyond stylistic choices, genre and word choice, and comes down to the basic mechanics of the language they use.
I used to blog all the time. I used to have a serious writing work ethic. I’ve blogged in many formats under multiple names since 2004, or thereabouts, which makes me a bit of a youthful whippersnapper in terms of some of the internet. But it’s nearly a decade now, and that’s too much waffling on the internet to throw away just because I’m busy.
I’m out of practise. I’m rusty. I used to write for a living; now I’m more on the production side, and my writing is suffering for lack of daily use. This is not a muscle I should allow to atrophy.
Side projects are brilliant, and I like to have at least six on the go at the same time, because there is something wrong with me. My blog hasn’t been on that list for far too long.
Writing things through is a superb way to refine an argument, distill an insight or open a debate. Writing makes me better at thinking.
I used to yammer on about how important it was for a digitally-savvy journalist to have a blog and get themselves out there on the wide wide interwebs. Just because I’m happy with where I’m at, and pouring a great deal of energy and inspiration and activity into my day job, doesn’t mean that advice doesn’t still hold true.
I know a whole bunch of stuff about some extremely esoteric internet subjects now. Maybe some people might find some of that useful. I should share.
I have a whole bunch of opinions and knowledge about games and other culture, about storytelling more broadly, about politics and events. I have a lot of experience of making game things in liminal spaces. Maybe some people might find some of that interesting. It can all share space together with the media stuff here and cross-pollinate, the way it does in my brain.
One of the things making Detritus has viscerally re-taught me (more on that in a coming blog post!) is that what actually matters in my personal work is making things. If they’re well received and widely read, that’s absolutely brilliant. But what matters more is that they exist at all.
I didn’t want to do ten days of blogging every day on my own, so I sort of challenged Grant. He’s a far, far better and more entertaining writer than I am, and I really enjoy reading what he writes. I’m basically just doing this so that he writes more. It’s entirely selfish.
Three years ago today I got married. We had a secular ceremony at the registry office in Norwich, and each wrote words for the moment when we exchanged rings. Both being writer types, it all got a bit competitive. This is what I ended up with.
I love you. Those three words have my whole life in them.
My eyes see through those words, and the world is changed and made wider and more beautiful, more precious, because everything is touched with that love.
My arms are full of those words and waiting to give them to you every morning, every evening, every day that I am lucky enough hold you, for ever.
My heart sings those words every morning as it thumps in my chest, a triple beat greeting the morning with joy because you are in it.
My legs run home every night to those words. My feet pound the streets to those words.
Those three small words are a shield for my back and a shelter from the rain in hard times.
Those three small words are the snow falling on my upturned joyous face and the sun shining to wake my sleeping skin.
My tongue tastes those words, shapes my speech through those words. Every word I speak to you has those three words in it too.
My hands are shaped around those words. There is no gift greater than those words that I can give to you.
So I give you this ring which is not a circle but those three words made solid, those three words with my whole life in them.
Taylor Clark has a storming piece up on Kotaku today. He’s right: most popular video games are dumb. And that’s fine, so long as we don’t assume that’s the only thing games can do.
To accept childish dreck without protest-or worse, to defend the dreck’s obvious dreckiness just because the other parts of a game are cool-is to allow the form to languish forever.
Yes. Preach it. Preach it also to readers who love Dan Brown’s fiction in spite of the writing, and everyone who overlooks the hour-long goodbye scenes at the end of the Lord of the Rings films.
Most popular things are dumb, not just video games
Video games are not unique in being collaborative creations in which many elements are brought together to form a whole; nor are they alone in being often poorly integrated, with areas of brilliance marred by areas of dreck (or indeed whole areas of dreck occasionally elevated by moments of brilliance). All media have these problems.
But video gaming is such a small field at present. Our examples of brilliance and of dreck come from a depressingly limited pool of options, especially when we examine big-budget titles. Truly stand-out works in any field are rare. Most media plays to the majority. In video gaming, it is the mindless that has proven to sell well – so mindless most games remain.
Sturgeon’s Law (90% of everything is crap) applies not just to things being bad, but also to things being dumb, crude, silly. It’s not just video games; it’s also everything else. There shouldn’t be any shame for gamers in saying: yes, a lot of games are dumb. A lot of everything is dumb. A lot of dumb things are fun.
But Clark’s right that by saying video games can only be dumb, we’re doing the medium a great disservice. In the 18th century there was a widely held perception that novels could only be dumb, until classics began to emerge and a canon formed. Video gaming has been around for a much shorter time and has much farther to go before it reaches maturity – technology is still not stable, barriers to entry are still falling rapidly, the business model is still all over the place, and all those things impact the kinds of games that are produced and the processes by which they’re made. But video games can, and should, aspire to greatness, both mechanically and narratively – and ideally, both at once.
it is extremely difficult— maybe impossible— to come up with a story and characters that, when placed within the context of most current video games, don’t feel inherently silly
Most current video games are inherently silly, therefore it’s impossible to put anything on top of the silliness to produce something that’s less silly. Well – yes. There’s an assumption here about the place of writing, story and characterisation in games – that it’s not an inherent part of the context of games, but rather something added on top. But if you start from the premise that your game is about hyperviolent destruction of mythical monsters, you’ve made a lot of decisions about the story and the characterisation already. Even the best writers won’t be capable of making a game deep, believable, complex or realistic if the gameplay is fighting against that narrative at every turn. See also: GTA4.
Gameplay and narrative shouldn’t simply inform each other. They should be inextricable from each other. Games that aspire to being well written can’t just plaster story on top of mechanic like wallpaper. It has to be mixed into the mortar, built into the foundations. It doesn’t matter whether you’re gunning for embedded or emergent story, froth or experiential narrative or whatever – you can’t slap it on top of gameplay like an afterthought, because gameplay mediates the entire experience.
If you’re playing a different story than the one you’re being told, then the game can’t attain that coveted, if ill-defined, goal of comprehensive intelligence. It’ll always be fractured; no matter how carefully the cracks are hidden, it won’t ring true.
And writers? Well, they need to find a use for what they do, I guess. Because a story for its own sake written from a single point of view – digital or otherwise – is increasingly looking like it isn’t enough.
Journalists are facing down this problem online, now, as well as creative writers and other sorts of digital storytellers. In a way, it’s comforting to remember it’s not just written news but all sorts of writing that’s wrestling with these questions. And it’s also comforting to remember that things like Instapaper, the Long Good Read, Longreads and a vast array of others are whirring away, proving that for many people, yes, a written story is enough.
Yesterday was Gamecamp 4, the first one I’ve been to, and I had a properly fantastic time. Some excellent sessions, some fascinating conversations, and some surprisingly forgiving zombies made it a great day.
Here’s what I took from the day.
We like stories in our games, and we like games in our stories, but not all games (or stories) need both.
Boss fights interrupt flow, but can be used to build interesting characters. They can be frustrating (Metal Gear Solid), but when they’re done well and foreshadowed properly, they can also be hugely satisfying (Limbo).
Free play without structure isn’t a game.
Digital games suck at relationships.
A lot of digital games writing sucks, full stop.
Romance and sex in games are two very different things with different problems to be solved.
Some problems being tackled by digital game folks have already been solved by live game folks, and vice versa.
When under attack, people seem to instinctively try to get to high ground. When high ground is not available, they use tables.
Lemon jousting is harder than it looks.
Mechanically, World of Warcraft and Farmville are (depressingly?) similar.
We like our extrinsic motivators without coercive social marketing practices.
Gamification isn’t particularly interesting to people who already make games.
My working definition of emergent stories – stories created by players interacting with game mechanics without a designer getting in the way – is flawed, hugely flawed, but works OK for demonstration purposes.
Emergent stories need space to emerge. People make up stories to fill gaps.
Story can be constructed after experience, collaboratively.
Someone has already run an art heist game in a museum. I really hope they do it again. Soon.
Museums, like news organisations, need help making good games with few resources.
The Keyworth building at London South Bank Uni would be an excellent venue for a full-scale game of Zombie.
The unconference format just works. No bit of my day was boring or slow or non-interactive. I went to half a dozen really interesting talks, and missed about a dozen more, and that’s fine.
If I hadn’t failed repeatedly, I wouldn’t be a journalist. This is all a bizarre accident.
See, I never wanted to be a journalist. (Blasphemy!) I remember deciding when I was about 9 that if I did become a journalist I would write for the Guardian or the Independent but definitely not the Daily Mail because it was rubbish, but all that was obviously only a back-up plan. I was going to be a Writer.
So I grew up a bit, wrote a lot, won at school, won at being homeless and failed at being sane, and eventually dealt with that enough to pack up and get to university for a literature and creative writing degree. I did my best to become a Writer by arranging words in attractive orders as much as humanly possible. I held down a part-time job designing books, copy editing, typesetting and occasionally redesigning the perspex plates on the front of all the postboxes in the UK, which at the very least meant that millions of people read my work every day.
And then came graduation, and the growing realisation that I had literally no idea how to be a Writer and still afford to eat. I applied to two post-grad courses, one in creative writing and one in literature, and failed at both. I went for editorial jobs at Oxford University Press and Taylor Francis and loads of smaller places, and failed – in fact I failed at more than 50 job applications in three months, that summer.
Around this time I split up with my long-term partner, and moved out of the house we shared, and while sleeping on other people’s sofas I spotted a job ad for Trainee Journalists for the Eastern Daily Press in Norwich where I was living and I thought, well, at this point, the part time job won’t pay the rent, let’s apply.
When I did the application test – an exam in a room with 100 other people – I was still on sofas and hadn’t seen the news in the best part of a week. That made writing a 200-word news story on a current news issue pretty difficult. Luckily, I blag well, and if nothing else the years of wanting to be a Writer meant I could write well. So I got the call back, and was sure I’d failed the interview (I wasn’t sure what a red top was), and then a few days before Christmas came the job offer. Paul Durrant – he of the most excellent moustache and Brummie accent – phoned me and said: “Got some good news for you: you’re going to be a journalist.”
Man. What a failure.
So that’s me. I failed at Writing and won at writing. I failed so hard I failed myself right into a career that’s perfect for me, right into work I love and an environment I thrive in. I failed so badly that I wake up every day excited about what I do; I failed so hard that if you didn’t look at what really happened you’d probably call it deliberate success.
Since then, of course, it’s been slog and hard graft and an awful lot of trying incredibly hard all the time. It’s been monstrously long days and never turning my phone off and learning stuff in my spare time and making things happen. It’s been – it is – hard, and joyous. And I’ve never regretted the failures that led me here.
That’s my lesson. Sometimes failure is better than success. Sometimes you get better opportunities through failing than you do through succeeding. Sometimes the only way to win is to fall.