Management structure

Posted in Industry Rants on June 28th, 2013 by MrCranky

Written in response to musing about whether or not Valve’s ‘cabal’ structures were useful, or just a quirk of the company.

Management isn’t generally the problem, the problem is that after a certain point the structure starts to exist to serve the structure, not the needs the structure was originally supposed to serve. All organisational structures, be they flat, tiered, cabals, whatever, are there to facilitate the business needs. Generally a games developer needs to make better games, faster and cheaper. When you spend all day in interminable meetings because your hierarchy is a bad fit for what actually needs done, then communication overhead means that more time is spent talking about what should be done than is spent on doing it, you’re not serving the business. When you spend a bunch of time flitting between tasks because it’s not clear whether you should be doing something or someone else should, and end up doing the same thing as someone else while other vital things fall between the cracks, you’re not serving the business.

All different sorts of management can be fine, great even, as long as everyone remembers that at the end of the day it’s supposed to make the work go better, not worse. It doesn’t matter whether it’s top down, bottom up, side to side or shaken not stirred, as long as it’s making it easier for real, productive, money-making development to happen. Remember those Time and Motion studies? I think that’s what we need sometimes – someone from outside to point out when our structures are getting in the way rather than helping. It’s very hard to see when you’re in the thick of it; you get a sense that something is wrong, that this madness can’t be the best way to do things, but not how to fix it.

Maturity in fiction and games

Posted in Industry Rants on November 11th, 2012 by MrCranky

I’ve been attempting to thrash out some opinions in my head recently, and I think they’re reached the stage where writing them down would help. I’m thinking about the sorts of games the industry tends to make, and seeing in them parallels with fiction in books. Specifically I’m thinking about the sorts of stories we tell, and the kind of writing involved. Looking back on the most notable games of the last twenty years, it seems to me that many if not most games use a science-fiction or fantasy (SF&F) setting. The ones which don’t (I’m thinking Call of Duty and the other modern FPS games, Uncharted, etc.) all tend to rely on the same tropes which I’ll talk about later.

For some background, I’ve recently read the Game of Thrones series, which for those who haven’t read it is a gritty fantasy series set in an essentially medieval world. The characters are dark and flawed, and the line between the heroes and villains of the piece is most definitely blurred. ‘Good’ characters are not uniformly noble, and ‘evil’ characters are not unremittingly bad. The main characters are vulnerable as everyone else in the world, they don’t have special skills, they’re not extraordinarily lucky. They die just like everyone else, and just being a main character is no guarantee they’ll even survive till the end of the book. Interesting stuff happens all over the place, not just where the main characters are. Fortuitous events are as often bad for the protagonists as they are good. I won’t say “just like real life” because real life doesn’t have a whole lot of dragons in it, but certainly a lot more plausible than a lot of SF&F fiction.

I’m almost tempted to use the term ‘grown-up fiction’ here, but I think that’s doing a disservice to SF&F fiction, which can be as grown-up and compelling as regular fiction. But the tropes that I see in non SF&F games are the same ones you come across in SF&F fiction and games. Here are few:

  • The protagonist(s) turn out to have amazing powers that elevate them far above regular people, e.g. amazing strength, abilities with weapons or magic, or supernatural senses; or maybe they’re the one and only person who is the fulfilment of some ancient prophecy.
  • These powers are often previously undiscovered and the protagonists develop them through the course of the story, leading to the story’s climax where the full range of their abilities will be tested.
  • The antagonists have powers or a similar advantage that rival the protagonists’, but they will already be in full command of them at the start of the piece
  • Alternatively the antagonists will be in control of the world situation (e.g. an evil government commanding an army of minions), and the protagonists are only safe because they are hidden, and achieve victory by using their superior abilities against ever-increasing numbers / strengths of minions.
  • Minions will be so staggeringly ineffective their only purpose is to be cannon fodder for the developing protagonist. The unstoppable army that has supposedly swept away all resistance seems to be entirely staffed by soldiers that seem unable to tie their own shoe-laces.
  • If the protagonists don’t have great abilities, then they are at least unnaturally lucky – other minor characters throw their lives away while the main characters are miraculously untouched, despite the antagonists being in a clearly superior position.
  • Alternatively they will be the rich and noble sons and daughters of the rulers of the land; uniquely placed to get involved in high adventure, without needing to ever sully themselves with something as hum-drum as a regular job, just to earn enough to put a roof over their heads.
  • The protagonists will always be in the right place at the right time for interesting stuff to happen. The village which has been ignored by the evil emperor for years is raided by the empire’s secret police only a day after the protagonists seek refuge there.

Sounding familiar? Star Wars, Lord of the Rings, Harry Potter, Eragon, The Belgariad; Call of Duty, Half-Life, Doom, heck – every FPS ever, Uncharted, GTA, Max Payne, Prototype, Ninja Gaiden, God of War. They’re not unique to SF&F settings, but SF&F does use them rather a lot.

This I think is where my feeling that these are immature stories comes from. They appeal to our sense of wanting to be special, we sympathise with a powerless character becoming powerful, and standing for all that’s good against a clearly evil villain. We don’t want them to have weaknesses because we’re putting ourself in the protagonist’s place, and we don’t want to have weaknesses. But the notion of a super-powerful character who is only vulnerable because they don’t realise just how strong they are is the very definition of an adolescent fantasy of what a great character would be. He’s a ninja with super-strength, who can fly faster than the speed of sound, and can also stop time and can totally be invisible. Really? And he hasn’t conquered every enemy in the entire world yet why? Comic writers have realised this since the start, as the arms race of super-hero versus super-villain is a never ending one. A super hero who is invulnerable and superior to all his foes is a really boring character.  They have to be vulnerable, both in their powers and in their characters, to be able to weave them into an interesting story. And no, not being able to be everywhere at once isn’t a vulnerability, at least not a proper one. This isn’t limited to super-hero fiction either: if you’ve played Call of Duty or Wolfenstein but haven’t seen Band of Brothers, watch at least a couple of episodes; the brutality of fighting in WWII is inescapable – one man wouldn’t be mowing down dozens and dozens of Axis soldiers, they’d be lucky to kill a half a dozen before luck meant that they took a bullet themselves.

Now flip it around the other way. There is a surfeit of fiction out there that doesn’t fall into these traps (classic literature such as Dickens, Austen, etc. not to mention crime novels, historical fiction novels, romance novels). But comparably there are very few games which don’t. There are plenty of abstract games (e.g. Tetris), simulation games (e.g. Gran Turismo, flight sims, or The Sims), but I think most people would struggle to name more than one or two high profile games with narratives that don’t fall prey to these same easy tropes. From memory I’d call out “Hotel Dusk: Room 215,” as being a good story with compelling and believable characters without any of the tropes I’ve mentioned above. Similarly the LucasArts games did very well at telling a story without requiring the main characters to be super-special in any way. But these are the exceptions and hardly the norm.

Of course, there are reasons why the fiction in games is written the way it is. A Call of Duty game where you were dropped by a single bullet quite simply wouldn’t be fun. An RPG where you played a subsistence farmer, struggling to get by, wouldn’t keep any but the most masochist player interested. However I still think it’s important to recognise that there is a richness of narrative fiction out there largely untapped because we are treading the safe road we’ve walked before. When accusations are levelled at the games industry that we only make games for kids, and that we’ll never make a game that will make people cry (a lie, I know), I look at the sorts of games that get made, and I can’t help but think that we’re not doing ourselves any favours.

That’s not to say that some games aren’t bucking the trend. “Dear Esther” sounded like a laudable attempt, although I’ve yet to play it. I’d love to see more crime fiction brought to life through games (L.A. Noire, for all its plodding repetitive game-play, was a great stab at this genre). I wish someone would tell a compelling ghost story in the form of a game. Heck, I’d even settle for romantic comedy. Just, you know, something that stretches our boundaries a bit, and not just another bullet-proof space marine or boy that finds he is actually an ultra-powerful magician.


Posted in Games, Tales from the grind-stone on October 25th, 2012 by MrCranky

Oh my, it has been a while, hasn’t it?

In my defence, it’s been a crazy summer, and I have been juggling many different balls. Thankfully, all the work we’ve been doing has finally come to fruition, and is all now out there in the world so we can talk about it. First off, the work I’ve been doing for the last year or so with Sumo Digital, on Nike+ Kinect Training.

This was mostly working on the localisation aspect, as the game is translated into some 15 languages across 3 discs, there was a lot of voice content to get in. I can’t take much for anything else, but I think the folks at Sumo did a great job on it – certainly when I’ve had to actually stand up in front of the Kinect and do some real exercise, I’ve certainly felt the burn!

In-house however, we’ve had another big project that we’ve put our heart and soul into. Last year, Bliss Kiss Productions approached us with a pitch to re-make Daley Thompson’s Decathlon, for mobile devices. Of course, we loved the original game, I think anyone who had a Spectrum or Commodore 64 will have played it at some point: personally I abused my old rubber-keyed Spectrum 48K terribly to try and get a decent score. Thankfully I didn’t have a joystick at that point, otherwise I’m sure it would have been broken just as many others did theirs. So the chance to bring it to mobile was something we couldn’t pass up.

While we did some solid work on it in autumn last year, other commitments meant that it wasn’t until this summer that we could tackle it in earnest. Which, combined with all our other ongoing commitments, made for a lot of work. Dan’s been in pretty much the whole summer working flat out on it, and seems pretty chuffed with his first proper published title.

It’s a remake from the ground up, obviously. Looking back at the original version it was clear that the design was still fun (we spent more time playing than taking notes when researching), but the rose-tinted glasses of nostalgia allowed us to forget just how dated the graphics looked. On the Spectrum version, Daley’s an all-white blocky sprite with only a few frames of animation! There were also a lot of design decisions that were clearly made due to technical limitations (such as the shot put taking place on a straight track, instead of in a circular pit as it does in real life). Some of those decisions we revisited, but where there was a design case for it, we erred on the side of the original.

What was pretty clear,  from even the first round of focus testing, was that the original was brutally hard in its learning curve. Running events like the 100m and hurdles are straightforward enough, but three events in particular were unique in their own way: the high jump, pole vault and discus throw all differ in style. Instead of rewarding frantic tapping, they are games of timing. In the 80s, it was fine to spring that sort of challenge on the player and expect them to learn it on their own, but modern players are nowhere near as understanding. With that in mind, we put in a practice mode that allowed players to learn how to master particular events, without the added pressure of participating in the whole decathlon; and we put on-screen prompts and buttons to guide unfamiliar players through each event.

Also needing wholly revisited were the controls themselves. As a first principle we wanted to replicate the frantic button mashing / joystick waggling of the original; the user should have to break a sweat to get those high scores, especially in the 400m. At first glance the touch-screen controls seem obvious, alternating between left and right sides of the screen to run. But finding a way to let the user throw and jump without a) accidentally jumping when they didn’t mean to, or b) having the on-screen feedback be underneath the user’s fingers, was not a trivial task. Worse, when you introduce multi-touch, we had to find a way to handle input so that it was always physically hard to achieve the maximum speed. Later focus testing revealed that our use of an on-screen button for throwing / jumping wasn’t working; users were interpreting “HOLD” as a prompt, not a button, and simply holding their finger down wherever they last tapped. Based on that, we revised the controls to respond to exactly that action.

On the visuals and audio, we wanted to aim somewhere between modern and nostalgic. For the art side, we brought in Paul Helman to work on the graphics, and we feel he was right on the mark in his style – not blocky or restricted in colours, but also not trying to be too realistic. At first we were worried about how Daley Thompson would react to the stylised look we gave him, but all the feedback was positive.

For audio, we worked with Gavin Harrison, who did a great job experimenting on the audio we needed. Evoking the ‘old style’ in audio is somewhat harder; the audio chips of the 8-bit era had a very limited range, which just sounds silly nowadays. In the end, we went for a simple synth-sounding musical theme, and some very slightly distorted audio samples.

We finished our work at the end of September, and the game itself was released on iOS and Android on the 21st of September. The PR machine for the launch is in full swing, and we’re eagerly awaiting the public’s reception of it. When the dust has settled, I’ll try to write up a post-mortem of everything we’ve done, what worked and what didn’t, but right now I’ve been enjoying some well deserved time off!

Coding conventions

Posted in Coding on July 30th, 2012 by MrCranky

Another mini-rant on coding this week, originally composed as a response to someone who didn’t see why conforming to coding standards was such a big deal. In this case (roughly sorting header #include statements alphabetically) the defence was “it’s trivial to do that automatically, so why should you care whether a coder does it themselves?” That’s a pretty typical response, but the answer to it for me sums up exactly why following conventions is important, and it is nothing to do with the conventions themselves, and everything to do with how you work as a team.

First off I’d agree that this case in particular is not a major issue. None of them (indenting conventions, space conventions, capitalisation conventsion) are, but that isn’t why it gets people worked up when one coder decides to go ahead regardless. The problem is that you have a choice between:

  • Original coder does it as agreed first time


  • Original coder decides to ignore convention previously agreed on
  • Entire team endures negative effects of said change until either:
    • Another coder takes time out from whatever else they’re doing to fix it:
      • If they do it as part of a commit they’re already doing, it obscures the diffs for the ‘real’ changes they’re making.
      • If they do it as a separate commit, they’ve got to take the time to make sure that they’ve not accidentally broken something everyone chooses to leave it as it is and over time the entire codebase degenerates into a collection of such issues
    • Somebody writes an automatic tool to fix the problem

Fixing it after the fact is not a good solution, because it’s far more expensive than just doing it right the first time. If there’s a policy, then everyone should stick to it. If they don’t agree with the policy, then they should take that up amongst the team, not just ignore it because they don’t agree and they expect someone (or something) else to fix it later. If it’s a stupid policy, then the team can agree to get rid of it. If it has merit for others then they should respect that even if they don’t personally agree with it: because they’re working as part of a team, not just as individuals, and that should entail a certain amount of respect to your team-mates.

Most of us will have known ‘renegade’ coders, who go off into their own zone and implement some big bit of functionality without consulting with the rest of the team. Sometimes that works well, and other times they come back, throw the code over the fence at the rest of the team and act surprised when they have problems integrating it. That is no way to work, and not only will it lead to friction amongst the team, it also generally means a bunch of wasted effort that could have been avoided with better communication up front. Not respecting coding conventions isn’t nearly as bad as that, but I feel like it’s the first step down the road towards it.

When you’re working in a team, you don’t have the luxury of implementing things in a bubble: you have to work with other peoples’ code, and they have to work with yours. Coming to a common agreement as to how to work with each other is the most basic part of that, otherwise you’ll find yourself working at odds with each other. There can and should be compromises to get to that agreement, but ‘agreeing to disagree’ is generally not a viable option.

Conflicting ideas about the size of STL strings

Posted in Coding, Technical Guidance on July 18th, 2012 by MrCranky

This post is one of those “I couldn’t find it when I was Googling, so here’s a succinct description of the problem / solution so other people can avoid the same round-about research.”


You have one bit of code (perhaps a library or a DLL) which thinks that sizeof(std::string) is 28 bytes, and another bit of code which thinks that it is 32 bytes. In Release mode they both agree that the size is 28 bytes. In our case it was actually std::wstring, but both string objects are actually the same size and exhibit the same problem.


You have a mismatch in your configuration between the two projects, essentially you’re trying to mix Debug code and Release code, which is just fundamentally not allowed. This much information is readily available on the Internet with some basic searching, but crucially most of those places don’t tell you the one piece of information you really need: exactly what setting is different? Which one of the dozens of settings that typically differ between Debug and Release is the STL code actually paying attention to?

The real answer lies in the details. It is not a Debug vs Release problem (well it is, but only indirectly). If you’re like me, the first thing you checked was the presence (or absence) of the _DEBUG or NDEBUG pre-processor directives. After all, they’re the defines most often used to get differing behaviour between the debug and release builds. You’ll find however that those definitions have no bearing at all on the size of std::string.

Now is probably a good time to visit this Stack Overflow question which links to good information on the subject.

In fact, the root cause is the presence and value of the preprocessor definitions _SECURE_SCL and/or _HAS_ITERATOR_DEBUGGING. If these are defined and set to 1, then sizeof(std::string) will be 32. If they are defined and set to 0, sizeof(std::string) will be 28.

More troubling is that even if those definitions aren’t explicitly listed in the set of pre-processor definitions, I believe the compiler (the Visual Studio compiler at least) will define them for you, based on its own internal logic. _SECURE_SCL will always be 1 for both debug and release builds, but _HAS_ITERATOR_DEBUGGING will be 1 for debug builds, 0 for release builds (as it has a tangible performance impact). You can explicitly turn off _SECURE_SCL to get more performance if you want, but you should understand the drawbacks before you do so.

I will update this post if I find out more about the internal setup of those definitions, but simply knowing that they are the cause of the size difference is usually enough to get to a resolution. I would certainly recommend adding some logging to both code modules that spits out the value of these two defines so it’s clear to you what the values are on both sides.


For most, an immediate solution is to simply manually define iterator debugging to be on or off in both projects so that they are consistent. To do that, simply add _HAS_ITERATOR_DEBUGGING=1 (or 0) to your project’s preprocessor definitions.

You may want to avoid setting it explicitly (ideally you’d simply rely on the compiler defaults), in which case you’ll need to figure out why iterator debugging is enabled for one module but not the other. For that I’m afraid you need more information about how the compiler decides to set those defines, but presumably another one of your project settings is indirectly making the compiler decide that iterator debugging should be enabled or not, and it is that setting which is different between your two modules.

The importance of (good) teachers

Posted in Industry Rants on June 25th, 2012 by MrCranky

I usually recommend that students looking to get into the games industry as coders stick with traditional, academic courses like Software Engineering or Computer Science. Not because those courses teach the content most appropriate to games development, but because they leave the students with a well rounded education. With a well rounded education, they can learn the practical / vocational skills needed for games development (a higher level of programming expertise usually) on their own, plus they have the option of a career somewhere other than the games industry if they change their mind or find there is a shortage of employment available. If they specialise in a vocational course too early, they wouldn’t get the more general education that would allow them to work anywhere other than games.

That’s not to say that I discount students from vocational games courses though, far from it. But the quality of those courses varies dramatically, and so it’s even more important to assess the quality of the education they’re receiving. The first and probably biggest alarm bell that rings is when courses employ lecturers without games industry experience. That to me is utter madness. They might have masters degrees or doctorates, they might be the most engaging lecturer in the world, but without industry experience, they are wholly unqualified to be teaching a vocational course. That’s like someone teaching others how to swim when they’ve only ever had a bath. There are many other warning signs of course, but to me an institution that thinks to staff their course with vocational teaching staff with no experience in that vocation is only ever going to produce sub-par graduates.

So my advice to those institutions is this: hire industry experienced people. Poach them away from the industry with better working conditions and less stress, even if you can’t offer them more money. Entice them with the notion of enthusing a new generation of games developers. Find the next big studio that gets shut down (there’s no shortage of those), and see if anyone wants to take a break from the industry proper to teach. But whatever you do, don’t hire academics who’ve never shipped a game in their life.

And don’t hire people who couldn’t get into the games industry on their own, but who want to pretend like they’ve made games so get into teaching. Hint: you’re not a professional games designer until someone has paid you real money to design a game which has shipped. That doesn’t include:

  • designing games for your friends
  • designing your own game but never actually making or releasing it
  • writing books about other peoples’ game designs and how they are good or bad

If you’re going to teach games design, personally I think it should be compulsory to detail which games you designed (or part designed), and how well they did. Your students should be able to go find your games and judge for themselves how good your design chops really are, before they start taking your opinions on design as ‘the way things are.’

In defence of middleware

Posted in Industry Rants on May 31st, 2012 by MrCranky

This mini-rant sprang from a discussion on The Chaos Engine about middleware, in answer to the question: “even if it’s the best engine available is it really worth being locked in to anything other than in-house, license-uninhibited tech?”

That depends on whether you’re interested in building games or shipping games. You’re trading many man-months of effort on a new / unknown engine versus a non-trivial licencing cost. How many games do you have to ship on your internal engine before the difference in cost becomes positive? And what do you do with all those engine developers you’re carrying once the engine is done? Because they’re part of your burn-rate now.

Making your own tech is simultaneously the risky option for the business, and the safe option for the developers. Why? Because as long as you can persuade someone to bankroll it, there’s a tonne of work to do, and it’s nice, tangible work with obvious goals and milestones. You know when you’re done. You know what you’re making. The customers are the other developers on your team, and they’re not nearly as fickle as the public. It’s a lot easier to find success in building your own engine than it is to find success making and shipping games.

That is super short-term thinking though. Because once you’ve succeeded in making the engine, you’ve still got to ship a successful game, and worse, you’ve probably got to ship several successful games before the engine development effort is paid back. Plus your engine will have a lifespan just like they all do: if you don’t profit enough from the games made on it in that lifespan, then it’s been a net loss.

It’s no wonder that individual developers don’t like middleware. It’s clunky, it rarely fits right with what you’re trying to make, and you’ve got little to no control over its development. But “it’s expensive” isn’t a great argument against it, because the alternative is expensive too. It’s not risk-free, but it’s certainly less risky than doing it yourself. It’s a known cost, and in most cases a known risk. Fundamentally, it frees your employers from having to take a gamble on the tech you build, and when the money they’re gambling on your tech is money that they could be gambling on your games, I don’t think that’s really very attractive.

I’d prefer to be working for a smaller company that can be more agile, more robust, and capable of shipping more games, over a company that’s carrying an engine development team, that has to build games based on tech that won’t be done till some future date, and which has less capital to work with because it’s invested a chunk of it in an engine that has yet to pay that money back.

What to expect from the games industry, and what it expects of you

Posted in Tales from the grind-stone on March 7th, 2012 by MrCranky

The folks from Edinburgh University Computing Society, who run the student TechMeetup, have asked me to give a brief talk on the games industry to one of their gatherings. As anyone who knows me will attest, I’m happy to waffle about the games industry at length, but I do have a few pet topics. Here are my discussion items for the talk, on which I’ll expand at the talk itself.

  • Hard but rewarding work – need talent and passion.
  • The feeling you get from seeing other people pick up the work you’ve made and get real entertainment from it is fantastic.
  • Making games is a business, but not a hugely lucrative business. If you want to get rich, look elsewhere.
  • Don’t expect a job for life, or gamble everything on one team.
  • Employers vary in quality. Good teams make good games. Business can still kill good teams.
  • Margins are much tighter, hiring people is a risk.
  • Show your talent: make a demo, work on mods. An academic CV is unlikely to be enough.
  • Passion should not equal crunch. Enjoying your work is not a licence for exploitation.

New hotness

Posted in Tales from the grind-stone on January 17th, 2012 by MrCranky

Is it bad to be compulsively checking the UPS tracking page for my new laptop? Or to be a little nervous because it’s currently in Kazakhstan, and all those Call of Duty games made me a little nervous about ex-Soviet republics? Is that over-protective? It’s not even here yet, and I’m clucking over it like a mother hen.

Whatever, as long as it gets here in one piece and is suitably shiny. We’re kicking off with our new client this week, and it was immediately apparent that my current 32-bit dual core laptop (now five and a half years old) really wouldn’t cut the mustard. It was okay, just, for building for 360, because the console does all the heavy lifting. But it won’t run a PC build of anything substantial, and compilation takes an age. Not to mention the graphics flashing and sporadic unexplained hard freezes. So the new Macbook Pro kills two birds with one stone – it’s modern and chunky enough that it should build and run the client’s title, and it means Tim and I no longer have to pass the older Macbook Pro whenever there’s iOS work needing done.

To put it in some context, Tim’s machine needed a new graphics card as well to bring it up to spec. His new graphics card scored ~1600 on the benchmarks. The new Macbook Pro’s graphics score ~1300. Tim’s old graphics scored ~500, and the old MBP ~270. My current laptop (and bear in mind I got the Dell Precision M65 with the graphics ‘upgrade’) scores 71. Yes, 71. I had to go three pages down on the benchmark list before I could even find it.

Of course, even the new MBP isn’t up to the level of the monster Alienware M17X that MGS bought for me, but on the flip side, it also won’t weigh 7 kilos and sound like a jet turbine taking off. While I do still miss the glowy lights and brushed aluminium body of the M17X, the added benefit of crotch-based heat sterilisation from the MBP is surely enough to seal the deal.

Pinnie the Who and the Blustery Day

Posted in Random Stuff, Tales from the grind-stone on January 3rd, 2012 by MrCranky

Happy New Year! Tim and I have actually been in the office since Monday, eschewing the traditional extra Scottish bank holiday in favour of getting cracking on our big stack o’ work. Today though we’re here in defiance of all the sensible advice to avoid travel! Trees down, tiles smashing onto the ground, signs being torn off buildings and thrown around the roads like crisp packets in the wind. There are a few nice things about being in a basement office, and shelter from the wind is one of them.

It’s been a while since the last blog post though, so I’ve missed the opportunity to post this gem from back in December (and #HurricaneBawBag)

The aerial on the building at the back of our office, bent and battered, trailing a polythene sheet in the awful wind

How to get poor reception

That is our back-yard neighbour’s TV and ham-radio antenna, trailing a big sheet of polythene. Note the mangled and bent spokes, as a result of the polythene catching the wind like a sail and whipping around for hours, very nearly pulling the poor man’s chimney stack over. Not that last months winds can hold a candle to today’s storm though. It seems Mother Nature is angry with us this winter.

To other news: we’ve picked up a new client for the new year which promises to be very interesting – a variety of code support work on PC/360/PS3. In addition to our existing clients, that’ll mean our own projects will have to be put to the side for a little while.

After yet another acquaintance saw fit to share their mobile app idea with me last night, I realised that what we’re short on isn’t ideas, it’s time. What with all of our client work and flitting back and forth, we very rarely get a chance to get heads-down, all-out concentrated on our own apps. There’s nobody to blame for that but me really, but we are rather at the mercy of the paying work. Tim’s been doing a bang-up job in December of bringing our latest creation up to a releasable standard, but I fear it’s not going to reach the quality bar before we have to put it back on the shelf and concentrate on our clients’ needs.

In an ideal world, we’d be able to take our time, concentrate fully on bringing our ideas to fruition, and the money made from releasing them would pay for the next round of product-making. In practice it’s not as simple as that; client work is money in our pockets now, but app sales are money in our pockets later, maybe. Of course, that’s a vicious circle, without taking a punt on our own apps, we’ll never have the opportunity to win big and break out of the work-for-hire mold. But in the meantime we take the work that keeps a roof over our heads.

We’re coming up on the end of our 7th year in business now, which is no mean feat these days. I’ve just updated our entry in SDI’s Gaming Brochure list of Scottish developers, and it’s heartening to see all the small and large companies in there. Here’s to a bright and positive 2012, and to the opportunities it brings.

Black Company Studios Limited, The Melting Pot, 5 Rose Street, Edinburgh, EH2 2PR
Registered in Scotland (SC283017) VAT Reg. No.: 886 4592 64
Last modified: February 06 2020.