EVE Online, Hilmar Petursson

Posted in Conferences on August 14th, 2007 by MrCranky

Much more interesting was the talk from EVE Online’s Hilmar Petursson. Admittedly the content was very close to my heart – massively multi-player virtual online worlds, but as I mentioned to some other delegates, sometimes it’s good to hear from a developer that made a good game, made it well, and has done well out of it. There are too many ‘fluff’ talks here, about things which might be, or projections, or spin on developers/markets that aren’t quite as good as they are made out to be. And given the current disenchantment with MMOG developers in the games industry, it’s good to hear that if done well it can be profitable.

Anyway, Hilmar told a story of a rocky start, with a long pre-production time for EVE, including a period of a good 6 months where they ran out of start-up capital and relied on the goodwill of their staff to keep working for free essentially, entirely based on their faith in the product. After their rocky start, and some publishing woes (they sold the rights to a book publisher which then decided to neglect it entirely in favour of their book business), they eventually started to self-publish EVE, based on an entirely digital download model (no boxed game = no distributors). With a solid core of 30,000 players, they’ve grown their player base continually (and sometimes exponentially) since then. The amazing growth of the player base is matched only by their loyalty and immersion in the game, to the point at which they now have over one hundred thousand subscribers, and are on track to hit 200,000 before the end of this year. Soon, it seems, they will be the first developer to have more subscribers than the population of their home country (Iceland only has 300,000 residents)!

It’s this massive player-base, all sharing a single virtual universe, which leads to the parts of the game-play which captivate the players. Simple but solid core mechanics, closely modelled on their real-world equivalents (markets with bid/buy/sell systems, corporations with management structure, etc.) provide a framework on top of which the players operate rich social structures. It is these social structures which provide the real interesting play in the game; very much an example of emergent game-play.

Hilmar differentiates between two different styles of MMOG – the ‘theme-park’ games (EverQuest, World of Warcraft, etc.) where play is rich, but tightly scripted and given to the player in neat, measured doses; and sand-box games (EVE, Ultima Online), where a virtual world with some basic rules and mechanics is presented to the player, and they are free to create whatever interesting systems within that world that they like.

My favourite part was some video and concepts of the upcoming revisions to the game, which extend it to allow users to walk around inside space stations as a 3D character avatar – a vast improvement over the user-is-a-ship mechanic currently employed. CCP feel that it is this limiting factor which is skewing the player-base massively in favour of men (95% of the players), and hope that the new focus on personalisation in the game will redress that balance. Certainly this means that they are one step closer to my own vision of a virtual universe which scales from the personal scale to the universe scale, although I can see the technological challenges remain massive. It seems like they are feeling their way gently towards that system, with the personal avatars initially being limited to wandering around some lovely interior environments and chatting. They hope to eventually move to allowing indoor combat, but I think it is wise to work out the teething problems with the system first! They are also re-visiting the graphics of EVE – always one of its strongest points, to scale it up to current hardware levels.

Regardless, I think it may be time to re-visit EVE once again – the last time I played was a couple of years ago, with a much smaller player-base, and fewer features. Certainly the growth of the player-base makes me think that it is worth another try.

Virtual Reality TV, Peter Cowley

Posted in Conferences on August 14th, 2007 by MrCranky

This talk, from an executive at Endemol (they of Big Brother and Deal or No Deal fame), was mostly about the acceptance of the fact that traditional TV production and development is an aging dinosaur in todays entertainment medium. The younger audience is playing increasing amounts of games, at the expense of the time they used to spend watching television. Good for us, not so good for the television producers.

Mostly this talk was telling us things that we already knew – that TV producers take a blinkered view of content production, and that kids and younger people prefer interactive media to non-interactive (TV and films). Web-based content, both games and social networking, are being used increasingly to maintain the reach of traditional platform holders such as the BBC or Channel 4.

Most interestingly, they claim to have done research that shows that as users grow past 18 and school leaving age, they tend to be less biased towards interactive media, and tend to start using their mobiles and PCs in ‘more adult’ ways. Personally, I suspect this is an outgrowth of the free-time factor, that the younger audience spends more time on interactive media because they have more time to spend! Once they have more interesting pressures on their time, the desire for interactive content drops away.

Still, a long talk for not very much reward.

What Am I Worth?, Ed Williams

Posted in Conferences on August 13th, 2007 by MrCranky

This talk was interrupted by the fire alarm last year, so it was good to get through it unmolested this time. Again, I’m not sure just how relevant it is to a small developer, but it’s interesting to get a feel for the current climate at the top, as it feeds down the chain a bit in terms of work available for us.

Certainly the last couple of weeks have been bad for the majority of the stock market, and in general terms that means that risky investments are the first to be dropped. Unfortunately, games development is always a risky investment! Even the big players have suffered in the last couple of months, lots of volatility caused by a real lack of predictability. Not so good.

In Ed’s opinion, there were still opportunities, but they are in the low cost markets which are up and coming: mobile, downloadable games, casual titles, etc. Again, showing the risk aversion here – the big budget, big risk titles are not the sort of things investors want to deal with. Small scale, fast turnover games can show predictable results, without the all-or-nothing issue a AAA title might have.

Also covered was the massive growth shown in 2006, certainly a big surprise to me, but good news. Income across the market is up 68% to $995m, of which subscriptions make up a tad more than two thirds. Forecasts for 2008 are for more than $1.3bn, which is optimistic but doesn’t feel like crazy numbers to me.

More interestingly for us, the massive growth of outsourcing we were talking about in 2005 has outstripped even those projections, $1.1bn in 2006 (40% of costs). For a lot of companies it’s still about off-shoring to cheaper countries, but even local outsourcing is shown to keep costs under control, and reduce the risk that the publishers/developers have to take on for any given title. More work for us then. 🙂

Keynote speech – Yves Guillemot, Ubisoft

Posted in Conferences on August 13th, 2007 by MrCranky

Focusing on the upcoming challenges for Ubisoft – the second biggest player in the industry after EA. Almost entirely positive, which is a good thing for a keynote, but in my opinion dwelled a little over-long on the things that Ubisoft were doing rather than taking a more general industry-wide view.

The overriding theme was that of large market growth (backed up in a later talk quoting 68% growth of income in 2006). Ubisoft see the growth as being driven by 3 things: the new generation of console (increased power -> improved immersion -> increased sales). Not sure about that, but my cynicism relating to the new generation is well known. Also driving growth: accessible games, as evidenced by Nintendo DS and Wii. It definitely felt like Ubisoft view the Nintendo platforms as only good for family friendly, casual fun and learning titles, and the traditional AAA blockbuster titles are reserved for the 360 and PS3.

Finally, they echoed the sentiments from last year that user-generated content is driving growth too. Frankly, this rings hollow for me – where are the increased sales from this sort of content. Perhaps I’m too disconnected from the reality of mass marked gaming these days, but I’m just not aware of where this obsession with user generated content is coming from, or what evidence has appeared since last year to convince us that this new way of making games is actually here. I can understand wanting to build good community tools to improve the way people play their games and interact, especially in multi-player titles, but I’m not sure how that ties in to user generated content.

Onto Ubisoft’s actual strategy – for accessible titles on the Wii and DS, they’re focusing massively on usability and polish, and implied much smaller teams, and much smaller titles, developed quickly.. Fthey’re going for the big team, big cost approach for their AAA titles (200+ experienced staff is their idea of a ‘good size’ for teams). They know they need to increase sales to amortise their costs, but I’m not sure that they have any real way of doing that effectively. However, they do have the economies of scale, and the intelligence to try and maximise re-use of tools and engines to minimise their development costs.

Crucially, they know they face recruitment issues with such massive teams, not to mention the cost implications. As such, they are building whole teams (note, crucially they’re not outsourcing to independents, they’re building Ubisoft Studios), but in places where the cost is far cheaper.

Well, it must be nice to be such a big fish, but I’m not sure just how relevant that sort of strategy is to us, the little fish in the pond.

Intro

Posted in Conferences on August 13th, 2007 by MrCranky

Chris Deering introed the conference (albeit with it’s old monicker, Edinburgh Interactive Entertainment Festival, but we can forgive him that. Prior to introing the keynote speaker, there was a bit of quiz-the-audience with the fancy voting devices they’d issued us. The gist of the results:

  • Sector likely to experience most growth in the next year: spread results, favouring casual, handheld, mobile and MMO. Few people liked packaged console or PC titles to grow.
  • Genre likely to experience most growth: Virtual life (a la Second Life or Home) topped the poll, with music titles a strong second. Not sure if I agree with that assessment, but that’s probably reflective of the general sentiment that virtual online communities are going to grow in popularity generally.
  • Percentage of overall revenue to come from online (downloadables, subscription, ad-revenue, etc.): 20-40%. Fair enough – it’s definitely growing and becoming far more relevant, not just in our industry.

Coffee/Networking

Posted in Conferences on August 13th, 2007 by MrCranky

On a whim I checked my phone for available Bluetooth devices in the break-out area – an impressive 18 phones and PDAs which is I think the most I’ve ever seen. We do like our fancy toys. The Nokia N70 and N80 seemed popular, but the prize goes to the device cunningly labelled “Matt’s got AIDS”!

EIF 2007

Posted in Conferences on August 13th, 2007 by MrCranky

Aha! It seems that the Royal College of Physicians has joined the 21st century, and installed wi-fi in its lecture theatres. Of course, I’m at EIF, fueling up on coffee to fight off the hangover-induced grumpiness. More posts/updates as I go through the day of lectures, probably based on how compelling each lecture actually is! If I’d thought in advance, I would have brought my phone/USB cable, snapped a few pictures and added them to the posts, however that will have to wait until after the fact.

Scottish Games promo

Posted in Industry Rants on August 8th, 2007 by MrCranky

So I had a brief moment trying to be telegenic yesterday, being in front of the camera to do a bit for a promotional video being done for the EIF next week. I’m sure I will be edited to some extra small section as we don’t have much interesting stuff to say, but we shall have to see. Anyway, amongst the topics covered was “Why do you think VIS Entertainment went under” – which is sort of a tricky question to answer.

Sure, in the many times in Milnes after work at VIS we laboured long and loud over what we (the grunts) thought the problems were, and anyone for several tables around would be able to repeat them, but I think in the end it wasn’t as bad as it seemed then. Of course, we don’t have any insight into the real goings on, either financial or managerial, so it’s all supposition. However, from where we were sitting it seemed to boil down to one thing: cashflow.

VIS was pretty big at the end – probably still over 100 employees. That makes for a lot of salary going out the door each month. We had two big and one small project on the go (State of Emergency 2, Brave, and NTRA: Breeders Cup), and those had been going for a while, so there was probably little sales revenue from previous titles, only publisher milestone payments. Then of course Brave completed, with nothing to take its place – suddenly more than a third of those payments are gone, with potential sales revenue from it not likely to appear for many months. That’s going to hurt any company’s books, and if the balance is already tight…

That’s not really a ‘why’ so much as a ‘how’ though. The ‘why’ is even more supposition, but I think is reflected in much of what I’ve said here before. Publishers were being hit by tighter margins due to increasing costs, and were responding by tightening down on the developers. Slice the margins thinner and thinner, and the developer becomes so fragile that they cannot long survive if a project finishes with no follow-on, or worse, is cancelled early. In that sense, VIS were just another amongst many studios which died – Visual Science, DC Studios, and so many more across the UK and beyond.

Arguably had projects been cancelled or different decisions been taken things would have played out differently, maybe better, maybe worse. But it seems to me that even if a studio played a perfect game and made no wrong moves, they would still be only a small amount of bad luck away from failure. That for me is a symptom of a troubled industry, and is something I hope will improve. Certainly we all need to work smarter, not harder, to keep costs low enough that making games is profitable.

Connectivity

Posted in Tales from the grind-stone on July 26th, 2007 by MrCranky

Interesting post here from Hypnos at Puzzle Pirates about the increasing dependency we have on our internet connections. I certainly know this one – every time I have to uproot and wait for a new DSL line to be installed I go a little bit stir crazy from the fact that I can’t check my email or do all of the things that normally make up my day to day work and life. Luckily this time round (less than a month away now), I have a laptop with wi-fi, and will most likely be haunting one of the free wi-fi coffee shops around Edinburgh to get my daily fix of Internet goodness.

Managed code and tools

Posted in Coding, Tools on July 23rd, 2007 by MrCranky

So I must admit to being a bit of a luddite when it comes to embracing the new languages and technologies available for developers now. Partly this is because I’ve read (and heartily agreed with) No Silver Bullet by Fred Brooks, and partly it is because I’m much more comfortable knowing exactly what is going on ‘under the hood’ when I write software.

That being said, having good tools is a vital part of games development, and to write good tools you have to build good user interfaces on top of functional software. No amount of clean, efficient and well structured code is going to get you past that final hurdle, which is to interact with the user. I have spent too much time on too many different projects faffing around with inadequate UI libraries to want to spend any more on it now. I would say I am comfortable with MFC based development, but I would never claim that it was easy or pleasant.

So when I keep hearing other developers evangelising the merits of tools based on managed code (C#, etc.) and the .NET platform. Apparently it should take the pain out of making tools, and user interfaces, and should let me concentrate on the important things instead. Well, that was enough to tempt me in, and to give it a try.

The thing is though, our engine and game code is all based on C++, simple and clean, and we’re not going to change that (no matter how much XNA and Microsoft try to tempt us otherwise). So any tools we build have to be able to leverage all that pre-written code, and play nice with the other parts of our engine. So we needed a way to make the managed tools work with the un-managed engine, and that was where my headaches began.

Straight out of the gate, building a tool application with Visual Studio 2005 was simple and easy, and it took less than 5 minutes to have a skeleton UI that had all the right hooks for exercising some of the engine functionality needed to pre-process our assets. But then I had to figure out how to link those hooks to our pre-existing code, and that wasn’t nearly so simple. The problem was this – a C++/CLI based application (i.e. our tool UI) needs to jump through a few hoops to talk to native (C++) code. The documentation rabbits on about mixed and pure assemblies, DLL building, COM linking and a whole ream of pages about how to build a managed application. All of which is total overkill for what we needed – a simple wrapper layer between the managed UI application, and the native core code.

Now that I’ve found out how, it’s not as hard as I thought; as I work through the details, I’m going to note them down and post them here, because it was immensely frustrating to continually search for tutorials and references (I gave up on the MSDN documentation), only to find lots of people talking about how simple it was, but no-one bothering to let on how it was done.

Anyway, in lieu of a later, better post, here is what I have so far:

  • Make a library DLL, making sure that it has managed (/clr) support enabled. This will form the wrapper layer
  • The library DLL can statically link to the native libraries you have.
  • Build a wrapper class which uses pointers to your native classes to route commands/requests to the native code. Make sure this is a managed class (it should take the form “public ref class WrapperClass” if you’re using C++/CLI)
  • NB: You will have to follow the managed rules about not having native value types in your wrapper classes, but you are allowed to have pointers to native types and use new/delete as normal
  • In your fully managed UI application, use the Reference system in the Common Properties for the project to add a reference to the wrapper library. This will automagically allow use of your wrapper layer classes, no need for headers or static linkage. [this is the one annoying step that really wasn’t clear from the documentation]

Anyway, a better picture should emerge from this experimentation, and I’m hopeful that once the basic pattern for managed/unmanaged tools emerges, that we’ll be massively more productive and be able to build up a nice tool-set using this new technology.


Email: info@blackcompanystudios.co.uk
Black Company Studios Limited, The Melting Pot, 5 Rose Street, Edinburgh, EH2 2PR
Registered in Scotland (SC283017) VAT Reg. No.: 886 4592 64
Last modified: February 06 2020.