I love ruby. It’s beautiful language, with elegant and expressive syntax – perfect as a scripting language, and great for prototyping new ideas quickly… I use it every chance I get, and truly enjoy coding in it.
But is it suitable for game development?
Unfortunately the answer is a resounding no!
I say this probably being the largest ruby fan on the planet, but the simple answer is if performance is at all important for you, you are going to find ruby completely unsuitable.
It’s not because ruby is “slow”. We all know that it’s slow, it is an interpreted language after all. Ruby 1.9 has made it quite a bit faster, and in most cases the performance is slow, and there is plenty of room for improvement, but it’s not too bad.
The problems with ruby aren’t directly related to performance, but are more architectural…
Several years ago when I first started to tinker with Ruby (1.6.8 era), I felt it needed the following things before it could be ready to use in games:
- Bytecode interpreter
- Native threads (not green threads)
- Incremental garbage collector
With Ruby 1.9 we can now check off the first two, but unfortunately the garbage collector is still stuck in the stone age.
The problem of course is that ruby uses a mark-sweep garbage collector. This is a two pass algorithm that first walks over objects that can be reached from top-level references (the “mark” step), then in the second pass it walks over all objects and removes those that aren’t marked (the “sweep”).
This is of course exceptionally cute and ruby like. Reference counting (as used in python and obj-c) can be a pain in the ass to maintain, because if you mess up inc/dec references, you leak memory or access a dead object. With the mark-sweep nothing is required except a function that tells ruby what objects are referenced by one object, and of course ruby can work this out itself (hint: it’s the member variables…)
The problem with this is that it takes a long time to execute. In my simple game application under normal use I found that ruby garbage collects once every 5 seconds or so, and each time it caused a frame hitch of 35 – 100 milliseconds on my macbook air.
Problem is, at 60FPS we only have 16.67ms per-frame… and the GC just blew the budget. This means we’re going to drop frames on a regular basis, and the game feels awful to play.
So what is the solution?
Right now the only option is to disable the GC entirely, and avoid allocating memory on the fly. Pre-allocate all objects, “GC.collect” immediately after level load then “GC.disable” until the end of the level. Of course you also have to avoid calling methods that allocate ruby objects, this basically rules out using any 3rd party ruby libraries (which are allocate happy), and eventually you end up writing your code basically in a C-like fashion within ruby
After a month of this, you start asking yourself “hey, why exactly am I writing code in ruby again?”. After all, if I wanted to pre-allocate everything and not use garbage collection, I could just use C++, and the performance would be much better, and I wouldn’t have to write all this glue code between Ruby and C++…
So until somebody writes a real garbage collector for ruby, one that either runs asynchronously on another thread, *or* one where I can throttle the GC once per-frame “ok, please garbage collect now, but you have a time limit of 1ms to do everything you need”, I have to conclude that ruby is not suitable for game development and you should not waste your time trying to use it!
If you are set embedding a scripting language in your game, I would say that the main choices at this point would be between LUA and Python. I do prefer the syntax of Ruby over Python considerably, and I’m worried that LUA wont actually add enough on top of C++ to be worth integrating without a lot of customization… but clearly either of these options are good, both having shipped in many commercial games.
For now, I’m sticking with pure C++ for games.