Yesterday I published The Great Ruby Shootout and it quickly gathered a fair deal of attention. It was on the front page of Slashdot, Hacker News, Reddit, and so on. More than 15,000 people came by to read about the results of my comparison between Ruby implementations.
Those numbers looked good but something didn’t add up. Ever since I clicked the “Publish” button, I had a very uneasy feeling about the main shootout figures. They just didn’t seem right. I had a chance, particularly during the writing of my book, to extensively use Ruby on Vista and I can guarantee you that it’s visibly slower than on GNU/Linux. The Phusion team had benchmarked their Ruby Enterprise Edition against Ruby 1.8.6 many times, and found it to be about 25% faster. Yet my results were showing it as twice as fast than Ruby 1.8.7, which in turn is already faster than 1.8.6. To makes things worse, I’ve used Ruby 1.9 and found it to be faster than Ruby 1.8.7, but not 5 times as fast. For most programs that I tried Rubinius didn’t seem faster than Ruby 1.8. And the more I pondered it, the more it began to feel like one too many things didn’t add up.
In the comments, Isaac Gouy reported a couple of issues with the Excel formulas, where a few unsuccessful tests were mistakenly added to the totals. This skewed the results slightly, particularly in terms of penalizing JRuby. However, this wasn’t really it. Sure, the totals were inaccurate, but not enough to fundamentally change the main outcome of those results.
As I was discussing this somewhat unexpected result with Hongli Lai (co-author of Ruby Enterprise Edition), he mentioned that he knew what might be causing this anomaly. I had run the initial test against Ruby installed through apt-get, because I’d made a couple of assumptions. The first was that most people would probably be using the Ruby version that was deployed by their OS’ packaging system in both development and production mode. The second was that the performance of this version would be roughly similar to the one built from scratch. This second assumption would turn out to be highly mistaken.
I decided to run a test using Ruby 1.8.7 built from source as the baseline and added a column for Ruby 1.8.7, installed through apt-get, to the tables. In addition I also corrected the issue pointed out by Isaac. I updated the original shootout with the correct data, and what you see below is a bar chart for the geometric mean of the ratios for the successful benchmarks.
Notice how everything makes much more sense now. Ruby 1.9 and JRuby are very close, respectively 2.5 and 1.9 faster than Ruby 1.8.7 (from source) on these benchmarks. Less impressive result sure, but I suspect much more realistic. The results for Ruby Enterprise Edition are in line with the 25% speed increase, if we consider that 1.8.7 is a bit faster than 1.8.6. Rubinius is still slower than MRI for most tests, but it’s improving. Ruby on Windows is slow. So slow in fact, that Ruby on GNU/Linux is twice as fast.
The really big, flashing warning though is what happens when you install Ruby through apt-get. Compiling from source gives you double the speed, according to these tests. I expected a 10/20% increase, not 100%. The gist of it is that prepackaged Ruby is compiled using the option –enable-pthreads and there is the whole issue of shared vs static libraries. But whatever the reason, this is a significant difference. For production use, in light of these results, I feel that it would be foolish to use the slower version of Ruby provided by apt-get/aptitude.
I rectified the results as soon as possible because the last thing I wanted was to mislead the Ruby community or worse still, betray its trust. Major kudos to Isaac for spotting the calculation issue, and Hongli for selflessly pointing out that the excellent Ruby Enterprise Edition results were probably due to the low performance of the Ubuntu’s version of Ruby.
No related posts.
I sincerely welcome and appreciate your comments, whether in agreement or dissenting with my article. However, trolling will not be tolerated. Comments are automatically closed 15 days after the publication of each article.