I love redis. It's blazingly fast and wonderfully atomic.

Back in December 2010 I converted the database behind World of Solitaire to redis 2.2

It's currently holding over 21 million keys and handling over 1,100 commands per second.

Over the past 9 months RAM usage has slowly been increasing as more and more keys are inserted. A few days ago I realized I was very short on available RAM and had to do something sooner, rather than later.

Before throwing more RAM into the box, I decided to try updating to redis 2.4 as I had read a blog post that it was more efficient at storing sorted sets.

I was SHOCKED at the reduction in RAM usage:

It was such a dramatic reduction, I questioned whether or not data had been lost in some way.

Comparing the same dump in both 2.2 and 2.4 yielded the exact same key count: 21,085,659

Am I really storing that many sorted sets? I decided to write some code to see how many keys of each type I was storing and what the average length of each type was.

My first attempt in node.js ended pretty quickly with a memory allocation fault with node.js, so I decided to code it in C.

Data type breakdown:

Average Length:

Over 10 million sorted sets with only 1.5 entries on average per set.

Thanks to redis 2.4, I won't have to worry about RAM for a while :)

Here is the hacky C code I coded up to gather redis key type stats: