Assuming that you have 32GB of Memory allocated to Redis and other memory available to your system resources, you can calculate the number of entries you can store. However, keep in mind that this is extremely generalized as there is always some sort of overhead. Regardless, if you look at the pure number of entries from a math perspective, you could store roughly a billion entries with 32GB of RAM.
1 GB = 1024MB 1 MB = 1024KB 1 KB = 1024B 1GB = 1,073,741,824B 32GB = 34,359,738,368B size = 12B + 20B = 32B number_of_entries = (available_size / size_of_entry) = (34,359,738,368B / 32B) = 1,073,741,824
As I said before, it is very bad to just assume that there isn't any kind of overhead. See Redis's FAQ Below. Without knowing more details of your environment, it would be impossible to give more accurate numbers.
What's the Redis memory footprint?
To give you a few examples (all obtained using 64-bit instances): An empty instance uses ~ 1MB of memory. 1 Million small Keys -> String Value pairs use ~ 100MB of memory. 1 Million Keys -> Hash value, representing an object with 5 fields, use ~ 200 MB of memory. To test your use case is trivial using the redis-benchmark utility to generate random data sets and check with the INFO memory command the space used. 64 bit systems will use considerably more memory than 32 bit systems to store the same keys, especially if the keys and values are small, this is because pointers takes 8 bytes in 64 bit systems. But of course the advantage is that you can have a lot of memory in 64 bit systems, so in order to run large Redis servers a 64 bit system is more or less required. The alternative is sharding.