There are multiple ways to figure out what take so much memory of your Redis instance. The simplest and quickest way is the following:
redis-cli --bigkeys
But for some cases it’s not enough. Then you can Redis Memory analyzer (requires Python 3.4). It does provide a lot of useful insights but didn’t help me either. So here’s a way which seems to be optimal in case you need to sort keys by used memory. This way needsĀ redis-rdb-tools installed:
pip install rdbtools
then run:
rdb -c memory /var/lib/redis/dump.rdb -f memory.txt
rdb will go through the whole database and at end you will get all keys used memory output file. Once you have it won’t take one more minute to complete the task:
OUT=./memory.csv { echo -n "TOTAL " cat memory.csv | awk -F ',' '{sum += $4} END {print sum/1024/1024}' ; for k in $( awk -F ',' '{print $3}' $OUT | awk -F ':' '{print $1}' | sort | uniq -c | sort -rnk1 | head -n 10 | awk '{print $2}' ) ; do echo -n "$k " grep $k memory.csv | awk -F ',' '{sum += $4} END {print sum/1024/1024}' done ;} | sort -rnk2 | column -t [root@d1 ~]# /usr/local/sbin/redis_memory_report.sh TOTAL 1027.42 bosons 291.925 leptons 238.798 quarks 170.58 [root@d1 ~]#
There might be some smarter ways to get this but this one did the trick for me.