作者:任仲禹
爱可生 DBA 团队成员,擅长故障分析和性能优化,文章相关技术问题,欢迎大家一起讨论。
本文来源:原创投稿
*爱可生开源社区出品,原创内容未经授权不得随意使用,转载请联系小编并注明来源。
问题发生背景为某生产 Redis 集群(版本 Redis 5.0.10 ,架构为 30 片以上),该集群中某一个分片内存使用率异常高(内存占用达70%以上,其它片内存相对使用较低),我们模拟生产环境如下监控图所示:
相信看文章标题大家都已知道问题结论,我这里想跟大家分享的是排查这种问题的方法。
### 正常实例
redis-cli -p 6380 -h 10.186.62.28 info keyspace ##数据量
# Keyspace
db0:keys=637147,expires=0,avg_ttl=0
redis-cli -p 6380 -h 10.186.62.28 info memory |grep -w used_memory ##内存使用
used_memory:104917416
### 异常实例
redis-cli -p 6382 -h 10.186.62.5 info keyspace ##数据量
# Keyspace
db0:keys=191433,expires=0,avg_ttl=0
redis-cli -p 6382 -h 10.186.62.56 info memory |grep -w used_memory ## 内存使用
used_memory:373672656
redis-cli -p 6382 -h 10.186.62.56 info memory |grep mem_fragmentation_ratio
mem_fragmentation_ratio:0.89 ## 碎片率小于1
# redis-cli -p 6382 -h 10.186.62.56 --bigkeys
# Scanning the entire keyspace to find biggest keys as well as
# average sizes per key type. You can use -i 0.1 to sleep 0.1 sec
# per 100 SCAN commands (not usually needed).
[00.00%] Biggest string found so far '"key:{06S}:000061157249"' with 3 bytes
[00.03%] Biggest string found so far '"key3691"' with 4 bytes
[40.93%] Biggest string found so far '"bigkkkkk:0"' with 102400000 bytes
[51.33%] Biggest string found so far '"bigk:0"' with 204800000 bytes
-------- summary -------
Sampled 191433 keys in the keyspace!
Total key length in bytes is 4161149 (avg len 21.74)
Biggest string found '"bigk:0"' has 204800000 bytes
0 lists with 0 items (00.00% of keys, avg size 0.00)
0 hashs with 0 fields (00.00% of keys, avg size 0.00)
191433 strings with 307777256 bytes (100.00% of keys, avg size 1607.75)
0 streams with 0 entries (00.00% of keys, avg size 0.00)
0 sets with 0 members (00.00% of keys, avg size 0.00)
0 zsets with 0 members (00.00% of keys, avg size 0.00)
其中获悉结果如下
注意,本文因为是自己模拟的测试环境相对简单,实质生产环境稍微复杂点,可能有不同类型的如 hash、set 类型的键,这些键通过 --bigkeys 分析工具后无法得到内存占用大小,而只能知道元素/成员个数,所以还需要通过其他命令获得内存占用大小:
10.186.62.56:6382> memory usage bigkkkkk:0
(integer) 117440568
10.186.62.56:6382>
10.186.62.56:6382> memory usage bigk:0
(integer) 234881072
# 制造 10 条以 renzy:id: 为前缀,大小为 1024 字节的 key
127.0.0.1:9999> debug populate 10 renzy:id: 1024
OK
127.0.0.1:9999> keys renzy:id*
1) "renzy:id::8"
2) "renzy:id::2"
3) "renzy:id::4"
·····
## 制造阻塞
127.0.0.1:9999> debug sleep 2 //阻塞 2 秒
OK
或
# redis-cli -p 9999 keys \* > /dev/null
//(如果数据量大的话直接 keys *即可)