mirror of
https://codeberg.org/redict/redict.git
synced 2025-01-23 00:28:26 -05:00
3c23b5ffd0
Georadius works by computing the center + neighbors squares covering all the area of the specified position and radius. Then a distance filter is used to remove elements which are actually outside the range. When a huge radius is used, like 5000 km or more, adjacent neighbors may collide and be the same, leading to the reporting of the same element multiple times. This only happens in the edge case of huge radius but is not ideal. A robust but slow solution would involve qsorting the range to remove all the duplicates. However since the collisions are only in adjacent boxes, for the way they are ordered in the code, it is much faster to just check if the current box is the same as the previous one processed. This commit adds a regression test for the bug. Fixes #2767. |
||
---|---|---|
.. | ||
type | ||
aofrw.tcl | ||
auth.tcl | ||
bitops.tcl | ||
dump.tcl | ||
expire.tcl | ||
geo.tcl | ||
hyperloglog.tcl | ||
introspection.tcl | ||
keyspace.tcl | ||
latency-monitor.tcl | ||
limits.tcl | ||
maxmemory.tcl | ||
memefficiency.tcl | ||
multi.tcl | ||
obuf-limits.tcl | ||
other.tcl | ||
printver.tcl | ||
protocol.tcl | ||
pubsub.tcl | ||
quit.tcl | ||
scan.tcl | ||
scripting.tcl | ||
slowlog.tcl | ||
sort.tcl |