4 千万的数据量, postgre order by 优化。
資深大佬 : xmge 9
各位大佬,有个这样的数据库优化问题:
select xxxx,xxx from tables where a > xx order by update limit 100; a 和 update 都有索引
数据量 4 千万,执行 explain analyze 分析 sql 结果如下,主要的花费在于 Recheck Cond,
这个问题如何解决呢???
Sort (cost=2058.65..2059.93 rows=512 width=89) (actual time=513.218..513.274 rows=333 loops=1) Sort Key: updated Sort Method: quicksort Memory: 74kB -> Bitmap Heap Scan on Tables_XXX (cost=20.53..2035.61 rows=512 width=89) (actual time=15.101..512.814 rows=333 loops=1) Recheck Cond: ((user_id)::text = 'XXXXXXX'::text) Heap Blocks: exact=318 -> Bitmap Index Scan on XXXX_INDEX (cost=0.00..20.41 rows=512 width=0) (actual time=11.006..11.006 rows=333 loops=1) Index Cond: ((user_id)::text = 'XXXXXXX'::text) Planning time: 0.089 ms Execution time: 513.320 ms
大佬有話說 (3)