I create a table with 43kk numbers, populating them with values ββof 1..200. So ~ 220 thousand. For each number distributed according to the table.
create table foo (id integer primary key, val bigint); insert into foo select i, random() * 200 from generate_series(1, 43000000) as i; create index val_index on foo(val); vacuum analyze foo; explain analyze select id from foo where val = 55;
Result: http://explain.depesz.com/s/fdsm
I expect a total runtime of <1s, is this possible? I have an SSD, the core is i5 (1.8), 4gb RAM. 9.3 Postgres.
If I use Index Only scan, it works very fast:
explain analyze select val from foo where val = 55;
http://explain.depesz.com/s/7hm
But I need to select id not val, therefore Incex Only scanning is not suitable in my case.
Thanks in advance!
Additional Information:
SELECT relname, relpages, reltuples::numeric, pg_size_pretty(pg_table_size(oid)) FROM pg_class WHERE oid='foo'::regclass;
Result:
"foo";236758;43800000;"1850 MB"
Config:
"cpu_index_tuple_cost";"0.005";"" "cpu_operator_cost";"0.0025";"" "cpu_tuple_cost";"0.01";"" "effective_cache_size";"16384";"8kB" "max_connections";"100";"" "max_stack_depth";"2048";"kB" "random_page_cost";"4";"" "seq_page_cost";"1";"" "shared_buffers";"16384";"8kB" "temp_buffers";"1024";"8kB" "work_mem";"204800";"kB"
source share