Skip to content
Advertisement

Optimizing Queries while being limited on the number of Indexes I can use

I have a query that involves a join on three tables. Let these tables be A, B and C. A and C have a primary key and they are joined through an equi-join to B using 2 foreign keys that B has. The optimal indexing strategy would be to use 4 indexes on all 4 keys. However I am constrained on the number of indexes I can build..(for reasons I had rather not go into).

The size of tables is such that B has 6 million rows. A has 10 thousand and C has 100 thousand rows.

My question is whether I should build separate indexes on both foreign keys of B(to table A and table C) or if I would benefit by creating a composite index on foreign keys of table B.

The size of an index is not a major issue. However the number of indexes have to be minimized while retaining performance. There are also some filter conditions on attributes of A , B and C.

Note: The database is PostgreSQL 12 and is limited in shared_buffer_space.

Edit The Query Structure is

SELECT A.a1, A.a2, B.b1, B.b2, C.c1, C.c2 
FROM 
   A INNER JOIN B ON A.a3=B.b3 AND A.a5 = <some value> AND B.b6 = <some value>
     INNER JOIN C ON B.b4=C.c4 AND C.c7 = <some value> AND C.c8 = <some other value>; 

I was wondering what other optimizations might I do to Improve query performance including but not limited to Indexing and tuning the Performance Optimizer. The joins are all on integers.

Advertisement

Answer

If your query looks like:

select . . .
from b join
     a
     on a.a_id = b.a_id join
     c
     on c.c_id = b.c_id

Then no additional indexes are really necessary. The primary key indexes on a and c are sufficient.

User contributions licensed under: CC BY-SA
9 People found this is helpful
Advertisement