Skip to content
Advertisement

Avoid ‘out of shared memory error’ in PostgreSQL function

I have two tables, a table called companies_display with information about publicly traded companies such as ticker symbol, market cap etc. and a partitioned table stock_prices with the historical stock prices for each company. I want to calculate the beta of each stock and write that into companies_display. For that I wrote the function calculate_beta(ticker) that calculates it:

The function works for just a few companies, but when i tried to update the whole table

I get an out of shared memory error and it says that max_locks_per_transaction is too low. My guess is that the function creates a lock on the stock_prices table for each company and doesn’t drop them till it would have run to completion. What I tried was to create a loop

but I ran into the same problem. Is there a way to drop the lock on stock_prices after each beta calculation or do the update in batches? For about 5 companies at a time my method works.

Advertisement

Answer

If someone runs into the same problem, I avoided the error by calling my sql function from the server I use to update my database. That way I have one transaction per company and I don’t get to many locks.

User contributions licensed under: CC BY-SA
1 People found this is helpful
Advertisement