Forum OpenACS Q&A: Postgres deadlock & maximum execution time

Is it possible to set a maximum excecution time on postgres queries.
Sometimes my postgres-server overload, probably because of a wrong query. Stoping the browser or even aolserver does help.
I had either to wait several minutes or restart postgres in order to get back to a "normal" charge.
Posted by Dave Bauer on
You should find and fix those queries that are long running.

You can cancel a long running query without stopping AOLserver or PostgreSQL.


Posted by Marc Kalberer on
My problem is a bit different.
I had to setup an advanced search engine with a lot of criteria. Depending on which categories/options are choose the query can run for a very long time. I optimized/debug it but some cases remain. That's why I wanted to put a timeout on queries ?
Is it possible ?
Posted by Gustaf Neumann on

This is an old unanswered query, but maybe someone might find the following helpful: PostgreSQL supports a configuration variable named statement_timeout for limiting the max execution time of a query. This value can be changed per SQL session.

Since the SQL session in OpenACS is typically reused for several queries via the connection pool, one should sandwich the potentially slow query by a code like this:

set old_statement_timeout [db_string _ {SHOW statement_timeout}]
db_dml _ {SET statement_timeout = '13s'}
-- .... your potentially slow queries here 
db_dml _ {SET statement_timeout = :old_statement_timeout}

... and probably wrap a transaction out such a code. The value for statement_timeout can be set as well in the configuration file, but then per-default all queries are limited by this - unless one raises the limit per query like above.

In general, it is desirable to limit the maximum execution time of SQL queries issued from web pages to ensure the liveliness of a system. In practice, developers make sure, that overlong queries do not happen.