-
-
Notifications
You must be signed in to change notification settings - Fork 2.2k
Description
Describe the Bug
If a page contains a very long table, the response time increases to very high values (>=30s). This leads to request timeouts with HTTP server errors (500) and thus discarded changes.
What I did to temporary solve the issue was adding the following line to the public/.htaccess
file, which increases the PHP request timeout:
# Increase PHP execution time to prevent timeouts on write
php_value max_execution_time 60
Steps to Reproduce
- Create a page with a long table (in our case: about 300 rows; 4 columns; with code formatting within cells).
- Save the page.
- Wait until page is probably saved (and measure request time).
- If the timeout of the webserver is exceeded, the query is discarded.
Expected Behaviour
Shorter response times comparable to other pages without long tables.
Screenshots or Additional Context
Interestingly the timeout is always produced by the exact same line of code, which is according to the logs:
[2022-01-01 00:00:00] production.ERROR: Maximum execution time of 30 seconds exceeded (...) at /var/www/bookstack/app/Entities/Tools/PageContent.php:231)
The $xpath->query('//body//*//*[@href="' . $old . '"]')
call seems to cause the problem in combination with many HTML elements, which is the case when using large HTML tables.
Browser Details
No response
Exact BookStack Version
v22.11.1
PHP Version
No response
Hosting Environment
Debian 11 with Apache HTTPD (probably shouldn't matter much).