We are tracking down a problem that was recently revealed.
We have an Xowiki page with a very long content body. Over 87000 characters.
When attempting to edit and save this page, the POST body is dropped and does not appear to be received or processed by the request processor or xowiki package. Logging in the rp_handler procedure and index.vuh of xowiki show no post data is in ns_conn query or ns_getform.
I also added logging to ns_getform procedure to see if I could figure out what was happening from that direction but it does not reveal anything I can find.
For example this debugging line in the index.vuh for xowiki shows in most cases the form data is received, but in the case of very long form data it is empty.
::$package_id log "--starting... [ns_conn url] [ns_conn query] \
form vars = [ns_set array [ns_getform]]
"
I checked the maxupload setting that specifies if post data is written to a temporary file. I tested 10000 - 100000 (original settings) - 1000000
Increasing the maxupload setting in the config file to 1 million bytes does solve the problem. Is this a problem we should try to fix. Should a large form upload (without a file attachment, just a textarea) fail if it is larger than this setting?