Forum OpenACS Development: Bug in xowiki page includelets resolution when viewing live revision

If you go to an xowiki page that has an include of another xowiki page like
{{ page_title }}
and you click "Revisions" then click on the magnifying glass icon next to the lastest/live revision you will get an infinite loop.

Here is what happens.

When the include tag is encountered by the parser resolve_request is called. Then this bit of code runs

if {$item_id != 0} {
set revision_id [my query_parameter revision_id 0]
set [expr {$revision_id ? "item_id" : "revision_id"}] 0
#my log "--instantiate item_id $item_id revision_id $revision_id"
set r [::Generic::CrItem instantiate -item_id $item_id -revision_id $revision_id]
$r destroy_on_cleanup
#my log "--instantiate done CONTENT\n[$r serialize]"
$r set package_id [namespace tail [self]]
return $r
} else {
return ""
}

This gets the revision_id from the URL query parameters and passes it to the instantiate which uses the revision_id. This instantiates the revision of the parent page and it parses it again, gets to the include etc.

I noticed in a later version of xowiki the code is changed to use

set r [::xo::db::CrClass get_instance_from_db -item_id $item_id -revision_id $revision_id]

which might fix it. I am not sure of all the implications of this the xo::db::CrClass call is no available in my version.

Any advice?

Dave,

as i see from my mails, i have fixed the bug more than a year ago. I would certainly recommend to upgrade to a newer version of xowiki. We started to use the head version of xotcl-core + xowiki on our production system (the major blocker from my side to release xowiki + xotcl-core from HEAD as stable is the update the documentation).

However, if for some reasons, you can't update, the following changes might help you:
http://cvs.openacs.org/cvs/openacs-4/packages/xotcl-core/tcl/context-procs.tcl?r1=1.21&r2=1.22
http://cvs.openacs.org/cvs/openacs-4/packages/xowiki/tcl/xowiki-procs.tcl?r1=1.122&r2=1.123
http://cvs.openacs.org/cvs/openacs-4/packages/xotcl-core/tcl/context-procs.tcl?r1=1.20&r2=1.21

-gustaf

Hi Gustaf,

The code from HEAD (i have version 1.39 of context-procs.tcl for example) exhibits the same behavior. It appears the code has been totally rewritten and the patches do not help.

I am using xotcl-core 0.89 released 2008-08-25.

Any ideas?

As far as i can tell, this problem does not exists in the head version of xowiki+xotcl-core. just now, i have created 2 pages, where the 2nd one contains the first one. Clicking on "view" in "revisons" on the page including the 2nd one works fine.

Note, as indicated before, an updated xotcl-core is not sufficient, xowiki should be updated as well. Note that the mentioned patch was between xowiki-procs 1.122 to 1.223. xowiki-procs is in head at version 1.306.

Not sure, what xowiki version you are using. It might be easier to upgrade xotcl-core and xowiki to HEAD.

-gustaf

I fixed this problem by increasing the stacksize to 1mb from 512kb.

Not sure why it was only 512kb, i think at leaast 1mb is a minimum with the code we are using these days.

Gustaf, what stacksize do you use?

Note, you were right, there isn't any bug, it was just the depth of the call stack when processing a very long wiki page to generate a diff.

One issue that could come up, if I get a document twice the size of the one I had a problem with, it might again run out of stack. There might be someplace in the code that code be changed to not nest the calls so much, but I haven't evaluated that.

512KB are clearly to small. We use 2mb on all our systems (including production, were we have about 100 connection threads configured). However, i don't see, were the document size has an influence on the stack size diff. As long there is no recursion, the document size should not require a larger stack. Are you using already the version with ::util::diff? If you see a recursion, where does it happen?
I haven't evaluated it yet. I just saw a signifigant difference in the time it took to generate the diffs when the size of the content increased.

I will look into the code. It is using the standard xowiki diff not using util::diff in this instance.

Difference on time depending on size is ok, different stack consumption would be bad. The Tcl-based diffs load the document in two versions in memory and build from this tcl lists, for which the longest common subsequences are computed.

It well established that scripted solutions are slow on large input files (see e.g. the comments on http://esw.w3.org/topic/HtmlDiff). The best free alternative seems to be currently daisydiff from Google code (gsoc project in 2007) and maybe the php implementation for mediawiki based on the same algorithm (gsoc project in 2008) http://code.google.com/p/daisydiff/ . If someone has time, interest and skills, it sounds like a nice little project to compare this in detail with the tcl based html diff and maybe port it to Tcl.

-gustaf neumann