Forum OpenACS Q&A: Experience with Online Tests?

Collapse
Posted by Nima Mazloumi on
Hi everybody,

has someone experience with creating online tests using OpenACS? We have the survey package but as far as I can see it doesn't support correct answers so the user gets a short feedback on his input. This would be a further extension.

If someone knows of a tool for creating online tests I would be grateful to get more info.

Best greetings,
Nima

Collapse
Posted by Joel Aufrecht on
This thread: https://openacs.org/forums/message-view?message_id=155046 seems to summarize the state of online tests in OpenACS pretty well.  My impression from it is that we have a bunch of survey packages with some features, and that a full-featured system doesn't exist for OpenACS but is being ported - ETA unknown.

Meanwhile, I put together a fairly simple app which does online testing and shows correct answers (and lets test-takers argue with the correct answers) and uploaded it to the repository just today.  It's called "Vocabulary" and you can install it from the repository, or see it (with some sample data) at aufrecht.org/vocabulary.  If a standard solution ever shows up in OpenACS, I'll modify my package to use it instead.

Collapse
Posted by Nima Mazloumi on
Dear Joel,

from what I now know the evaluation system they develop is an improvement of the homework component and not an assessment tool for students to test their own knowledge. There should be something like the survey package that is extended to support this feature to collect entries on the one hand but also to send a reply back to the student for feedback. Thus each question type should also have an optional right/wrong checkbox and an optional textarea (*) for the right/wrong reply.

As soon as the user submits all the entries are stored, then evaluated, a dynamic reply page with as a combination of (*) depending on the user entry is created and sent back to the student.

Does this make sense to you?

I will try to play a bit with the existing survey tool to see how much effort the implementation is.

Greetings,
Nima

Collapse
Posted by Joel Aufrecht on
"Does this make sense to you?"

Do you mean something like this: http://aufrecht.org/pictures/base-photo?photo_id=14200

Collapse
Posted by Nima Mazloumi on
Kind of. But there should be support for multiple choice, integers, natural numbers, free text. Similar to the survey package.
Collapse
Posted by Malte Sussdorff on
The new version of survey (aka. assessment) will have these capabilities. If you need it *now*, we can for sure hack something together based on the design specs of assessment and complex survey. It would basicly add as_item_checks (see the specs at https://openacs.org/projects/openacs/packages/assessment/design/metadata).

But as this work would be redundant, I'd advise not to go down that road but wait for the 4-6 months til assessment get's released. Mind me beeing very cautious here, but we are facing a complex system which we somehow have to integrate into the grading book functionality as well (if you have an assessment you can add points to each question, which will be aggregated and delivered to the grade system depending on the answer.). This is a whole different story though and we need to be very careful how best to integrate without reimplementing a lot of functionality again (after all, if you have free text answers that have to be graded manually, you need to add this functionality as well).

Collapse
Posted by Venkatesh Goteti on
One of the best guys to ask about this would be Ernie Ghiglione whose intial work on an online tests/exam package is being used on AIESEC.net. And, he was doing this stuff on the project there. I think there have been improvements on it after that, I am sure someone around would have a later update on what happened ever since.
Collapse
Posted by Ernie Ghiglione on
Nima,

Venky is refering to this post:

https://openacs.org/forums/message-view?message_id=83158

It is an Oracle only test package that allows you to create tests. Each test have sections, and the sections have questions. Questions can have one or multiple answers. For each question, you can have a wrong or correct message to display to the user. Also you can set up a passing score for each test (a percentaje of questions).

A cool feature that has is that it allows you to set the order you want the question rendered within a section. So you can choose to have an X number of question randomly picked from a larger pool of questions.

In addition it has the killer-question concept. Although you might be getting all answers right, if you answered wrong a question that was a killer-question, you automatically fail (regardless of your previous answers).

I can't really recall all the details, but it has a good bunch of features you might find interesting.

Malte is working on the assessment tool and that the specs sound very very cool. Maybe it would be worth it to work with him and others on this.

Ernie