Forum .LRN Q&A: Assessment Status Report

Collapse
Posted by Malte Sussdorff on
This is to give everyone a headsup on what is happening with regards to assessment and what I see is necessary to progress swiftly in the next couple of weeks.

First of all the use cases are ready at https://openacs.org/projects/openacs/packages/assessment/requirements/use_cases. Please have a look at them and especially add more ones as comments, if you feel some are missing. The use cases are the primary drivers of the future development and are therefore very important.

A list of needed attributes by the university of Heidelberg has been compiled some time ago at https://openacs.org/projects/openacs/packages/assessment/rfc/. This is already a more technical approach to the functionality as it defines the attributes and the question types.

The requirements which derive from the use cases are detailed at https://openacs.org/projects/openacs/packages/assessment/requirements/ with an overview of what the assessment system wants to achieve and the list of item types we want to support. This is work in progress and needs an update due to the use cases.

As for the design, it is clear that we need to use the content repository for the benefits of categorisation, i18n of content along with versioning. Furthermore, the datamodel along with explanations has been beautifully written down by Stan at https://openacs.org/projects/openacs/packages/assessment/design/.  Furthermore understanding has been reached to build as much display logic into ACS core as possible, to make the code reuseable for other packages as well (e.g. multi dimensional matrix with the special case of Likert scale: https://openacs.org/forums/message-view?message_id=179439).

The current design challange is to decide whether to make use of xoTCL and follow an OO paradigm. The benefit for the development is the ability for an item to inherit additional attributes depending on it's item_type and display_type. As there is not much experience outside Vienna university with xoTCL it is a risky approach, but I think, taking their experience so far, a worthwhile one.

With regards to development, the E-Lane project went ahead and implemented a first stab at the assessment system to import QTI data. This has to be ammended to make use of the content repository correctly along with some other minor corrections, but it is great to know that we have a working QTI import.

The development happens solemnly on the OpenACS HEAD CVS. The package has a maturity level of "0", (don't use, in development) and is located in the packages directory. Currently only E-Lane is developing there, my assumption is that I will start working there ones the specs are more solidified and I hope Stan can jump in soon as well.

One main reason for a delay in the continuing refinement of the specifications has been the urge to make the assessment system as much reuse existing OpenACS technology along with the planned one as possible. Taking into account the discussion around an object oriented approach for OpenACS 6 and the easy creation of objects with dynamic forms, this sounded like a worthwhile effort to wait upon and see where it is going. Sadly, nothing will happen in that direction soon, so it makes more sense for the assessment crew to go ahead and develop the dynamic form creation on it's own.

What needs to happen in the near future:

- Come to a decision how to store the additional attributes for each item (so development can start). Current favourite: XML in one column (though Dave Bauer disagrees). Alternative: Create different cr_types for each combination of display/item_type, which IMHO is a little bit too much work. Or use one additional as_items_attributes table as it is designed in the specifications at the moment, though this might prove to be hassle to integrate with the CR.

- Develop the item_repository supporting all the necessary question types and integrate the work done on the QTI import.

Next steps with regards to the specifications:

- Amend the requirements to incorporate the use cases and the needs by the university of Heidelberg.
- Decision on the storage of additional attributes.
- Flesh out the design for item_checks (aka testing).
- Define strategy how to do internal grading and exporting the grades into the evaluation package.

Collapse
Posted by Malte Sussdorff on
After quite some talking with various people inside and outside the community I updated the Design pages to incorporate design decisions. I will not retouch these specifications again, unless being asked to for good reasons :).

Fundamental decision has been made, which makes it's way through the whole design specification.

  • As many attributes of an object (item, choice, section, assessment, ...) will make it into a general cr_item_type.
  • Additional, type dependend attributes will be added by linking an object of the type with the original object.
What does this mean. Let's take an item for example (The item description is up at https://openacs.org/projects/openacs/packages/assessment/design/as_items.).
  1. An item stores it's core information in an object of the as_item type. Core information would be name, description, adp snippet.
  2. Depending on what kind (e.g. multiple choice) of item it is, a second object will be created of the type associated with the kind (e.g. as_item_multiple_choice type). A relationship between both objects will be created. If we copy an item, we might as well create a new relationship between the copied question object and the type object associated with the original object.
  3. Depending on how we want to display (let's say textarea) the item, a new object of the as_display_textarea type will be created and associated with the item. For faster processing reasons we are going to put the widget type and display_code (for HTML) as attributes with the as_item object itself, but this can be overruled by the associated display_type.
The specification for the display functionality can be found at https://openacs.org/projects/openacs/packages/assessment/design/display.

The types of items have already been identified at https://openacs.org/projects/openacs/packages/assessment/requirements/item_types. To my knowledge these should cover all use cases, but I'm more than happy to incorporate more.

Internationalization of content will be done in line with the specs/ideas posted by Joel

What does this mean for assessment:

  • We have all information ready to work on the item repository, store the information in the database and display it on one page.
  • The current import and data model written by Eduardo and Alvaro should be amended to make use of the description (and the CR).
  • Nearly all functionality asked for in the use cases is reflected in the specifications (somewhere), we need to map this now to the use cases.
  • We need feedback if the specifications are understandable enough for the user wanting to evaluate if assessment is of good use (https://openacs.org/projects/openacs/packages/assessment/requirements/) and if not, state why and how we can improve.
  • Feedback by developers on the design specifications would also be highly appreciated, especially by the e-lane folks who already went ahead and wrote the QTI import.
Collapse
Posted by Malte Sussdorff on
A new section on response handling and storage has been added. This should allow developers to extend the QTI import to also include importing of answers to questions.