Forum .LRN Q&A: Re: Request for advice from the OCT

Collapse
Posted by Ola Hansson on
Malte, your description of the integration between Assessment and Curriculum (well, the SS engine, anyway) goes pretty much hand in hand with the way I picture it.

The expectation that SS modifies the users permissions on an assessment is interesting, and something which I hadn't thought about. It sounds a bit tricky but I'm sure we'll work something out.

Collapse
Posted by Malte Sussdorff on
The reason I bring up modification of permissions is due to my understanding, that there is no need for packages to have a scheduling functionality, as this can be solved by permissions.

Take assessment. The assessment "bar" is available to user "foo" from 10am to 5pm. The scheduler will open the "write" permission for user "foo" on assessment "bar" at 10am and remove this again at 5pm.

Same is true for SS granting access to an assessment. Or take files. Let's assume a student shall only see the document on advanced mathematics if he passed the introduction test. SS would then take the grade from assessment (either pull or push) and then give the permission "read" on the file "advanced mathematics".

The beauty of this approach is it's easyness and the lack of need to modify most existing packages to know of an existing sequencing module.

Collapse
Posted by Ernie Ghiglione on
Hey Malte,

<blockquote> Same is true for SS granting access to an assessment. Or take files.
Let's assume a student shall only see the document on advanced
mathematics if he passed the introduction test. SS would then take
the grade from assessment (either pull or push) and then give the
permission "read" on the file "advanced mathematics".

The beauty of this approach is it's easyness and the lack of need to
modify most existing packages to know of an existing sequencing
module.
</blockquote>

Well, that will be -in most cases- given too responsibility to the SS engine that might not be part of its job (and it might fall as part of the Assessment package responsibilities to do so.

[IMPORTANT: note that this examples only covers how it should work if the Assessment package is meant -or is coded- to be compliant with IMS SS. If that's not the case, then the sequencing can be resolved simply by the Assessment internal mechanisms/functions]

Taking your example, let me see if I can modify it a bit so the bounderies of both systems are clear and neatly defined:

When the assessment is created the author defines certain rules that basically set the behaviours (and/or branching) that a student will take according to certain values that he/she scores (or previous content view) for instance.

Once the assessment gets uploaded, then the fun part begings:

The assessment package is responsible for manage, delivering the questions/tests/assessment to the students, gather results and some other admin tasks.

The SS engine, instead, does set the sequences and behaviour for any IMS SS that gets given.

Therefore, if the assessment comes with IMS SS information, the assessment package passes those to the SS engine which stores the sequencing information (only).

Now, when a student/learner goes about taking that particular assessment, the Assessment packages ask the SS to deliver the correct sequence for this user. Once the SS engine returns the appropiate sequence, then the Assessment package renders it accordingly to the user. However, there is going to be cases when the answer of the user will have to be passed to the SS engine so the Assessment package can get a new sequence according to the results.

Notice that there the SS engine bears no responsibility for recording the scores of the assessment -as it is only the fellow that has a bunch of rules set up and according to what the student answer (and the sequence originally set by the author) determines what comes next. So the responsibility for recording down the results of the student is part of the Assessment package.

Otherwise, not only the SS engine has to record all results for all random questions and tests (not to mentioned pages views for all sort of  courses and learning objects)... which at the end of the day, they really are not part of its job.

Now, the question you might want being asking: but then, who is the sequencing engine going to know where the student left last time? Well, when you request a sequence, since the Assessment package is the one that tracked where the guy left off, then it is the Assessment packages that says "Hey SS mate, random striker is back. He left in section 2, question 4 and the answer was 'The Beatles'... what should I show him next?"... and the IMS SS engine, will give you a set of questions to render.

Does that make sense?

This way, you are very clearly separating the sequencing job from the rendering and tracking, which this two last ones are the responsibility of the Assessment/LORSm packages. Determine the appropiate sequence of activities/questions/learning objects: that falls on the SS engine lap... and that's the way you want to do it, so neither your Assessment package nor LORSm have to understand anything about sequencing and behaviours. In addition, if the IMS SS in the future changes, and new behaviours are added, you keep using the same API and the SS engine will tell you what and how to render it to the user.

I hope that helps,

Ernie

PS: Once again, this applies only and only if the sequence you want to work with is of the type IMS SS... if not, you are just free to use whatever you want... from creating your own sequencer and behaviour or even the workflow packages if you find it useful.

PS2: Sorry for keep pushing for the Carnegie Mellon paper on simple sequencing, but it does cover *all* relevant aspects of IMS SS and it is really very well and robust design.

http://www.lsal.cmu.edu/lsal/resources/standards/ssservices/services-v02.pdf

Collapse
Posted by Malte Sussdorff on
Hi Ernie, I mentioned earlier:

Now, internally the assessment system is flexible enough to enable a sequencing on it's own. This sequencing though has nothing to do with IMS simple sequencing. The assessment sequencing is used for branching, displaying multiple question on a page and then go to the next one.

I see a clear distinction between what an assessment does *internally* and how it is called in a learning context. If you talk about IMS sequences within an assessment, they have to be controlled by the assessment system using the functions provided by the assessment.

But this is not what the assessment is all about in a learning context. In a learning context an assessment is only *part* of the learning experience. And this learning experience includes other objects as well (e.g. LORSm content, grades given in oral exams, ...). The SS package will deal with the conditions and rules that govern the way how the Sequence between these Learning Objects is created.

Let's try to get the distinct utterly clear, as I think this is the reason for confusion.

  • Question four follows question two if answer to question one was "bar". Otherwise display question three. Strictly assessment package internal.
  • Display assessment higher mathematics if paper on mathematics has been read. SS functionality
  • Display questions a,b,f,g if paper on mathematics has been read, display question b,d,g,j otherwise. SS functionality. Footnote: a,b,f,g is one assessment, b,d,g,j is another assessment.
You asked where we store the grades. The assessment system *internally* stores percentages. The results of an assessment will be *pushed* to the Evaluation package. The SS system has to query the evaluation package if it wants to create a rule based on grades, *not* the assessment system (though it might do so if it so pleases, but I don't think it would make sense for it to do it).

Now, when a student/learner goes about taking that particular assessment, the Assessment packages ask the SS to deliver the correct sequence for this user. Once the SS engine returns the appropiate sequence, then the Assessment package renders it accordingly to the user.
No. This is not the case. The assessment knows on it's own the sequence which to use for the assessment as an assessment *internally* does not differentiate between items and sections depending on external conditions. If you want to modify an assessment based on external conditions you should create two assessments, otherwise the results of *one* assessment are not comparable anymore within the assessment. And this might be the case where I run heads on against the "standards" wall, but unless I see a real use case, where you have an assessments display governed by external conditions, I'm not keen on designing it that way from the beginning (you can always exchange the *internal* sequencing engine at a later stage, if utterly necessary).

If a student leaves an assessment in the middle, the assessment system knows where to continue. No need for the SS system to give the next questions. This is something the assessment does all by itself *internally*.

My whole point is that there is a clear distinction between how sequencing is done *internally* in a package and *externaly*. You are not going to make the SS package responsible for the sequence of paragraphs in a document. Neither do you have to make it responsible for knowing the sequence *within* an assessment. But it is *very* responsible for providing the sequence between the document and the assessment.

Can you see this distinction and does it make sense to you ?

P.S.: I do agree that it would be nice to use the API and storage capabilities of an SS package for handling sequences internally in an assessment. But until we have such a generic API and storage capabilities, we are stuck with the engine currently implemented in the design specifications. If someone (Ola, Ernie 😊 ) wants to take a look at it and modify it in a way that we could split this out and make it into an SS api, that's fine with me. Please look at https://openacs.org/projects/openacs/packages/assessment/design/sequencing.

Collapse
Posted by Ernie Ghiglione on
Hi Malte,

Thanks for taking the time to explain this a bit more clearer. It has been really good. We should have more of these discussion as it really helps us to put everyone in the same page.

<blockquote> No. This is not the case. The assessment knows on it's own
the sequence which to use for the assessment as an assessment
*internally* does not differentiate between items and sections
depending on external conditions.
</blockquote>

But then, is it possible to say that the sequencing of QTI has nothing to do with IMS SS? For instance, a sequence of activities can't reach (for lack of a better word) one single and individual question in an assessment? If no, then we might need to figure out what we can do as Simple Sequencing does not place any restrictions on what can be sequenced in such a tree. (http://www.imsglobal.org/simplesequencing/ssv1p0/imsss_bestv1p0.html#1500831)

More over, can an QTI assessment be sequenced using IMS SS?

<blockquote> My whole point is that there is a clear distinction between how
sequencing is done *internally* in a package and *externaly*. You are
  not going to make the SS package responsible for the sequence of
paragraphs in a document.
</blockquote>

That is true. However, that deals with the granularity of the activities defined in the sequence. It won't be able to sequence paragraphs as they are part of a learning object, which I believe they are the smallest units that can be part of an activitity, right?

However, I was under the assumption -by reading the specs- that it was possible to sequence individual questions as they are the smallest (atoms) of QTI that can be sequenced. But I'm not so sure any more 😊

<blockquote> Can you see this distinction and does it make sense to you ?
</blockquote>

Yes, I believe I do. Summarizing:

IMS SS = sequencing of activities (learning objects, entire assessments, etc)
IMS QTI (assessment) = internal sequence of questions given by the assessment creator before it was uploaded into the system

Right?

Ernie