test-item-procs.tcl

  • last updated 22 hours ago
Constraints
Constraints: committers
 
Constraints: files
Constraints: dates
Added warning, when (not all) requested forms could not be loaded.

Reduced verbosity.

add question_info_block to documentation block

add explicit return to be more human readable

split out question_info_block

bugfix: pass "-revisions" to "render_proctor_images" since it is needed there

added attachments for text_interaction

whitespace changes

Refactored exam protocol renderers

The new code reduces the sinze of inclass-exam.wf significantly by

moving the exam protocol into test-item-procs.tcl. In the same step,

the largish function was split up, the rendering functions of

submissions are now named consistently, and there are now different

functions for rendering single items vs. many items, making the

single-item rendering reusable. Additionally, the answer manager is

documented in a more eye-friendly and modular way.

  1. … 1 more file in changeset.
provide an analysis method for cleared input (callable for swas via method blank-inputs)

  1. … 1 more file in changeset.
provide means to show submissions of students per test item

  1. … 2 more files in changeset.
added type per question to exam overview

provide a nicer exam-overview

  1. … 2 more files in changeset.
provide more detailed test-item descriptions for exam/question overviews

added exam-overview

  1. … 6 more files in changeset.
make grading checke configurable via URL, make calculations more robust

CVS: ----------------------------------------------------------------------

  1. … 1 more file in changeset.
added policy for supporting view of revisons, used more detailed message key values for supporting rounding by points or revisions

  1. … 3 more files in changeset.
added support for pagination buttons, visited buttons and flagged buttons

  1. … 6 more files in changeset.
moved answer status into answer_panel (similar to downstream),

made sensitivity of inspect links automatically updating (like downstream)

made templating easier and refactored code

  1. … 2 more files in changeset.
Plug the proctoring-display include to the inclass-exam via a web-callable method

  1. … 1 more file in changeset.
call 'next' to ensure file attachments are stored in the database and content repository

fix attachments for short_text_interaction

improve spelling and whitespace changes

  1. … 1 more file in changeset.
added support to "Answer_manager.get_answers" to return also non-exericse specific attributes

added functionality to prevent opening the same exam in multiple tabs

  1. … 1 more file in changeset.
Added substitution values for short text answers

This change adds the possibility to provide randomized substitution values to short

text questions via value sets.

Value sets are a means for a content developer to provide multiple matching answers which are inserted into the text before an exercise is shown to the end user. One can e.g. provide for a calculation exercise several input and some output values, such that the students get different calculation exercises provided. These values can also be used for the correct-when clauses.

The content developer can use percent-code delimited elements when defining the exercise:

---

Assume, you want to download a %x.what% with the size of %x.size% over a %x.type% connection with a rate of %x.rate% from %univ%.

---

and also in "correct when"

---

%x.secs%

---

the value sets can be provided via an extra field for the short-text questions and have the form

of a Tcl dict:

---

univ {WU-Vienna TU-Vienna "University of Vienna"}

x {

{type "ADSL" rate "256 kbit/s" size "235 MB" secs 5300 what "Powerpoint file"}

{type "ADSL" rate "512 kbit/s" size "5.6 MB" secs 91 what "PDF file"}

{type "4G" rate "80 Mbit/s" size "270 MB" secs 27 what "PDF file"}

{type "4G" rate "40 Mbit/s" size "650 MB" secs 32 what "Lecturecast Video" }

{type "5G" rate "1 Gbit/s" size "520 MB" secs 4 what "Powerpoint file" }

{type "5G" rate "1 Gbit/s" size "1.5 GB" secs 12 what "Lecturecast Video" }

}

---

In this example, every student will get a randomly chosen value for the university (%univ%)

and matching elements containing the answer (e.g. download time of "270 MB" over "80 Mbit/s" is 27 seconds).

The download time is used in the correct when part, such that auto-correction can be applied.

When a student answers this exercise, the system provides random choices that are substituted in the text.

For every variable ("univ", "x", ..) different random values are used for the student.

Certainly, for other students, other numbers and results will be provided.

Note, that this value sets can be used for numeric an non-numeric exercises.

Current limitations:

- only defined for short-text questions (can be in general also for other question types)

- no elaborate user interface for entering value sets (or a thorough validator) is provided.

  1. … 1 more file in changeset.
Added two types of grading schemes (in addition to "exact") to ordering exercises:

- "position": count elements as correct, when these are on the correct position

- "relative": count elements as correct, if the neighboring element is correctly before the actual element

The results are adjusted by the same guessing correction as in the "ggw" scheme for MC exercises.

Example:

- desired order: 1,2,3,4

- provided answer: 3,1,2,4

- scheme "exact": 0%

- scheme "position": 0 0 0 1

- scheme "relative": 0 1 1 (correctly ordered 1<2 and 2<4)

A minor refactoring was also performed to ease code reuse.

  1. … 2 more files in changeset.
use absolute time for computing beeps

inclass-exam: Added configurability for time budget and display of minutes and/or points during exam

  1. … 4 more files in changeset.
add "btwn" to precise correct-when specs

deactivate autocorrection for short_text questions, when correct when is not precise