Continuous Delivery Updates for May 2024

Students and Instructors

Grades – Feedback files are now available in Grades for Dropbox, Quizzes, and Discussions

This feature allows instructors to ensure that learners can easily review feedback from the Grades tool.

Instructor feedback added in Dropbox, Quizzes, and Discussions is now visible to learners under Grades.

Learners can review feedback from the Grades page. Attached Files shows Feedback.docx (12.68 KB).
Figure: Learners can review feedback from the Grades page.

Instructors Only

CourseLink Editor – Equation editors

The Graphical and Chemistry Equation editors used within Editor are updated from WIRIS version 4.13 to version 7.23.1.

Awards – Certificate ID for an issued award

To improve security when issuing certificates, instructors can now use a new replacement string field when creating their certificate templates to add an automatically-generated numeric ID. This ID uniquely identifies the issued award, which can be verified using a new API call.

The feature introduces a new replacement string {CertificateID} that instructors can add to their PDF templates. A unique ID is generated and populated into the replacement string {CertificateID} on an ongoing basis when released.

The unique generated ID is 12 numeric characters long and is stored in the CourseLink database as CertificateID, along with other data for the issued award.

Note: The {CertificateID} replacement string follows existing rules of other current replacement strings and is marked read-only.

A certificate ID validation tool is available in the OpenEd Toolbox under the Miscellaneous menu.

Grades Mastery View – Improved Publish All and Retract All workflows

To improve user experience within the Grades tool’s Mastery View, this feature addresses user feedback regarding the Publish All and Retract All actions.

Previously, a time delay could occur between initiating Publish All or Retract All actions and the visible update of the action’s status in the Mastery View table. This delay sometimes made it unclear whether the action was successfully applied.

To provide immediate feedback that a publishing or retracting event is in progress, the Mastery View page now displays as pending or updating immediately after a user clicks Publish All or Retract All. Updates can take several minutes to complete. Users can navigate away from the Mastery View page while updates are ongoing to perform other tasks. Upon returning to Mastery View, the page clearly displays the status for the Publish All or Retract All actions.

After clicking Publish All, the screen updates and the Publishing in progress pop-up displays until the publishing action is completed.
Figure: After clicking Publish All, the screen updates and the Publishing in progress pop-up displays until the publishing action is completed.

Learning Outcomes – Learning Outcomes evaluation for group assignments

This feature adds learning outcomes assessment into the evaluation interface of group assignments. Instructors can now assess learning outcomes aligned directly to group assignments or through rubric alignments. Instructors can also apply a learning outcome score while evaluating a group assignment. All members of an assessed group receive the same learning outcomes score, ensuring consistency and fairness in the evaluation process.

Previously, learning outcomes could be aligned directly or through rubric criteria to group assignments but were not assessable within the evaluation interface. This limitation made it impossible to capture outcomes assessment data for group assignments. Now, instructors can easily assess learning outcomes in group assignments, and organizations can gain additional data on learner achievement.

Manage Files – Receive notification when zipping and unzipping files

As of this release, whenever users add (zip) or extract (unzip) one or more files to a ZIP archive in Manage Files, a new dialog confirms that files are being zipped or unzipped in the background. Once the process is complete, a notification appears in Update alerts (the bell icon).

Previously, adding files to a ZIP archive was not a background process and thus there was no notification after zipping. In addition, the notification for unzipping files appeared in the Subscription alerts area.

A notification dialog appears when zipping or unzipping files. Click Close to resume work.
Figure: A notification dialog appears when zipping or unzipping files. Click Close to resume work.

Dropbox – Advanced assessment including Co-Marking, Delegation, and Multi-Evaluator

Single Shared Evaluations and Ready for Review

Often, evaluators and publishers may not be the same person. This feature ensures that users using Co-marking and Delegation assignments have a way to communicate with publishers when their evaluation is ready for review. The feature reduces communication errors during evaluation. When an evaluator has completed their evaluation, they can set the evaluation to Ready for Review, which will signal to the publisher that the evaluation is ready and can move on to the next phase.

To use single shared evaluations and ready for review

  1. Navigate to the Assignments tool.
  2. Create an assignment with a Single shared evaluation, ensuring that publishers and evaluators are not the same person.
  3. Click Save and Close.
Instructors must select Single shared evaluation in the Dropbox tool. The Publishers dropdown is set to manually select publishers.
Figure: Instructors must select Single shared evaluation in the Dropbox tool.

Add evaluators to Assignments or edit existing evaluators

When creating a new assignment or editing an existing assignment, in the Evaluation & Feedback accordion of the Dropbox tool, the number of evaluators who can give feedback in the course is displayed.

If an instructor clicks Edit Evaluators, an Evaluators dialog displays all evaluators with the Give Feedback role permission in the course. When you click on Edit Evaluators, the advanced assessment workflows begin.

You can add evaluators to the assignment by selecting evaluators in the Evaluators list dialog.

If you do not make any selections and click cancel, the advanced assessment workflows will not be available, and you can continue creating the assignment, and it will be a non-advanced assessment assignment. All evaluators who have the Give Feedback permission will have access to this activity.

If you click Save on this page, you can trigger the advanced assessment workflows.

Assign evaluators in the Evaluators dialog. 3 of 3 evaluators are selected.
Figure: Assign evaluators in the Evaluators dialog.

Selecting the Manage Evaluators option opens the Evaluators dialog to allow you to make changes to the evaluators for this assignment.

After the evaluators are assigned, there are now four new capabilities in the Evaluation & Feedback accordion:

  • Evaluators
  • Publishers
  • Matching learners to evaluators
  • Multiple Evaluator Mode
After assigning evaluators, the Evaluators, Publishers, Matching learners to evaluators, and Multiple Evaluator Mode sections appear in the Evaluation & Feedback accordion.
Figure: After assigning evaluators, the EvaluatorsPublishersMatching learners to evaluators, and Multiple Evaluator Mode sections appear in the Evaluation & Feedback accordion.

Note: When enrollment changes are made for evaluators in a course, it could take up to two minutes for this enrollment change to be processed and reflected in the Edit Evaluators list and in the Evaluators section.

Assign publishers

While creating or editing an assignment with assigned evaluators, you can assign publishers based on the options listed below.

  • All evaluators can publish any evaluations
  • Evaluators can only publish their evaluation: This means that only the evaluator assigned to evaluate a specific learner submission can publish the evaluation. You can review the Advanced Evaluation dialog to learn which evaluator gets to publish which learner submission.
  • Manually select publishers: This means you can select one or more users from a list of users who have the Give Feedback permission, which allows them to be the publisher. A list of users with the Give Feedback permission appears, and then a selection can be made to decide who publishes the evaluation to the learner.

Depending on the option selected, a Publish button will appear on the evaluation screen for users who meet the criteria of that option.

Allocate evaluators to learners (Manage Matching)

Click the Manage Matching link to open the evaluator mapping dialog table. Options are available in the drop-down list to determine how the learners should be matched to the evaluators. The system can randomly match each learner with up to three evaluators. However, it is also possible to customize the evaluator and learner mapping to choose more specifically who evaluates who using the learner-evaluator mapping table and selecting or un-selecting the matching check boxes.

Choose a learner to evaluator mapping scenario in the Allocation Options drop-down menu. All evaluators assigned to all learners is selected.
Figure: Choose a learner to evaluator mapping scenario in the Allocation Options drop-down menu.

Two options are available for evaluation modes that allow you to assign evaluators quickly:

  • One shared evaluation: All evaluators contribute to the same shared evaluation of the learner’s submission. This is often referred to as co-marking.
  • Multiple separate evaluation: Each evaluator completes their own individual evaluation separately. Those independent evaluations must be aggregated by the publisher before being released to the learner. This is often referred to as an independent multi-marker.

Suppose a new user enrolls in the course after mapping the learner-evaluator. In that case, the system will allocate the new user automatically based on the allocation options for the assignment.

Evaluator mapping can also be assigned by groups and/or sections using the Groups/Sections drop-down menu. In this menu, the desired group or section can be selected to display those learners. Then those learners can be mapped to an evaluator.

Synchronize multi-evaluator grades between Grade Book and Dropbox

When a group of instructors use delegation or co-marking to evaluate an assignment, entered grades synchronize with grade book. Also, when an instructor enters a grade for an assignment in grade book, the entered grade synchronizes assignment grades.

Inputs into grade book populate into the Aggregated tab for Assignment Evaluation when a user chooses to sync from the grade book.

Other features include the following:

  • Any Aggregated tab that has a draft saved and published will be overridden by the Grade book inputs when the Grade book sync is started.
  • Any evaluation not marked as ready to aggregate is switched to ready to aggregate when the grade book sync has been started.
  • Any evaluation already marked as ready to aggregate will stay unchanged when the grade book sync is started.

Respondus Monitor – Sensitivity levels

The flagging system in Respondus Monitor now supports three sensitivity levels: Strict, Standard, and Relaxed. The default setting is Standard, and it produces results like before. The Strict sensitivity setting detects when portions of the face have moved outside of the video frame, or when both eyes of the test taker aren’t visible. When this occurs, a Partial Missing flag will appear in the proctoring results.


With the Relaxed setting, what you’ll notice most is that it reduces the sensitivity of Missing flags. If the face or eyes are partially missing from the webcam video but Respondus Monitor can still detect the student is there, the Missing flag is less likely to trigger.

Which of the three settings should an instructor choose? It depends on the testing scenario. The relaxed setting might be preferred with open-book exams since students often turn away from the webcam to look at notes or books. Strict sensitivity might be appropriate for tests where students aren’t using outside resources like books, calculators, or scratch paper.

One of the best aspects of the new sensitivity setting is that an instructor can select it after students have completed their exams. This is possible because the setting appears on the proctoring results page. When the sensitivity level is changed, the Review Priority scores and the rankings of the proctoring sessions are immediately updated for the instructor.


Please be sure to follow University guidelines for enabling Respondus Monitor.


If you have any questions about the updates, please contact CourseLink Support at:
519-824-4120 ext. 56939