Client Case Studies
The Latitude Difference
Awards & Reviews
Advanced Administrator Training
Getting Started Guide
Technical and Training Resources
R & D
Enhancements In Development
The Latitude LMS Assessment Authoring Tool enables portal administrators to create, deliver and analyze a range of assessments, from simple quizzes to sophisticated, multi-objective examinations. This document will describe how key features allow you to:
Define multiple test objectives and sub-objectives
Shuffle assessment questions in real time
Shuffle answer choices in real time
Create pools of questions for each test objective, allowing for randomization in each attempt
Control the number of test attempts per participant
Control session time limits
Define pass/fail requirements
This toolset also allows you to supply images and videos as part of question or answer content, making the assessment more interactive for students. To learn more about these graphic customization options, please see the related document:
Embed an Image or Video in Assessment Course Content
Step one: Create an Assessment Course
From the left navigation menu, click
to create a course with the functionality you want. For the Delivery Method, select
. For more detailed instructions on adding courses, please refer to related document:
Add a New Course in the LMS
After creating the Assessment course, click
Continue Test Setup
. Return to the Course Details page and click this button to review or edit your Assessment at any time.
Step two: Test Setup
In the “Testing Administration” window, click the
link under the test Name to define key Test settings. When reviewing these instructions, it is important to understand that the term “Objective” refers to the topic or category of questions in each section of the test.
Please refer to the screenshot and field descriptions below for guidance on selecting Test settings. You will need to revisit some of these settings once your assessment content has been finalized. Such settings will be explained in greater detail near the end of this document.
: Defaults to Course Code.
: Describe the expected result in terms of student performance. This field appears at the Objective level as well.
: If you randomize objectives, the questions within each Objective will remain grouped, while the Objectives would be presented randomly for each launch of the assessment.
Sort by Questions
: There is a Sort Order option for each question you create. Selecting “Yes” for the Sort by Questions field indicates that you want to dictate the order in which questions will be presented. If you want the questions to be randomized, select “Yes” for that setting instead.
: Questions are sorted, by default, to the order in which they were created. Selecting “No” for both fields will result in the default order being applied. Selecting “Yes” for both fields indicates that you may want some questions sorted, while others can be randomized.
First Page on Return
: Choose where, within the assessment, users are directed when relaunching the course. This setting is only applicable for assessments that have more than one page of questions presented.
: You can restrict the number of attempts and the time allowed for a user to pass the test. Time Allowed will appear after you click Update Test.
: We suggest that you set these fields to “Unlimited” while creating and testing your assessment to avoid being locked out for too many launch attempts or time spent. You can adjust these settings as appropriate once the content is finalized.
button to save changes.
field displays once the test is saved. Now or later, enter into this field a number that represents the minimum passing score, and indicate whether that score is based on Percent or Questions.
Return to Tests
to continue authoring.
Once you have configured settings at the Test level, click the
link under the test Name to set assessment functionality at the Objective level.
Please refer to the screenshot and field descriptions below for guidance on setting up Objectives. You will need to revisit some of these settings once your assessment content has been finalized. Such settings will be explained in greater detail near the end of this document.
: Describe the overall goal of the Objective.
Load All Questions on One Page
: Indicate whether or not you want all questions within this Objective loaded on one page. It may be best to create the questions first, and finalize this selection later.
Objective Sort Order
: Default sort order is 99. Objectives, and their associated questions, will display in the order they are created. If you want the Objectives to show up in a different order, you can specify the specific numeric order here.
: This setting would be overwritten if you selected to randomize objectives at the Test level.
: Default is Active. You can deactivate and reactivate Objectives as necessary.
: Indicate how many questions from the pool will to be presented when the assessment is launched. You will need to create questions first, then return to Objective settings to determine the number for this field.
to save entry.
Once the Objective has been created, additional links display at the bottom of the page, allowing you add a Sub Objective or start adding Questions. Click the desired link, or click
Return to Test
to continue adding Objectives.
Add Sub Objectives
If your assessment requires a more detailed setup, you can click
Add Sub Objective
under a parent Objective. Sub Objectives allow elements of the parent-level objective to be split into finer skill or knowledge sets. You can nest Sub Objectives under multiple levels by selecting the Parent Objective from a list of existing objectives. Click
to save the Sub Objective when your entries are complete.
Once you have designed a Test with sections broken into the Objectives (or Sub Objectives) you want students to achieve, you can create the assessment questions and content. When reviewing the instructions below, it is important to understand that the term “Distractor” refers to a possible response to the question or prompt. All correct and incorrect answers are called Distractors.
Types of Questions
Fill in the blank
– Responses are scored by text match
– There is one correct choice in a list of possible answers. True/False questions are often created with this setting.
– The participant can select none, one, or more of the choices offered
– Cross-relate responses to multiple questions from a single exam item. This question type presents several multiple-choice questions together, where the participant selects one choice for each statement or question presented. The correct answer requires that all responses be correct.
– Responses are not scored. Often used for opinion or survey questions.
The following example shows a Multiple Choice question setup with numbered field descriptions.
to save the entry.
The window will refresh, showing at the top of the page an outline of where the question resides within your assessment, and at the bottom a preview of the new question. This is the “Edit Question” page, which you can revisit and revise any time. In addition to the standard return links, note the linked options to Delete Question, Add Question, Update Question and add Images as part of question or distractor content.
The following screenshot shows an example of a Matrix Question, which requires the participant to select one choice for each statement or question presented. This type of question requires that all student selections be correct to grant a passing credit.
: This setup works best when you enter all of the text and sort order components first, click
, then indicate the correct responses with the radio buttons from the “Preview Question” area. Click
to save and verify changes.
Finalize Settings that Make the Assessment Functional
Finalize Objective Settings
Once you have created all of your question and response content, it is important to return to each Objective and finalize settings. Please refer to the screenshot and field descriptions below for guidance on defining assessment behavior.
to save changes. Click
Return to Test
to continue updating all active objectives.
Finalize Test Settings
Verify that all of your questions and objectives are in working order, and that you have test launched the new course. If you are satisfied with the results, it is time to revisit your Test settings to ensure that all of your desired sorting and restriction selections are in place. Please refer to the screenshot and field descriptions below for guidance on finalizing assessment behavior.
The Student Experience
Students can launch assessments, like all self-paced courses, from their homepage or the Course Details page. Once a user is enrolled, clicking the
button will open the assessment in a new window and begin timing. A user should complete the assessment, then click
to submit his or her transcript and view the results instantly. The pass/fail status and score will also be reflected under the user’s training history.
In the example below, a student responds to a Fill in the Blank question.
Once a student clicks Score Test, the LMS displays test results and feedback text for each objective.
A student who has completed the assessment can always review their Detailed Test Results -- correct or incorrect score per question -- by returning to the Course Details page and clicking
View Test Results
: Configure the number of Distractors (possible answers) before entering any information. Deleting or adding a row before saving the question will clear all field selections and text.
: Enter the question or prompt text.
: Select which objective, or sub objective, the question falls under. Then choose the Type of question from the options listed above.
: Indicate if the question is Mandatory, meaning that it will be included with every assessment launch. The default setting “No” implies that you will be creating a pool of questions larger than the number presented for the Objective. Selecting “Yes” confirms that, whether or not you choose to randomize questions in your Test settings, this question will always appear in the assessment.
Question Sort Order
: Questions, and their associated distractors, will display in the order they are created. If you decide to Randomize Distractors, the possible responses for each Question will remain grouped, while the Distractors would be presented randomly for each launch of the assessment.
Under Distractors, use the radio button to indicate the Correct Answer from the list of distractors.
: If using the Multiple Answer question type, you must first click
to change the radio buttons into checkboxes, then check multiple correct answers.
Load All Questions on One Page
: Indicate whether or not you want all questions loaded on one page. Take into account the number of questions in each objective, the number of answers per question, if there are graphics involved, etc.
: The system considers your selection for the “Questions per Page” setting at the Test level as well.
Objective Feedback Text
: This optional free text field allows you to provide users brief feedback notes once they complete the assessment and click Score Test.
: Indicate how many questions from the pool will to be presented when the assessment is launched. This number must be equal to or higher than the number of Mandatory questions, but not more than the Current Count. If the number falls outside of these parameters, the assessment will generate an error.
Ensure that your
selections match your settings at the Objective and Question level. If there are discrepancies, Test-level settings will override all others. The same concept applies to the “Questions per Page” field.
: Establish pass/fail requirements. Enter into this field a number that represents the minimum passing score, and indicate whether that score is based on Percent or Questions.
restrictions, if desired. If a user attempts to launch the assessment more times than allowed, he or she will receive an error message telling them so. If a user reaches the time limit before he or she is able to complete the assessment, the attempt will be submitted and scored instantly.