Navigation Menu

Click the "+" to see inside a chapter or use the search to the right.

.

Skip to end of metadata
Go to start of metadata

To begin validating items, click Validate Questions button to view the Validate Questions page.

Icon

If an item is marked View-only in the project (i.e., marked as "Editable" in another project), you cannot add validation to the item; only comments can be added to the item.

Subject Matter Experts should be provided with some training about the ratings before they give them. There should be instruction on the perils of low ratings which should be used sparingly. A poor item should be "Modified" in an effort to improve it, or another item should be written to replace it. Otherwise, the poorly rated items have to be re-reviewed.

If the Validator is assigned with the search criteria Belongs to case, then that case cannot be deleted. In order to delete such cases, the assignment having cases for Validator has to be deleted first by the Project Manager.

How are "Star Ratings" calculated?

Icon

The "Star Ratings" for items are created using a combination of the validation ratings of Importance, Criticality, and Frequency provided by the SMEs for an item. Instead of just averaging the three I, C, and F ratings together (which would tend to “water down” the results with regression toward the mean), we use a multiplicative approach which provides a bit more weight to extremely low ratings. This is valuable because (like a jury) an organization wants to know if there is anyone on the panel who feels strongly that the item is unworthy of being placed on the exam. The range is then converted to a 5-star metric, and items rated poorly by an SME are given a comparatively lower star rating.

These ratings are useful to:

  1. Provide documentation that all of the items on an examination meet a standard of quality as determined by subject matter experts.
  2. Assist users in the exam assembly process because items can be selected according to quality when more items are available in a particular blueprint area than are needed to meet exam specifications.


  1. Rate the item Importance, Criticality, and Frequency and select the Difficulty Rating from the drop-down list.

    Rating

    Description

    Importance

    Importance refers to how important the contents of this item are to the candidate. It is to be rated as:
    0 (Not Important)
    1 (Minimally Important)
    2 (Moderately Important)
    3 (Quite Important)
    4 (Extremely Important)

    Criticality

    Criticality refers to how much "harm" may potentially be caused if the candidate were not familiar with the content covered by this item. It is to be rated as:
    0 (Not Critical)
    1 (Minimally Critical)
    2 (Moderately Critical)
    3 (Quite Critical)
    4 (Extremely Critical)

    Icon

    The “harm” refers to physical, financial, organizational, emotional, etc.

    Frequency

    Frequency refers to how often an individual may use the context of this item when performing his or her tasks. It is to be rated as:
    0 (Never)
    1 (Rarely)
    2 (Sometimes)
    3 (Often)
    4 (Repetitively)

    Is there only one correct answer?

    Specify Yes if you have verified that there is only one correct answer for the item.

    Is there at least one verifiable reference?

    Specify Yes if you have verified that there is at least one provided reference.

    Is the Question classified to the correct Exam Blueprint section?

    Specify Yes if you have verified that the item does belong to the blueprint category it is currently classified as.

    Difficulty Rating

    Estimate how difficult it is to answer this item: If 100 people were asked the question, how many could answer it correctly?



  2. Click Submit.

 

Why does the range of difficulty ratings change for different questions?

Icon

The range of possible difficulty ratings changes based on the number of options in the item. The reason is that ExamDeveloper changes the range to account for random guessing. For example, if the item has 4 options, the values range between 25 and 100 since someone could guess randomly at the item and have a 25% chance of getting it correct and therefore, provides a rating lower than random guessing does not make sense.

Icon

The "Returned By" text in User Tracking and Return Comment title text for previously returned items that are now in Validate State on the Validate Questions page may appear using different text, depending on the text provided in the Return State Text setting of the Item Bank's configuration.

Prior to the 1809 version, the default text was Reject. If you are working with items within an Item Bank that was created prior to the 1809 release, the text is named Reject unless you named it something else.

Icon

If time permits, Project Managers can search for items with low ratings and change the state to "Review" to send them back through the review process.

 

 

  • No labels