1. Defect free question papers: Modern tech makes it possible to reconcile apparently conflicting needs

Defect free question papers: Modern tech makes it possible to reconcile apparently conflicting needs

There are three basic quality expectations of a modern-day multiple-choice-based competitive exam

Published: June 5, 2017 2:20 AM
Defect free question papers, Defect free question papers of exams, how to make Defect free question papers, modern technology to make Defect free question papers In a typical testing project, content development activity is flagged off 4-5 weeks before the test date.

Venguswamy Ramaswamy

There are three basic quality expectations of a modern-day multiple-choice-based competitive exam:

– The questions asked should be correct, i.e. should have a single answer from amongst the provided options;

– The questions should be unambiguous, i.e. should be open to only a single possible interpretation;

– The questions should be within the syllabus, i.e. from content matter that the candidate has studied to be eligible for the exam.

The general focus on transparency in public dealings has ensured that most examination bodies voluntarily publish question papers and answer keys, and invite comments from the candidates. This has shifted the relentless scrutiny of candidates, coaches and media from implementation aspects of exams to matters relating to content. To anyone following the discussions on this topic, it would seem that examination administrators, with all the resources and managerial competency at their disposal, are unable to deliver exams with questions meeting the minimal quality standards of being correct, unambiguous and within the syllabus.

The fact that such instances are reported even from premier examinations—with huge implementation budgets and with the best of resources at their command—indicates a more deep-rooted cause than mere incompetence. The primary cause of errors in framing questions stems from the fact that, while planning the exam implementation programme, there is greater focus on confidentiality and security issues rather than on content quality.

You may also like to watch:

Recognising that a single person would seldom be the master of all concepts within a subject, the most crucial aspect in creating quality questions is on-boarding the best possible group of relevant subject matter experts (SMEs). The output of each SME needs to be reviewed for correctness, both from a content and language perspective. Ideally, each question should be reviewed by multiple SMEs and numerous check points defining quality questions need to be signed off.

This process, if rigorously followed, would result in a high quality ‘question bank’, which could then be used for constructing defect-free question papers. The immediate inference from the above is that construction of defect-free question papers takes time and requires inputs from multiple experts.

Now, ‘time’ and ‘exposure’ are two things that can create potential security and confidentiality vulnerabilities.

In a typical testing project, content development activity is flagged off 4-5 weeks before the test date. This is done to ensure that the content is not prepared too much in advance and to eliminate the need to make special arrangements for secure archival and retrieval of content. Given the short development window, SMEs are selected based on availability rather than on expertise, and it is likely that the best possible SMEs may be left out of the development efforts. The few selected ones entrusted with creating questions typically work in a workshop mode to eliminate potential leakages due to the communication process. Writing test questions is a creative process and the workshop mode of operation with its demand for a particular level of productivity to meet the number targets encourages adoption of short-cuts, which lead to errors. SMEs create questions on topics and concepts in which they have a tenuous hold, merely to cover the requirements of the syllabus. The limited time and expert pool also severely compromises the quality of the review process, if such a process is at all implemented.

The entire content development activity is designed to be completed in the least possible time with the least possible exposure, while maximising security and content confidentiality. However, the unintended consequences of the process design is that the quality of questions gets the short shift.

It is only through technology adoption that examination administrators can reconcile the apparently conflicting needs of quality question paper development with that of content security and confidentiality. A well-designed ‘content authoring engine’ could enable the administrators to on-board SMEs much in advance of the exam schedule. SMEs could log in to the authoring platform from their individual work locations at a time of their convenience and enter questions on their given area of expertise. This would ensure utilisation of the best SME rather than the best available one. The submitted questions can be put through multiple peer reviews with system-assisted prompts to ensure that reviews are of high quality. Automated tools could be deployed for safeguarding against plagiarism. Multiple levels of authentication and authorisation with data encryption ensure security of the content and address all requirements of secure storage, archival and sharing.

It’s time examination administrators redesign implementation plans to address content quality with as much emphasis as they place on security and confidentiality.

The author is global head, iON, a unit of TCS focused on manufacturing industries, educational institutions and examination boards

  1. No Comments.

Go to Top