Building an Efficient and Effective Test Management System in an ODL Institution

Open University Malaysia (OUM) is progressively moving towards implementing assessment on demand and online assessment. This move is deemed necessary for OUM to continue to be the leading provider of flexible learning. OUM serves a very large number of students each semester and these students are vastly distributed throughout the country. As the number of learners keeps growing, the task of managing and administrating examinations every semester has become increasingly laborious, time consuming and costly. In trying to deal with this situation and improve the assessment processes, OUM has embarked on the development and employment of a test management system. This test management system is named OUM QBank. The initial objectives of QBank development were aimed at enabling the systematic classification and storage of test items, as well as the auto-generation of test papers based on the required criteria. However, it was later agreed that the QBank should be a more comprehensive test management system that not just manages all assessment items but also includes the features to facilitate quality control and flexibility of use. These include the functionality to perform item analyses and also online examination. This paper identifies the key elements and the important theoretical basis in ensuring the design and development of an effective and efficient system.


Introduction
One key feature of ODL institutions ODL is the provision of flexible learning. The flexibility to learn in terms of time and locality is probably one main reason that makes ODL institutions a preferred choice of learning for working people and adult learners. Based on the observation, it appears that there has been a rapid increase in the number of learner population in many of such institutions. With the growth of the number of learners, the task of administering formal assessments such as developing items, maintaining item quality and conducting tests and examinations becomes tedious and laborious. In fact, the administrative processes of assessment and evaluation can become a nightmare for ODL institutions (Okonkwo, 2008). The same issue is faced by Open University Malaysia (OUM) which started operation with only a few hundred learners but in just a period of over a decade the accumulated learners' population has surpassed 15,000. As learner population keeps increasing, the administration of examinations for every semester has become increasingly laborious, time consuming and costly. Every semester the assessment department of OUM faces the challenge of managing and conducting examinations for more than 25000 students at 37 OUM learning centres throughout the country. In every semester, new sets of examination papers are set. The process involves identifying and engaging qualified subject matter experts to prepare examination questions and marking schemes. This creates a challenge for OUM in making sure there is consistency in the quality of the examination papers prepared. The task of reviewing and moderating these examination questions and marking schemes is time consuming as well. Printing and delivery of examinations to the various examination centres and collecting and sending back the answer scripts to the examination department also becomes quite costly and challenging. Additional measures have to be taken to ensure security of the delivery of the examination papers as well as the answer scripts, to avoid any form of leakage. Scheduling and administering the examinations must be done carefully and efficiently as well. There is thus a need to leverage on technology to minimize the issues and challenges related to assessment management and administration. The University requires a test management system that not only helps minimize the manual processes involved in administering assessments, but also ensures the quality of the examination papers generated.
This paper provides a detailed description of the design and development process of OUM test management system, also named as OUM QBank system.

Why Test Management system
Test management system is not a new concept. It has long been advocated as a possible tool for managing effective and efficient tests and examinations (Choppin, 1976). Nevertheless, the traditional item banking systems are more of a basic test items storage system. According to Estes (1985) these systems support the mass storage and easy selection and retrieval of items used as examination questions. There was little emphasis on automation of generation of test and also test quality control process. With the advancement of technology related to item banking development, it is now possible for learning institutions to develop more comprehensive test management systems that have much additional functionality, besides basic systematic storage and retrieval. Besides automation of processes, the important functionalities should include the capability of the system to ensure the quality and consistency of test papers generated.

OUM QBank
OUM QBank was designed with the main objective of reducing the laborious manual process of examination or test items preparation and administration and to ensure the quality of examination papers prepared. To achieve the objective, the design of an effective and efficient test management system should have the following unique features: There are three kinds of assessment items to be stored in OUM QBank. These items are : the essay-type test items, multiple choice Question (MCQ items) and items in the form of assignment tasks. For the essaytyped items and MCQ items, the storage is structured to categorize items based on the subject, topic and cognitive levels of difficulty. Figure 1 illustrates the basic structure of QBank item storage. The storage consists of 60 storage cells with each cell specified by topic and cognitive level.

ii) Item Entry Interface
OUM QBank is designed to provide a user-friendly interface to easy entry of items. Figure 2 shows a screen shot of the item-entry interface. The interface design allows the user to type items directly so as to be saved into the system. Alternatively the user may prepare the items in Microsoft Word and use the normal copy-paste method to deposit items into the system.

iii) Test Specification Table
A test specification table serves as a blueprint that guides in the preparation of examination questions. It helps to ensure consistency and level of difficulty of a paper. It also helps to ensure the appropriate distribution of a test paper in terms of topics. Therefore, the use of a test specification table is an important step in the preparation of an examination paper. The test specification table is known as the Item Distribution Table (IDT). Basically, a test specification table is in the form of a table that displays the distribution of the examination questions for a given subject according to topics to be tested and the cognitive level of questions. The test specification table prepared based on the content of the learning module. This ensures that the test items are representative of the content being covered in the module. Having a good distribution of questions that are representative of the whole module also helps ensure content validity (Jandaghi & Shaterian, 2008). Another important dimension to be considered when building the table of specification is the distribution of items according to the different levels of cognition. The levels are based on Bloom's Taxonomy which states six levels of cognition: knowledge, comprehension, application, analysis, synthesis and evaluation respectively.
OUM QBank is designed to allow the generation of test specification table based on user-set criteria. Figure  3a illustrates a generated test specification table.

Figure 3a: A system-generated Test Specification Table
It is easy to observe that the structure of the test-specification table is similar to the QBank item storage structure. Therefore, once a test-specification table is generated, it is easy to programme the system so as to randomly select the required items from the required storage cells. In the QBank, the test-specification table is represented in Figure 3b.  Table (IDT) iv) Test paper generation To minimize laborious manual task, QBank has formatted all the examination templates into the system. . Once a test-specification table is generated, the system will be able to generate the test paper according to the required print-ready format.

v) Item Analyses
After an examination, the examination results can be imported into the QBank system to enable item analyses. Difficulty index and discrimination index for each and every item can be generated. The discrimination index describes the extent to which a particular test items is able to differentiate the higher scoring students from the lower scoring students. Item difficulty index shows the total group answering the item correctly. These information serves as a reference for the user on the quality of items that have been developed and to make decisions about how each item is functioning. This then helps the faculty to identify poor items which need to be reviewed, enhanced or to be discarded. Figure 4 provides a visual representation of the QBank system framework with relation to complete process of item preparation, test generation, item analyses and item review.
As a test management system, the QBank deals with quality management and with examination papers generation. The quality management component includes two sub-systems: the item banking system and examination items analysis system. The item banking system stores and categorizes examinations questions prepared by SMEs and reviewed by the reviewers. It also stratifies questions based on their respective difficulty indices and discrimination indices generated by the examination items analysis system. The examination item analysis system analyses the items based on students scores collected from the Online Marks Entry System currently used at OUM. It generates report for problem questions that need to be reviewed by the reviewers before depositing into the data bank system again. The examination papers generation system also has sub-systems. The first sub-system automates the generation of test specification tables. Based on the information provided by the test specification table generated, another sub-system retrieves the examination questions that meet the criteria in a random manner. The questions are then formatted into printable form.

Figure 4: Framework for OUM's Test Management System
OUM QBank has been developed based on the framework and design detailed above and is ready for faculty to start entering items. Figure 5 shows the login screen. Figure 6 shows the dashboard for QBank users. The functions that show will vary based on the different roles of the users logged into the QBank. Functions for each role are clearly defined and each role will have a different level of security and access. The different roles for the users include the super administrator, item entry operator, item entry reviewer, chief reviewer, faculty administrator, and the faculty dean.

Online Examination implementation
Another unique sub-feature of QBank system which needs to be highlighted is the Online Examination System. Figure 7 shows the online testing system main screen.
The features and framework described earlier are meant for 'Offline' delivery of Examination paper. This means that test papers generated from the system can be printed as hard copy examination papers to be administered at various learning centers. The Online component of the examination papers generation system is basically an extension of the Qbank main system. It provides an additional option for the test paper generated to be displayed online for the purposed of conducting the test in the online environment. To facilitate Exam Paper created for delivery of 'Online Exam' purposes, the Qbank system needs to include another 'Online subsystem' for Students to login to take their examination at pre-scheduled date and time.

Figure 7: OUM Online Testing System
The Online Subsystem is designed to integrate with OUM's campus student management system to perform the following two important functions: i) Authenticate Students access and confirm if there is a pre-scheduled examination. Once Student ID is authenticated and system confirmed that there is an examination scheduled then the designated Exam Paper will be made available to the student.
ii) Send students exam results to the campus student management system once student completed their examination Since the examination is taken online, both examination results and item analyses can be processed in real-time.

Security Consideration
Security concern is one of the major considerations throughout the design of the system infrastructure and architecture to ensure that data in the system is protected from unauthorized access resulting in theft, loss, use or modification as well as from attackers, hackers and crackers.
Various methods of users' authentication include biometrics and facial recognition technologies were explored. For cost effectiveness and efficiency, the familiar 2-factors authentication same as those used by banks, is implemented for all users, at all levels, accessing to the system to protect their account with their password and personal mobile phone. 2-factors authentication can drastically reduce the probabilities of online identity theft, phishing expeditions, and other online fraud and thus provide security to the complete system.

Conclusion
A good QBank system facilitates the assessment processes. The system also helps to identify items which may not meet the quality standard. But the system by itself does not ensure the quality of assessment items to be deposited. It needs much knowledge, skills and effort from the subject matter experts to achieve this end.
Throughout the design and development of OUM's QBank system, which took about 6 months, the Institute for Teaching and Learning Advancement at OUM has worked closely with the system developers to ensure that the system is user-friendly and meets the needs of the University. A great deal of effort has been put into the design and development of the system and in the different stages of the development such as system requirement analysis, userinterface design, functionality design and user-acceptance treating. The team also gathered feedbacks and constructive suggestions from faculty academics and administrators to ensure the system usability. It is hoped that with the implementation of the QBank system, assessment can be conducted not only in a more efficient and effective manner, but also in a more flexible way, paving the way towards flexible entry and exit of learners