search
for
 About Bioline  All Journals  Testimonials  Membership  News


Indian Journal of Pharmacology
Medknow Publications on behalf of Indian Pharmacological Society
ISSN: 0253-7613 EISSN: 1998-3751
Vol. 36, Num. 6, 2004, pp. 388-389

Indian Journal of Pharmacology, Vol. 36, No. 6, November-December, 2004, pp. 388-389

Correspondence

The other side of OSPE

Department of Pharmacology, JIPMER, Pondicherry
Correspondence Address:Department of Pharmacology, JIPMER, Pondicherry gitanjali@jipmer.edu

Code Number: ph04139

Sir,

Pharmacologists in India have been using Objective Structured Practical Examination (OSPE) as a method of evaluation since the early nineties.[1] It has been often touted as a good substitute for the conventional method of practical examinations since it is more objective. At the department of pharmacology, Jawaharlal Institute of Postgraduate Medical Education and Research (JIPMER), Pondicherry we have used this method for evaluating medical laboratory technicians[2] and subsequently, in 1999, introduced it as a method of evaluation for the formative exams conducted for medical undergraduate students in pharmacology. With the passing of time I have become progressively disenchanted with this method for the following reasons outlined below.

1. Time constraint: In JIPMER we have to examine a batch of approximately 30 students over a period of two and a half hours. Due to space restraints we are able to set up a maximum of 10 stations only. If we allot each station 5 min, it will take 50 min for a batch of 10 students to complete all stations. Hence we need three hours to examine one batch. Cutting the time spent in each station was not an option for us since some of the tasks (such as giving an I.V. injection, or communicating to patients on proper use of oral contraceptive pills) required this amount of time. If we gave students less time it would amount to testing how fast they could do the task rather than how well they could perform it.

2. Logistical problems: There were technical difficulties in setting up many sets of experiments for each station. For example, we used to have a station where an isolated frog rectus tissue was mounted (by us) and the student had to inject a given volume of acetyl choline and record the response. We had to put up three tissues (three separate baths) for a single station so that there was sufficient time for the tissues to relax before the next student came to add the drug. We also needed one technician to standby filling the reservoirs, rinsing the syringes and flushing the baths. In the course of the examination, if one of the tissues stopped working or a technical fault developed, there was no time to repair the set-up since the student had to go to the next station or else would hold up the entire batch.

3. Maintaining uniform difficulty levels: Thirty students (half the batch) would have the same set of ten stations. The next batch of thirty would come after a gap of one week and have to be given another set of stations. Maintaining uniform difficulty levels between batches is indeed a very difficult undertaking.

4. Shortage of observers: In order to include more procedural stations we needed to have sufficient number of observers. All faculty and residents had to be present as observers whenever an OSPE was to be conducted. This left us with few residents to tackle the organizing - like escorting the students into the lab, giving instructions etc., Including technicians as observers was not feasible since the lab staff were needed in the procedural stations to facilitate the observers (mopping up spills, topping up test tubes with solutions, ringing the bell to keep time etc.). Unless there was full attendance (among faculty and residents) it became difficult to conduct an OSPE with many procedural stations.

5. Observer fatigue: After observing the same station for ten or more students observers get tired, bored and careless. On one occasion we found that the batch of students who had a particular observer for a task scored between 3-5 out of ten whereas the batch which went to another observer (the task was the same) scored 8-10. We then learnt that the person giving more marks was giving tips to the students to complete the task. This happened because the observer got bored and wanted to interact with the students.

6. Time consuming: Preparations are time consuming especially preparing the checklists, printing them, etc. A lot of planning has to be done. After the OSPE, tabulation of marks takes a long time especially if many stations are arranged. Considering the fact that during six months of the year we have three batches of students and that for each batch we conduct five notified tests (all of which include practical examinations) these cumbersome procedures soon became tiresome.

7. Problems with electricity: At times, just before the OSPE was due to start the electricity would fail. This used to throw the entire session out of gear unless an alternative station could be set up or another power source located.

8. Miscellaneous:

(a) As the years went by, there was an increasing trend to include more number of response stations in a bid to "save" on observers. This amounted to students answering short answer questions even though we included calculation of drug dosage, statistical problems and so on which could be described as essential intellectual skills needed for pharmacologists. However, this does not detract the fact that these skills can be tested in a theory paper too and does not need to be labeled OSPE.

(b) After two years, some of the questions had to be repeated. Then it became a simple matter of testing recall mainly since students already knew the answers to most of the problems. In trying to setup "new" stations there was a trend to make the tasks either more difficult or trivial.

(c) Even though the checklists were being modified each time, taking into consideration the comments of the faculty manning a particular station, the loopholes were many. For example, one of the points in the checklist on communication skills was that the student had to greet the patient. Many students forgot to do that and remembered it only at the end of the interaction. However, since they knew they would get one mark for greeting the patient they went ahead and did so. The observer gave these students the mark allotted for it as the checklist did not specify that the student had to greet the patient first and not in between or last!

Due to these problems we decided to use a method that I would prefer to call SOSPE or (Semi Objective Structured Practical Examination) for want of a better name. It is an amalgam of the older (conventional) method of practical examination where the student is asked to conduct one experiment and is questioned on it and OSPE. In this method, the student is asked to conduct one experiment and the examiner gives marks based on a structured check-list. 15 marks are allotted for this exercise which has 5 parts with equal distribution of marks (three each). The five parts are: (a) Procedure / Methodology (b) Tabulation of results / graph (c) Demonstration of skill (d) Interpretation of results (e) Seat viva. For each part the examiner accords marks which are totaled in the end. Under tabulation of results (part b), the examiner checks whether observations have been presented properly in a table and also looks for the correct title (heading), inclusion of units, number of animals, name, dose and route of administration of drug etc., The examiner also asks the student to perform a skill (part c) connected with the experiment such as loading a particular volume of drug into a syringe, injecting subcutaneously etc. There may not be much difference from the previous, older method of global assessment, except that students need to pay attention to all parts of the practical experiment and faculty have to observe a skill being performed and give marks for the various components of the exercise.

We follow this up with an exercise on communication skills wherein the student is given a drug/device/dosage form and asked what instructions he/she will give the patient. This exercise is marked out of 10 marks. For another 50 marks we give 5-6 problems on various aspects of clinical pharmacology such as prescription writing, ADR monitoring, essential drugs list, therapeutic drug monitoring, critical appraisal of drug advertisements and so on. For each problem, the evaluation is structured.

The method described above is easier to plan, set-up, conduct and does not need as much manpower as OSPE. Even though there is an element of subjective bias, the checklist eliminates that to some extent. There is also a dialogue between the examiner and the student which permits the examiner to identify the very good from the good and also point out mistakes. This element of student-examiner interaction was greatly missed by many of our faculty during OSPE. However, each examiner needs to spend at least ten minutes with a student. Hence the number of students who can be examined by one examiner is 6 - 8 during a single session. The method outlined above is the one currently employed for most of the formative exams in JIPMER. We still use OSPE for one or two exams. Meanwhile I feel my brief honeymoon with OSPE is over and the search for adopting better methods of evaluating students in practicals goes on.

REFERENCES

1.Natu MV, Singh T. Objective structured practical examination (OSPE) in pharmacology - students' point of view. Indian J Pharmacol 1994;26:188-9.  Back to cited text no. 1    
2.Batmanabane G, Raveendran R, Shashindran CH. Objective structured practical examination in pharmacology for medical laboratory technicians. Indian J Physiol Pharmacol 1999;43:242-6.  Back to cited text no. 2    

Copyright 2004 - Indian Journal of Pharmacology

Home Faq Resources Email Bioline
© Bioline International, 1989 - 2024, Site last up-dated on 01-Sep-2022.
Site created and maintained by the Reference Center on Environmental Information, CRIA, Brazil
System hosted by the Google Cloud Platform, GCP, Brazil