Shiken: JALT Testing & Evaluation SIG Newsletter
Vol. 13 No. 1. January 2009. (p. 14 - 21) [ISSN 1881-5537]
PDF PDF Version

Innovative School-Based Oral Testing in Asia

by Tim Murphey   (Kanda University of International Studies)

This article describes some recent changes in assessment in several Asian countries and proposes that more Asian-based modeling of testing alternatives might help some institutions in other countries break out of their "entrance exam hell" mentality and shift educational assessment holistically toward a more performance and formative style. It ends with a proposed shift in university entrance exams toward more literacy-based oral interviews that might have a big impact on JHS and HS teaching and may actually produce "Asians with English abilities."

Keywords: oral testing, entrance exams, wash-back, backwash, school-based assessment

Criticizing secondary school English education without criticizing typical university entrance exams is like criticizing the horrific impact of guns and knives on society without criticizing the arms makers, distributors, and sellers. The university entrance exams are indeed the "tail that wags the dog" – the dog being education, including the cram school industry.
For example, in Japan a majority of students accepted at universities have to take paper and pencil multiple choice exams concerned mostly with discrete point grammar and vocabulary problems (Obunsha, 2008). The washback on secondary schools is such that rote learning continues to be ubiquitous. In 2008 35.4% of students in Japan were accepted on the basis of some type of recommendation (MEXT, 2008). Most were recommended by their high schools. At previous universities where I taught, when everyone was given a placement test prior to class, recommended students often scored much lower than the others who took the same paper and pencil tests. This may have been because high school teachers knew that recommended students are usually accepted. Instead of sending candidates who tend to perform well on standard paper and pencil exams, they likely recommend those who seem to have merit in other regards.
"quick-'n-dirty perfunctory [university entrance] interviews are in fact common for screening incoming recommended students"

97.6% of the universities in Japan have some sort of short, perfunctory oral interview exam [suisen nyushi] in which applicants meet with faculty for interviews that are 5 minutes or so (Sundai Yobiko, 2008). Most students know what questions to expect in advance and have already rehearsed their answers. Essentially these oral interviews are nothing but a ritual formality students are expected to pass through. Only if a student demonstrates unusual shyness or mental instability do they fail such interviews. Well-constructed and valid oral interviews are rare (Murphey & Park, in press).

[ p. 14 ]

However, quick-'n-dirty perfunctory interviews are in fact common for screening incoming recommended students (Keeandoaaru Suisennyuushi Taisaku Inkai, 2001). Generally speaking, high school records and application essays are considered along with the oral interviews when making admission decisions (Youyou Inc., 2008). In 2008 66% of all recommended applicants to private universities in Japan were accepted. Prefectural universities took in 45% of their recommended candidates. National universities accepted just 38% of their recommended applicants (JS Nihon no Gakkou, 2008).
At the Kanda University of International Studies where I now teach, 60% of our approximately 1000 new students are accepted through interview exams. This acceptance is not automatic – about 40% actually fail their interviews. To qualify for the interview exam, students have to have achieved a certain level in a standardized exam and/or have an excellent GPA.
"To make oral interviews an important component of the admission process surely enhances the validity of the decisions that are made."

The general consensus in the testing field today is that validity is enhanced if multiple forms of evidence are gathered (Cronbach and Meehl, 1955, 281-302; AREA, 1999). To make oral interviews an important component of the admission process surely enhances the validity of the decisions that are made. Also, there is widespread agreement that decisions based on multiple raters tend to be more reliable then those based on single raters (Gamaroff, 2000). For this reason, I recommend employing at least two raters for interview exams. At my university we pair off native speakers with proficient Japanese EFL instructors. The native/non-native combination, however, is not essential: some proficient nonnative speakers of English are fully qualified to rate incoming students and not all native speakers are necessarily good raters. Having two teachers work together to rate students is labor intensive, but I maintain that this more efficient and effective in getting the kinds of students we want – active learners enthusiastic about learning. (Note to grammarians and vocabulary specialists: everything we do with language has grammar and vocabulary embedded in it. We are using grammar and vocabulary when we listen and talk. However, as Reed (2006, 1-5) suggests, we are not usually exercising our understanding of talk and discourse with discrete point grammar and vocabulary questions on paper-and-pencil tests).
As I was reading my first and second-year students' language learning histories (Murphey, 1999), I was curious to find out if high school teachers were changing their teaching to help students in these oral interviews. The results suggest they are not. Most students admit in their histories that their high school classes were not helpful for their entry to our university and they had to go to private conversation schools and/or get private tutors to train them in English conversation (Murphey, 2008). It will probably take many more universities daring enough to promote oral use through extended interview exams before secondary school pedagogy follows suit.

[ p. 15 ]

Most universities still depend mostly on their pencil and paper exams to make money, with hundreds of applicants sitting in crowded testing rooms (Murphey, 2004) rather than having two teachers interview one student for 10 minutes at a time. When other universities tell me oral interviews are not practical, I suspect what they mean is "We can’t make the same amount of money. Oral interviews are not cost efficient."

Oral Testing in Malaysia and Hong Kong

And yet, in some other Asian countries things seem to be changing faster. Not only are some schools doing more oral testing, they are putting the testing in the hands of the teachers working with students. In May 2008, after my JALT Pan-SIG plenary in Kyoto, I gave a plenary at the 7th Malaysia International Conference on English Language Teaching in which Hamp-Lyons presented on "School based assessments: Why and some thoughts on how". Her abstract in the program read:
Around the world the movement towards alternatives to formal tests of language is building momentum, at the same time that more and more traditional and high-stakes tests are appearing. These alternatives tend to be lower stakes, more participatory, more closely aligned with curriculum goals, and to emphasize performance assessment. In several countries school-based assessment has become one of the most influential alternatives. In this paper I describe the SBA approach being taken in Hong Kong, and illustrate it with examples.
At the same conference Mostafa and Othman discussed the backwash of a school-based oral English test in Malaysia and their summary stated:
This qualitative study investigated the backwash effect of the Oral English Test (OET) conducted at school-based level in selected secondary schools in the Batang Padang district of Perak. More specifically, the study looked at how English as a Second Language (ESL) teachers prepared students for the test in school, how they tested the students in a school-based context, and examined the backwash effects of the test conducted on classroom instructions and student performance. Findings indicated that the School-based Oral English Test produced beneficial backwash on the ESL teachers’ classroom instructions and on the performance of the students in the test.
Both of these papers describe oral exams that were originally generated by the respective education ministries and then turned over to local teachers. They both contend that this changes teaching. It stands to reason that if the teachers are in charge of testing a certain way, many will gradually shift their teaching to match what they are testing. However, the solution probably involves more than just changing tests; teacher education is also important. Cheng (1999) has aptly pointed out that it is much easier to change what we teach than it is how we teach. Though curricular content is often influenced by tests, to impact teaching in a deep way teacher training is also essential.

[ p. 16 ]

At present in Japan, the vast majority of high school curriculums seem to be geared toward paper and pencil university entrance exams. I wish Japan and other Asian countries would model Malaysia and Hong Kong in these regards and even take things a step further toward literacy based oral exams (described below).

Dreaming of a different type of exam

I would love to see a more holistic change to assessment, something that might include more aspects of competence and performance. My suggestions would be to give students an authentic newspaper or magazine like Readers Digest or Time Magazine, or even an EFL/ESL student magazine or a graded reader, with a reading passage marked with a post-it that the testers would have carefully chosen for the students' approximate levels. Students would have 20-30 minutes before their interviews to quickly read the passage and write a short summary and note down words and things they did not understand. Thus, the ensuing conversation would be based on a short reading and student writing-sample. Students could be assessed not just for their level of comprehension and understanding, but their ability to ask intelligent questions and use their interlocutors to learn from. Instructions with the authentic material might say something like:
It is normal not to understand some words and expressions when we read. You are encouraged to ask your interviewers questions about the passage or vocabulary that you do not understand. The goal is to come to a good collaborative understanding by the end of the interview.
Appendix A offers one possible sample procedure.
"If secondary school teachers are going to teach to the exams (which most will inevitably do) we can tap into the washback effect and promote the use of real world material by using them on our tests."

This procedure would also allow us to see to what extent the learners know how to learn from others and dare to ask questions. The fact that the testing material is authentic, with a title and author, from a real publication which possibly has pictures or graphics, often makes it easier to grasp than many entrance exams reading passages that are artificially reworked and plagiarized by a committee into bland nonsense stripped of title, author, and soul. In real life we practically never read anything without knowing where it comes from or its title and its author. Only in exams do we make our readings so unnatural. Not that there may not be some excellent passages in some entrance exams, but the point is that if secondary school teachers are going to teach to the exams (which most will inevitably do) we can tap into the washback effect and promote the use of real world material by using them on our tests.

[ p. 17 ]

Imagine students returning to their schools and cram schools saying they talked about a passage from a magazine such as Readers Digest or Time at the entrance exams and asking their teachers to train them more with these authentic materials. I am not on the payroll of either of these magazines, by the way, and I would suggest a regular shifting or types of authentic material that would be nevertheless accessible to people in the real world and not too demanding for students.

Hand Over

Have I finished dreaming yet? No. The last step would be for universities to turn much of the testing over to high schools and let teachers perform the interviews and give students scores, as indeed they have done in Malaysia and Hong Kong. Universities would still do oral interviews for entrance, but still make available the types of authentic material they use and the questions after every entrance exam (as they do now) but these would be interview questions and discussion topics having to do with real materials – these could easily become materials for actual use in the schools. High school teaching would then take on a "reality" that would motivate students to participate more holistically in their learning as well as teachers becoming more assessment literate. Newfields (2006, p. 10) notes that, "assessment literacy is an important aspect of overall teacher development. All teachers wishing to develop professionally should also learn more about assessment." What better way to do so than by putting the testing and assessing in their hands. There are of course challenges for implementing authentic interviews for high-stakes decision making/testing such as the development of valid, reliable, and usable rating scales, teacher training for the interview procedures, etc. But considering the amount of positive washback impact that is likely, these challenges should be considered worth facing.
I do feel strongly that Asian educational institutions have much to learn from one another and that those in the testing associations should edit a book on the innovations in Asian ELT Testing so that more countries and institutions could model each other. I know at the individual level that positive near peer role modeling is a very active force (Murphey & Arao, 2001). I think modeling could happen more at the institutional, district, and country level as well. For this reason, Siwon Park and I are putting out a call for potential chapters for a book on innovations in Asian testing, which is described on page 21 of this newsletter.

Thanks very much to Tim Newfields and Siwon Park for comments on earlier drafts of this piece.

[ p. 18 ]


American Educational Research Association. (1999). Standards for educational and psychological testing. Washington D.C.: American Educational Research Association.

Cheng, L. (1999). Changing assessment: washback on teacher perceptions and actions. Teaching and Teacher Education, 15, 253-271.

Cronbach, L. J. and Meehl, P.E. (1955) Construct validity in psychological tests. Psychological Bulletin, 52, 281-302. Retrieved November 28, 2008 from http://

Gamaroff, R. (2000, March). Rater reliability in language assessment: The bug of all bears. System, 28 (1) 31-53.

Hamp-Lyons, L. (2008). "School based assessments: Why and some thoughts on how" MICELT 2008 Conference Presentation, May 12-14, 2008.

Keeandoaaru Suisennyuushi Taisaku Inkai. (Ed.). (2001). Suisen Nyuushi, AO Nyuushi no Mensetsu, Shoronbun Bunkei-hen. [Interview sand Essays for Recommended Exams and AO Liberal Arts Exams.] Tokyo: Kirihara Shoten.

Ministry of Education, Culture, Sports, Science and Technology - Japan. (2008). Heisei 20 Nendo koku-kou-watakushi-ritsu daigaku tanki-daigaku nyuugakushi senbatsu jichi joukyou no gaiyou. [Summary of 2008 public and private university and jr. college placement exams]. Retrieved December 28, 2008 from 09/08092911.htm

Mostafa, N. A. B. & Othman, N. B. (2008). "School Based Oral English Test; The Backwash Effect" MICELT 2008 Conference Presentation, May 12-14, 2008

Murphey, T. (1999). Publishing Students' Language Learning Histories: For them, their peers, and their teachers. Between the Keys: Newsletter of the JALT Materials Writers SIG, 7 (2) 8-11, 14.

Murphey, T. (2004, Winter). Participation, (Dis-)Identification, and Japanese University Entrance Exams. TESOL Quarterly, 38 (4) 700-710.

Murphey, T. (2008). The collaborative wisdom of student voice. In T. Newfields, P. Wanner, & M. Kawate-Mierzejewska (Eds.). Divergence and Convergence, Educating with Integrity: Proceedings of the 7th Annual JALT Pan-SIG Conference. Retrieved November 22, 2008 from Murphey.htm

Murphey, T. & Arao, H. (2001). Changing Reported Beliefs through Near Peer Role Modeling. TESL-EJ, 5(3)1-15. Retrieved October 1, 2008 from

Newfields, T. (2006). Teacher development and assessment literacy. Proceedings of the 5th Annual JALT Pan-SIG Conference "Authentic Communication." May 13-14 Shizuoka Japan. Retrieved October 1, 2008 from

[ p. 19 ]

Obunsha (Ed.). (2008). 2008 Nen jukenyou – Zenkoku daigaku nyuushi mondai kaitou: Eigo (Watakushiritsu hen). [2008 English National Private University Entrance Exams - for Cram Schools]. Tokyo: Editor.

Reed, D. (2006). Diagnostic assessment in Language Teaching and Learning. Clear News, 10 (2) 1-5. Retrieved October 1, 2008 from

S Nihon no Gakkou. (2008, November 28). Daigaku no suisen nyuugaku jichi jokyou: Heisei 18-20 nendo no nyuugaku senbatsu yori. [University recommended examination conditions: 2006-2008 selection information]. Retrieved December 7, 2008 from

Sundai Yobiko. (2008). Topics > Suisen wa 97% no daigaku de jichi. [Topics: 97% of universities have recommended exams]. Retrieved December 8, 2008 from 2008news/vol8/index.htm#a

Youyou Inc. (2008). AO Nyushi, Suisen nyuushi to wa? [What are the AO and recommended entrance exams?]. Retrieved December 8, 2008 from
Main Article Appendix A



The purpose of this volume is to present innovative testing practices in many Asian countries and bring them to light so that other testers might model them and adapt them to perform better in their own contexts. The logic is that testing procedures that are already within the Asian contexts may often be more appropriately adapted to other Asian contexts than practices imported from afar. We are looking for innovative practices in wide scale national or international testing, entrance exam testing to schools and universities, as well as school-based testing. We are also interested in innovative classroom based testing that might lend itself to wider use among schools and institutions.

FIRST SUBMISSION of PROPOSALS deadline by April 2009. Please send a short abstract and outline of what your innovative testing idea is and what would be included in your chapter, what has been done and a brief history of it, (list previous publications about the innovation). Let us know what remains to be done with any supporting documents (1000 to 2000 words). This may be submitted to Tim Murphey ( or Siwon Park ( by email or sent to Address: Kanda University of International Studies,

TIMELINE: Notification of tentative acceptance of chapters June 2009. Finished chapters by October 2009. Book to be published in 2010.

[ p. 20 ]

Newsletter: Topic IndexAuthor IndexTitle IndexDate Index
TEVAL SIG: Main Page Background Links Network Join
last Main Page next
HTML:   /   PDF: