English Ideas Banner
English Ideas
Books: Understanding English   |  Teaching English: Being the Best   |  The Ideas of English Grammar
Grammar: The Visual Grammar of English   |   The Language Awareness Challenge 
Ideas: Text Analysis and Corpus Linguistics  |  The Associative Model of Language  |   Sentence Diagrams
             Needs Analysis  |  Testing Ideas   |   The Dual Text Approach  |  Thoughts on Teaching and Learning
ESP: Military English  |  Police English   |   Teaching Uniformed Personnel Magazine
Courses: Moodle Courses   |   English and Teaching   |   Certificate for Teachers of Uniformed Personnel
Thoughts on Online Testing

There will eventually come a time when you need to test online if you are teaching online.

There are a number of possible solutions to this. You could prepare online versions of traditional tests e.g. using gap fills to check grammar, multiple choice checks of reading and listening tasks and so on. However, these only check how well the learners can cope with ‘other people’s language’ and are limited in terms of the information they provide. A multiple choice test really tells you how good they are at doing multiple choice tests. These kinds of tests also take up a huge amount of time to properly set up, check etc., so even if they are marked automatically the cost/benefit analysis is tending towards not using these kinds of test. Limited useful information vs a lot of preparation time = do not bother.

Also, one of the key problems is how to prevent cheating on such tests. Unless you can control the test-takers’ environment (so you can check their ID before they log-on to do the test) then it is very difficult to prevent cheating.

 
thoughts

Many teachers can tell of learners who were at the lower end of marking scales for most of the year (2 or 3 out of ten) but who then turned in very highly rated work, and had to be given 10 out of ten for the final test. Obviously they cheated but how to prove it? Yes, there are ways, but it’s much better to minimize the opportunities for cheating.

So, bearing in mind all these factors, we can reach a number of conclusions.

  1. The creation of online versions of traditional paper tests such as use of English, reading and listening takes a long time.
  2. These kinds of tests test how the learners cope with ‘other people’s language’ and give us limited information about the learners’ real abilities in English.
  3. Learners can easily cheat in these kinds of tasks.
  4. So we shouldn’t bother.

A better way of testing is to test the learners’ own production: of spoken and written language. This is a much more informative way of testing their language. You could ask them to complete a writing task (or more than one) and ask them to do a speaking task. The speaking task could be live – through video conferencing software or it could involve the production of a recording which is then evaluated.

With writing tasks, though, we have the same cheating problem. Unless we can check their ID and monitor them during the writing task (e.g. watch them through their web cameras!) then we cannot safely say that the learners will not be able to cheat. So, if this is a critical issue then we need to cross off writing tasks from our potential list of tests.

This leaves us with speaking tasks. As long as we know what the learners look like (or sound like) then we can safely test them. I suggest three ways to do this.

  1. Spoken Performance (Recorded): Script a text which covers the grammar and lexis the learners have been studying and you want to test. Ask them to prepare this for a spoken performance: they chunk the text – that is mark where they will pause; mark which words they will stress most; and mark the intonation (rise or fall). They then record this (and record until they are happy) and send the completed recording to you. Then you rate their spoken performance. Obviously, this task just tells you about how well they can produce rehearsed speech, but this is quite interesting information and the recording process is a useful one anyway. This is also not a task which can be (easily) cheated on.
  2. Spoken Interview 1 (Dice): Before the interviews you prepare 12 questions you want to use in the tests. Using video conferencing software you do a short (5-10 mins) oral interview with the learners. The first two questions are generic (How are you? How are you feeling today?); the next two questions are decided by throwing two dice. These indicate (e.g. 1 and 5) the two questions you ask this particular candidate. You throw the dice for each candidate and let the dice decide what questions they will answer. This is to avoid cheating by candidates telling the other candidates what questions they had. The randomizing of the questions by throwing the dice is just for you – don’t tell the candidates how many questions there are or what numbers they answered. But you could explain that you are throwing dice to make the choice.
  3. Spoken Interview 2 (Cards): This is a similar activity but instead of 12 questions you prepare 54 and use a pack of cards (52 cards and two jokers) to choose which two questions you use. Draw two cards for each candidate. You could reuse cards/questions or discard them after each interview. Obviously it take more work to prepare 54 questions than 12 but your choice (12 or 54) will be determined by how many candidates you will have.

These three ways of testing spoken English give you useful information about your learners’ abilities in English. The first, the recording of prepared speech gives you information about their confidence and fluency and general pronunciation. The second and third give you a broader picture of their productive capabilities. Of course, you could ask them to do a recording and interview them as well – it all depends on the time available and the information you want. All three ways are relatively cheat-proof, and this might be a key factor in your choice of test. Of course, testing online can be developed further into more complicated scenario-based tests but these three ways give you useful information with limited preparation time.