Adrian Tennant considers 'diagnostic tests', sometimes referred to as 'level tests' or 'placement tests', comparing them to other forms of assessment as well as giving some practical suggestions.
What are diagnostic tests?
Diagnostic tests are used to determine a student's proficiency level in English before they begin a course. These kinds of tests are meant to help pinpoint what students know and don’t know. Students with similar scores in a diagnostic test will often be put together in the same class according to language level. However, in a school where students are grouped together according to their age, diagnostic tests can be used to help choose the appropriate coursebook and also help the teacher to know what to teach.
So, diagnostic tests differ from other forms of assessment in two key ways. Firstly, they are conducted at the start of a course (before any teaching has taken place on that particular course). Secondly, the content of the test is not based on what has been taught in the preceding classes (as often there haven’t been any). The aim of diagnostic tests is also fundamentally different as they are often designed to encompass a wide range of levels, making it possible to give the same test to all the students and, from the results, place them in the appropriate level.
What is included in a diagnostic test?
The majority of diagnostic tests are quite limited in their scope. This is generally because language schools and other institutions want to give quick short tests which can be marked easily (i.e. objective – simple right or wrong answers).
Many diagnostic tests contain fifty or so multiple choice questions focusing mainly on vocabulary and grammar. As the tests are designed to cover a wide range of levels, the questions will range from things beginners might be expected to know all the way up to what advanced students might know.
Here is a typical example:
1) What ___ your name?
a) am b) are c) is
2) Where are you _____?
a) come b) from c) live
27) Have you _____ been to London?
a) ever b) never c) yet
28) Can you pass me the _____ control? I want to change the channel.
a) distant b) remote c) self
49) _____ spoken to John earlier, I’m sure everything will be OK.
a) Has b) Have c) Having
50) I’m sorry, but I think that’s really ____ -fetched.
a) far b) high c) long
Occasionally, diagnostic tests include other languages skills such as reading, writing, listening and speaking. However, these areas of the test can often be flawed (see the article on Assessing Skills). When they are included they generally try to be as easy to mark as possible. Including areas such as speaking and writing clearly adds a level of subjectivity to the tests, but for a realistic overview of a student's level they are actually very important.
Are there any problems with diagnostic tests?
In many cases the answer is ‘Yes’. Firstly, there are often big differences between the methodology, content and question/task types employed in diagnostic tests and the methodology employed in the classroom. For example, many diagnostic tests rely predominately on multiple choice questions and focus on grammatical accuracy. Whereas the teaching itself focuses on speaking and on communication – ‘getting the message across’ – and not worrying so much about the accuracy of the language used. Quite clearly there is a potential problem here. A student might not score very well in the diagnostic test but may be very good at communicating, whereas another student might do well in the test but find speaking quite challenging. These two students might then be placed in inappropriate classes or levels as their test scores don’t truly reflect their proficiency in the type of language and skills being taught.
Secondly, many diagnostic tests are limited in their scope. In other words, they do not contain a listening or speaking component. If a student is placed in a class simply based on their ability to answer grammar, reading and writing tasks, it may well be that they are strong in these skills but are weak in listening or speaking. Not only does this lead to the same problems as mentioned above, but it also gives a false idea of the student's overall language ability.
Thirdly, if the end test does not match the diagnostic test, then this can also be problematic as there is no consistency or benchmark with which to gauge a student's progress. In many ways there needs to be a synergy between the entry and exit tests and what actually takes place during the lessons.
Finally, a major issue with most diagnostic tests that use multiple choice questions is that it’s difficult to distinguish between students who actually know the correct answer and those who simply guess and are lucky with their guess. In some instances it is possible to eliminate one answer quite easily and make guessing, and getting the answer correct, even more likely. If multiple choice questions are used, it is important that attention is paid to patterns in order to discern when students are merely guessing, so if a student gets only one out of three questions right that focus on a particular area of language, it is more than likely that the one they got right was a guess. On the other hand, someone who gets all three questions right probably knows that particular area of grammar. Reliability is quite clearly one of the main problems surrounding multiple choice questions and yet, because of the ease of marking, they are the most popular way of assessing students in diagnostic tests.
What makes a good diagnostic test?
Balance – if you want a diagnostic test that will actually do what it is supposed to do, then it will be balanced and will match the types of tasks that will be part of the lessons. In other words, if the focus of the classes is on speaking, the diagnostic test should focus on speaking.
Of course, it is important that any diagnostic test doesn’t take too long. It is unlikely that students will be happy if they are asked to take a test that takes three or four hours. This means that compromises might need to be made in order to make the tests effective, valid and reliable. There is a lot of talk of validity and reliability and these are certainly two important factors of any good test. However, for diagnostic tests, and in fact for most tests, another important factor is that they are easy to manage (including marking).
How are diagnostic tests marked?
This brings us on to the topic of marking diagnostic tests. In the most part such tests are objective (one reason why they often don’t include a spoken or written component). The most typical simply use a quick key (often one that can be placed over the test to see which of the answers are correct), or are computer marked (sometimes even taken on a computer) and marked as correct or incorrect.
These results are then ‘banded’. In other words a score range between x and y is assigned to a certain level, for example 1–10 = Beginner, 10–20 = Elementary … 50–60 Upper-intermediate, etc. Of course, this is a very arbitrary way of marking a test, even if the test is fairly extensive in the range of skills, etc that it tests. For example, how big a difference is there between a student who scores 49 and one who scores 51? Less than between a student who scores 41 and one who scores 49. Yet, the two who score in the 40s are likely to end up in the same class level, despite the gap in score being wider!
- Don’t compromise the validity or reliability of your diagnostic test simply for ease of marking or administration. Although these last two factors are important, a diagnostic test that doesn’t actually do the job it is supposed to do is worse than useless.
- A diagnostic test (in fact, all tests), should reflect the content and task types in the course. A diagnostic test that doesn’t do this can be neither valid nor reliable.
- Finally, the purpose of a diagnostic test is to inform both the teacher and student as to the level of the student and hopefully to the areas of language that need to be focused on.
If you feel the diagnostic tests don’t meet the criteria for a good diagnostic test as outlined in this article, then maybe you should be looking for (or designing) a better option.
Assessment matters: Designing your own tests
- Currently reading
Assessment matters: Diagnostic tests