Introduction to Language Assessment Literacy Part 6:
Positive washback from constructive alignment
In the final part of this series about Language Assessment Literacy, we will look at the ways in which careful assessment design can positively affect the quality of teaching and learning which happens as students prepare for their tests and exams. This is the notion of positive washback, an aspect of testing which is built in to many contemporary language assessments
Positive washback can be enabled through assessment design by considering the communicative competencies at work in the questions on a language exam. This can enable an alignment between the test construct and the items that test-takers meet on the test, and the design of teaching activity necessary to prepare students for the exam effectively. This cohesion between test construct, items and required skills and knowledge is known as constructive alignment. This article will present some constructively-aligned ways of encouraging more authentic language performance in a test environment, along with examples of how this freer type of testing can enable washback into teaching before the assessment itself:
Facilitate specific student responses by providing broader contexts and genres rather than specific sentences and paragraphs.
Communicative situations often involve quite formulaic exchanges containing specific language forms which can be tested freely based on student choice rather than fixed sentences or gapped dialogues. For example, present the test-taker with a picture of a shop full of food / toys / pets etc. and ask them to frame a dialogue between the shop-keeper and a customer based on a specific situation:
Peter lives in a small flat and goes out quite often. He wants a pet, and can see two animals he is interested in. Help him find the right pet by talking to the shopkeeper
This item requires the test-taker not just to use the functional language of requesting / enquiring / purchasing, etc., but also requires them to evaluate which animal would be most appropriate for Peter’s situation, and generate focused questions accordingly. Ordering the questions to get to the required information efficiently requires strategic processing, another important competence to consider in holistic language usage. This is a much freer way of assessing competence in a shopping environment, rather than simply completing sentences which could appear in any situation ‘correctly’.
Knowing that this kind of question will test for contextualised, evaluative and interactional skills, it will naturally be in the benefit of the student to learn through situations, to be equipped with a range of different types of question for different scenarios, and to engage with the needs of the participants in a situation rather than simply thinking of the ‘correct answer’ being sought by the question. It would be difficult for a teacher to ‘prepare’ their students for a question like this without practising a range of sociocognitive and strategic competencies along the way.
Prompt for functions rather than language, and give the test-taker free rein over how they fulfil the communicative function required.
A test item which shows a situational picture and has the prompt ‘advice’ would require the test-taker to use a phrase which matches the situation, along with relevant vocabulary, and can be gauged for politeness, complexity, directness, strength etc. according to the criteria that you set on the item. Moving from function to structure in this way maximises student output based on a minimal prompt, thus enabling broader assessment of a wider range of communicative skills applied by the test-taker.
Again, this kind of task requires the test-taker to evaluate the relationship between the participants of the dialogue (a policeman giving advice to a child would differ greatly from a teenager giving advice to her best friend, or coach giving advice to a football team, for example). They would have to consider sociolinguistic aspects of interaction and modify their response accordingly. The washback effect from this type of question would require learners to work with register, politeness strategies, direct and indirect language and softening language in order to prepare students adequately for the test. Again, these competencies are of real-world value, and can help students to fine-tune their language and make quite sophisticated language choices rather than simply choosing from a set of closed alternatives.
3) Ask students to summarise and paraphrase for specific purposes of text. Providing students with a set of texts of different genres, but covering the same subject matter, gives a range of linguistic stimuli for analysis, evaluation and re-processing according to the test item. In a similar way to the structure to function approach above, prompting with a purpose and leaving the student to select appropriate information from the different texts and create a suitable summary requires a lot of higher-order skills and competencies which are relevant to the world of work and study, and require language use operating at a level above that of comprehension or reformulating ideas.
Example: Test items include a blog article, a text-based advert, a cartoon and a letter published in a newspaper, all written on the topic of social media use. Texts contain biased and balanced information on the issue of social media addiction, and the task is for the student to summarise their reading for a report on people’s views on social media. Test- takers can choose their stance, and the aspect of social media use they present in order to convince the reader that social media addiction is / is not an issue for teenagers.
The prompt requires students to understand the features of different genres, to select information and interpret visual prompts (the cartoon), to integrate summary-based and persuasive writing, and to formulate and communicate their point of view effectively for the reader. The range of competencies involved in this task are workable in the classroom, though again they cannot be ‘rehearsed’ for the test situation, meaning that test results will be a more reliable reflection of test-takers’ performance and language proficiency.
As the three examples above show, the notion of constructive alignment is essential for performance-based testing of communicative competence beyond the purely linguistic-knowledge level, and it is up to test designers to harness this alignment by considering the roles of language, skills and communicative competence in test performance, and to design tasks that require student-centred practice of these skills in the classroom, rather than rote-learnt preparation. In this way, positive washback can be a valuable tool for teachers, test designers and students alike, leading to greater skills development which carries the impact of language learning forward beyond the test event and into the lives of the learners in their futures.
Tom Garside is an international education developer and founder and Director of Teacher Training at Language Point. He has published TESOL: A Gateway Guide, a methodology e-guide for teachers of ESOL, a Pronunciation activity book centred on pronunciation card games, and will be speaking on ways of ensuring sustained development for English Language Teachers at the Future of ELT conference at Regents University, London on June 15th.