It's that magical time of year again: June 21st marks the day high school students shed classrooms for exam rooms and prepare to write the dreaded finals.
This year in BC, tenth grade students have been added to the provincial exam equation in addition to their senior peers. Not to be left out, grade eleven provincial exams in Social Studies begin this June.
It just doesn't make sense.
In the interest of full disclosure, I am a part-time public school educator and I teach courses that have provincial exam requirements. To some, this makes me a special interest group, but look, you're just going to have to take a leap of faith here, okay?
To put skeptics' minds at ease, I'll make another confession: I am not opposed to provincial exams, per se; I'm opposed to these ones and how they're run.
For the moment, I'll put aside the argument of whether or not tenth and eleventh grade provincial exams are a sound idea, except to say that an independent study at Queen's University, Ontario found that after a number of years of said exams in the Ontario school system, declines in the graduation rate have been in the neighbourhood of 15 percent. For now let's focus on the administration of BC's exams.
Where's the consistency?
One of the principal purposes of having students write the same exam province wide is to "ensure that Grade 10, 11 and 12 students meet consistent provincial standards of achievement in academic subjects," according to the B.C. Ministry of Education Examinations Handbook. That is, are students taking classes in Vancouver getting the same opportunities to learn skills to the same standard as students in Quesnel and vice versa? Provincial exams are the great equalizer: students know they've met or mastered the necessary objectives and teachers can see they're meeting the curricular needs of their students.
Except that with the new grade ten and eleven provincial exams, consistency controls are not provided. Historically, grade twelve provincial exams are evaluated by an assigned group of teacher-evaluators, working together in special sessions during which their assessments have been calibrated to ensure consistency through training and the marking of exemplars.
Our new exams, at least in English 10 and the soon to be implemented Social Studies 11 level, are permanently stored at the local school and graded by the very classroom teachers who taught the course.
Which is not to suggest that local teachers are not capable of assessing to a provincially determined standard, nor that teachers at any one school would artificially inflate their students' scores, for example, in order for their school to rank more favourably in published statistics.
But if evaluation is to be done by the instructors who taught the course, having those teachers submit the grades from their own assessments done in class offers the exact same degree of consistency, with far less unnecessary stress on students. It would cost much less too.
Who else allows self-reporting?
This isn't simply an educational issue. Any first year researcher would concur that a data measurement instrument without proper data management provides seriously flawed results. What CEO would permit company departments to self-report on their profits and losses without mechanisms in place to ensure the accuracy of those assessments? If consistency and reliability is the goal of provincials, this system does not provide it.
Worse is that new regulations implemented by the Ministry this year have made the examination process even less pedagogically sound. Assessment consistency is a laudable goal but surely opportunities for student achievement are at least as important, if not more so.
The principle function of assessment, be it assignments, projects, quizzes, or exams, is to measure student understanding of objectives and to help students learn from those achievement indicators. New Ministry rules prohibit teachers from using past exams as learning tools for current and future students. What better way for students to understand expectations than to see, discuss and learn from assessment indicators used in previous sessions?
Blurry brass ring
Students are over a barrel here, needing their exam results to graduate and apply for post-secondary. But that doesn't justify such a heavy-handed, secrecy filled approach to evaluation.
When it comes to finding balance between implementing programs that are best able to help student achievement and those that meet the desire for quantifiable data it is imperative to err on the side of the former over the latter.
More than anything we want our students to receive the best learning opportunities possible. The Ministry needs to keep that brass ring in clearer focus when it administers province-wide evaluations.
David Russell is a freelance writer, former talk-show host and part-time educator living in Coquitlam. He has written for Maclean's, The Vancouver Sun, The Province and others. His web site can be found here
Tyee Commenting Guidelines
Comments that violate guidelines risk being deleted, and violations may result in a temporary or permanent user ban. Maintain the spirit of good conversation to stay in the discussion.
*Please note The Tyee is not a forum for spreading misinformation about COVID-19, denying its existence or minimizing its risk to public health.
Do:
Do not: