Alberta Education recently announced that, as of September 1, 2015, school marks will count for 70 per cent and diploma exam marks will count for 30 per cent of a student’s final mark in Alberta diploma courses. This is a change from the 50/50 weighting that has been used since the diploma exams were introduced in the 1980’s. The immediate impact of this change will be that final marks will go up across the province. This is because school marks tend to be 7 to 10 percentage points higher than diploma marks. Higher final marks will mean more students will graduate and the graduation rate will go up. And more students will reach the standard of excellence (80%) meaning demand at post-secondary institutions will increase. A second impact is that a school that has a lower discrepancy between its school and diploma marks (generally, a school that has higher diploma marks) will see less of a bump in its final marks than a school with a higher discrepancy (generally, a school with lower diploma marks). A third impact is that, because the variance of school marks is lower than the variance of diploma marks, the overall variance of final marks is going to be reduced, and schools are going to come closer together in their overall academic results as measured by final marks. Finally, we anticipate that schools and departments with higher academic expectations will review their assessment practices to determine if their higher standards will end up penalizing their students under the new scoring formula.
We put together the following visualization using Alberta Education’s 2015-2016 diploma exam data to help high school administrators and educators (as well as parents and students) determine where their schools’ internal assessment practices fall with respect to their peers.
Each circle represents an Alberta high school. The circle’s colour represents its region (Calgary, Edmonton, Other) and its size reflects the number of diploma exams written by students enrolled in the school.
You can use the filtering controls in the top right to change the visualization. For example, you can compare school results in individual diploma courses or across different years. Or you can compare the results of schools in school authorities by filtering on “Authority Name” or you can compare schools of similar size by using the “Exams Written” slider. Once you’ve set the filters to include only the schools in your comparison group, the visualization will update.
The middle line is the best-fit line (using linear regression) for the schools that have been selected through the filters. The two lines on either side of the best-fit line are the 95% confidence intervals. Schools that fall outside the confidence intervals are grading their students in a manner that isn’t consistent with their peers.
Diploma exams are designed to have an average of 65%: the halfway point between the acceptable standard (50%) and the standard of excellence (80%). They’re mostly multiple choice, because the theory of this testing format is well developed, and multiple choice formats are consistently used to produce reliable, high-stakes testing scales. The drawback is that the multiple choice format can’t detect or reward partial understanding; answers are either right or wrong. And diploma exams are usually written in one sitting that lasts a couple of hours. School marks are awarded through quizzes, assignments, and exams administered over the full course of instruction. They incorporate more information. They can incorporate many forms of assessment other than multiple choice. Teachers, in short, give students more scope and opportunity to demonstrate understanding and, consequently, improve their marks.
Diploma marks and school marks are both measures of student achievement. Because they’re on a percentage scale, we tend to think of them as having an absolute significance — the percentage reflects how much of the course’s curriculum a student got “right.” We look at thresholds for passing (50%) and achieving the standard of excellence (80%) as reliable indicators and checks on our education system. Higher school grades will push up the number and percentage of students able to meet both thresholds. With higher (and increasing) grades, schools are making different claims than diploma exams about how well Alberta students are prepared for the next stage of their lives. Rather than getting into the debate of which set of marks are more accurate, we should really be focused on how the two sets of marks relate to each other — where do students and schools fall relative to their peers? If there’s good agreement between the two measures, there’s good reason to feel confident that they’re measuring the same thing. The fact that diploma and school marks have different distributions (diploma averages are around 65% and school averages are around 75%) doesn’t mean we can infer that school marks are inflated. We can only infer that if we knew the diploma exams are perfectly accurate measures of student achievement. They’re not. But they are well designed, with generally high-quality items that have been carefully edited, field-tested, and calibrated prior to their use. And they’re administered to all students taking diploma courses. As such, they’re good, informative tests, and they provide a chance for schools to compare their assessment practices against their peers. In this view, a school’s marks are inflated only if they are well outside the range set by peers that achieve similar results on the diploma exams. They are definitely inflated if this patterns holds up over a number of years.
You’ll probably want to know whether the grades your child is getting at school are going to be consistent with his/her future diploma exam results. Grade 12 marks determine admission into post-secondary institutions. Marks generally drop on diploma exams, just as marks generally drop for students in their first year of university. If your child intends to pursue post-secondary education, and there’s a relatively wide gap between school and diploma exam marks at your school, you might want to look into how well your school is preparing your child for higher education. By looking into, we mean talk to the teachers and administrators at your child’s school.
You can use this to benchmark your assessment practices against those used in similar schools. Every classroom is unique, so you can expect some variance in outcomes. Small differences won’t matter, but if there’s a big difference between you and your peers at similar schools, it may be worthwhile to understand (and learn from) the reasons why. Schools that are consistently grading outside of the range set by their peers (whether they’re awarding higher or lower marks than their students deserve relative to their cohort) aren’t doing their students any favours. In the long term, they’re only making it tougher for them to succeed.