We help our clients understand, acquire, and retain their best customers.

Performance Measurement

Measuring Performance in the NHL
Posted by Sridhar Mutyala at 09:42 PM · No Comments

(This is something I wrote back at the start of 2011 for a project we had going back then. I think Corsi has evolved beyond what I discuss here, but I still like goal value as an alternative.)

If you’re interested in the NHL and statistics (like us), you may have heard of something called the Corsi number. It’s one of many new methods to measure player performance being tracked by bloggers (like Behind the Net). Perhaps because it’s new, it’s not popular with traditionalists; witness Don Cherry’s rant against Corsi on Coach’s Corner last season. Despite misgivings over siding with guys who go with their gut, I agree with Cherry. The Corsi number is flawed. But it takes a better effort than Cherry’s to appreciate just why.

[

Students at Calgary Board of Education (CBE) high schools outperformed their Edmonton Public School Board (EPSB) counterparts on the 2014 Alberta diploma exams. This isn’t news. CBE high school students have maintained a significant and consistent edge across the core academic diploma exam subjects for a number of years. What’s interesting is that while CBE high school students are coming out ahead, CBE grade 9 students are consistently trailing EPSB students in Provincial Achievement Test (PAT) results. So we have a situation where students enter EPSB high schools with an academic edge over their CBE peers, but by the time they finish high school three (or more) years later, they’ve fallen behind. Why?

[

In Part 1 of this post, I looked at a few criticisms of the methodology used in the Fraser Institute’s high school rankings. Here, I’m going to explain what I think is the real problem with the rankings: they’re not necessary.

The Fraser rankings, released annually and regularly reported by the media, have largely shaped public perception of school performance. The authors have stated they want to make it possible for parents and educators to easily compare and monitor the academic performance of Alberta high schools. They could have done this by creating a better interface for Alberta diploma exam data. Comparisons made with that data would be easy to understand and evaluate. Instead, they came up with a complicated (and arbitrary) scoring formula to rate and rank schools that essentially shifted the conversation from how schools are doing academically to how schools are doing in the Fraser rankings.

[

In Part 1, I’ll take a closer look at a few criticisms of the Fraser Institute’s Alberta high school rankings, an annual attempt to compare the academic performance of secondary schools across the province. I’ll then explain in Part 2 what I think is the real problem with the rankings: they’re not necessary. Alberta Education achievement data can already be used to monitor academic performance at individual schools. Direct comparisons made with that data would be easy to understand and evaluate. The Fraser ratings, which combine diploma test results and other variables into a single score using an ad hoc formula, are needlessly complicated and misleading, both for parents and for administrators.

[