Students at Calgary Board of Education (CBE) high schools outperformed their Edmonton Public School Board (EPSB) counterparts on the 2014 Alberta diploma exams. This isn’t news. CBE high school students have maintained a significant and consistent edge across the core academic diploma exam subjects for a number of years. What’s interesting is that while CBE high school students are coming out ahead, CBE grade 9 students are consistently trailing EPSB students in Provincial Achievement Test (PAT) results. So we have a situation where students enter EPSB high schools with an academic edge over their CBE peers, but by the time they finish high school three (or more) years later, they’ve fallen behind. Why?
As always, let’s start with the data: the PAT and diploma exam results released by Alberta Education. The chart below shows the trends for the two boards (along with the provincial averages) on the two sets of standardized exams. I’ve excluded the PAT results for 2013 as the CBE schools had very low participation in that year due to the June flood.
(Click on the image below to see a larger view.)
You can see in the top graph that, excluding ELA 9, grade 9 EPSB students posted PAT results that were typically 2 to 3 percentage points higher than CBE students (who were performing at about the provincial average). The bottom graph shows the performance gap reversed, with CBE students maintaining a significant and consistent edge (sometimes more than 4 percentage points) in the core diploma exam subjects and EPSB students now hovering around the provincial average.
To make things more concrete, let’s look at the cohort of grade 9 students that wrote the Math 9 PAT in 2011. EPSB students in this group achieved a 67.3 average and CBE students achieved a 64.7 average, a 2.6 percent difference in favour of the EPSB. When this cohort of students sat down to write the Math 30-1 diploma exam three years later in 2014, the CBE students posted a 67.9 average and the EPSB students managed a 64.2 average, a 3.7 percent difference now in favour of the CBE. Over 3 years of high school math in the two boards, the CBE students improved and the EPSB students got worse.
Now let’s look at the performance of individual schools in the two boards on the 2014 diploma exams. The table below shows the school’s overall diploma average (the average score for all diploma exams written in the six core academic subjects: Biology 30, Chemistry 30, English 30-1, Math 30-1, Physics 30, and Social Studies 30-1), its ranking among all high schools in the province, the change in rank from the previous year, the number of diploma exams written by students at the school, and the percentage change in the overall diploma average from 2013.
(Click on the image below to see a larger view. You can also check out our full list at Alberta High School Rankings or see how individual schools perform with our Alberta High School Dashboard.)
Old Scona is, as always, at the top of the list. Each year, hundreds of students compete for the roughly 120 grade 10 spots available at the academically focused Edmonton high school. The school ranks these applicants on their academic performance to date and other factors and offers admission to the top candidates. The school’s diploma results (an 85.1 average in 2014) are always impressive, but the Old Scona deck is clearly stacked. (In fact, assuming that the typical entrance average for Old Scona students is over 90, it’s puzzling that the school doesn’t do even better on the diploma exams than it does.)
Other public schools in Edmonton and Calgary play the hand they’re dealt; they’re called on to educate the broad population of students who arrive at their doors with diverse backgrounds, interests, motivations, and abilities. Here’s where the CBE schools shine, taking the remaining 9 spots in the top 10. Edmonton’s next best performing high school after Old Scona is Harry Ainlay which places just out of the top 10. Even if Harry Ainlay absorbed Old Scona and all its students, it would still place behind the top 3 CBE schools (Sir Winston Churchill, Henry Wise Wood , and Western Canada) on overall diploma exam average.
Does the diploma exam performance gap between the two boards really matter? Differences in marks matter wherever marks matter. Marks matter if you’re a student who’s planning to go to university. Society pounds home that post-secondary education is the right choice: higher education equals higher earnings, higher growth in earnings, greater assets and savings, even higher life satisfaction. There may be many roads to a successful, productive, and happy life, but the widest and surest roads pass through post-secondary institutions. But you need to get in, and to get in, you need good, bordering on great, marks, because these institutions use marks as a proxy for academic potential and, because demand for spots at these institutions exceeds supply, the entrance process is highly competitive.
In Alberta, final marks in grade 12 are an average of school awarded marks and diploma exam marks. The diploma exams are the last tests that students take in the Alberta K-12 system. They’re high stakes, standardized tests designed to be statistically fair, reliable, and valid assessments of course curriculum. Because all students taking 30-level courses write the same (or statistically equivalent) final exams, diplomas make it possible to compare students, schools, and school authorities over time.
You can argue the limitations of standardized tests in measuring student learning or argue against education’s fixation on grades and test scores. You can also put forward a good argument that standardized tests shouldn’t be used to measure educational quality.
But learning can’t simply be asserted; it has to be demonstrated. And, in all levels of our educational system, students demonstrate learning by writing tests and doing well on those tests. When two comparable populations of students (public high school students in Edmonton and Calgary) write the same statistically designed tests and one group consistently achieves significantly higher results, I think we’re justified in looking for reasons for these differences, say their level of preparation or their learning experiences leading up to the assessments.
In our public school system, we try for the best possible educational outcomes for all students. Students should have similar academic opportunities and support regardless of where they choose to go to school. This is the ideal. But something seems to be off in Edmonton public high schools, and it’s leaving EPSB students at a disadvantage in relation to their CBE peers.
I don’t have an answer to the question I posed in the title of this post, but I can offer some hypotheses.
1. The differences in diploma results between the two boards are small and they’re due to chance. This hypothesis can be rejected using the comparison of means test. It shows that the differences are statistically significant. What’s more, Calgary public students outperform Edmonton public students year in, year out, again suggesting systematic rather than chance factors at play.
2. CBE high schools have lower participation rates in the core academic subjects. This could mean that CBE schools are more effective at steering students who may perform poorly in the core diploma subjects into alternatives (e.g, from Math 30-1 to Math 30-2). If these students are being consistently factored out, CBE averages would tend to rise.
The participation rate data published by Alberta Education doesn’t support this hypothesis. There are differences in participation rates between the two boards across subjects (e.g., in 2014, CBE had a higher participation rate for Chemistry 30, English 30-1, and Math 30-1 while EPSB had a higher participation rate for Biology 30, Physics 30, and Social Studies 30-1), but the CBE rates aren’t consistently lower.
3. Calgary has many private and charter schools. Edmonton doesn’t. According to Jason Clemens of the Fraser Institute, “when parents are empowered to choose their schools for their kids and we force schools to compete with one another, we get better school performance.” The diploma exam results provide support to this conservative mantra, at least when performance is viewed narrowly in terms of standardized tests. Calgary public high schools may pay closer attention to diploma exam results because their private and charter school neighbours continue to draw parents’ attention to academic performance. CBE schools that are “forced to compete” have managed to achieve better academic performance than EPSB schools that face no similar competition in their region.
4. Calgary is a more affluent city and everyone knows that results on standardized tests are highly correlated with household income. I looked at this relationship in my critique of the Fraser Institute’s high school rankings. Using Fraser’s data, I found a positive correlation between diploma exam results and household income for Calgary and Edmonton high schools. I also found that nearly 70% of the variation in diploma exam marks was left unexplained by household income. Maybe affluence is enough to tilt the diploma results in Calgary’s favour, but then you have to wonder why the affluence effect doesn’t play out similarly in the PAT results.
5. Calgary is a white collar town. Edmonton is a blue collar town. Folks in Calgary are more likely to have a university degree than folks in Edmonton. This fact comes straight from Statistics Canada’s 2011 National Household Survey. The proportion of Calgary adults aged 25 to 64 with a university degree is around 35%. In Edmonton, the proportion is around 25%. Students whose parents hold a degree are much more likely to complete university. We can assume the academic expectations and academic support in these households are also likely higher. So the difference in marks between the two boards may reflect a difference in household environment rather than school environment. As to why this household advantage isn’t reflected in the PAT results, I’m not certain. Perhaps university-educated parents increase their academic expectations and focus when they sense it really matters — sort of like cramming for finals.
6. EPSB high schools appear to have a lower 3-year completion rate than CBE high schools: a greater proportion of EPSB diploma exams are written by students who are upgrading or completing high school outside the traditional 3-year route. These students tend to score below the provincial average on the diploma exams. What’s more, they’re upgrading to improve an earlier diploma mark, which we can safely assume was lower than what they needed to move on to the next stage in their lives. This population of students therefore affects a district’s diploma exam results twice, bringing down the overall average each time.
EPSB serves these students through Metro Continuing Education and Centre High (as well as other smaller alternatives) while CBE serves these students through Chinook Learning Services. The number of diploma exams written at Centre High (described on its website as a dynamic high school for fourth and fifth year students) has risen sharply since 2008 while the number of exams written at Chinook Learning Services has risen more modestly. CBE high schools and their students seem more focused on completing high school (successfully) in 3 years and this focus has resulted in better diploma exam results for the district.
Here are the links to the EPSB high school completion rates and the CBE high school completion rates. Note that EPSB doesn’t report its 3-year completion rate but CBE does. For comparisons sake, in 2012-13, the EPSB 5-year completion rate was 77.3 and the CBE 5-year completion rate was 80.7 (with a 3-year completion rate of 74.0).
7. The Old Scona Effect. EPSB high schools do face competition, not from private or charter schools (as with the CBE) but from one of their own.
Edmonton junior high students compete for the limited (about 120) grade 10 spots available at this small academic high school that consistently finishes at the top of any rankings derived from diploma exam results. The school receives far more applicants than it can accept. Applicants who make the cut based on the school’s scoring criteria — a weighted mix of academic achievement in junior high, a written test of cognitive ability, and letters of recommendation from junior high principals — enter a crucible of high academic expectations and competition for the next 3 years. Applicants who don’t make the cut enrol in other EPSB high schools — where it appears expectations are lowered.
Now these other EPSB high schools don’t measure themselves academically against Old Scona; the school is an outlier. They measure themselves against their peers, the schools that serve the general student population. But Old Scona does affect the EPSB ecosystem in an important way. When it enrols the city’s top academic achievers, it removes them (and their parents) as an influence on the culture of other Edmonton high schools, and it removes the positive effect of this high-achieving sub-population from their diploma exam results. When the other Edmonton high schools compare themselves to each other academically and set benchmarks, they’re therefore choosing as normal a lower range of academic achievement than they otherwise would if Old Scona didn’t exist or if it wasn’t allowed to pick the cream of the crop.
Participation rates are just as important as averages and should be taken into account in comparing school districts and schools. Can you add this to your analysis? Thanks!!