Tuesday, May 27, 2008

Ranking high schools

I received this email from a person who graduated from Mercer Island High School in the 1970s. (he can identify himself more fully in the comments if he likes -- I don't like to mention people's names/identity unless I have their explicit permission):

Was having lunch and reading the paper at the Roanoke earlier this week, and the topic of conversation at the bar was the latest Newsweek high school poll which did not mention MIHS. Went to the Newsweek site, couldn’t find MIHS  anywhere. Then Googled, and found this (http://168.99.1.74/MISD/schools/hs/sitecouncil/default.html):

Newsweek Best Schools Report
This poll (The nation’s most challenging schools) has been ongoing for a number of years. The Education Reporter for the Washington Post makes the claim that you can define the nation’s top institutions by identifying the number of students who take AP tests divided by the total number of senior students.
Mercer Island is not on this list because we elected not to participate. Schools must submit this information themselves. Our superintendent asked the school board whether they wanted to use this as one of our indicators. The board did not want to do so. Therefore the decision was made not to participate. John has looked at our student AP testing numbers and discovered that, indeed, if we had participated, we would have been on the list (but not in the top 100). In addition, our students’ AP scores are among the highest. Our WASL scores are also the highest in the state. We have also been nominated as a National Blue Ribbon School, a much more significant recognition.

So…am I reading this right? The board thought that since MIHS wouldn’t rank in the top 100, they wouldn’t participate? I don’t know if that rating would hold true today, but it’s interesting to note that International School, Newport, Interlake, and Bellevue are all in the top 100 on that poll this year. Might be worth a public discussion, don’t you think? We – the residents – are paying the taxes that fund this school program, and we deserve to know how it compares nationally – and locally. At least, that’s how I see it. <smile>

I agree that a single statistic is misleading, and maybe even counter-productive if it causes schools to change policies to conform to the rating.  That’s the argument against WASL, NCLB, etc. etc. 

I also don’t blame schools (even the ones who rank well) for resisting such rankings, just like I resist when my boss at work ranks my team against others.  I tell him he’s comparing apples to oranges and it’s not only unfair, it’s counter-productive.  But on the other hand, all world-class companies do that, so this can't be all bad.  And although yes it can be hard and unfair sometimes, in the aggregate and over time, I think it works pretty well.

Competition is a good thing and I don’t know how you can improve without a way to keep score.  I don’t believe Mercer Island is anywhere near as good a school district as it could be –as it has to be if my kids are going to compete with China and India someday.

When I discussed this recently with an influential Islander (again, sorry not to mention him by name -- he's welcome to speak up in the comments), he said this:

I think the best way to compare schools is by measuring the way they prepare students for the next step.


In districts like ours, where 95%+ go on to higher education, it only measure how well the kids do in college.   In fact, the Newsweek lists says they choose AP/IB tests as measurements because it’s a way to measure their readiness for college.  Much better would be seeing how well kids from the different schools actually do in college.


Newsweek’s way is measuring inputs when we really care about the output.


So, go to colleges and ask them how kids from various high schools do relative to each other.  Locally, the Seattle Times has done this a few times (by measuring the GPA difference between student’s high school GPA and their freshman-year college GPAs at various Washington State colleges/universities).  While that output measure isn’t ideal (it ignores strength of schedule and soph-senior years), it’s a start. 


PS Some argue for measuring how many kids get into their first college choice.  The problem with this... is that many kids don’t have a realistic view when it comes to their first choice.


PPS  I completely agree with your comment “I don’t believe Mercer Island is anywhere near as good a school district as it could be” – we could have a very long discussion about that on many levels.  One quick note is that’s the primary reason the whole “big idea” effort is underway led by some pretty impressive folks and energy.  Here’s hoping that takes us closer to the goal.

What do you think?  Please leave comments (anonymously if you prefer) or email me (if you're too shy) and I'll summarize in a later post.

3 comments:

Anonymous said...

Here's an interesting letter asking not to be included in the Newsweek list -- signed by many superintendents, including those from high schools considered among the nation's elite: http://www.sfschools.org/2008/04/school-supes-boycott-newsweek-hs.html.

Anonymous said...

Hi Richard - this is Keith Pleas (who sent you that orignal pointer).

>> Competition is a good thing and I don’t know how you can improve without a way to keep score.

Absolutely. More to my original point though, I don't know how you can make a meaningful investment decision without some measure of the costs AND benefits. In my experience, if you can't quantify the benefits you leave yourself open to emotional (and often counter-productive) decisions.

And, yes, any statistical measure has flaws. But if we allow those being judged to pick which measurement THEY want to use, we're just introducing bias into the process - everyone will naturally play the game that justifies their contribution. IMHO, if a school has pretensions at being "elite", then it should participate in any widely used measurement system. And then, if they feel that the results are not "fair" to them, then they can explain their rationale.

>> Much better would be seeing how well kids from the different schools actually do in college.

Not so fast. If there's an inherent advantage that a district has - proximity to universities, percentage of residens with college degrees, financial resources, whatever - then the relevant measurement would be how much better their kids did solely because of the education system. Meaning, you need to measure enough variables across enough schools that you can factor out income level, distance, et cetera and still have a statistically valid measurement.

My personal MI education highlights an additional factor that may be difficult to measure - opportunity. In particular, I was able to take advantage of (at the time) different start times between the schools to enable me to take an early class at MIHS and then walk the short distance to what was at that time North Mercer Junior High. And in high school, I was able to take non-matriculated classes at the UW in the afternoon that were counted towards high school attendance. I was able to do these things because the school system supported me.

So - for me - the value of the MI school system wouldn't have been visible in any national averages.

My brother (two years behind me) was a National Merit Finalist and Presidential Scholar candidate. Out of curiosity, I just Googled ""presidential scholar" and "mercer island", and found the 2007 list contained 3 students from MIHS (and two from MI who went to Lakeside) compared to 4 from all of Bellevue (one each at Issaquah, Bellevue, Lakeside, and Newport). However, Google didn't return any mention of this in Mercer Island related sites. A search for "presidential scholar" on the Mercer Island School District site returned "Service is too busy" at 5:25pm on Saturday evening. Sheesh.

Richard Sprague said...

Keith, those are excellent points, good reasons why it would be silly to choose a school based on one, simplistic number.

But why not trust parents and their kids to make up their own minds? Give them all the statistics, including the Newsweek rankings, and let them decide for themselves. I don't buy a car based solely on the Consumer Reports recommendations -- why would I buy a school system based purely on Newsweek? Wouldn't it always be better to have more ratings than fewer?