Showing posts with label value-added. Show all posts
Showing posts with label value-added. Show all posts

11 October 2024

Why don’t parents value school effectiveness? (because they think LeBron’s coach is a genius)


A new NBER study exploits the NYC centralised school admissions database to understand how parents choose which schools to apply for, and finds (shock!) parents choose schools based on easily observable things (test scores) rather than very difficult to observe things (actual school quality as estimated (noisily!) by value-added).

Value-added models are great — they’re a much fairer way of judging schools than just looking at test scores. Whilst test scores conflate what the student’s home background does with what the school does, value-added models (attempt to) control for a student’s starting level (and therefore all the home inputs up that point), and just looking at the progress that students make whilst at a school.

David Leonhardt put in well;
“For the most part, though, identifying a good school is hard for parents. Conventional wisdom usually defines a good school as one attended by high-achieving students, which is easy to measure. But that’s akin to concluding that all of LeBron James’s coaches have been geniuses.”
Whilst value-added models are fairer on average, they’re typically pretty noisy for any individual school, with large and overlapping confidence intervals. Here’s the distribution of school value-added estimates for Uganda (below). There are some schools at the top and bottom that are clearly a lot better or worse than average (0), but there are also a lot of schools around the middle that are pretty hard to distinguish from each other, and that is using an econometric model to analyse hundreds of thousands of data points. A researcher or policymaker who can observe the test score of every student in the country can’t distinguish between the actual quality of many pairs of schools, and we expect parents to be able to do so on the basis of just a handful of datapoints and some kind of econometric model in their head??




Making school quality visible

If parents don’t value school effectiveness when it is invisible, what happens if we make it visible by publishing data on value-added? There are now several studies looking at the effect of providing information to parents on test score levels, finding that parents do indeed change their behaviour, but there are far fewer studies directly comparing the provision of value-added information with test score levels.

One study from LA did do this, looking at the effect of releasing value-added data compared to just test score levels on local house prices, finding no additional effect of providing the value-added data. But this might just be because not enough of the right people accessed the online database (indeed, another study from North Carolina found that providing parents with a 1-page sheet with information that had already been online for ages already still caused a change in school choice).

It is still possible that publishing and properly targeting information on school effectiveness might change parent behaviour.

Ultimately though, we’re going to keep working on generating value-added data with our partners because even if parents don’t end up valuing the value-added data, there are two other important actors who perhaps might — the government when it is considering how to manage school performance, and the school themselves.

28 September 2024

The best teachers usually don’t know who they are


"Nobody tells me that I'm a strong teacher”.

That’s what the best teacher in Los Angeles, Zinaida Tan, said in 2010 after the LA Times published the first ranking of teachers based on student progress. As the Guardian reports:

"Tan taught at Morningside Elementary, a decent if unremarkable school with an intake of mainly poor students, many of whom struggled with English. Year after year, students were entering Tan’s class with below-average ability in maths and English, and leaving it with above-average scores. You might imagine that before the Los Angeles Times published its rankings, Tan would have already been celebrated for her ability by her peers - that her brilliance would be well-known to fellow teachers eager to learn her secrets. You would be wrong on all counts.

When the Los Angeles Times sent a correspondent to interview Tan, they found her quietly carrying out her work, unheralded except by those who had taken her class and knew what a difference it had made to their lives. “Nobody tells me that I’m a strong teacher,” Tan told the reporter. She guessed that her colleagues thought her “strict, even mean”. On a recent evaluation, her headmaster noted she had been late to pick up her students from recess three times. It was as if Lionel Messi’s teammates considered him a useful midfielder who needed to work on his tackling."


We just found the exact same thing in Uganda. Ark Education Partnerships Group came up with the idea of doing a ranking of Secondary Schools based on value-added (adjusting for student’s starting point) rather than the current system of just looking at test scores at the end of school, which tends to reward schools that are able to select the best students, rather than necessarily teach them the most.

The Daily Monitor reports:

"Some of the 200 schools ranked best in the country yesterday, reacted to the news with shock and awe"

One of these surprise top-performing headteachers seems to have almost accidentally stumbled across a fundamental principle of modern education thinking, that all students can make progress if you teach to their level instead of focusing on just the brightest kids in the class.

"Mr Rajab Nsubuga, the head teacher of Hope Boarding Secondary School Lutembe, off Entebbe Road, the best Secondary School in the country, also said the ranking shocked him, adding that the school works on a philosophy that every student is a learner only that they accommodate slow, average and fast learners at their school."

Phil Elks and I wrote up a paper with the methodology here. One thing in particular we note is that from a quick count we found at least 24 other countries that have official national primary and secondary exams that could be used for similar analysis. And for all the technical flaws of value-added models, they’re a pretty clear improvement on what currently happens, which is rankings of schools based on raw test scores.