The Forum > Article Comments > The launch, the crash and the recovery of My School > Comments
The launch, the crash and the recovery of My School : Comments
By Chris Bonnor, published 1/2/2010When you get into the business of comparing schools, with all this entails, there can be little margin for error - too much is at stake.
- Pages:
-
- 1
- Page 2
- 3
- 4
-
- All
The problem is, as the article demonstrates, that only 30% of the difference between schools is explained by what schools do (as distinct from who they enrol). It’s not my opinion, the chief exec of ACARA himself regularly cites this.
So even before you start, the significance of NAPLAN as an explanation of the difference between schools and their relative worth is seriously flawed. It means that you must factor in all other differences between the schools otherwise you end up comparing apples and oranges. ACARA know this, hence the ICSEA index - which falls short of the mark for reasons I explained.
What Sniggid needs to show is evidence from anywhere that more testing and high stakes testing raises the quality of learning and the quality of schools. I wish you luck!
Candide’s suggestion (re-introduce school inspectors) is a partial answer to vanna’s plea (where do we get this information?). Independent professional assessment of schools is expensive but others do it (e.g. New Zealand) – why can’t we? And they use student test data as a valuable source, but certainly not the only source, of information.
The post from Queensland reminds me of another problem. Year 7 in Qld is at the end of primary school – the kids have been at the school for years. In NSW the Year 7 test is given after the kids are in their new high school for just over three months. And then we compare NSW and Qld Year 7 NAPLAN results! Someone needs to explain to me why this is valid.