The Forum > Article Comments > Educational sexism in Queensland > Comments
Educational sexism in Queensland : Comments
By John Ridd, published 26/4/2013Comparing Core Skills Tests with OP and gender suggests that Queensland boys are being shafted.
- Pages:
-
- 1
- Page 2
- 3
- 4
- 5
- 6
-
- All
Posted by Hasbeen, Friday, 26 April 2013 2:44:10 PM
| |
The trend towards verbosity in the sciences is evident in the National Curriculum. Elucidating the problem amongst the contextual verbiage is as much the challenge as solving it. The trend away from tests/exams in assessment weighting in schools and towards research assignments (annotated, referenced) generally suits the temperament of girls over boys, and success in public service "busy-work" roles.
There is also a trend towards incorporating into science what should be left to social studies departments. Beliefs and values have been imbued into the science curriculum where students must be able to wax about the social impact of scientific advancement as much as understand the science, and this is reflected in assessment weightings. This suits the female temperament over the male, IMO, at least until a few years beyond school-leaving age. I do not believe boys are now less intelligent than girls. Something has changed in the curriculum and assessment of the sciences and whether it really reflects the needs of the external scientific world is a moot point. Posted by Luciferase, Saturday, 27 April 2013 9:54:34 AM
| |
In case anyone’s still following this:
Dear CARFAX (1) See VALIDITY OF HIGH-SCHOOL GRADES IN PREDICTING STUDENT SUCCESS BEYOND THE FRESHMAN YEAR: High-School Record vs. Standardized Tests as Indicators of Four-Year College Outcomes* http://cshe.berkeley.edu/publications/docs/ROPS.GEISER._SAT_6.12.07.pdf (2) To prove that no cherrypicking is involved, I offer the following quotes from the ACER summary of the 2009 TIMMS. Australia’s average Year 4 mathematics score in TIMSS 2011 was not significantly different to the achieved score in TIMSS 2007, but Australia’s 2011 score was a significant 21 points higher than in TIMSS 1995. (p. 10) Australia’s average Year 4 science score in TIMSS 2011 was significantly lower than the achieved score in TIMSS 2007, but Australia’s 2011 score was not significantly different to the score in TIMSS 1995. (page 12) Australia’s average Year 8 mathematics score in TIMSS 2011 was not significantly different to the achieved score in TIMSS 1995, although there have been some small fluctuations over the 16 years. (page 14) Australia’s average Year 8 science score in TIMSS 2011 was not significantly different to the achieved score in TIMSS 1995, although there have been some fluctuations over the 16 years. (page 16) Sue Thomson, S. et al. Highlights from TIMSS & PIRLS 2011 from Australia’s perspective. Melbourne: ACER. Dear eyejaw In future, if you wish to refute what someone says, I suggest you address what they said rather than change the subject. If I had time to follow this exchange any further, I’d look forward to your response to the evidence I offered, which is direct from the publication quoted above. I won’t be looking at this exchange any longer, so don’t bother replying. I have other dragons to slay :-) Posted by Godo, Saturday, 27 April 2013 7:06:05 PM
| |
It’s far too simplistic to assume that there is sex discrimination being practised against boys, based on a gender breakdown of QCST and OP scores.
Firstly, 41,330 students completed Grade 12 in 2012, but this essay concentrates only on the 25,760 students who sat for the QCST/OP. So 37% of the field data is missing. Unlike females, many Grade 12 males opt for VET and apprenticeship training, so don’t need a high OP or an OP at all. Almost certainly, these future apprentices comprise a large portion of that 37%. The reduced number of males left to sit the QCST are likely to be more academically inclined and, thus, perform proportionally better in the ‘A’ score. Secondly, QSA research has found that there is almost no deviation between QCST results and WSM (within school marking for OP assessment) in male-only and female-only schools. However, there is a significant deviation in co-ed schools. [http://www.qsa.qld.edu.au/downloads/publications/report_qcs_test_review_2012.pdf, p.26] If the move to ‘social-based’ Science/Maths teaching were negatively affecting boys, then this would show up in the male-only schools results – but no such deviation exists. So the problem seems to lie in school-based factors, e.g. location, degree of remoteness, socio-economic conditions. A much higher proportion of co-ed schools are government schools, whereas most independent schools are same-sex schools. Also, there is a much higher proportion of government located in lower socio-economic areas. It would be better to look into these factors rather than create an unnecessary and inappropriate gender backlash. Posted by Killarney, Saturday, 27 April 2013 7:33:03 PM
| |
Correction: second last line above should read 'of government schools'
Posted by Killarney, Saturday, 27 April 2013 7:36:19 PM
| |
Hi Killarney,
Your point that the males who do the QCS are self selected to be better than average because lots of less academically minded males don’t sit for the QCS is true but it does not affect the analysis in this article. If the school assessment was fair, we should see the boys out-performing the girls on the school assessment and thus the OP score – because they have self selected to be better than the girls (on average). But we don’t see that. In fact they do worse than the girls. And your point that there is “…….. no deviation between QCST results and WSM (within school marking for OP assessment) in male-only and female-only schools. However, there is a significant deviation in co-ed schools “ supports the analysis in this article. An all male or all female school will by definition not be affected by a gender bias in either the QCS or the school assessment. However one would expect to see a deviation in co-ed schools if a gender bias is occurring. Perhaps I am misunderstanding your point but I think your first point is true but not relevant here and the second supports the argument in this article. Peter Ridd JCU Posted by Ridd, Saturday, 27 April 2013 8:26:49 PM
|
The current system was obviously introduced for 4 reasons.
1/ It may even have been a legitimate attempt to help those who suffer exam nerves. The fact that being trained to be able to perform under a little pressure should be part of education is totally ignored.
2/ In school assessment allows teachers to favor some students without sanction.
3/ It favours girls who are more likely to apply themselves to home assignments.
4/ It allows totally incompetent teachers to hide this fact virtually indefinitely.
It allows those good at Google to get away with almost no knowledge of a subject, & allows the teachers pet to sail into higher education knowing almost nothing. Year 10 high school math being taught in Environmental Science BSc courses is a testament to that.
Not only is it a bad system, it is totally immoral, & virtually designed d to be rorted.
An externally set & marked final exam is the only fair way to assess students, & also allows the home schooled & self schooled to access the system. Of course it does not offer the same number of well paid jobs for teachers who don't like the classroom.