AARE blog

AERO responds to James Ladwig’s critique

AERO’s response is below, with additional comments from Associate Professor Ladwig. For more information about the statistical issues discussed, a more detailed Technical Note is available at AERO.

On Monday, EduResearch Matters published a post by Associate Professor James Ladwig which critiqued the Australian Education Research Office’s Writing development: what does a decade of NAPLAN data reveal? 

AERO’s response is below, with additional comments from Associate Professor Ladwig. 

AERO: This article makes three key criticisms about the analysis presented in the AERO report, which are inaccurate.

Ladwig claims that the report lacks consideration of sampling error and measurement error in its analysis of the trends of the writing scores. In fact, those errors were accounted for in the complex statistical method applied. AERO’s analysis used both simple and complex statistical methods to examine the trends. While the simple method did not consider error, the more complex statistical method (referred to as the ‘Differential Item Analysis’) explicitly considered a range of errors (including measurement error, and cohort and prompt effects).

Associate Professor Ladwig: AERO did not include any of that in its report nor in any of the technical papers. There is no overtime DIF analysis of the full score – and I wouldn’t expect one.  All of the DIF analyses rely on data that itself carries error (more below). There is no way for the educated reader to verify these claims without expanded and detailed reporting of the technical work underpinning this report. This is lacking in transparency, falls shorts of the standards we should expect from AERO and makes it impossible for AERO to be held accountable for its specific interpretation of their own results.

AERO: Criticism of the perceived lack of consideration of ‘ceiling effects’ in AERO’s analysis of the trends of high-performing students’ results, omits the fact that AERO’s analysis focused on the criteria scores (not the scaled measurement scores). AERO used the proportion of students achieving the top 2 scores (not the top score), for each criterion, as the matrix to examine the trends. Given only a small proportion of students achieved a top score for any criterion (as shown in the report statistics), there is no ‘ceiling effect’ that could have biased the interpretation of the trends.

Associate Professor Ladwig made his ‘ceiling effect’ comments while explaining how the NAPLAN writing scores are designed not in relation to the AERO analysis.

AERO: The third major inaccuracy relates to the comments made about the ‘measurement error’ around the NAPLAN bands and the use of adaptive testing to reduce error. These are irrelevant to AERO’s analysis because the main analysis did not use scaled scores, it did not use bands, and adaptive testing is not applicable to the writing assessment.

Associate Professor Ladwig’s comment was about the scaling in relation to explaining the score development, not about the AERO analysis.

In relation to the AERO use of NAPLAN criterion score data in the writing analysis, however, please note that those scores are created either through scorer moderation processes or (increasingly where possible) text interpretative algorithms.  Here again the address of the reliability of these raw scores was absent, but with one declared limitation noted, in AERO’s own terms:

Another key assumption underlying most of the interpretation of results in this report is that marker effects (that is, marking inconsistency across years) are small and therefore they do not impact on the comparability of raw scores over time. (p[.66)

This is where AERO has taken another short cut, with an assumption that should not be made.  ACARA has reported the reliability estimates to include that in the scores analysis.  It is readily possible to report those and use them for trend analyses.

AERO: A final point: the mixed-methods design of the research was not recognised in the article. AERO’s analysis examined the skills students were able to achieve at the criterion level against curriculum documents. Given the assessment is underpinned by a theory of language, we were able to complement quantitative with a qualitative analysis that specifically highlighted the features of language students were able to achieve. This was validated by analysis of student writing scripts.

Associate Professor Ladwig says this is irrelevant to his analysis. The logic of this is also a concern. Using multiple methods and methodologies does not correct for any that are technically lacking.  In relation to the overall point of concern, we have a clear example of an agency reporting statistical results in a manner that elides external scrutiny accompanied by an extreme media positioning. Any of the qualitative insights to the minutia these numbers represent will probably very useful for teachers of writing – but whether or not they are generalisable, big, or shifting depends on those statistical analysis themselves. 

AERO’s writing report is causing panic. It’s wrong. Here’s why.

If ever there was a time to question public investment in developing reports using  ‘data’ generated by the National Assessment Program, it is now with the release of the Australian Educational Research Organisation’s report ‘Writing development: What does a decade of NAPLAN data reveal?’ 

I am sure the report was meant to provide reliable diagnostic analysis for improving the function of schools. 

It doesn’t. Here’s why.

There are deeply concerning technical questions about both the testing regime which generated the data used in the current report, and the functioning of the newly created (and arguably redundant) office which produced this report.

There are two lines of technical concern which need to be noted. These concerns reveal reasons why this report should be disregarded – and why media response is a beatup.

The first technical concern for all reports of NAPLAN data (and any large scale survey or testing data) is how to represent the inherent fuzziness of estimates generated by this testing apparatus.  

Politicians and almost anyone outside of the very narrow fields reliant on educational measurement would like to talk about these numbers as if they are definitive and certain.

They are not. They are just estimates – but all of the summary statistics reports are just estimates.  

The fact these are estimates is not apparent in the current report.  There is NO presentation of any of the estimates of error in the data used in this report. 

Sampling error is important, and, as ACARA itself has noted, (see, eg, the 2018 NAPLAN technical report) must be taken into account when comparing the different samples used for analyses of NAPLAN.  This form of error is the estimate used to generate confidence intervals and calculations of ‘statistical difference’.  

Readers who recall seeing survey results or polling estimates being represented with a ‘plus or minus’ range will recognise sampling error. 

Sampling error is a measure of the probability of getting a similar result if the same analyses were done again, with a new sample of the same size, with the same instruments, etc.  (I probably should point out that the very common way of expressing statistical confidence often gets this wrong – when we say we have X level of statistical confidence, that isn’t a percentage of how confident you can be with that number, but rather the likelihood of getting a similar result if you did it again.)  

In this case, we know about 10% of the population do not sit the NAPLAN writing exam, so we already know there is sampling error.  

This is also the case when trying to infer something about an entire school from the results of a couple of year levels.  The problem here is that we know the sampling error introduced by test absences is not random and accounting for it can very much change trend analyses, especially for sub-populations So, what does this persuasive writing report say about sampling error? 

Nothing. Nada. Zilch. Zero. 

Anyone who knows basic statistics knows that when you have very large samples, the amount of error is far less than with smaller samples.  In fact, with samples as large as we get in NAPLAN reports, it would take only a very small difference to create enough ripples in the data to show up as being statistically significant.  That doesn’t mean, however, the error introduced is zero – and THAT error must be reported when representing mean differences between different groups (or different measures of the same group).

Given the size of the sampling here, you might think it ok to let that slide.  However, that isn’t the only short cut taken in the report.  The second most obvious measure ignored in this report is measurement error.  Measurement error exists any time we create some instrument to estimate a ‘latent’ variable – ie something you can’t see directly.  We can’t SEE achievement directly – it is an inference based on measuring several things we can theoretically argue are valid indicators of that thing we want to measure.  

Measurement error is by no means a simple issue but directly impacts the validity of any one individual student’s NAPLAN score and an aggregate based on those individual results.  In ‘classical test theory’ a measured score is made of up what is called a ‘true score’ and error (+/-).  In more modern measurement theories error can become much more complicated to estimate, but the general conception remains the same.  Any parent who has looked at NAPLAN results for their child and queried whether or not the test is accurate is implicitly questioning measurement error.

Educational testing advocates have developed many very mathematically complicated ways of dealing with measurement error – and have developed new testing techniques for improving their tests.  The current push for adaptive testing is precisely one of those developments, in the local case being rationalised as adaptive testing (where which specific test item is asked of the person being tested changes depending on prior answers) does a better job of differentiation those at the top and bottom end of the scoring range (see the 2019 NAPLAN technical report for this analysis). 

 This bottom/top of the range problem is referred to as a floor or ceiling effect.  When large proportion of students either don’t score anything or get everything correct, there is no way to differentiate those students from each other – adaptive testing is a way of dealing with floor and ceiling effects better than a predetermined set of test items.  This adaptive testing has been included in the newer deliveries of the online form of the NAPLAN test.

Two important things to note. 

One, the current report claims the performance of high ‘performing’ students’ scores has shifted down – despite new adaptive testing regimes obtaining very different patterns of ceiling effect. Second, the test is not identical for all students (they never have been).  

The process used for selecting test items  is based on ‘credit models’ generated by testers. Test items are determined to have particular levels of ‘difficulty’ based on the probability of correct answers being given from different populations and samples, after assuming population level equivalence in prior ‘ability’ AND creating difficulties score for items while assuming individual student ‘ability’ measures are stable from one time period to the next.  That’s how they can create these 800 point scales that are designed for comparing different year levels.

So what does this report say about any measurement error that may impact the comparisons they are making?  Nothing.

One of the ways ACARA and politicians have settled their worries about such technical concerns as accurately interpreting statistical reports is by introducing the reporting of test results in ‘Bands’.  Now these bands are crucial for qualitatively describing rough ranges of what the number might means in curriculum terms – but they come with a big consequence.  Using ‘Band’ scores is known as ‘coarsening’ data – when you take a more detailed scale and summarise it in a smaller set of ordered categories – and that process is known to increase any estimates of error.  This later problem has received much attention in the statistical literature, with new procedures being recommended for how to adjust estimates to account for that error when conducting group comparisons using that data.  

As before, the amount of reporting of that error issue? Nada.

 This measurement problem is not something you can ignore – and yet the current report is worse than careless on this question.

It takes advantage of readers not knowing about it. 

When the report attempts to diagnose which component of the persuasive writing tasks were of most concern, it does not bother reporting that the error for each of the separate measures of those ten dimensions of writing has far more error than the total writing score, simply because the number of marks for each is a fraction of the total.  The smaller the number of indicators, the more error (and less reliability).

Now all of these technical concerns simply raises the question of whether or not the overall findings of the report will hold up to robust tests and rigorous analysis – there is no way to assess that from this report, but there is even bigger reason to question why it was given as much attention as it was.  That is, for any statistician, there is always a challenge to translate the numeric conclusions into some for of ‘real life’ scenario.

To explain why AERO has significantly dropped the ball on this last point, consider its headline claim that year 9 students have had declining persuasive writing scores and somehow representing that as a major new concern.  

First note that the ONLY reporting of this using the actual scale values is a vaguely labelled line graph showing scores from 2011 until 2018 – skipping 2016 since the writing task that year wasn’t for persuasive writing (p 26 of the report has this graph).  Of those year to year shifts, the only two that may be statistically significant, and are readily visible, are from 2011 to 2012, and then again from 2017 to 2018.  Why speak so vaguely? From the report, we can’t tell you the numeric value of that drop, because there is no reporting of the actual number represented in that line graph.  

Here is where the final reality check comes in.  

If this data matches the data reported in the national reports from 2011 and 2018, the named mean values on the writing scale were 565.9 and 542.9 respectively.  So that is a drop between those two time points of 23 points.  That may sound like a concern, but recall those scores are based on 48 marks given for writing.  In other words, that 23 point difference is no more than one mark difference (it could be far less since each different mark carries a different weighting in formulation that 800 scale).  

Consequently, even if all the technical concerns get sufficient address and the pattern still holds, the realistic title of Year 9 claim would be ‘Year 9 students in 2018 NAPLAN writing test scored one less mark than the Year 9 students of 2011.’

Now assuming that 23 point difference has anything to do with the students at all, start thinking about all the plausible reasons why students in that last year of NAPLAN may not have been as attentive to details as they were when NAPLAN was first getting started.   I can think of several, not least being the way my own kids did everything possible to ignore the Year 9 test – since the Year 9 test had zero consequences for them.  

Personally, these reports are troubling for many reasons, inclusive of the use of statistics to assert certainty without good justification, but also because saying student writing has declined belies that obvious fact that is hasn’t been all that great for decades.  This is where I am totally sympathetic to the issues raised by the report – we do need better writing among the general population.  But using national data to produce a report of this calibre, by an agency beholden to government, really does little more than provide click-bait and knee jerk diagnosis from all sides of a debates we don’t really need to have.

James Ladwig is Associate Professor in the School of Education at the University of Newcastle.  He is internationally recognised for his expertise in educational research and school reform.  Find James’ latest work in Limits to Evidence-Based Learning of Educational Science, in Hall, Quinn and Gollnick (Eds) The Wiley Handbook of Teaching and Learning published by Wiley-Blackwell, New York. James is on Twitter @jgladwig

AERO’s response to this post

ADDITIONAL COMMENTS FROM AERO provided on November 9: information about the statistical issues discussed, a more detailed Technical Note is available at AERO.

On Monday, EduResearch Matters published the above post by Associate Professor James Ladwig which critiqued the Australian Education Research Office’s Writing development: what does a decade of NAPLAN data reveal? 

AERO’s response is below, with additional comments from Associate Professor Ladwig. 

AERO: This article makes three key criticisms about the analysis presented in the AERO report, which are inaccurate.

Ladwig claims that the report lacks consideration of sampling error and measurement error in its analysis of the trends of the writing scores. In fact, those errors were accounted for in the complex statistical method applied. AERO’s analysis used both simple and complex statistical methods to examine the trends. While the simple method did not consider error, the more complex statistical method (referred to as the ‘Differential Item Analysis’) explicitly considered a range of errors (including measurement error, and cohort and prompt effects).

Associate Professor Ladwig: AERO did not include any of that in its report nor in any of the technical papers. There is no overtime DIF analysis of the full score – and I wouldn’t expect one.  All of the DIF analyses rely on data that itself carries error (more below). There is no way for the educated reader to verify these claims without expanded and detailed reporting of the technical work underpinning this report. This is lacking in transparency, falls shorts of the standards we should expect from AERO and makes it impossible for AERO to be held accountable for its specific interpretation of their own results.

AERO: Criticism of the perceived lack of consideration of ‘ceiling effects’ in AERO’s analysis of the trends of high-performing students’ results, omits the fact that AERO’s analysis focused on the criteria scores (not the scaled measurement scores). AERO used the proportion of students achieving the top 2 scores (not the top score), for each criterion, as the matrix to examine the trends. Given only a small proportion of students achieved a top score for any criterion (as shown in the report statistics), there is no ‘ceiling effect’ that could have biased the interpretation of the trends.

Associate Professor Ladwig made his ‘ceiling effect’ comments while explaining how the NAPLAN writing scores are designed not in relation to the AERO analysis.

AERO: The third major inaccuracy relates to the comments made about the ‘measurement error’ around the NAPLAN bands and the use of adaptive testing to reduce error. These are irrelevant to AERO’s analysis because the main analysis did not use scaled scores, it did not use bands, and adaptive testing is not applicable to the writing assessment.

Associate Professor Ladwig’s comment was about the scaling in relation to explaining the score development, not about the AERO analysis.

In relation to the AERO use of NAPLAN criterion score data in the writing analysis, however, please note that those scores are created either through scorer moderation processes or (increasingly where possible) text interpretative algorithms.  Here again the address of the reliability of these raw scores was absent, but with one declared limitation noted, in AERO’s own terms:

Another key assumption underlying most of the interpretation of results in this report is that marker effects (that is, marking inconsistency across years) are small and therefore they do not impact on the comparability of raw scores over time. (p[.66)

This is where AERO has taken another short cut, with an assumption that should not be made.  ACARA has reported the reliability estimates to include that in the scores analysis.  It is readily possible to report those and use them for trend analyses.

AERO: A final point: the mixed-methods design of the research was not recognised in the article. AERO’s analysis examined the skills students were able to achieve at the criterion level against curriculum documents. Given the assessment is underpinned by a theory of language, we were able to complement quantitative with a qualitative analysis that specifically highlighted the features of language students were able to achieve. This was validated by analysis of student writing scripts.

Associate Professor Ladwig says this is irrelevant to his analysis. The logic of this is also a concern. Using multiple methods and methodologies does not correct for any that are technically lacking.  In relation to the overall point of concern, we have a clear example of an agency reporting statistical results in a manner that elides external scrutiny accompanied by an extreme media positioning. Any of the qualitative insights to the minutia these numbers represent will probably very useful for teachers of writing – but whether or not they are generalisable, big, or shifting depends on those statistical analysis themselves. 

It’s so dramatic: what new play Chalkface gets very right about being a teacher

Chalkface is a new play about teachers, currently being staged by the Sydney Theatre Company and playing at the Sydney Opera House.

Written by Angela Betzien, Chalkface is advertised as a black comedy in which an ‘old school’ teacher clashes with a bright-eyed newbie. As a researcher of teachers’ work and a head teacher Science, we hoped the play would be a fresh take on the profession we live every day; that it would make us laugh, and maybe even have something insightful to say.

We went to see Chalkface last Saturday evening and overall, we think that the play gets a lot right.

From the peeling paint, to the old mis-matched chairs, out-of-service hot water tap and enormous tin of Nescafe Blend 43, the set is without question the quintessential public school staffroom in Australia. There is even a resident rat* eating through the precious limited stock of coloured paper in the supply cupboard. Meanwhile, one teacher tells the newbie teacher to note the school’s general scent, which he describes as “old fart”.

Between us, we’ve spent a lot of time in a lot of schools. While some are newer or better-resourced than others, we can tell you that this is generally a pretty accurate representation of public education in NSW. (The “old fart” smell in particular seems, curiously, universal.)

What the accuracy of the play’s set, and jokes about lack of resources reflect, however, are a systemic underfunding and lack of maintenance of government educational facilities which will not be news to any local audience. It is well known Australia has a problem with educational equity, and the play takes frequent jabs at wealthy private sector schools. While the teachers in the play guzzle Blend 43 and rotate cleaning shifts, for example, the private school up the road has apparently just hired a full-time barista for its staff. The contrast here is stark, and while not all private schools are hugely wealthy, some of them certainly are and despite years of debate about developing a ‘needs-based’ funding system, we aren’t there yet.

Chalkface doesn’t end its commentary on education policy with resourcing, however. The school principal wants teachers to focus on NAPLAN preparation at the expense of richer learning activities, as he angsts about the possibility of losing students to other schools. This experience has a sound basis in research; the marketisation of education through the ongoing encouragement of parental ‘choice’ and the displaying of NAPLAN results on the My School website has had well-documented flow-on effects of ‘teaching to the test’.

Nicknamed ‘Thatcher’, the principal is renowned within the school for his austerity, even stealing kids’ lunchboxes from the lost and found. His anxiety about the school’s budget reflects not only an overall lack of funding, but also the current positioning of principals as school ‘business managers’, having a larger share of financial responsibility for the running of the school. Our bright-eyed newbie teacher, for instance, is on a temporary contract, which she is told is because she is cheaper; the rise in fixed-term contract work in teaching is also a current issue for the profession.

The relationship between ‘Thatcher’ and the rest of the teachers in the school is, indeed, fractious. The principal establishes a ‘suggestions’ box which is derided, by everyone other than him, as a “black hole”. Again we see resonance with current themes in policy and research: under autonomous schooling conditions, principals in NSW have been described as chronically over-worked, their attention diverted from engaging with staff perspectives and working conditions.

As for the teachers themselves, the divide between the ‘old’/experienced, and young/‘new’ teacher may seem stereotypical, yet also raises important questions around teacher burnout. One of the discomforts we felt while watching Chalkface was the way in which the teachers, especially those more experienced, talked about their students. Usually these were jokes but they were always disparaging, and not always funny. ‘Deficit’ talk – where students and/or their families are described as lacking, either in intelligence or desirable social norms – is indeed rife in teaching and probably does rear its ugly head most frequently in school staffrooms. It can serve to support cycles of poor academic outcomes for populations of students experiencing forms of educational disadvantage. It can also be linked to burnout, as indeed the younger teacher in the play identifies: one of the three dimensions of burnout is ‘depersonalisation’, and we see much of this in the staffroom talk of Chalkface.

Also of concern is the raft of rather alarming health conditions the teachers experience, caused by their jobs. One has a damaged coccyx from having a student pull a chair out from underneath him; another has spent the summer holidays in a psychiatric ward after being locked in a cupboard overnight by a student. These are extreme examples. They are funny, but they are also not funny, reflecting genuine, current concern with teacher wellbeing.   

There are some positive outcomes in Chalkface. The two women teachers who are the main characters learn and grow from each other, and it’s genuinely enjoyable to see them do so. But that is about it. Ultimately, nothing is done about the inequity the school faces and the difficulty of these teachers’ jobs. In fact, most of the teacher characters leave this under-resourced school by the end of the play.

Chalkface lands on a description of pedagogy as the “art and science of hope”. Generally, the play feels authentic. It made us laugh, and sometimes grimace in frustrated recognition. But ultimately, its ending portrays a bleak situation for public school education. We hope this part isn’t accurate, although we worry that it is.

*Spoiler alert: it turns out, it’s not a rat.

Meghan Stacey is a senior lecturer in the UNSW School of Education, researching in the fields of the sociology of education and education policy. Taking a particular interest in teachers, her research considers how teachers’ work is framed by policy, as well as the effects of such policy for those who work with, within and against it.


Nigel Kuan is Head Teacher Science at Inner Sydney High School. He holds a Bachelor of Science with Honours in Physics Education and a Masters of Teaching. Nigel has presented at Teach Meets and other practitioner forums, and takes a particular interest in student engagement and scientific literacy. 

Header images from the Sydney Theatre Company. Photo: Prudence Upton From left to right: Susan Prior, Stephanie Somerville, Catherine McClements and Nathan O’Keefe in Sydney Theatre Company’s Chalkface, 2022.

Open access: why we must break down the paywalls now

Publication via open access increases visibility, reach and impact of educational research, so why are we still hitting our heads against paywalls?

Why do we publish? For attention, for funding and for impact. The quality research undertaken by Australian education researchers has the biggest chance of impact if it is readily accessible to educators and policymakers. So why is the important voice of research left out of policy and practice? 

There is no question that one major obstacle is the lack of timely access to published research. Unless you are a researcher with an academic library login, then paywalls, restrictive terms of access, and the time to get hold of legally published research, shut out the very people who need this information.

Traditional publishing model 

Why are our organisations paying excessive amounts to both publish and  access (often) publicly funded-research? In case anyone needs a refresher on the traditional academic journal publishing model:

Academics receive funding via grants to conduct research (taxpayer-funded). Research findings in the form of academic papers and book chapters are reviewed (mainly for free) by their academic peers. Authors rely on their institutions to pay publication costs. Once published, institutions pay again to get access to these works via library subscriptions. 

So, who benefits from this model? Certainly not those outside the academic ecosystem, nor those inside it. The costs are worn primarily by researchers and their institution (x 2) while publishers make a significant profit (for example, a major academic publisher now has a profit margin heading towards 40%, higher than companies like Microsoft and Google).  

OA benefits all

Next week is Open Access Week (24-30 October) which provides an opportunity for the academic and research community to stop and rethink why we do what we do. So, what is open access (OA)? 

Open Access is the free, immediate, online availability of research articles combined with the rights to use these articles fully in the digital environment (SPARC).

The momentum towards open access publishing is not without challenges and misrepresentation, and fair and equitable access to Australian education research remains challenging. 

The quality question 

There remains persistent misconceptions around OA, often conflated with predatory publishing. Quality discussions in 2022 are more nuanced. In order to be indexed by the Australian Education Index, ERIC, Scopus, et al., all journals (no matter their publishing model), are evaluated for adherence to quality standards including the application of peer review. The most comprehensive index of OA journals, DOAJ, follows rigorous principles for transparency and best practice in scholarly publishing that need to be met to be included. Quality and credibility should be vested in the research itself, not simply by the place of publication. While business models are very much in flux, scepticism is healthy when applied to both not-for-profit and big-name commercial players.

Re-route around the paywall

As members of educational organisations and academia we are not the only consumers of research. OA ensures that those outside of the institutional bubble, including those responsible for policy and information provision, have access to timely research. It also supports collaborative research without restrictions on a global scale. 

Internationally, there is a lot in-play around research access and discoverability. The emphasis on OA is significant in light of the White House (OSTP) memo on equitable access to research, and the release of UNESCO Recommendation on Open Science

Nationally, the OA policy revision from Australia’s leading health and medical research body removes the current 12-month embargo on the release of NHMRC-funded research from 2023 and all research will be openly licensed (ensuring and clarifying re-usability). If open access is good for science and medicine, it’s just as crucial for education. 

If governments and research organisations are mandating equitable and timely access the writing on the ‘paywall’ seems inevitable. It’s feasible to expect more favourable changes in terms of access to all research.

That’s not to say that for-profit publishers are not embracing OA – many actually are. But while they might be removing paywalls to access the research, they have generally adopted an author-pays model via article processing charges (APCs), again contributing to inequitable publishing opportunities for authors. Some have noted that major publishers are actually now leveraging the OA movement for their own benefit and time will tell as to the success of transformative agreements between institutions and publishers. 

Taking back control of the education research landscape

Improving access to education research requires an informed approach to navigating the OA landscape. Put simply, it comes down to the purpose of education research to affect and improve educational practice and policy. Journals are not the only vehicle for sharing research. Institutional and subject-specific repositories are valuable in terms of discoverability and expanding the use of research to a wider and diverse community. Repositories are central locations for institutional content and “have a critical role in archiving, preserving and sharing the diverse content produced by universities so it can be used by others and have the greatest impact on our society”.

Disciplinary repositories offer a bespoke OA alternative, and contribute to the visibility of discipline-specific research. We have recently been involved with a project that highlights actions that improve access to tertiary education research. The Universities Australia Learning and Teaching Repository (LTR) collates higher education learning and teaching research. Initially LTR archived the learning and teaching project reports and outputs funded by the Australian Government from 1994-2018. Repository content is openly licensed and in 2021 LTR undertook a pilot activity to index selected articles from the education-focused, OA Student Success journal. This pilot activity provides two obvious benefits in terms of educational research: 

  1. LTR, as a resource, is sustained by contemporary scholarly research filling the void left by the removal of consistent education research funding in Australia
  2. As an OA tool, educational research is curated and amplified to an educational community that extends further than our institutions. 

What can researchers do?

As education researchers, there is a lot we can do to make the OA model work for us.

Publish and review in OA journals

When thinking about where to donate your time as a reviewer or author, consider open access options. Read your publishing contract. If you transfer your copyright to a publisher can your own institution re-use your published content in its teaching? Most OA journals apply Creative Commons licenses which articulate rights to reuse, which means immediate and free access to your work.

Deposit in relevant institutional and education repositories

Give your institution’s repository plenty of love. Depositing your research in an institutional repository automatically improves discoverability and impact and that’s a fact. Most recently, the Curtin Open Knowledge Initiative examined citation activity in the OA landscape and noted “Open access through disciplinary or institutional repositories showed a stronger effect than open access via publisher platforms”. 

Advocate for Open AccessThere is value in an active (and collective) academic voice when it comes to advocating for OA to educational research. First and foremost, academic libraries and the information professionals working in this space are well-placed to help researchers navigate the changing publishing landscape. Secondly, keep abreast of international and national OA organisations and their activities. Open Access Australasia is a ‘go-to’ central source for current information and resources. Get your institution on board if they are not already members. Finally, if you are an editorial member or a reviewer for a paywalled educational research publication, start a conversation around the value of OA in your own community – what have you got to lose?

Tracy Creagh is Journal Manager, Academic Journals in the Office for Scholarly Communication at Queensland University of Technology (QUT). She manages three of QUT’s five open access, peer reviewed scholarly journals and leads the institutions’ Open Access (OA) Journals Community of Practice dedicated to sharing and contributing to best practice in open scholarship. She is also Managing Editor of the Student Success journal. Tracy is on Twitter @creaght 

Pru Mitchell is Manager, Information Services at the Australian Council for Educational Research and adjunct lecturer, School of Information and Communication Studies at Charles Sturt University. She is a long-term advocate for open education resources and her research interests include metadata standards for digital collections that enhance the discoverability of open access content. Pru is on Twitter @pru

‘My attitude COMPLETELY changed’: why universities should move new teachers from resentment to respect for Indigenous Australia as we vote on the Voice


Editor’s note: One of the biggest challenges in Australian education is how we embed an understanding of Indigenous cultures and knowledges. As Australia approaches a vote on The Voice, universities have a responsibility to change Initial Teacher Education (ITE)  to incorporate cultures and knowledges appropriately. Students enrolled in ITE already have views on what they will learn in their compulsory courses – and those views are confronting. How can educators move students from uncomfortable and scared, to be bold and prepared? This is a longer blog post than usual – but in it, Quandamooka scholar Dr Mitchell Rom explores how we might produce a teaching workforce that places sufficient value on Indigenous knowledges and perspectives.

In June this year, the Australian Institute for Teaching and School Leadership (AITSL) released its final report Building a culturally responsive Australian teaching workforce as part of its Indigenous cultural competency project. This national report is a progressive step in the right direction towards raising awareness and understanding of how to better support Indigenous students in schools. The term “cultural competency” is defined in the report as, “When organisations and individuals…expand their cultural knowledge and resources in order to better meet the needs of minority populations” (Cross et al., 1989, as cited in AITSL, 2022, p. 35). The term “cultural responsiveness” is also used in the report which stated that “Being ‘culturally responsive’, in the context of Australian schools, is the ability to respond to the diverse knowledges, skills and cultural identities of Aboriginal and Torres Strait Islander students” (AITSL, 2022, p. 9). 

Prepared over three years, the report suggested that ITE should play a key role in developing the cultural competency and responsiveness levels of pre-service teachers or future teachers in Indigenous education. It stated “It is critical that ITE programs prepare teachers for the wide range of students they may teach, including Aboriginal and Torres Strait Islander students” (AITSL, 2022, p. 17). The report further recommended that “Aboriginal and Torres Strait Islander content should be included as a mandatory unit of study or indeed, mandatory cross-curricula focus, within ITE programs” (AITSL, 2022, p. 6). Having recently completed my PhD in ITE and compulsory Indigenous education, I agree with these statements and the importance of ITE programs. However, the recent report unfortunately does not acknowledge that the Indigenous education space at university is filled with colonising and complex challenges for academic teaching teams and pre-service teachers. Before looking to ITE programs as one of the answers to improving the cultural competency of teachers, it is important to consider the varied challenges linked to Indigenous education courses in these programs.

Study findings

My PhD study was grounded in the context of the Australian Professional Standards for Teachers (APST), Graduate Standards 1.4 and 2.4, which were introduced by AITSL in 2011. These two national standards emphasise teaching Indigenous students (1.4) and teacher skills and knowledge around reconciliation in schools (2.4) (AITSL, 2011). Universities have now turned their attention to preparing our future teachers to be able to meet these national standards and develop cultural competency in this area through offering Indigenous education courses.

The study specifically focused on the key learning, teaching and education policy challenges situated in contemporary Indigenous education courses at university. The study involved 174 non-Indigenous pre-service teachers from an elite Queensland university who were studying a compulsory Indigenous education course. It also involved five academic teaching staff from the same course, as well as myself as a Quandamooka teacher who has taught in three Indigenous education courses across two Queensland universities. The research identified, through a storying methodological approach (Phillips & Bunda, 2018), a total of 11 key challenges across three interrelated areas of the university. These three areas included key challenges within the university classroom space (lecture or tutorial context), the broader university institution, as well as with education policy, namely APST 1.4 and 2.4 by AITSL.

Pre-service teacher journeys in Indigenous education

Pre-service teachers can have varied experiences of studying Indigenous education at university. The study found that some pre-service teachers were willing to engage with Indigenous education from the initial commencement of the course. For example, one pre-service teacher shared “At the beginning of the course, I felt excited and ready to expand my views and knowledge”. Another pre-service teacher noted “I felt increasingly comfortable with the way everything was taught and became more understanding, appreciative and open minded about Indigenous ways of knowing, being & doing”. Some pre-service teachers began the course displaying resistance towards Indigenous education and then were able to change their attitude and position as the course progressed. In addition to this, some pre-service teachers remained resistant learners throughout the entirety of the course, despite the efforts of teaching teams and national policy agendas such as APST 1.4 and 2.4.  Overall, the research found that many Queensland pre-service teachers experienced challenges navigating a compulsory Indigenous education course within their ITE program.

Stepping into the course, 126 pre-service teachers shared that they had mixed initial views towards learning Indigenous education. One pre-service teacher stated “I was wary [the course] would be wrapped in anti-Western rhetoric and would focus on demonizing Western culture”. Another pre-service teacher shared “I didn’t have a high opinion or high expectation from the course title alone and felt apprehensive going into this course mainly due to what people had said about previous semesters (most people told me this course sucked)”. Other words used to describe student feelings towards beginning Indigenous education included “uncomfortable”, “borderline apathetic”, “unimpressed”, “confused”, “confronted”, “dreading it”, “pointless”, “guilty”, “very hesitant” and “unprepared”. Moreover, 66 pre-service teachers shared that they had limited engagement, knowledge and understanding with regards to Indigenous education prior to university. In the study, pre-service teachers shared that “I’d never had much to do with Indigenous studies in my schooling so I wasn’t sure what to expect in this course” and “My experiences [with Indigenous studies] at school were mostly tokenistic”.

Within the university classroom, nearly 40 pre-service teachers showed a level of resistance towards studying the course. One pre-service teacher commented on the compulsory nature of the course and stated “I was not looking forward to beginning the course and was extremely unhappy it was compulsory since I knew I’d be thought of as a middle-class white male that oppressed everyone”. In relation to being introduced to the concept of white privilege in class, another pre-service teacher shared “I didn’t like how the tutorials made me feel. It felt like the teaching staff would make activities that addressed how white privileged I was and almost make me feel shit about being white”.

As the course progressed during the semester, a number of pre-service teachers were able to shift their attitudes regarding Indigenous peoples and education. For example, one pre-service teacher wrote “My perspectives, understandings and attitude around Indigenous education have COMPLETELY changed but I also believe that this was attributed to the study habits and attitudes I brought into tutorials and lectures”. Another pre-service teacher shared “In class, I was continually faced with situations where I would think ‘But that’s not my fault’, but was able to stop and transform my understanding so that I could use my white privilege to ensure the deserved respect is given to the first peoples of Australia”. These student experiences demonstrate learning, understanding and growth in this contested learning and teaching space. I am confident these non-Indigenous pre-service teachers will continue to work to strengthen Indigenous education as allies.

Unfortunately, the study also highlighted various inconsistencies in relation to pre-service teacher development. In the final week of studying the course, one pre-service teacher mentioned “I would say that the nature of this course has made me look at Indigenous education more negatively”. Another pre-service teacher shared “Right now, I feel confused and this course has left me more scared of teaching Aboriginal and Torres Strait Islander students than I was before”. On finishing the course, one hesitant pre-service teacher wrote “While I write this in the final week of the course, I still do not feel like I’m prepared to meet [APST] 1.4 & 2.4”.

Moving forward

Some pre-service teacher experiences shared above highlight an education system that is gradually shifting towards a greater respect for Indigenous education matters. While this is positive, the findings also reflect a current system (particularly in relation to some schools in Queensland), that does not place sufficient value on Indigenous knowledges and perspectives. This is reflected in many pre-service teacher comments around their previous learning and their own ill-preparedness to commence the Indigenous ITE course. In light of this, and the broader study findings, education stakeholders including AITSL need to be aware that improving cultural competency requires an understanding of the complex challenges situated in compulsory Indigenous education. It also requires a recognition that there are key challenges that sit external to the control of academic teaching teams including, in particular, pre-service teachers arriving at university ill-prepared from schools and remaining resistant to studying Indigenous education. Broadly speaking, for these challenges to improve, educational institutions at all levels (from primary schools to universities), and those who lead and work in these social institutions need to continue to shift and change. This is needed so that by the time our pre-service teachers commence Indigenous education studies at university, they are more equipped to navigate these spaces and become culturally competent and responsive practitioners. One way of doing so is by placing greater value on Indigenous knowledges and perspectives in schools. This includes ways of thinking that seek to challenge the colonial status quo. By doing so, this will support our future teachers to be more effectively prepared when working with Indigenous students in our schools and therefore will continue to strengthen Indigenous education.

Mitchell is a postdoctoral researcher interested in Decolonial studies, Education and Health. His PhD focused on the key learning, teaching and education policy (Australian Professional Standards for Teachers 1.4 and 2.4) challenges situated in the contemporary Indigenous Australian education space at university. He initially trained as a secondary school teacher in the disciplines of English and History in Queensland and studied Education for over a decade. He also taught at university, published and worked across various levels of education. As a Quandamooka researcher, Mitchell is interested in discussing social matters with like-minded scholars for positive community change. Mitchell is an advocate of strength-based thinking and decoloniality. Contact him on LInkedIn.

Image in header is Ren Perkins, a PhD student undertaking research with and about Aboriginal and Torres Strait Islander teachers, in action.

Beyond koalas, UN organisations work to put climate change education on the agenda

How UN organisations work to put climate change education on the agenda

Climate literacy is the biggest educational challenge of our time. Even as the Australian government’s recent action signals productive steps toward climate action, there is still much work to be done in addressing climate change and climate literacy can help with this. 

What is climate literacy? It’s an understanding of how climate change works, but also includes feeling motivated and able to participate in meaningful change for the planet. Where climate literacy is established, political will for action will follow.

We can achieve this through education – as we rapidly approach catastrophic ecological thresholds and tipping points, education is increasingly being considered a key component in our response to climate change. Young people are acutely aware of—and are taking action against—the unfolding climate crisis, with research capturing their experience and its emotional toll.

Intergovernmental bodies, such as the United Nations (UN), play an important role in progressing social causes through education, and action on climate change education is an ongoing focus for UN organisations. This includes the development of UN policy programs with a focus on climate change education, which impact national education policies globally. 

The work of these UN organisations is impacted by a lack of resourcing and isolated organisational structures. Observing the ways that UN organisations work together to advance their common agendas is of crucial importance to understanding how they meet their climate action goals. 

The UN & climate action

There are an increasing number of UN international organisations (IOs) developing programs in relation to climate change education (CCE): 

These UN policy programs aim to guide policy decision-making by UN member states until 2030. 

These initiatives are promising, given the urgent need for escalated action on climate change globally, and considering the impact previous UN policy programs have had on national and regional education systems, as well as other development goals. However, there is concern about the influences on policy programs and their effects on program quality. 

A lack of transparent decision-making weakens the work of the UN and impacts the quality and effects of work intended to address climate change. The UN’s policy programs are also up against other political forces that challenge their ability to fulfil their goals.

In light of these factors, it is important to consider the impact that those who work for and within policy programs (policy actors) do across networks and through relationship-building in order to deliver the UN’s work.

How UN networks advance the climate agenda

Environmental concerns have not always been seen as a core policy agenda in the education sector, just as education has not previously been significant on the environmental policy agenda.

The UN IOs introduced above bridge this gap with their dual focus on environment and education. However, both UNESCO-ESD and the UNFCCC-ACE are small, under-resourced subunits, doing important and far-reaching work with minimal resourcing.

The lack of focus on climate change education creates incentive for UN organisations to seek support across programs and networks. These relationships help to increase the profile of the climate work being done, and strengthen each policy program.

UNESCO’s ESD and UNFCCC’s ACE coordinate activities that bring together government and non-government representatives to do climate policy work. Our study finds that there are several points of connection between ESD and ACE, including writing joint reports, and meeting formally and informally to make connections.

The joint writing of policy documents, such as the Action for Climate Empowerment: Guidelines for accelerating solutions through education, training and public awareness and Integrating ACE into Nationally Determined Contributions: A Short Guide for Countries provided opportunities to work together on similar aims.

Meetings are also a key aspect of the shared work of UN organisations. Both ACE and ESD host regular high level events, attended by national governments and other intergovernmental staff, as well as staff who work across these UN IOs. Meetings often provide opportunities for cross-organisational work to unite in the interest of shared climate agendas. 

These co-network policy arrangements help UN organisations to overcome their limitations (such as limited resources) and link up with another organisation that does have the mandate and resources for an outcome they both would like to achieve (e.g. producing/publishing climate change education reports).

This tells us that the work involved in achieving the aims of climate change-oriented UN programs relies on networks of shared interest, and relationships bridging across organisations.

What does this tell us about the UN’s work on climate change education?

Climate change is a matter of critical importance to present and future generations around the globe. Our research indicates that advancing climate change education is not limited to siloed organisations or policy writers; it is a responsibility that is shared across multiple organisations and groups.

It also tells us that two of the UN’s international organisations tasked with addressing climate change, ACE and ESD, do not necessarily have the mandates or resources that are required to deliver the work required in advancing climate change education.

By better understanding the interactions across these policy programs, this research helps our understanding of how organisations might work together, outside and across their scope and jurisdictions, in order to gather adequate resources to promote and fulfil climate agendas. 

Marcia McKenzie is Professor in Global Studies and International Education in the Melbourne Graduate School of Education, University of Melbourne. Her research includes both theoretical and applied components at the intersections of comparative and international education, global education policy research, and climate and sustainability education, including in relation to policy mobility, place and land, affect, and other areas of social and geographic study. This research will be the focus of Marcia’s forthcoming papers at the 2022 AARE conference in Adelaide. 

Stephanie Wescott is a Postdoctoral Research Fellow in the Melbourne Graduate School of Education, University of Melbourne. Her postdoctoral research studies the network relationships between UN policy programs and climate change education.

Why teacher unions matter now more than ever

Teachers are striking. Not just in NSW, Australia, where the NSW Teachers’ Federation went on strike when it took to the streets in late 2021. Teachers in the states of Washington, Ohio and Seattle in the United States also took strike action this year in response to similar pressures that Australian teachers are facing. They are demanding smaller class sizes, more specialist support for teachers, higher wages, and better conditions to prevent teacher burnout.

My research has focused on school education mainly in NSW where market-driven agendas have entrenched competitiveness in education systems, contributed to the rise of precarious work in the teaching profession, slowed the growth of teacher salaries, and increased the workload and administrative burden on teachers and school leaders. For the last 40 years, neoliberal policy agendas in education have threatened to undermine the democratic foundations of public schooling and weaken education unions that represent the voice of thousands of teachers.

Research on teacher unions is lacking

Although public education is an issue at the forefront of society, what is lacking in the conversation about neoliberal education reform agendas is how teacher unions attempt to challenge such agendas. Teacher unions are important civic and economic associations that articulate teachers’ collective and professional voice.

As an interdisciplinary researcher spanning the fields of industrial relations and education, my research focuses on the interrelations between employees and employers and their representatives, and the state. For nearly 10 years, I’ve examined the complex contexts in which teacher unions organise and campaign in an effort to understand the strategies they use to resist neoliberal agendas.  

My chapter, recently published in Empowering Teachers and Democratising Schooling: Perspectives from Australia, contributes to understanding how teacher unions build grassroots activism and shape campaign strategies to resist neoliberalism and inspire action towards a more democratic future. The chapter is nested in a broader conversation in the book, alongside contributions from teachers and researchers, which is focused on giving primacy to teachers’ voices in education scholarship and public debates. The chapter draws upon insights from my doctoral thesis which examined how one teacher union in Australia has campaigned over the last 40 years in response to various education reforms and threats to teachers’ working conditions.

Lessons in building union power

There are contemporary reports of a teacher shortage crisis in New South Wales. Compounding this is an ageing teaching workforce. While the education and training industry has the highest proportion of employees who are trade union members, revitalising how unions recruit and engage the next generation of activists is a key concern of unions today, not only for teacher unions. According to the latest Australian Bureau of Statistics union membership data from August 2020, only 5% of employees aged 15-19 years are trade union members; this is only marginally higher at 6% for those aged 20-24.

In addition to the profile of unionism changing, the social, political, economic and cultural environment of organising is also evolving. One Former Assistant General Secretary of the teachers’ union I spoke to in my research reflected on this: “[when] you started teaching, you joined the superannuation scheme, you joined the health fund, and you joined the union, and you were just active in the union”.

Renewing strategies to engage an incoming generation of teachers into the profession has been an important task for teacher unions. Strategies have included organising beginning teacher conferences for teachers new to the profession, establishing networks to connect young activist teachers, and offering training and professional development opportunities for members.

Campaigning to advance alternative ideas in public education

The concepts of governance, accountability, efficiency and competition are also changing the way we think about public institutions and public services. Freedom within education is being constrained and democracy is being threatened by neoliberal logics. Such ideology has challenged the fundamental values on which public education systems have been built.

Research shows that teacher unions have responded to this in various ways, including ‘organising around ideas’, connecting with community and campaigning for social justice. This means presenting alternatives to dominant (neoliberal) ideas and campaigning for a vision of quality public education based on the values and principles of democracy and social justice.

Framing campaign messages in response to different contexts enables unions to set the agenda for public education and articulate the voice of the profession.

For instance, using ‘local stories’ can be a powerful way to appeal to parents and community members during campaigns. In one education funding campaign I researched, a Union Organiser from a teacher union spoke about how their campaign was framed around:

“not talking about the billions of dollars and talking about macro level, but just saying to the community this is what it means to you, this is what it means to Billy in kindy when he arrives at the school and he can’t speak a word of English, he’s able to get access to support . . . Those stories can’t be refuted and they can’t be talked down . . . [i]f you’re actually talking about a real human from a real place in a real situation.”

Empowering teachers has also been important in the face of threats to their core industrial and professional conditions of work, as well as the strong criticism and blame that has been placed on teachers over many recent years.

Teachers and their unions are working in challenging times. Continuing to foster a sense of empowerment in teachers and placing the voice of teachers at the centre of education debates is crucial in order to protect and advance the conditions of one of the largest occupations in the world.

Mihajla Gavin is a lecturer in the Business School at the University of Technology Sydney, and has worked as a senior officer in the public sector in Australia across various workplace relations advisory, policy and project roles. Mihajla’s research is concerned with analysing the response of teacher unions to neoliberal education reform that has affected teachers’ conditions of work. Mihajla is on Twitter @Mihajla_Gavin

Anonymous writes: I became a better teacher during COVID. I didn’t yet know I had cancer

Remember the COVID shutdowns? Remember the months of remote teaching?

As a middle school teacher, I thought I handled the remote phases pretty well. We had two big ones in the ACT – week after week of innovating lessons.

So, when I felt pretty tired at the end of last year, I chalked it up to being middle aged, to having had a few years of classroom challenge, even to the research and writing required to finish my Master of Education.

Heck, I even wondered if I was experiencing long COVID.

But over the summer holidays I didn’t really recharge properly. Not like I had in previous years.

The news was then full of stories about teacher burnout. About teachers getting tired. Yeah – that’s me, I thought.

As the weeks of Term One passed, it was getting clear I was more tired. Not the just-have-a-nap tired. By May, when I came back from a conference, I slept for 14 hours straight.

By June, I went to the doctor. Tests for all sorts of things. Test after test. Samples and vials and specimen jars. Machines I had never seen before, probing parts I had never really thought about.

The tests raised a few red flags. So then there were different tests in July. Then a biopsy. Then a cancer diagnosis in August.

Looking back, my hardcore tiredness was the result of a billion blood cells fighting a battle against tumours each day.

But (and here is the funny thing) I became a better teacher this year. I refined my approach. I focussed on my classroom craft and I was ruthless about my approaches to overcome workload issues.

To deal with the tiredness – remember, I thought this was just a COVID-shaped hangover for most of the year – I went back to my toolbox this year to consider what works. What works for me and what works every time, just so I didn’t have to think so hard to be a good classroom operator. 

I found my auto-pilot with the following approaches.

  1. If it doesn’t work, don’t do it. This one sounds simple, but I wanted to make sure my limited energy was going into what works. I had long been a fan of the Victoria Department of Education’s High Impact Teaching Strategies. These provide a drive-through summary of work from Hattie, Lemov, Marzano, and the Teaching and Learning Toolkit from Evidence For Learning. Yes, they all use different methodologies to measure effect size and identify what works well in a classroom. The HITS set aside the variation in approaches and terms and provide 10 powerful strategies, and meant I needed less brain energy to plan my teaching approaches.
  2. Get the students to do the work. As I started to get more and more tired, I plugged into the energy of my students by flipping the classroom. I had tinkered with this during remote schooling, but I went full flip with a lot of guidance from Catlin Tucker’s work on blended learning. I also tuned into effective classroom preparation and talk using TQE from Marisa Thomson. This meant students came in with the ideas and the questions for the lesson. It wasn’t up to me to light up a lesson. I had started the process with the students doing the work, even before I got to school.
  3. I do, we do, you do. Every lesson finished with a modelled writing from me, then a cooperative writing task, then student writing. This worked in my early-career classrooms with support from First Steps/Stepping Out and it still works today. Every lesson. The routine became my friend. The students knew what was coming.  They got to ask “why did you do that?”, they heard my think-alouds while writing and they tried the ideas themselves. They got better at their writing. They knew they were getting better. You want energy in a classroom? Tap into the confidence of students on the move. Bandura has been saying this stuff for half a century and it is a powerful ride.
  4. But what about the marking? Yep – more writing would normally mean more marking, right? If you don’t mark it, how do the students know where they are going? It’s all about formative assessment, yeah? Well, kinda. I tapped into some marking techniques from Jennifer Gonzalez via Cult of Pedagogy . She writes “It’s important for students to get lots of practice, and to get credit for their effort, but not everything needs careful grading.” These techniques changed my classroom and helped my students progress every week.

Every classroom is different. Every class is different. Every teacher is different. I am not writing this to tell teachers how to do their job – you might be surprised how much your ego gets trimmed when doctors poke and prod at you for a few months.

But, there may be something here to help teachers use their energy budget effectively. There may be a way here that helps a tired teacher achieve more and feel good about themselves and their classes, and not burn out.

(One more thing. I have had a level of support from a partner, a principal and a head of department that is beyond words. The spoken and unspoken support often made the difference between hope and despair. If teaching is about relationships, then it is these relationships that will rescue you on the days when you feel about to drown.)

Teaching is hard. 

Teaching in a pandemic is hard. 

Teaching in a pandemic with cancer is hard.

Let’s be kind. Let’s be smarter. Let’s do the absolute best we can do to support each other and build the skills of our students.

Note from the editor: This piece is written by a former contributor to the AARE blog. That person wanted to remain anonymous. I agreed to the request and published the piece as it is.

We build submarines and the defence force. Now we must support the families who work in them


The Federal Government has plans to expand Australian Defence Forces (ADF) to a 40-year high. They hope to increase the forces by 30% (18,500 extra personnel by 2040), the biggest increase since the Vietnam War. This will inevitably lead to an increase in the number of children and parents impacted by military service. 

It won’t just be enough to recruit new soldiers, sailors and aviators – retention will also be critical and we know that Defence families play a key role here. Defence families are depended on to provide a crucial service to the ADF, often at significant cost to their own wellbeing. Defence families are mostly ‘invisible’ in our communities, and struggle to get access to the support and understanding they need.  

Our PhDs explored the experiences of young children and partners in defence families and sheds light into some of the factors affecting the ADF, military members, and their spouses, children and loved ones.

Dutiful housewife and children model

One of the major challenges is attracting and retaining staff because of the high demands of the job. The military is a ‘greedy institution’, demanding great sacrifice from the defence member and their family

Most Defence families are expected to relocate at least every 2 years. Frequent relocations, and absences from home, make it incredibly challenging for Defence families to have their own careers and supportive relationships within education settings, as the former Minister for Defence, Peter Dutton highlighted in comments earlier this year. The new federal government announced a funding boost to 48 community-based organisations providing value to defence families and building connections in July.

As Defence is recognising, the expectation of partners who need to sacrifice their own career to support the career of the ADF member is out of step with the vast majority of modern families with dual careers. It is also out of step with children who are connected to peers, educators and the wider community.

Over 73% of Australian couple families have two sources of income and women make up 19% of the ADF. The ADF seeks to be an employer of choice. 

Children are often quite connected to their extended family, and their community through extracurricular activities. Additionally, many build a sense of belonging and the sense of place within their education communities.

Perfect female partners and perfect children

There is pressure on partners of Defence members to perform a ‘perfect spouse’ role, which is at odds with modern society.

The model assumes ‘perfect partners’ will sacrifice not only their career, but will also dutifully perform a ‘perfect spouse’ role. They will not complain about the inconvenience of Defence life. For example, participants said they felt pressure to ‘suck it up and deal with it’ when they were having trouble during deployments. 

The model often requires families to give up access to sources of support which provide a protective buffer. These include extended family, friends within their community, educators, health care professionals and community groups. Additionally, access to specialist services may not be available where they are posted, or those services might not understand the experience of being a military family. 

Incorrect or outdated information about the support Defence families receive can have negative impacts, such as the perception that families receive free housing, as well as some more outlandish claims. For example, one participant said some of her friends thought she travelled on Air Force planes every time they went on holiday.

Children can also experience a lack of empathy from peers, and even teasing if they attend early childhood services or schools that have little experience with military families.

When families don’t receive the support and understanding they need from their communities, it can impact their willingness to stay associated with the military. 
Retention of highly trained members is difficult, with many personnel citing ‘family reasons’ when they leave. As one family explained

We had never planned for it to be Caleb’s career forever. In the end we chose to leave much earlier because of the promotion they offered him. This meant he was going to be away more often for training. When Jess turned 3 we realised Caleb had only been there 1 year of her life…(a) big issue for us. Caleb had missed the first soccer games and other big events in the children’s lives.

The military also makes enormous demands from spouses and families. Defence families have the impossible task of keeping each ‘institution’ (military and family) satisfied. 

This is especially the case when military members work away for months on deployment or lengthy training sessions. This leaves the partner to cope with their own careers, the needs of the children and run the household themselves. 

This is especially stressful when the children are younger and are less able to understand the sudden disappearance of a parent. Partners are dealing with their own responses, and the responses of their children which can sometimes feed off each other. Children’s responses vary, and can include a regression in physical, social, emotional and cognitive (learning) skills.

While time apart is challenging, reintegration is often harder, as the defence member tries to fit back into family life. The children and family have adapted and grown while they were away. 

He was really tired and tried sleeping during the day …. The kids … made really loud noises suddenly and he would be angry… it is hard because when you are on base you are with adults for 9 months…adults who are good at following orders. When he came home, he was dealing with a toddler and a pre-schooler.

… the kids were up to different stages so he was often babying them and they didn’t want to be babied. Nine months is a long time in a young child’s life and they changed a lot. He was also really upset by some of the parenting decisions I had made in his absence.

Some children emotionally protect themselves by not getting close to the parent who has been away. 

Sam had a rebellion against me …There was some nervousness about coming home and trying to fit back in with the children, especially after Sam’s episodes of not wanting to have anything to do with me.

Educators reported children were very clingly when their parent deployed, often reluctant to play with peers at first. They were also less able to cope with small moments of tension in play episodes and were likely to react emotionally.

Support for young children

Until recently, there was also a lack of Australian resources to assist young children understand transitions and stresses they faced within defence families. This showed a lack of understanding and acknowledgement of the sacrifices young children make within defence forces.

Just because very young children may not be able to say why they are upset, it matters to them when a parent is no longer available. Fortunately, funding has enabled free research-based resources to be created to help parents, educators and family/social workers better support young children. 

Apart from frequent relocations and parental deployment, some children can also experience a parent having service-related physical injuries, medical and mental health conditions. This has been highlighted in the Royal Commission into Veteran Suicide which has also highlighted these barriers to recruitment.

Where to from here?

Effective recruitment and retention will need policy changes. To address attrition, this Recommendation Report called for policies to guarantee families with children could only be asked to relocate a maximum of 3 times from birth to 18. The report also recommended using a flexible model for deployment where parents deploy for longer but less often. In this model, training episodes can be built into the deployment to reduce transitions at home, reducing stress for children. 

This will also assist children to build strong and supportive relations with their educators, peers and community. This builds stronger, more resilient communities who have a greater capacity to support children from defence communities.

Additionally, greater awareness of modern military experiences in the community will benefit current and future families. This means better understanding for families as they access community services, including GPs and early childhood educators, who might not appreciate the challenges of deployment and frequent relocations.

Bios  

Marg Rogers is a senior lecturer in the Early Childhood Education and Care program at the University of New England and the lead researcher for the funded Early Childhood Defence Program project (ECDP). This team, along with their Steering Committee of stakeholders has developed research-based, free, online resources for early childhood educators, parents and family/social workers to better support young children from Australian military families. She tweets at @MargRogers11 and you can find her on LinkedIn.

Amy Johnson is a lecturer in journalism and public relations at CQ University. Her current research projects include the Early Childhood Defence Project, which develops research-based, free online resources for educators and parents to better support young children from Australian military families as well as projects which enhance veterans and family’s wellbeing. Amy has lived experience of military service as an officer in the Royal Australian Navy (Reserve) and the partner of an ADF veteran. She tweets at @AmyJohnsonPhD and you can find her on LinkedIn.

What we must do now to rescue Australian schools

We expect education to be a catalyst for more equitable and inclusive societies yet too often governments and systems deploy one-stop solutions without detailed plans for how exactly improvements will be achieved or at what costs.

The Building Education Systems for Equity and Inclusion report comes from an Academy of Social Sciences of Australia workshop I hosted at the Gonski Institute for Education at UNSW. Working with representatives from school systems, academia, professional associations, industry, and teachers, the report offers recommendations aimed at addressing inequities in the school system.

Recommendations centre on five key issues: intergenerational policy failure; the need to look beyond the school gate; raising the voice of the profession; data, evidence and research; and ensuring a focus on teaching and learning.

Intergenerational policy failure

While the Australian Government is spending more on education than at any point in history, disparity gaps endure for various equity groups on a range of outcomes. Needs- based funding tied to the implementation of evidence-based reforms hasve been distorted courtesy of the unique policy architecture of Australian federalism. School systems have limited resources with which to pursue their objectives and the design of school funding policies plays a key role in ensuring that resources are directed to where they can make the most difference. 

Australian federalism means there is neither a national system nor a state/territory system of school-based education. Common critiques focus on overlap in responsibilities and duplication. Achieving uniformity is difficult, time consuming, and frequently limited to the lowest common denominator. However, education is a complex policy domain whose actions impact well beyond state or territory borders. Currently, no jurisdiction wants to be the first to admit there are problems meaning systems can deteriorate substantially before action is taken. Asserting jurisdiction independence and sovereignty surrenders some of the strengths of federalism and removes important failsafe mechanisms targeting overall health of the system.

A significant policy problem for education is the current teacher shortage. Substantial attention has been directed at Initial Teacher Education programs, and the attraction and retention of educators. Less focus has been granted to affordability of housing for teachers. With housing (ownership and rental) costs rising, servicing commitments on a teachers’ salary can be difficult – particularly in major cities. The ability to live near the place where one works, or the drivability or commuting infrastructure means that workforce planning needs to take a multi-dimensional approach built on more than just raising the public profile of the profession.

Beyond the school gate

Australian Early Development Census (AEDC) data indicates that 22 per cent of children in the first year of formal schooling are vulnerable in at least one domain (e.g., physical, social, emotional, language, and communication), and 11 per cent in two. Early data indicates that the AEDC is a predictor of NAPLAN performance nine years later and with 8.1 per cent of early childhood providers operating with a staffing waiver due to a lack of qualified staff, early intervention is a difficult task.

School-based education exhibits many layers of segregation and stratification. The distribution of students from socio-educational disadvantage or requiring adjustment due to disability are not evenly distributed between sectors (government, catholic, and independent). Peer effects can influence outcomes as much as individual socio-economic status. Cultural context has a large effect (between 33 and 50 per cent) on student performance, and the further a school is located from major cities the lower level of student outcomes. Failure to control for segregation and stratification makes it impossible to identify the drivers of school improvement in different locations and better design interventions aimed at equity and inclusion.

Voice of the profession

Education is seen as ‘a’ if not ‘the’ solution to most social issues and the result is that schools are constantly being asked to do more without having anything removed. Many of the decisions to add things to schooling take place without any engagement or consultation with educators – not education bureaucracies but the educators who work in schools. The result is frequent changes in curriculum documents, additional mandatory training programs, shifting accreditation requirements, updated and expansive administrative requirements, all with negligible impact on student outcomes. This not just intensified teachers’ work but de-democratising the profession. TALIS data indicates that only 28.7 per cent of Australian teachers feel that their views are valued by policy makers. With declining educator well-being and in the context of a teacher shortage, it is timely to establish a forum for representatives from the profession to have a voice in decisions regarding the form, objectives, targets, and outcomes of schooling as articulated in the national agenda.

Data, evidence, research

Improving the equity of education is not possible without data and evidence. You cannot improve that which you do not measure and monitor. An effective school education system needs sufficient data points and appropriate data linkage to understand how well it is performing and robust evidence to identify priority areas for planning, intervention, and policy. While the Measurement Framework for Schooling in Australia details nationally agreed performance indicators, inconsistencies across states and territories datasets means that crucial insights for informing policy at a national level are being lost. Data linkage is an urgent task for understanding the relationships between multiple factors and their impact on education and social outcomes to inform effective policy making, program design and research at a national scale.

Systems and schools that embed data-driven evaluation as a core professional responsibility have a greater impact on student outcomes. This has led to schools increasingly being asked to provide evidence of their impact. At the same time, despite an impressive track record, education research is under-funded. Despite the establishment of the Australian Education Research Organisation (AERO) seeking to position Australian educators at the forefront of education research, without increases in total funding available, it is unlikely that research of the scale and scope necessary to effectively inform policy can be conducted. A promising avenue for increasing the quality of evidence and data use in schools and systems is co-design. However, it requires strategic leadership and matching incentives (including funding mechanisms) to better enable a systemic approach to research use, knowledge translation and breaking down boundaries between stakeholders.

Focus on teaching and learning

Pedagogical reform is a low-cost high-return approach to addressing distortions in a school system. Australian research (for example, Quality Teaching Rounds) has demonstrated that targeted and tailored interventions can positively impact student outcomes and teacher well-being. Yet, 76 per cent of teachers describe their workload as unmanageable. Australian schools have more instructional hours (828) than the OECD average (713), with teachers engaged in far more administration and school management than higher performing systems (e.g., Finland, Estonia). Attempts to recognise quality teaching through accreditation have received little uptake with only 0.33 per cent of the workforce certified at Highly Accomplished or Lead. Addressing equity and inclusion requires attention to how systems are designed to focus on the instructional core of schooling and making sure that resources (human, physical, and financial) are targeted towards achieving the highest quality of teaching in every classroom.

Summary

As the world re-sets to life under pandemic, the internal tensions for differentiation and external pressures for standardisation on education policy have never been greater. With increasing costs for public services at the same time as government revenue and household incomes falling, issues of educational equity, inclusion and excellence are amplified. The pressure to consolidate resources and pursue cost efficiencies will be felt most significantly by the poorest and most marginalised children and communities throughout the country. The stakes are high. Education is critical to human welfare, especially in times of rapid economic and social change.

Ensuring that resourcing and oversight focuses on the health of the system, with wraparound services supporting the workforce to have a voice and what they need for high quality instruction give Australian school systems the best chance of delivering equitable outcomes for all. 

Participants in the workshop

Professor Scott Eacott, Gonski Institute for Education, UNSW Sydney

Professor Eileen Baldry, UNSW Sydney 

Laureate Professor Jenny Gore, Teachers and Teaching Research Centre, University of Newcastle 

Professor Chris Pettit, City Futures Research Centre, UNSW Sydney

Professor Suzanne Carrington, Centre for Inclusive Education, QUT 

Dr Goran Lazendic, Australian Council for Educational Research (ACER)

Dr Virginia Moller, Steiner Education Australia 

Dr Rachel Perry, NSW AIS Evidence Institute

Dr Bala Soundararaj, City Futures Research Centre, UNSW Sydney 

Rebecca Birch, Teacher, Independent School 

Cecilia Bradley, Australasian Democratic Education Community 

Zeina Chalich, Principal, Catholic Education

Mark Breckenridge, Australian Secondary School Principals’ Association 

Elizabeth Goor, Montessori Australia 

Alice Leung, Head Teacher, Concord High School

Alex Ioannou, Montessori Australia 

Matthew Johnson, Australian Special Education Leaders and Principals’ Association  

Maura Manning, Catholic Education Parramatta 

Andrew Pierpoint, Australian Secondary School Principals’ Association 

Daniel Pinchas, Australian Institute for Teaching and School Leadership (AITSL)  

Diane Robertson, Principal, NSW Department of Education 

Michael Sciffer, PhD Candidate, Murdoch University

Scott Eacott PhD, is deputy director of the Gonski Institute for Education, and professor of education in the School of Education at UNSW Sydney and adjunct professor in the Department of Educational Administration at the University of Saskatchewan.