Local Schools, Local Decisions – A Lost Decade

Maurie Mulheron gives us all an insight into the effects that Local Schools, Local Decisions has had on education in NSW. . .

In a choreographed media conference outside a public high school in western Sydney on Sunday 11 March 2012, the NSW Government announced Local Schools, Local Decisions (LSLD) with the Premier and Minister for Education flanked by representatives of two principal groups. It was a plan purporting to ‘empower’ schools. But the evidence is that a far more sinister ulterior purpose, which had been some years in the planning, was driving the policy.

The issue of ‘school autonomy’ is hardly new. It has been an article of faith for many conservative politicians and some economists around the world since the 1970s. It has its origins in a neo-liberal economic theory that public provision is wasteful and ineffective, government expenditure should be reduced, taxation should be lowered and that the more competitive the environment in which government services operate the more efficient they will become. It is a theory that is applied to all aspects of public sector management. ‘School autonomy’ is not an idea relating to teaching and learning that was developed by teachers or education theorists. Its origins and purpose are based in economics and finance.

This is why two international management consultant and accountancy corporations were engaged by the NSW Treasury between October 2009 and January 2010 to conduct a detailed financial audit of the NSW Department of Education and Training (DET), the first NSW government agency to submit to the process. In time this would provide Treasury with the economic rationale for LSLD.

The overarching work was undertaken by the Boston Consulting Group (BCG) which was contracted “…to undertake a scan of DET expenditure and to develop a methodology that will allow Treasury to undertake future scans of other agencies.” Its purpose was to achieve significant financial savings. The January 2010 BCG document was called Expenditure Review of the Department of Education and Training (DET) – Initial Scan.i

The second corporation engaged at the time to undertake complementary work was Price Waterhouse Coopers (PWC).  Its December 2009 report, DET School-based employee related costs review – Interim Report was also prepared for the NSW Cabinet. While the BCG scan dealt with all the operations of the Department, the PWC report dealt specifically with staffing costs. As stated in its objectives, the report was to “…review areas of expenditure relating to DET’s School-based employees where there is scope for change and recommend actions to reduce DET’s expenditure in these areas.”ii

SECRET CABINET DOCUMENTS LEAKED

Both of these Cabinet-in-Confidence documents were never meant to be seen by the community or the teaching profession. However, in the lead-up to the March 2011 state election they came into the possession of the NSW Teachers Federation which, in response, reiterated the union’s concerns that ‘school autonomy’ models had seriously weakened public provision of education. The evidence for this had been mounting overseas for many years. In Australia, during the 1990s the Victorian Liberal Government instigated a dramatic experiment in devolution through the passing of the Education (Self-Governing Schools) Act (1998). It was later repealed by an incoming Labor Government but not before it had seen Victoria’s performance on governments’ benchmarks for achievement, the international PISA testing program, fall below the Australian average in all tested areas – reading, mathematics, and science.iii

In the week leading up to the 2011 NSW state election, the Sydney Morning Herald (SMH) revealed the intent of the secret BCG and PWC reports.iv “The shock comes not so much from the report’s far-reaching findings – which cut deep – but in the way it has been kept secret for so long. The deception used to get hard-working principals and teachers to, in effect, do the dirty work, will strike them as a betrayal.”v

And the betrayal was clearly articulated in the BCG report, “We have identified some quick wins, but have focused mostly on identifying the major opportunities to drive significant savings over time.”

To achieve this the BCG, throughout the review, argued the merits of the devolved school autonomy model of Victoria and, indeed, used Victoria as the benchmark. It noted that “NSW appears to have approximately 9000 more ‘in-school’ staff than Victoria”, also arguing that “NSW appears to have 13% more school related staff than Victoria”, and that “NSW appears to have 12% more non-teaching staff than Victoria.” The review goes on to argue that once the model of devolution similar to Victoria is adopted, “DET should aim to capture as much of this gap [in staffing levels] as possible.”vi

In essence, the BCG review argued that cost cutting through devolution could provide, “opportunities … worth $500-$700 million in recurrent costs and $800-$1000 million in one-off benefits.” The BCG review even advised how the devolved model could be sold to the public, “Possible to position these initiatives as part of a broader school regeneration or schools for the future program.”vii

What was becoming clear was that the NSW Treasury was determined to reduce the number of employees across the NSW public education system, and this was the focus of the second scan undertaken by PWC. The strategy was to ensure principals delivered the savings. Indeed, one section was labelled, “Empower Principals to act” where the report states, “We believe that increasing Principal accountability for managing School-based costs should be focused on driving a positive financial impact in the short to medium term while also maintaining educational outcomes.”viii

REPORTS REJECTED THEN DUSTED OFF

These two reports could easily be dismissed, as they were provided to the NSW Cabinet in the final months of the Labor administration, with an impending March 2011 state election. It should be noted that the extreme nature of the reports’ recommendations led the then Labor Education Minister to shelve both of them. However, they cannot be so easily ignored as both reports by these two corporations were to inform, and were referenced in, the incoming NSW Coalition Government’s Commission of Audits, one released as an Interim Report into Public Sector Managementix in late January 2012 and the Final Report: Government Expenditurex published in May 2012. Indeed, in the latter paper, there are 64 references to the benefits of devolution as a means of achieving efficiencies across the whole of government.

The NSW Commission of Audit Final Report of May 2012 stated,

“For many years financial management in NSW has been confusing, lacking in transparency and below the standards expected of efficient and effective government. This situation is not sustainable.”

The answer, it argued, is that,

“The devolution of authority and accountability, specifically in the areas of education and health, means expenditure (and power) must move from the centre to more local units.”

“The Commission is generally of the view that devolution should not increase expenditure in aggregate though capabilities and systems will need attention at the start. Expenditure in local units should however increase and be offset by reductions at the centre. These are exciting reforms that offer a new era for TAFE, more power and responsibility to school principals, and more community and clinician input and responsibility within Health.”xi


THE 47 SCHOOL TRIAL

Running parallel to the work that BCG and PWC was undertaking from October 2009 until January 2010, was a devolution trial involving 47 schools called the School-Based Management Pilot which was to test some of the key BCG and PWC concepts, notably as to whether local decision-making could produce savings similar to those captured in Victorian schools. This trial, which also began in late 2009, had originally been planned to end in 2010, but continued through to late 2011. Just a few weeks later in January 2012, the Final Report of the Evaluation of the School-Based Management Pilot was released.xii

Even though the justification for the 47 schools trial model was that it would bring about a lift in student achievement, in the final report evaluating the trial the entire section on student results was a mere 85 words in length in a document that ran to 92 pages. However, this should not have been of any surprise as there was no baseline academic data collected at the beginning of the trial, nor any other key data such as that regarding student suspensions, behaviour referrals, attendance, staff turnover. In fact, the only data collected by the NSW Department of Education related to student enrolments, data that is collected from every school annually. This revealed that, for the duration of the trial, 21 of the 47 schools lost enrolments. But this data was excluded from the final report. Instead, the evaluation based its positive findings on scant empirical evidence relying on anecdotal and subjective observations which included supposed comments of four different principals who all uttered almost identical phrases: “This has created a positive buzz in the school”; “[There’s] a buzz about the school in the town”; “Another principal reports ‘a buzz around the school in the community’”; “and there is a buzz about the school in town.” Four different principals all commenting on a perceived “buzz”. However, this woefully inadequate evaluation did not prevent the new Coalition Education Minister mentioning the trial’s “success” as a key justification for the introduction of LSLD.

The true purpose of the 47 schools trial was made clear in the earlier BCG report which revealed that the quarantined devolution model had led to savings of $15-25 million.xiii Later in the BCG report it was argued, “To capture savings from devolution requires more than the current rollout of the current [47 schools] trial. Current trial involves additional costs that will need to be phased out (e.g., to cover higher than average staff costs in some schools) and does not yet address staffing implications at the State and Regional Office.”xiv The “additional costs” were the significant additional funding each of the 47 schools received from the Department, in effect a temporary financial sweetener that would ensure a positive evaluation. It was only the BCG report that exposed that there was no intention to maintain this level of funding support beyond the trial. Towards the end of the BCG report the strategic thinking behind the trial was exposed: “[Must] test and measure impact and risk of devolved model(s) to prove concept. Assess risks and put in place any mitigation strategies to manage them.”xv

When LSLD was announced in March 2012, it was marketed as an education policy. This was the first of many falsehoods promulgated by the Government. There was no mention of the Boston Consulting Group report of 2010; no mention of the Price Waterhouse Coopers report; and no mention of the NSW Commission of Audit Reports of 2012 either. Nor did the Government ever reveal the real purpose of the 47 schools trial.

In reality, LSLD was always going to be about expenditure and the efficiency savings that could be secured, “There is considerable scope in NSW to reallocate expenditure in education and training to improve outcomes, through greater devolution of resource allocation decisions to principals and TAFE Institute Directors. This can occur within existing expenditure budgets.”xvi It is worth noting that the findings of the BCG report regarding the savings that could be accrued through devolution were referenced in the 2012 NSW Commission of Audit report.

So, what did the NSW Commission of Audit’s recommended ‘reductions at the centre’, a critical feature of Local Schools, Local Decisions, mean in practice? It is important to revisit the NSW Treasury’s demand on the Department of Education at the time.

Savings measures had to be identified by the Department in the 2011-2012 NSW budget to cover the four-year budget period up to 2015-2016. These measures were implemented as “general expenses in the education and communities portfolio have still outstripped the growth in government revenue”.xvii

The Department needed to find $201 million in savings from the 2012-2013 budget and $1.7 billion over the four year forward estimates period. The measures also included the 2.5 per cent labour expense cap, as detailed in the NSW Public Sector Wages Policy which had been reinforced by changes to the NSW Industrial Relations Act.

The savings demanded of the Department were introduced at the same time that Local Schools, Local Decisions was rolled out. In reality the ‘reductions at the centre’ resulted in a significant and unprecedented loss of positions from the Department, both public servant and non-school based teaching positions. And this, not a lift in student outcomes, was the primary objective of Local Schools, Local Decisions.

Ken Dixon, the general manager of finance and administration within the NSW Department of Education at the time, later described the policy to give principals more autonomy over school budgets as being driven by cost savings. In public comments he argued, “The Local Schools, Local Decisions policy is just a formula to pull funding from schools over time.” Mr Dixon, in a key senior Departmental position at the time the policy of Local Schools, Local Decisions was being developed, also revealed that the loss of at least 1600 jobs in the Department was factored into the business case. xviii

The ‘reductions at the centre’ included the loss of hundreds of non-school based teachers and support staff from programs throughout NSW including from curriculum support, professional development, staffing, drug and alcohol education, student welfare, student behaviour, community liaison, staff welfare, the equity unit, rural education, assessment and reporting, special education, and multicultural education. In essence, the capacity for the Department to initiate and fund system-wide support for teachers was decimated. To this day, the Department of Education has not been able to rebuild any significant systemic support.

6 MONTHS ON: THE CUTS ARE CONFIRMED

From the day that LSLD had been announced, the NSW Teachers Federation had opposed it, providing the evidence to members and the public that had been revealed to the union in the leaked BCG and PWC reports. An intense state-wide campaign was instigated. The union had been researching ‘school autonomy’ from at least 1988, prompted by the Metherell crisis. It had also studied closely the impact of devolution in other jurisdictions including Victoria, New Zealand and the UK. And there had been more recent experiences of ‘school autonomy’ policies that had been imposed in NSW.

Just a few years earlier in 2008, the Federation had been involved in a bitter and protracted industrial dispute with the NSW government over staffing including the loss of service transfer rights for teachers. The concern was the dramatic negative consequences for difficult to staff schools in outer metropolitan and rural areas. In a fax sent to all schools by the Federation at the time in the lead up to a 24-hour strike, the union showed remarkable prescience in sounding a warning that, “[The Government’s procedures will] establish the preconditions for the full deregulation agenda as in Victoria. Federation is in no doubt that if the NSW government succeeds in destroying the state-wide teacher transfer system that the next step is to introduce devolved staffing budgets to schools which include teacher and non-teacher salaries.”xix Just four years later this was now a fundamental element of the LSLD model.

It was also in the area of special education that the NSW government had instigated a devolved funding model which had been trialled in the Illawarra in 2011 and implemented across the state in 2012. This new method of allocating funding had been foreshadowed in the BCG report which stated that there was potential savings of up to $100 million from the “fast growing special education area”. Once again, comparing NSW to Victoria, the BCG report argued, “Victoria introduced reform initiatives in 2005 which stemmed growth of special education and suggests a broad opportunity exists to streamline NSW special education/equity programs”.xx The scheme was promoted to the community as Every Student, Every School but it was clear that not every student in every school would receive the support they needed. The reduction in centralised support, for instance, led to funding cuts for thousands of students with autism and mental health concerns who were excluded from the Integration Funding Support program.xxi

It was not until 11 September of 2012, six months after the LSLD announcement, that the intention to dramatically cut funding to the school system and TAFE was finally revealed by the then NSW Premier, Barry O’Farrell – a decision he described as “difficult but necessary”. The total amount of education funding to be cut amounted to $1.7 billion, almost the exact figure to the dollar that the BCG and PWC reports had recommended could be achieved through devolving budgets to local principals and TAFE institute managers. Also confirmed in the announcement was the loss of a total of 1800 non-school based teaching and support staff positions from Department offices – from the centre and from regional offices. This was a similar number to the total that Ken Dixon had explained had been factored into the LSLD “business case”.xxii

For months following the March 2012 public release of the LSLD policy, the Federation had been attacked by the Government which accused the union of lying to the profession about the intention to cut funding. But even though it was now vindicated, the Federation still found the news of the $1.7 billion cuts grim. Earlier, in response to the LSLD announcement, it had called all members out on strike, firstly in May 2012 for a two-hour stoppage, and later in June for a 24-hour strike.

While not preventing the full impact of the cuts to education, the strikes did achieve some important protections, at least for public school teachers. In response to the industrial action, the Department withdrew the plan to provide all schools with an actual staffing budget, making it notional instead. A school’s staffing entitlement, which was to be replaced by an unregulated principal’s choice of the ‘mix and number’ of staff, was also protected.

The Commission of Audit had declared that all staff ratios were to be removed from industrial agreements, citing NSW public school class sizes as the first example. “The Commission of Audit agrees that some workforce management policies and input controls are managerial prerogatives and should not be incorporated into awards…Examples are:​ teacher to pupil ratios…”xxiii

A public campaign in the lead-up to the strikes led to references to class sizes reconfirmed in subsequent industrial agreements. Finally, the plan to abolish the incremental pay scale was also withdrawn.

GONSKI – A POLITICAL LIFELINE

Following a long campaign led by the Australian Education Union, and strongly supported by the NSW Teachers Federation, a Federal Labor Government announced a comprehensive inquiry into schools funding in April 2010. The inquiry team was chaired by David Gonski whose name would become synonymous with the subsequent report delivered to government in November 2011. But it was not until 20 February 2012 that the report was released publicly. By April the following year, the Federal Government announced a new national $14.5 billion schools funding model. The funding was to be delivered over a six-year transition period from 2014 to 2019 with two-thirds of the funding to be provided in the final two years.

At its heart was the Schooling Resource Standard (SRS), effectively the minimum level of funding a school needed to have the vast majority of its students meet national outcomes. In essence, the more complex a school’s student profile, the greater level of funding it would attract, noting in the case of NSW public schools, the additional funding would be provided to the system to distribute on a needs basis. 

On 23 April 2013, NSW became the first state to sign a bilateral agreement with the Commonwealth, less than seven months after the announcement of the $1.7 billion cuts at the state level. In reality, the Gonski funding model was seen by the then NSW Education Minister as a political lifeline. The NSW Department of Education was faced with a serious contradiction. On the one hand, it had built a financial model to implement LSLD, but which was designed to de-fund the system in order to deliver $1.7 billion in savings. From 2014, however, there would be additional money provided to schools. But it soon became a case of a wasted opportunity. None of the additional recurrent funding could be used for any significant and much needed whole of system improvement. Improvements such as reduced class sizes, which for junior primary and lower secondary schools had not been reduced in many decades, nor for a reduction in face-to-face loads which also had not improved in decades. Indeed, there was little funding retained by the Department at the centre to rebuild the programs that had been decimated back in 2012. In other words, the government had squandered the opportunity to capitalise on a key advantage of the public education system which is its capacity to achieve massive economies of scale.

In a crude attempt to engender support for LSLD, the Department deliberately attempted to link LSLD with the additional Gonski funding in schools, as though the BCG and PWC audits, the Commission of Audit reports, the Public Sector Wages Policy, and the demand of the NSW government for departments to reduce labour expenses every year had not occurred. This re-writing of the history led to the Department’s Centre for Educational Statistics and Evaluation (CESE) developing a survey instrument that linked the two disparate variables — LSLD and additional funding. But neither of these variables, the system wide change to governance and the increase in Commonwealth and State funding, was dependent on the other. So, to conflate them in the first evaluation question where each variable is portrayed as being interdependent was a serious error, offending a basic tenet of research methodology. In response, the Federation raised the fundamental question as to exactly what was being evaluated: a change to the governance model of the public school system announced in March 2012 or the additional funding achieved two years later in 2014 through the National Education Reform Agreement (NERA).

LOST OPPORTUNITY

The additional funding had been allocated to individual schools untied, with little guidelines, minimal accountability and almost no programmatic system-wide support. Little wonder that even CESE’s Local Schools, Local Decisions Evaluation – Interim Report stated “…we were unable to determine…what each school’s [Resource Allocation Model] RAM equity loading allocation was spent on.”xxiv

Firstly, the devolution model was never designed to make funding information transparent. Indeed, it was designed to do the exact opposite, make funding matters more opaque. This was because the devolution model was expressly designed for twin purposes: deliver savings back to central government and allow governments to shift the responsibility for these savings to local managers. It was only ever intended to give local schools the illusion of control.

Secondly, the model was never designed to distribute and manage significant increases in funding. There now existed no comprehensive systemic and state-wide programmes designed to lift student outcomes across all schools: “In terms of differential change over time, we found no relationship between changes over time in these engagement measures and levels of need, with the notable exception that students in higher-need schools typically showed less positive change over time in levels of social engagement than students in lower-need schools. In other words, the gap in this measure between higher-need and lower-need schools increased over time, rather than decreased.” [Author’s emphasis]xxv

CRITICAL VOICES IGNORED

Over the years, there has been a tendency for government departments, like the NSW Department of Education, to declare that policies are developed from ‘evidence-based decision-making’. Yet, in the case of Local Schools, Local Decisions, this assertion must be contested. Moreover, it may actually be a case that the declaration of ‘evidence’ is a strategy to shut down debate, noting that very little in education policy, practice and theory exists without competing points of view.

The extreme Local Schools, Local Decisions policy was implemented dishonestly. Its true intentions were hidden from the profession with critical voices and available research ignored. In relation to ‘school autonomy’ models John Smyth believes, “Sometimes an educational idea is inexplicably adopted around the world with remarkable speed and consistency and in the absence of a proper evidence base or with little regard or respect for teachers, students or learning.”xxvi

In his essay, The disaster of the ‘self-managing school’ – genesis, trajectory, undisclosed agenda, and effects, Professor Smyth went on to argue that ‘school autonomy’ in reality is government “…steering at a distance, while increasing control through a range of outcomes-driven performance indicators.”

Further he said, “The argument was that schools would be freed up from the more burdensome aspects of bureaucratic control, and in the process allowed to be more flexible and responsive, with decisions being able to be made closer to the point of learning. Many of these claims have proven to be illusory, fictitious, and laughable to most practising school educators.”

Dr Ken Boston, one of the members of the Review of Funding for Schooling panel chaired by David Gonski, expressed frustration at the continuing promotion of devolution, arguing that “. . . school autonomy is an irrelevant distraction. I worked in England for nine years, where every government school . . . has the autonomy of the independent public schools in WA – governing boards that can hire and fire head teachers and staff, determine salaries and promotions, and so on. Yet school performance in England varies enormously from school to school, and from region to region, essentially related to aggregated social advantage in the south of the country and disadvantage in the north.”xxvi

Plank and Smith in their paper, Autonomous Schools: Theory, Evidence and Policy, argued, “Placing schools at the centre of the policy frame, freeing them from bureaucracy and exhorting them to do better has not by itself generated many of the systemic improvements, innovation, or productivity gains that policy makers hoped for.”xxvii

Professor Steven Dinham from the University of Melbourne acknowledged the lack of evidence for ‘school autonomy’ models: “The theory that greater school autonomy will lead to greater flexibility, innovation and therefore student attainment is intuitively appealing and pervasive. School autonomy has become something of an article of faith. However, establishing correlation and causation is not so easy.” Dinham says, “What is needed above all however, is clear research evidence that the initiative works, and under what conditions, rather than blind enthusiasm for the concept.”xxix

‘School autonomy’ was responsible for a “lost decade” in education according to one of New Zealand’s leading education researchers Dr Cathy Wylie formerly of the New Zealand Council of Educational Research (NZCER). In her book, Vital Connections: Why We Need More Than Self-Managing Schoolsxxx, Wylie argued that schools in NZ needed more central support, and that devolution had caused the loss of ‘vital connections’ between schools.

Even the OECD was ignored.  In its 2009 PISA cross-country correlation analysis, PISA 2009 Results: What Makes a School Successful? – Resources, Policies and Practices (Volume IV) the OECD authors argued that “. . . greater responsibility in managing resources appears to be unrelated to a school system’s overall student performance” and that “… school autonomy in resource allocation is not related to performance at the system level.”xxxi

And yet, this OECD report was released three years before the 2012 NSW Commission of Audit argued enthusiastically for a devolution model (sold later as Local Schools, Local Decisions).

A decade on, the catastrophic policy failure of Local Schools, Local Decisions is clear. The findings of the Department’s own research body, CESE, amplify this:

  • “To date, LSLD appears to have had little impact on preliminary outcome measures.”

  • “These results suggest that LSLD has not had a meaningful impact on attendance or suspensions.”

  • “However, the direction of the relationship was not as we expected: students in higher-need schools showed less growth in social engagement than students in lower-need schools.”xxxii

So, what has occurred after this lost decade? No lift in student outcomes, the gap between the advantaged and disadvantaged widening, a massive increase in casual and temporary positions in schools, no improvements in attendance, no improvement in suspension rates, no lessening of ‘red-tape’, a dramatic increase in workload, growing teacher shortages, and the salary cap still in place. The paradox is, of course, that the more localised the decision-making, the more onerous, punitive and centrally controlled are the accountability measures.

By 2022 the Local Schools, Local Decisions policy has left the NSW Department with no levers; no capacity to develop, fund and implement systemic improvements to lift all schools or to achieve massive economies of scale. Purportedly, the bulk of funding is in school bank accounts with the Department unable to determine what it is being spent on. Instead, we are left with policy by anecdote as revealed in the comments quoted within the CESE evaluation.

The tragedy of Local Schools, Local Decisions is that its structure remains in place, even if its name has changed. By 2021, the NSW Department had realised that LSLD had failed public schools, their teachers, and their students. It had also failed the community of NSW. Addicted to policy by alliteration, the Department rebadged it as the School Success Model (SSM). But this title reveals the continuing mind-set of both the Government and the Department. If we have learnt anything from the last decade it is that schemes like LSLD are essentially a cover for a government to abrogate its obligation to all children, all teachers, and all public schools. Instead, what is needed is for the NSW government, through its department, to accept it has an onus to provide systemic programmatic support rather than devolve the risk and responsibilities onto individual schools. Finally, the time to listen to and accept the advice of the teaching profession, and for the powerful, politically connected accountancy firms to be dismissed, is long overdue.

Original article available here (Journal for Professional Learning)

  • i Boston Consulting Group (BCG) Expenditure Review of the Department of Education and Training (DET) – Initial Scan (2010) pp 188-193

    ii PriceWaterhouse Coopers (PWC) DET School-based employee related costs review – Interim Report (2009) p2

    iii AEU (VIC) Submission to the Victorian Competition and Efficiency Commission Inquiry into School Devolution and Accountability (2012) p2

    iv Anna Patty SMH Secret cuts to schools (19 March 2011)

    v Anna Patty SMH Secret report administers a shock to the system (19 March 2011)

    vi BCG op.cit. pp 188-193

    vii BCG op. cit. p92

    viii PWC op. cit. p18

    ix NSW Commission of Audit Interim Report into Public Sector Management (January 2012)

    x NSW Commission of Audit Final Report: Government Expenditure (May 2012)

    xi NSW Commission of Audit op. cit. p10

    xii NSW DET Final Report of the Evaluation of the School-Based Management Pilot (2012)

    xiii BCG op. cit. p13

    xiv BCG op. cit. p34

    xv BCG op. cit. p146

    xvi NSW Commission of Audit Op. cit. p71

    xvii NSW Department of Education and Communities Saving measures to meet our budget (2011)

    xviii Anna Patty SMH Tip of the iceberg: warning 1200 more education jobs to go (14 September 2012)

    xix NSW Teachers Federation fax to all schools (13 May 2008)

    xx BCG op. cit. p58 and p150

    xxi “Reform funding on need” in Education (NSWTF) (16 August 2022)

    xxii Anna Patty SMH NSW to slash $1.7 billion from education funding (11 September 2012)

    xxiii NSW Commission of Audit: Public Sector Management p83 (24 January 2012)​

    xxiv Centre for Education Statistics And Evaluation (CESE) LSLD Evaluation Interim Report (July 2018) p8

    xxv CESE Op. cit. p8

    xxvi John Smyth The disaster of the ‘self‐managing school’ – genesis, trajectory, undisclosed agenda, and effects Journal of Educational Administration and History 43(2):95-117 (May 2011)

    xxvii Quoted in Education Vol 97 No 7 Maurie Mulheron On Evidence Based Decision-Making 7 November 2016

    xxviii David N Plank and BetsAnn Smith Autonomous Schools: Theory, Evidence and Policy in Handbook of Research in Education Finance and Policy Helen F. Ladd and Edward Fiske (eds) (2007)

    xxix Stephen Dinham The Worst of Both Worlds: How the US and UK Are Influencing Education in Australia Journal of Professional Learning (Semester 1 2016)

    xxx Cathy Wylie Vital Connections: Why We Need More Than Self-Managing Schools (2012)

    xxxi OECD PISA 2009 Results: What Makes a School Successful? – Resources, Policies and Practices (Volume IV) (2010)

    xxxii CESE Op. cit. p53, p51, p51

  • Australian Education Union (AEU Victoria ) (2012) Submission to the Victorian Competition and Efficiency Commission Inquiry into School Devolution and Accountability

    Boston Consulting Group (BCG) (2010) Expenditure Review of the Department of Education and Training (DET) – Initial Scan

    Centre for Education Statistics And Evaluation (CESE) (July 2018) LSLD Evaluation Interim Report

    Dinham, S., (2016) The Worst of Both Worlds: How the US and UK Are Influencing Education in Australia Journal of Professional Learning (Semester 1 2016)

    Gonski , D et al (2011) Review of Funding for Schooling Department of Education Employment and Workplace Relations (DEEWR)

    OECD (2010) PISA 2009 Results: What Makes a School Successful? – Resources, Policies and Practices (Volume IV)

    Patty, A., (19 March 2011) Secret cuts to schools Sydney Morning Herald

    Patty, A., (19 March 2011) Secret report administers a shock to the system Sydney Morning Herald

    Patty, A., (11 September 2012) Tip of the iceberg: warning 1200 more education jobs to go Sydney Morning Herald

    Patty, A.,(14 September 2012) NSW to slash $1.7 billion from education funding Sydney Morning Herald

    Plank, D N. and Smith, B., (2007) Autonomous Schools: Theory, Evidence and Policy in Handbook of Research in Education Finance and Policy Helen F. Ladd and Edward Fiske (eds)

    PriceWaterhouse Coopers (PWC) (2009) DET School-based employee related costs review – Interim Report

    Smyth, J., (May 2011)The disaster of the ‘self‐managing school’ – genesis, trajectory, undisclosed agenda, and effects

    Journal of Educational Administration and History 43(2):95-117

    NSW Commission of Audit (January 2012) Interim Report into Public Sector Management

    NSW Commission of Audit (May 2012) Final Report: Government Expenditure

    Wylie, C.,(2012) Vital Connections: Why We Need More Than Self-Managing Schools

Previous
Previous

Public education and privatization in Australia

Next
Next

Why music matters