ebook img

Evidence check 1 : early literacy interventions : government response to the Committee's second report of session 2009-10 PDF

2010·2.1 MB·English
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview Evidence check 1 : early literacy interventions : government response to the Committee's second report of session 2009-10

House of Commons Science and Technology Committee ———_ Evidence Check 1: Early Literacy Interventions: Government Response to the Committee's Second Report of Session 2009-10 Second Special Report of Session 2009-10 Ordered by the House of Commons to be printed 24 February 2010 2 fiaguH e ak | HC 385 Published on 4 March 2010 by authority of the House of Commons Psat OND TY Sereteancstisia onjeew pepeartoeeicea : London: The Stationery Office Limited £4.50 The Science and Technology Committee The Science and Technology Committee is appointed by the House of Commons to examine the expenditure, administration and policy of the Government Office for Science. Under arrangements agreed by the House on 25 June 2009 the Science and Technology Committee was established on 1 October 2009 with the same membership and Chairman as the former Innovation, Universities, Science and Skills Committee and its proceedings were deemed to have been in respect of the Science and Technology Committee. Current membership Mr Phil Willis (Liberal Democrat, Harrogate and Knaresborough)(Chairman) Dr Roberta Blackman-Woods (Labour, City of Durham) Mr Tim Boswell (Conservative, Daventry) Mr lan Cawsey (Labour, Brigg & Goole) Mrs Nadine Dorries (Conservative, Mid Bedfordshire) Dr Evan Harris (Liberal Democrat, Oxford West & Abingdon) Dr Brian Iddon (Labour, Bolton South East) Mr Gordon Marsden (Labour, Blackpool! South) Dr Doug Naysmith (Labour, Bristol North West) Dr Bob Spink (Independent, Castle Point) lan Stewart (Labour, Eccles) Graham Stringer (Labour, Manchester, Blackley) Dr Desmond Turner (Labour, Brighton Kemptown) Mr Rob Wilson (Conservative, Reading East) Powers The Committee is one of the departmental Select Committees, the powers of which are set out in House of Commons Standing Orders, principally in SO No.152. These are available on the Internet via www.parliament.uk Publications The Reports and evidence of the Committee are published by The Stationery Office by Order of the House. All publications of the Committee (including press notices) are on the Internet at http://www.parliament.uk/science A list of reports from the Committee in this Parliament is included at the back of this volume. Committee staff The current staff of the Committee are: Glenn McKee (Clerk); Richard Ward (Second Clerk); Dr Christopher Tyler (Committee Specialist); Xameerah Malik (Committee Specialist); Andy Boyd (Senior Committee Assistant); Camilla Brace (Committee Assistant); Dilys Tonge (Committee Assistant); Melanie Lee (Committee Assistant); Jim Hudson (Committee Support Assistant); and Becky Jones (Media Officer). Contacts All correspondence should be addressed to the Clerk of the Science and Technology Committee, Committee Office, 7 Millbank, London SW1P 3JA. The telephone number for general inquiries is: 020 7219 2793; the Committee’s e- mail address is: [email protected]. WUE Government response to the Science and Technology Committee's Second Report of Session 2009-10 1 Second Special Report On 18 December the Science and Technology Committee published its Second Report of Session 2009-10, Evidence Check 1: Early Literacy Interventions [HC 44]. On 16 February 2010 the Committee received a memorandum from the Government which contained a response to the Report. The memorandum is published as an Appendix to this Report. Appendix: Government response Every Child a Reader: Reading Recovery 1. The Government's policy that literacy interventions should take place early on in formal education is in line with the evidence. (Paragraph 25) 2. The Government's position that early literacy interventions are an investment that saves money in the long run is evidence-based. (Paragraph 32) We welcome the report’s findings and are glad the committee are in agreement with us on these points. 3. Ms Willis is right to acknowledge the need to compare Reading Recovery with alternative interventions. We conclude that, whilst there was evidence to support early intervention, the Government should not have reached the point of a national roll-out of Reading Recovery without making cost-benefit comparisons with other interventions. (Paragraph 37) Firstly, it is important to emphasise that Reading Recovery is just one part of the three- wave Every Child a Reader Programme roll out. We believe that the evidence base on ECAR and Reading Recovery was strong enough to justify implementing a large-scale intervention programme which incorporates intensive specialist tuition for the lowest attaining pupils with a wider commitment to improving teaching throughout the school. The evidence base at the time the decision was taken included: e Research into the first year of ECaR, commissioned by the Institute of Education (Burroughs-Lange, 2006). e Management information about ECaR which provides pre-and post test results for children who receive the intervention, and qualitative insight into schools’ and pupils’ views (ECACT, 2006). ¢ A long history of research into the effectiveness of Reading Recovery, the core intensive intervention of ECaR. 2 Government response to the Science and Technology Committee’s Second Report of Session 2009-10 e In-house analysis of school attainment in ECaR schools, compared to other schools. e An economic assessment of the return on investment of early intervention to address literacy difficulties (KPMG, 2006). e Brooks (2002; revised edition 2007) concluded that Reading Recovery is an effective intervention, with effect sizes from different studies standing up well in comparison to a range of other interventions. Another important basis for adopting the Reading Recovery programme was one of scalability - currently in the UK, Reading Recovery provides the only national network which was capable of training sufficient numbers of teachers to deliver on the scale required. Together with evidence of effectiveness, the issue of scalability meant that Reading Recovery was accepted as part of the ECAR programme. In the case of complex social policy issues, it is rare to find an evidence base which clearly indicates exactly how problems should be tackled. In such cases, it is imperative that Government acts on the best available evidence in order to begin to tackle entrenched problems such as poor literacy, and evaluates the results. 4. We are concerned by the low quality of data collection in UK trials on literacy interventions. Government-funded trials should seek the best data so as to make the results as powerful as possible. Running trials that do not collect the best data is a failure both in terms of the methodological approach, but also value for money. (Paragraph 40) As a Department, we can clearly only comment on our own studies and not more generally on UK trials on literacy interventions - although we are doing more to encourage independent researchers to strengthen their assessments of cost effectiveness. We are committed to collecting robust data on our pilots and interventions that is of sufficient quality to inform policy decisions. The Department carefully chooses the appropriate methodology for evaluating its policies and pilots, based on advice from expert social researchers on scientific credibility and practicality in each case, and through discussions at its Research Approvals Committee (RAC) and Policy Evaluation Group (PEG). The Department is increasing the involvement of expert researchers at an early stage in pilot development to help ensure that the methodological data needs drive the design of the pilot. More generally, the analytical community work on behalf of the Chief Scientific Adviser to ensure that evidence underpins policy development decisions across the Department. We believe that research involving carefully matched comparison groups can give good quality evidence, especially when there are so many variables affecting social policy. When deciding on the methodology for evaluating pilots/trials, we need to take account of the relative burden of data collection, the cost, and the timescales involved as well as the quality of the data. | 5. The Government should be careful when selecting evidence in support of educational programmes that have changed over time. Reading Recovery today differs from its Government response to the Science and Technology Committee's Second Report of Session 2009-10 3 1980s and 1990s ancestors. Evidence used to support a national rollout of Reading Recovery should be up-to-date and relevant to the UK. The Government's decision to roll out Reading Recovery nationally is not based on the best quality, sound evidence. (Paragraph 45) We agree with the Committee that research findings need to be interpreted carefully to reflect both the time period and the country in which the study has taken place. However, they can still give useful evidence on the potential effectiveness of interventions. ECAR was piloted in the UK in advance of decisions about roll-out. It is important to again emphasise that Reading Recovery is just one element of ECAR. The choice of Reading Recovery as the one-to-one intervention wave of the ECAR programme was made during the pilot phase led by the Every Child a Chance Trust. Information and evidence collected from the pilot was of high quality, and as noted in our response to point 3 (above), we felt we had a range of good and relevant evidence to support the roll-out decision. 6. We recommend that the Government should draw up a set of criteria on which it decides whether a research project should be a randomised controlled trial. (Paragraph 49) The Magenta Book provides guidance to Government officials undertaking evaluations. It is currently being revised to become the single source of guidance on evaluation for analysts and policy makers in government (see response to point 15). There are several different types of evaluations and methods for conducting evaluations, including Randomised Controlled Trials (RCTs). The revised Magenta Book will outline the pros and cons of RCTs and other methods for addressing different types of research questions. It will also provide analysts and policy developers with criteria to determine the appropriate evaluation method in a given scenario, and offer guidance on how to conduct the evaluation. 7. We conclude that a randomised controlled trial of Reading Recovery was both feasible and necessary. (Paragraph 54) A Randomised Controlled Trial (RCT) of ECAR, which is a school level programme, would have required random allocation of schools to the programme. This would have been very complex in the circumstances, given the need for teacher training to be planned and supported at local level. A RCT of Reading Recovery itself was not considered good value because there is already a strong evidence base on the effectiveness of the programme, including international RCTs. It is also the case that research involving carefully matched comparison groups can give evidence of an acceptable quality, especially when in many areas of social policy there are so many different variables which affect the outcome. Well-constructed quasi- experimental studies can therefore provide a fit-for-purpose solution, and the evidence gained from the ECAR pilot was robust. 4 Government response to the Science and Technology Committee’s Second Report of Session 2009-10 8. We recommend that the Government identify some promising alternatives to Reading Recovery and commission a large randomised controlled trial to identify the most effective and cost-effective early literacy intervention. (Paragraph 55) There are a very large number of initiatives currently available commercially. It would not be feasible to include all of them, in one evaluation. Instead, reviews such as those undertaken by Brooks compare the impacts of different initiatives across different studies. We have already committed to regularly updating’ “What works for Children with Literacy Difficulties’ (Brooks, 2007) as part of our on-line guidance, a recommendation from the report ‘Identifying and Teaching Children and Young People with Dyslexia and literacy difficulties’ (Rose, 2009). Also, as the Committee is aware, DCSF has commissioned an independent evaluation of Every Child a Reader, which is running from autumn 2009 to March 2011. The evaluation is using a mixed-method approach to evaluate how the programme has been implemented, its impact on outcomes, and its value for money. We will be keeping the evaluation process and the outcomes under review. 9. Teaching children to read is one of the most important things the State does. The Government has accepted Sir Jim Rose's recommendation that systematic phonics should be at the heart of the Government's strategy for teaching children to read. This is in conflict with the continuing practice of word memorisation and other teaching practices from the 'whole language theory of reading’ used particularly in Wave 3 Reading Recovery. The Government should vigorously review these practices with the objective of ensuring that Reading Recovery complies with its policy. (Paragraph 59) The term ‘whole language theory of reading’ has acquired a pejorative meaning, and is used to imply that a theory or methodology has no regard for the use of print information, sound/letter learning, phonics or the ability to decode text, and that children are encouraged to invent or guess text. This is not an accurate description of Reading Recovery. The reference to ‘the practice of word memorisation' in Reading Recovery is equally inappropriate. Reading Recovery teachers are given very explicit instructions to the contrary. Memorisation is not an alternative to decoding in Reading Recovery. However, the Rose review of early literacy clearly states that ability to decode is not the end point of literacy learning: ‘It is widely agreed that phonic work is an essential part, but not the whole picture, of what it takes to become a fluent reader and skilled writer, well capable of comprehending and composing text.’ (Rose, 2006, paragraph 37) Decoding is an essential skill for a reader encountering new, unexpected or unfamiliar words, but an efficient, fluent reader must develop a system whereby words read frequently can be read automatically. However, it is evident from the research literature that the balance of learning needs across the two dimensions changes as children become more fluent and automatic readers of words. Most readers learn intuitively how to retain in the memory that which they have deciphered, but for very low attaining children in Reading Recovery this skill may have to be taught explicitly. This is entirely different from the use of memorisation to circumvent the development of decoding skills. Government response to the Science and Technology Committee’s Second Report of Session 2009-10 5 Dyslexia 10. The Rose Report's definition of dyslexia is exceedingly broad and says that dyslexia is a continuum with no clear cut-off points. The definition is so broad and blurred at the edges that it is difficult to see how it could be useful in any diagnostic sense. (Paragraph 71) We recognise that views on identifying and supporting children with dyslexia vary. The Rose report reviewed the many published definitions of dyslexia and the Government has accepted the report’s definition which proposes that dyslexic difficulties are best thought of as existing on a continuum of wider literacy difficulties, rather than forming a discrete category. Dyslexia is not unique in this regard and a number of other conditions, such as autism spectrum disorder, are on a continuum. We believe it is useful to have a working definition of dyslexia so that, as the Rose report concluded, “we can build professional expertise in identifying dyslexia and developing effective ways to help learners overcome its effects.” 11. We conclude that 'specialist dyslexia teachers' could be renamed ‘specialist literacy difficulty teachers'. There are a range of reasons why people may struggle to learn to read and the Government's focus on dyslexia risks obscuring the broader problem. The Government's support for training teachers to become better at helping poor readers is welcome and to be supported, but its specific focus on ‘specialist dyslexia teachers’ is not evidence-based. (Paragraph 77) The Government accepts Sir Jim Rose’s view that: “There is also a need to develop better access for schools, parents and children to the advice and skills of specialist dyslexia teachers, who can devise tailored interventions for children struggling most with literacy, whether or not they have been identified as having dyslexia.” The term “specialist dyslexia teacher” is widely used and the British Dyslexia Association accredits courses meeting its professional criteria. Dyslexia difficulties are often not confined to literacy difficulties. Pupils with dyslexia often have co-occurring difficulties which may be seen in aspects of language, motor co-ordination, mental calculation, concentration and personal organisation. Given these points, we believe it is sensible to continue with the “specialist dyslexia teacher” term rather than introduce an additional one. However, we do accept that accreditation needs to reflect the best evidence-based practice, which will change over time. We have worked with the British Dyslexia Association to review their accreditation criteria of courses for specialist dyslexia teachers. 12. We recommend that future research on the impact of literacy interventions on children with dyslexia should be well designed randomised controlled trials, using appropriate control groups (including children with other reading difficulties and ‘normal' children), and test a range of literacy interventions. (Paragraph 82) The Government recognises that more good quality evidence is required around the effectiveness of particular interventions for children with dyslexia. In addition, the 6 Government response to the Science and Technology Committee’s Second Report of Session 2009-10 department recognises that Randomised Controlled Trials (RCTs) are an important research tool and is committed to carrying them out where appropriate. However, it is important to choose the most appropriate methodology. For a number of reasons, including ethical, funding, recruitment and timing issues, RCTs are not always possible or advisable in every circumstance. The Department, therefore, carefully considers all factors before commissioning research that will most usefully add to the existing evidence base. In addition, the Department will take on board results of other research that has been carefully scrutinised and deemed as high quality when making policy and funding decisions related to interventions for dyslexia. We will be asking Ofsted to undertake a survey to evaluate the extent to which, and with what impact, primary and secondary schools are using interventions to advance the progress of children and young people experiencing a wide range of literacy difficulties. It will be timed to provide an opportunity to evaluate the implementation of Jim Rose’s recommendations. 13. We recommend that the Government be more independently minded: it should prioritise its efforts on the basis of research, rather than commissioning research on the basis of the priorities of lobby groups. (Paragraph 84) The Government has set out its priority to improve the educational outcomes for children with special educational needs and has commissioned its own research. We will be publishing shortly "Special Educational Needs and Disability: Understanding Local Variation in Prevalence, Service Provision and Support". Expert stakeholder groups have a role to play in the development of policy and research priorities; indeed some stakeholder groups actively commission high quality research which helps inform debate. The ultimate decisions on its research and policy priorities lie with the Government. Conclusions 14. In broad conclusion, we found that there was a willingness from the Department to base its approach to early literacy interventions on the evidence. However, we discovered worryingly low expectations regarding the quality of evidence required to demonstrate the relative effectiveness and, in particular, the cost-effectiveness of different programmes. (Paragraph 87) We believe that when implementing ECAR, the decision to roll out nationally was made on the basis of sound evidence that the programme was effective, coupled with the need to deliver a stable intervention programme. | We absolutely recognise the importance of strengthening analysis of cost-effectiveness of different policies in the future. We can reassure the Committee that we have built this into our research programme and now consider whether a stronger focus on cost effectiveness is required in all our research projects as part of our Research Approvals Committee. Government response to the Science and Technology Committee’s Second Report of Session 2009-10 7 Furthermore the Department is doing more to encourage academics and other researchers to build in a cost-effectiveness component into all their (externally funded) research work, and to avoid focusing only on impact/effect size. 15. We recommend that the Government review its Magenta Book with a view to raising its expectations of social science research and evidence in relation to policy. (Paragraph 88) The Magenta Book is already under review. A revision was commissioned by the Government Social Research Unit in Autumn 2009 and was awarded to the research consultancy SQW. A revised version will be available later in 2010. The aim of the revision is to edit and develop the Magenta Book to increase its influence and utility, and become the single source of government guidance on evaluation for analysts and policy makers. The review of the Magenta Book is part of a programme of work to strengthen government capacity and skill in evaluation. This is being led by the Cross Government Evaluation Group (CGEG), set up last year in response to a Government Economic Service Initiative. The remit of this cross-disciplinary group is to: a) Identify and disseminate the conditions necessary for successful evaluation in government. b) Strengthen guidance on evaluation and raise its status. c) Strengthen training on evaluation and raise its status. d) Identify ways to increase the perceived value of evaluation amongst policy and delivery colleagues in their decision-making, including advising on when and what to evaluate. e) Support policy and delivery colleagues in their use of evaluation findings. f) Develop internal evaluation capacity through facilitating cross-departmental knowledge sharing, learning and development and peer review. g) Strengthen links with academics, evaluation experts, and relevant stakeholders both nationally and internationally, building international good practice. h) Identify and address cross cutting policy, programme and operational evaluation issues. i) Clarify the role and value of evaluation’ in relation to the range of other commitments such as post-implementation monitoring, audits and reviews carried out by the NAO, PMDU, etc. Where possible, try to strengthen links between the various requirements and those responsible for them (e.g. those responsible for evaluation and those responsible for spending reviews). j) Report main findings to Heads of Analysis (and Government chief economist where necessary) and encourage them to influence top level stakeholders. 8 Government response to the Science and Technology Committee’s Second Report of Session 2009-10 List of Reports from the Committee during the current Parliament The reference number of the Government's response to each Report is printed in brackets after the HC printing number. Session 2009-10 First Report The work of the Committee in 2008-09 HC 103 Second Report Evidence Check 1: Early Literacy Interventions HC 44 (HC 385) Third Report The Government's review of the principles applying to the HC 158-I (HC 384) treatment of independent scientific advice provided to government Fourth Report Evidence Check 2: Homeopathy HC 45 Session 2008-09 First Report Re-skilling for recovery: After Leitch, implementing skills and HC 48-I (HC 365) training policies Second Report The Work of the Committee 2007-08 HC 49 Third Report DIUS’s Departmental Report 2008 HC 51-1 (HC 383) Fourth Report Engineering: turning ideas into reality HC 50-I (HC 759) Fifth Report Pre-appointment hearing with the Chair-elect of the Economic and HC 505 Social Research Council, Dr Alan Gillespie CBE Sixth Report Pre-appointment hearing with the Chair-elect of the Biotechnology HC 506 and Biological Sciences Research Council, Professor Sir Tom Blundell Seventh Report Spend, spend, spend? — The mismanagement of the Learning and HC 530 (HC 989) Skills Council's capital programme in further education colleges Eighth Report Putting Science and Engineering at the Heart of Government Policy HC 168-1 (HC 1036) Ninth Report Pre-appointment hearing with the Chair-elect of the Science and HC 887 Technology Facilities Council, Professor Michael Sterling Tenth Report Sites of Special Scientific Interest HC 717 (HC 990) Eleventh Report Students and Universities HC 170-1 (HC 991) Session 2007-08 First Report UK Centre for Medical Research and Innovation HC 185 (HC 459) Second Report The work and operation of the Copyright Tribunal HC 245 (HC 637) Third Report Withdrawal of funding for equivalent or lower level qualifications HC 187-I (HC 638) (ELQs) Fourth Report Science Budget Allocations HC 215 (HC 639) Fifth Report Renewable electricity-generation technologies HC 216-I (HC 1063) Sixth Report Biosecurity in UK research laboratories HC 360-1 (HC 1111) Seventh Report Pre-legislative Scrutiny of the Draft Apprenticeships Bill HC 1062-1 (HC (2008-09) 262) First Special The Funding of Science and Discovery Centres: Government HC 214 Report Response to the Eleventh Report from the Science and Technology Committee, Session 2006-07

See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.