Indicators, evidence, and impact within systems, Department for Business, Innovation and Skills 2012, http://www.arc.gov.au/pdf/ERA_Indicator_Principles.pdf, http://www.charitystar.org/wp-content/uploads/2011/05/Return_on_donations_a_white_paper_on_charity_impact_measurement.pdf, http://www.oecd.org/science/innovationinsciencetechnologyandindustry/37450246.pdf, http://www.cahs-acss.ca/wp-content/uploads/2011/09/ROI_FullReport.pdf, http://mice.cerch.kcl.ac.uk/wp-uploads/2011/07/MICE_report_Goldsmiths_final.pdf, http://www.timeshighereducation.co.uk/story.asp?storyCode=409614§ioncode=26, http://www.odi.org.uk/rapid/Events/ESRC/docs/background_paper.pdf, http://www.iscintelligence.com/archivos_subidos/usfacultyburden_5.pdf, http://blogs.lse.ac.uk/impactofsocialsciences/tag/claire-donovan/, http://www.atn.edu.au/docs/Research%20Global%20-%20Measuring%20the%20impact%20of%20research.pdf, http://www.hbs.edu/research/pdf/10-099.pdf, http://www.esf.org/index.php?eID=tx_ccdamdl_file&p[file]=25668&p[dl]=1&p[pid]=6767&p[site]=European%20Science%20Foundation&p[t]=1351858982&hash=93e987c5832f10aeee3911bac23b4e0f&l=en, http://www.rand.org/pubs/research_briefs/2007/RAND_RB9202.pdf, http://www.rand.org/pubs/documented_briefings/2010/RAND_DB578.pdf, http://ukirc.ac.uk/object/report/8025/doc/CIHE_0612ImpactReport_summary.pdf, http://www.timeshighereducation.co.uk/story.asp?storyCode=415340§ioncode=26, http://www.publicengagement.ac.uk/sites/default/files/80096%20NCCPE%20Social%20Value%20Report.pdf, http://www2.lse.ac.uk/government/research/resgroups/LSEPublicPolicy/Docs/LSE_Impact_Handbook_April_2011.pdf, http://www.artscouncil.org.uk/media/uploads/documents/publications/340.pdf, http://www.ref.ac.uk/media/ref/content/pub/researchexcellenceframeworkimpactpilotexercisefindingsoftheexpertpanels/re01_10.pdf, http://www.ref.ac.uk/media/ref/content/pub/assessmentframeworkandguidanceonsubmissions/02_11.pdf, http://www.ref.ac.uk/media/ref/content/pub/assessmentframeworkandguidanceonsubmissions/GOS%20including%20addendum.pdf, http://www.ref.ac.uk/media/ref/content/pub/panelcriteriaandworkingmethods/01_12.pdf, http://www.russellgroup.ac.uk/uploads/REF-consultation-response-FINAL-Dec09.pdf, http://www.siampi.eu/Pages/SIA/12/625.bGFuZz1FTkc.html, http://www.siampi.eu/Content/SIAMPI/SIAMPI_Final%20report.pdf, http://www.thesroinetwork.org/publications/doc_details/241-a-guide-to-social-return-on-investment-2012, http://www.ucu.org.uk/media/pdf/n/q/ucu_REFstatement_finalsignatures.pdf, http://www.esrc.ac.uk/_images/Case_Study_of_the_Future_of_Work_Programme_Volume_2_tcm8-4563.pdf, Receive exclusive offers and updates from Oxford Academic, Automated collation of evidence is difficult, Allows evidence to be contextualized and a story told, Incorporating perspective can make it difficult to assess critically, Enables assessment in the absence of quantitative data, Preserves distinctive account or disciplinary perspective, Rewards those who can write well, and/or afford to pay for external input. If this research is to be assessed alongside more applied research, it is important that we are able to at least determine the contribution of basic research. Consortia for Advancing Standards in Research Administration Information, for example, has put together a data dictionary with the aim of setting the standards for terminology used to describe impact and indicators that can be incorporated into systems internationally and seems to be building a certain momentum in this area. Muffat says - "Evaluation is a continuous process and is concerned with than the formal academic achievement of pupils. 4. The term comes from the French word 'valuer', meaning "to find the value of". The introduction of impact assessments with the requirement to collate evidence retrospectively poses difficulties because evidence, measurements, and baselines have, in many cases, not been collected and may no longer be available. A Preferred Framework and Indicators to Measure Returns on Investment in Health Research, Measuring Impact Under CERIF at Goldsmiths, Anti-Impact Campaigns Poster Boy Sticks up for the Ivory Tower. 2006; Nason et al. In the UK, there have been several Jisc-funded projects in recent years to develop systems capable of storing research information, for example, MICE (Measuring Impacts Under CERIF), UK Research Information Shared Service, and Integrated Research Input and Output System, all based on the CERIF standard. By evaluating the contribution that research makes to society and the economy, future funding can be allocated where it is perceived to bring about the desired impact. There are areas of basic research where the impacts are so far removed from the research or are impractical to demonstrate; in these cases, it might be prudent to accept the limitations of impact assessment, and provide the potential for exclusion in appropriate circumstances. Replicated from (Hughes and Martin 2012). (2007) surveyed researchers in the US top research institutions during 2005; the survey of more than 6000 researchers found that, on average, more than 40% of their time was spent doing administrative tasks. Concerns over how to attribute impacts have been raised many times (The Allen Consulting Group 2005; Duryea et al. Aspects of impact, such as value of Intellectual Property, are currently recorded by universities in the UK through their Higher Education Business and Community Interaction Survey return to Higher Education Statistics Agency; however, as with other public and charitable sector organizations, showcasing impact is an important part of attracting and retaining donors and support (Kelly and McNicoll 2011). Impact is often the culmination of work within spanning research communities (Duryea et al. The most appropriate type of evaluation will vary according to the stakeholder whom we are wishing to inform. Scriven (2007:2) synthesised the definition of evaluation which appears in most dictionaries and the professional literature, and defined evaluation as "the process of determining merit, worth, or significance; an evaluation is a product of that process." . Case studies are ideal for showcasing impact, but should they be used to critically evaluate impact? 2007). The verb evaluate means to form an idea of something or to give a judgment about something. In the educational context, the . Two areas of research impact health and biomedical sciences and the social sciences have received particular attention in the literature by comparison with, for example, the arts. Impact is assessed alongside research outputs and environment to provide an evaluation of research taking place within an institution. A Review of International Practice, HM Treasury, Department for Education and Skills, Department of Trade and Industry, Yes, Research can Inform Health Policy; But can we Bridge the Do-Knowing its been Done Gap?, Council for Industry and Higher Education, UK Innovation Research Centre. 0000328114 00000 n
2005; Wooding et al. For full access to this pdf, sign in to an existing account, or purchase an annual subscription. Impact is derived not only from targeted research but from serendipitous findings, good fortune, and complex networks interacting and translating knowledge and research. 10312. If knowledge exchange events could be captured, for example, electronically as they occur or automatically if flagged from an electronic calendar or a diary, then far more of these events could be recorded with relative ease. 3. (2005), Wooding et al. These metrics may be used in the UK to understand the benefits of research within academia and are often incorporated into the broader perspective of impact seen internationally, for example, within the Excellence in Research for Australia and using Star Metrics in the USA, in which quantitative measures are used to assess impact, for example, publications, citation, and research income. 0000007223 00000 n
The quality and reliability of impact indicators will vary according to the impact we are trying to describe and link to research. 0000007777 00000 n
(2006) on the impact arising from health research. In this case, a specific definition may be required, for example, in the Research Excellence Framework (REF), Assessment framework and guidance on submissions (REF2014 2011b), which defines impact as, an effect on, change or benefit to the economy, society, culture, public policy or services, health, the environment or quality of life, beyond academia. Donovan (2011) asserts that there should be no disincentive for conducting basic research. Without measuring and evaluating their performance, teachers will not be able to determine how much the students have learned. The Oxford English Dictionary defines impact as a 'Marked effect or influence', this is clearly a very broad definition. This distinction is not so clear in impact assessments outside of the UK, where academic outputs and socio-economic impacts are often viewed as one, to give an overall assessment of value and change created through research. The . The reasoning behind the move towards assessing research impact is undoubtedly complex, involving both political and socio-economic factors, but, nevertheless, we can differentiate between four primary purposes. Definition of Evaluation "Evaluation is the collection, analysis and interpretation of information about any aspect of a programme of education, as part of a recognised process of judging its effectiveness, its efficiency and any other outcomes it may have." Mary Thorpe 2. Decker et al. Incorporating assessment of the wider socio-economic impact began using metrics-based indicators such as Intellectual Property registered and commercial income generated (Australian Research Council 2008). Cooke and Nadim (2011) also noted that using a linear-style taxonomy did not reflect the complex networks of impacts that are generally found. Time, attribution, impact. To allow comparisons between institutions, identifying a comprehensive taxonomy of impact, and the evidence for it, that can be used universally is seen to be very valuable. The Goldsmith report concluded that general categories of evidence would be more useful such that indicators could encompass dissemination and circulation, re-use and influence, collaboration and boundary work, and innovation and invention. Despite many attempts to replace it, no alternative definition has . It has been acknowledged that outstanding leaps forward in knowledge and understanding come from immersing in a background of intellectual thinking that one is able to see further by standing on the shoulders of giants. 0000006922 00000 n
It incorporates both academic outputs and wider societal benefits (Donovan and Hanney 2011) to assess outcomes of health sciences research. This is recognized as being particularly problematic within the social sciences where informing policy is a likely impact of research. There is a great deal of interest in collating terms for impact and indicators of impact. Also called evaluative writing, evaluative essay or report, and critical evaluation essay . If metrics are available as impact evidence, they should, where possible, also capture any baseline or control data. Here we address types of evidence that need to be captured to enable an overview of impact to be developed. They risk being monetized or converted into a lowest common denominator in an attempt to compare the cost of a new theatre against that of a hospital. Assessment for learning is ongoing, and requires deep involvement on the part of the learner in clarifying outcomes, monitoring on-going learning, collecting evidence and presenting evidence of learning to others.. 0000001178 00000 n
The Value of Public Sector R&D, Assessing impacts of higher education systems, National Co-ordinating Centre for Public Engagement, Through a Glass, Darkly: Measuring the Social Value of Universities, Describing the Impact of Health Research: A Research Impact Framework, LSE Public Policy Group. 2008), developed during the mid-1990s by Buxton and Hanney, working at Brunel University. different meanings for different people in many different contexts. Wooding et al. Evaluation is a procedure that reviews a program critically. 0000334683 00000 n
Not only are differences in segmentation algorithm, boundary definition, and tissue contrast a likely cause of the poor correlation , but also the two different software packages used in this study are not comparable from a technical point of view.