Skip to main content
Log in

Changing the Peer Review or Changing the Peers — Recent Development in Assessment of Large Research Collaborations

  • Article
  • Published:
Higher Education Policy Aims and scope Submit manuscript

Abstract

Peer review of research programmes is changing. The problem is discussed through detailed study of a selection process to a call for collaborations in the energy sector for the European Institute of Innovation and Technology. The authors were involved in the application for a Knowledge Innovation Community. Through the analysis of the case the article discusses the role of researchers acting in their new role of reviewers in situations where a number of important decision-making dimensions are cut out from the formerly direct influence on the researcher's quality assessment. In connection with this discussion the article will provide input to a critical review of the use of quantified scaling developed to systematize a quality assessment on the top of the peer review-based assessments. In addition the article discusses the challenges to the peer review system in Mode 2 type science and in cross-disciplinary research collaborations. This is a contribution to the assessment of new roles of peer reviewers in large policy decision-making systems like research funding, and an input to further discussions.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Similar content being viewed by others

Notes

  1. C. P. Snow opened up the discussion on the conflict between humanities and the natural sciences in 1959 in his famous Red Lectures (Snow, 1993).

  2. NIH — National Institute for Health and NSF — National Science Foundation.

  3. The CUDOS norms describing ideal norms for scientists behaviour in order to be prompted the best possible science results were developed by Merton in the wake of rude state impact on science before WW2 in Germany and the USSR (Merton, 1938, 1968a, 1942).

  4. A recent survey of more than 4,000 researchers’ view on the peer review system (Ware, 2008) disclosed an unexpected large support for the classic system in the light of the often hard critique from researchers and policymakers — in the survey the researchers acknowledged the criticism raised at the peer review system (bias etc.) but chose to support it nevertheless.

  5. http://www.ref.ac.uk/ (accessed 22 May 2012).

  6. From historical studies of the geneses of standardized quantification systems we know how difficult and time consuming quantification of former qualitative processes can be (Porter, 1992).

  7. At step 2 the ‘Management, governance and organization of the partnership and co-location, covering also financial and legal aspects of the KIC’ cover 50 points of 100, and the other 50 are from combined strength of partners. At step 2 the content and originality is gone, and only financial, legal and management issues are at stake, but how is this assessed? That is the content and novelty do not count in the second round.

  8. SUCCESS=Searching Unprecedented Cooperations on Climate and Energy to ensure Sustainability.

References

  • Alvesson, M. (2003) ‘Methodology for close up studies — struggling with closeness and closure’, Higher Education 46 (2): 167–193.

    Article  Google Scholar 

  • Barker, K. (2007) ‘The UK research assessment exercise: the evolution of a national research evaluation system’, Research Evaluation 16 (1): 3–12.

    Article  Google Scholar 

  • Bornmann, L. (2008) ‘Scientific peer review: an analysis of the peer review process from the perspective of sociology of science’, Human Architecture 33 (2): 23–38.

    Google Scholar 

  • Bornmann, L. (2011) ‘Scientific peer review’, Annual Review of Information Science and Technology 45: 199–245.

    Article  Google Scholar 

  • Bornmann, L. and Daniel, H.D. (2009) ‘Reviewer and editor biases in journal peer review: an investigation of manuscript refereeing at Angewandte Chemie international edition’, Research Evaluation 18 (4): 262–272.

    Article  Google Scholar 

  • Chiesa, V. and Manzini, R. (1997) ‘Managing virtual R&D organizations: lessons from the pharmaceutical industry’, International Journal of Technology Management 13 (5/6): 471–485.

    Article  Google Scholar 

  • Chubin, D.E. and Hackett, E.J. (1990) Peerless Science. Peer Review and U.S. Science Policy, Albany, NY: State University of New York Press.

    Google Scholar 

  • Cicchetti, D. (1991) ‘The reliability of peer review for manuscript and grant submission’, Behavioral and Brain Sciences 1 (14): 119–186.

    Article  Google Scholar 

  • Cohen, M.D., March, J.G. and Olsen, J.P. (1972) ‘A garbage can model of organizational choice’, Administrative Science Quarterly 17 (1): 1–25.

    Article  Google Scholar 

  • Cole, S. (1998) ‘How does peer review work and how can it be improved?’ Minerva 36 (2): 179–189.

    Article  Google Scholar 

  • Cole, S., Cole, J.R. and Simon, G.A. (1981) ‘Chance and consensus in peer review’, Science 214 (4523): 881–886.

    Article  Google Scholar 

  • Ernø-Kjølhede, E. and Hansson, F. (2011) ‘Measuring research performance during a changing relationship between science and society’, Research Evaluation 20 (2): 130–142.

    Article  Google Scholar 

  • Etzkowitz, H. and Leydesdorff, L. (2000) ‘The dynamics of innovation: from national systems and ‘mode 2’ to a triple helix of university–industry–government relations’, Research Policy 29 (2): 109–123.

    Article  Google Scholar 

  • Geuna, A. and Martin, B. (2003) ‘University research evaluation and funding: an international comparison’, Minerva 41 (4): 277–304.

    Article  Google Scholar 

  • Gibbons, M., Limoges, C., Nowotny, H., Scott, S., Schwartzman, P. and Trow, M. (1994) The New Production of Knowledge. The Dynamics of Science and Research in Contemporary Societies, London: Sage Publications.

    Google Scholar 

  • Hansson, F., Brenneche, N.T., Mønsted, M. and Fransson, T. (2009) ‘Benchmarking successful models of cooperation’, SUCCESS Work Package 1. Karlsruhe, http://openarchive.cbs.dk/handle/10398/6347.

  • HEFCE. (2009) Research Excellence Framework, 2009. Second Consultation on the Assessment and Funding of Research, September 2009/38, London: HEFCE.

  • HEFCE. (2012) REF 01.2012 Panel Criteria and Working Methods Contents, Main London: HEFCE.

  • Hessels, L.K. and van Lente, H. (2008) ‘Re-thinking new knowledge production: a literature review and a research agenda’, Research Policy 37 (4): 740–760.

    Article  Google Scholar 

  • KIC InnoEnergy. (2010), http://eit.europa.eu/kics1/kic-innoenergy.html, accessed 1 February 2012.

  • Kostoff, R.N. and Geisler, E. (2007) ‘The unintended consequences of metrics in technology evaluation’, Journal of Informetrics 1 (2): 103–114.

    Article  Google Scholar 

  • Kuhn, T.S. (1970) The Structure of Scientific Revolutions, Chicago, IL: University of Chicago Press.

    Google Scholar 

  • Lamont, M. (2009) How Professors Think. Inside the Curious World of Academic Judgement, Cambridge, MA: Harvard Business Press.

    Book  Google Scholar 

  • Lamont, M. and Huutoniemi, K. (2011) ‘Comparing Customary Rules of Fairness: Evaluative Practices in Various Types of Peer Review Panels’, in C. Camic, N. Gross and M. Lamont (eds.) Social Knowledge in the Making, Chicago, IL: University of Chicago Press, pp. 209–232.

    Google Scholar 

  • Langfeldt, L. (2006) ‘The policy challenges of peer review: managing bias, conflict of interest and interdisciplinary assessment’, Research Evaluation 15 (1): 31–41.

    Article  Google Scholar 

  • Laudel, G. (2006) ‘Conclave in the Tower of Babel: how peers review interdisciplinary research proposals’, Research Evaluation 15 (1): 57–68.

    Article  Google Scholar 

  • Leifer, R., McDermott, C.M., O’Connor, G.C., Peters, L.S., Rice, M. and Veryzer, R.W. (2000) Radical Innovation: How Mature Companies Can Outsmart Upstarts, Boston, MA: Harvard Business School Press.

    Google Scholar 

  • Lindblom, C.E. (1959) ‘The science of “muddling through”’, Public Administration Review 19 (2): 79–88.

    Article  Google Scholar 

  • Martin, B.R. (2003) ‘The Changing Social Contract for Science and the Evolution of the University’, in A. Geuna, A.J. Salter and W. Edward Steinmueller (eds.) Science and Innovation, Cheltenham: Edward Elgar, pp. 7–27.

    Google Scholar 

  • Merton, R. (1938) ‘Science and the social order’, Philosophy of Science 5 (3): 321–337.

    Article  Google Scholar 

  • Merton, R.K. (ed.) (1942) ‘The Normative Structure of Science’, in (1973) The Sociology of Science: Theoretical and Empirical Investigations, Chicago, IL: University of Chicago Press, pp. 267–280.

    Google Scholar 

  • Merton, R.K. (ed.) (1968a) ‘Science and Democratic Social Structure’, in Social Theory and Social Structure XVIII, New York: The Free Press, pp. 604–615.

    Google Scholar 

  • Merton, R. (1968b) ‘The Matthew effect in science’, Science 159 (3810): 56–63.

    Article  Google Scholar 

  • Nowotny, H., Gibbons, M. and Scott, P. (2001) Re-thinking Science. Knowledge and the Public in an Age of Uncertainty, Oxford: The Polity Press.

    Google Scholar 

  • Nowotny, H., Scott, P. and Gibbons, M. (2003) ‘“Mode 2” revisited: the new production of knowledge’, Minerva 41 (3): 179–194.

    Article  Google Scholar 

  • Olbrecht, M. and Bornmann, L. (2010) ‘Panel peer review of grant applications: what do we know from research in social Psychology on judgment and decision-making in groups?’ Research Evaluation 19 (4): 293–304.

    Article  Google Scholar 

  • Porter, T.M. (1992) ‘Quantification and the accounting ideal in science’, Social Studies of Science 22 (4): 633–652.

    Article  Google Scholar 

  • Seglen, P.O. (1997) ‘Citations and journal impact factors: questionable indicators of research quality’, Allergy 52 (11): 1050–1056.

    Article  Google Scholar 

  • Snow, C.P. (1993) The Two Cultures, Cambridge: Cambridge University Press.

    Book  Google Scholar 

  • Stake, R.E. (2001) ‘Case Studies’, in N.K. Denzin and Y.S. Lincoln (eds.) Handbook of Qualitative Research, Vol. 3, London: Sage, pp. 435–454.

    Google Scholar 

  • Starbuck, W.H. (2005) ‘How much better are the most-prestigious journals? The statistics of academic publication’, Organization Science 16 (2): 180–200.

    Article  Google Scholar 

  • van den Besselaar, P. and Leydesdorff, L. (2009) ‘Past performance, peer review and project selection: a case study in the social and behavioral sciences’, Research Evaluation 18 (4): 273–288.

    Article  Google Scholar 

  • Ware, M. (2008) Peer Review in Scholarly Journals: Perspective of The Scholarly Community — An International Study, London: Publishing Research Consortium.

    Google Scholar 

  • Whitley, R. (2000) The Intellectual and Social Organization of the Sciences, Oxford: Oxford University Press.

    Google Scholar 

  • Whitley, R. (2010) ‘Changing Governance of the Public Sciences’, in R. Whitley and J. Gläser (eds.) The Changing Governance of the Sciences,, Dordrecht: Springer, pp. 3–30.

    Google Scholar 

Download references

Acknowledgements

The authors thank Thomas Basbøll, CBS for his comments to the final version.

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

About this article

Cite this article

Hansson, F., Mønsted, M. Changing the Peer Review or Changing the Peers — Recent Development in Assessment of Large Research Collaborations. High Educ Policy 25, 361–379 (2012). https://doi.org/10.1057/hep.2012.17

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1057/hep.2012.17

Keywords

Navigation