Skip to main content
Log in

Identifying research fields within business and management: a journal cross-citation analysis

  • General Paper
  • Published:
Journal of the Operational Research Society

Abstract

A discipline such as business and management (B&M) is very broad and has many fields within it, ranging from fairly scientific ones such as management science or economics to softer ones such as information systems. There are at least three reasons why it is important to identify these sub-fields accurately. First, to give insight into the structure of the subject area and identify perhaps unrecognised commonalities; second, for the purpose of normalising citation data as it is well-known that citation rates vary significantly between different disciplines. And third, because journal rankings and lists tend to split their classifications into different subjects—for example, the Association of Business Schools list, which is a standard in the UK, has 22 different fields. Unfortunately, at the moment these are created in an ad-hoc manner with no underlying rigour. The purpose of this paper is to identify possible sub-fields in B&M rigorously based on actual citation patterns. We have examined 450 journals in B&M, which are included in the ISI Web of Science and analysed the cross-citation rates between them enabling us to generate sets of coherent and consistent sub-fields that minimise the extent to which journals appear in several categories. Implications and limitations of the analysis are discussed.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Figure 1

Similar content being viewed by others

References

  • Adler N and Harzing A-W (2009). When knowledge wins: Transcending the sense and nonsense of academic rankings. Academy of Management Learning and Education 8 (1): 72–95.

    Article  Google Scholar 

  • Ahlgren P, Jarneving B and Rousseau R (2003). Requirements for a cocitation similarity measure, with special reference to Pearson’s correlation coefficient. Journal of the American Society for Information Science and Technology 54 (6): 550–560.

    Article  Google Scholar 

  • Association of Business Schools (2010). Academic journal quality guide. Association of Business Schools.

  • Blondel V, Guillaume J-L, Lambiotte R and Lefebvre L (2008). Fast unfolding of communities in large networks. Journal of Statistical Mechanics: Theory and Experiment 2008 (10): P10008.

    Article  Google Scholar 

  • Boyack K and Klavans R (2011). Multiple dimensions of journal specifity: Why journals can’t be assigned to disciplines. In: E Noyons, P Ngulube and J Leta (eds). The 13th Conference of the International Society for Scientometrics and Informetrics (Vol. I, ISSI, Leiden University and the University of Zululand, Durban, S. Africa, pp 123–133.

  • Egghe L and Leydesdorff L (2009). The relation between Pearson’s correlation coefficient r and Salton’s cosine measure. Journal of the American Society for Information Science and Technology 60 (5): 1027–1036.

    Article  Google Scholar 

  • Hair J, Anderson R, Tatham R and Black W (1998). Multivariate Data Analysis. Prentice Hall: New Jersey.

    Google Scholar 

  • Hoepner A and Unerman J (2009). Explicit and implicit subject bias in the ABS Journal Quality Guide. Accounting Education 21 (1): 3–15.

    Article  Google Scholar 

  • Hussain S (2011). Food for thought on the ABS Academic Journal Quality Guide. Accounting Education 20 (6): 545–559.

    Article  Google Scholar 

  • Leydesdorff L (2004). Top-down decomposition of the Journal Citation Report of the Social Science Citation Index: Graph- and factor-analytical approaches. Scientometrics 60 (2): 159–180.

  • Leydesdorff L (2006). Can scientific journals be classified in terms of aggregated journal-journal citation relations using the Journal Citation Reports? Journal of the American Society for Information Science and Technology 57 (5): 601–613.

  • Leydesdorff L (2008). Caveats for the use of citation indicators in research and journal evaluations. Journal of the American Society for Information Science and Technology 59 (2): 278–287.

    Article  Google Scholar 

  • Leydesdorff L and Opthof T (2010a). Scopus’s source normalized impact per paper (SNIP) versus a journal impact factor based on fractional counting of citations. Journal of the American Society for Information Science and Technology 61 (11): 2365–2369.

    Article  Google Scholar 

  • Leydesdorff L and Opthof T (2010b). Scopus’ SNIP indicator: Reply to Moed. Journal of the American Society for Information Science and Technology 62 (1): 214–215.

    Article  Google Scholar 

  • Macdonald S and Kam J (2007). Ring a ring o’ roses: Quality journals and gamesmanship in management studies. Journal of Management Studies 44 (4): 640–655.

    Article  Google Scholar 

  • Mingers J (2014). Problems with SNIP. Journal of Informatics 8 (4): 890–894.

    Google Scholar 

  • Mingers J and Burrell Q (2006). Modelling citation behavior in management science journals. Information Processing and Management 42 (6): 1451–1464.

    Article  Google Scholar 

  • Mingers J and Lipitakis E (2010). Counting the citations: A comparison of Web of Science and Google Scholar in the field of management. Scientometrics 85 (2): 613–625.

    Article  Google Scholar 

  • Mingers J and Lipitakis E (2013). Evaluating a department’s research: Testing the Leiden methodology in business and management. Information Processing & Management 49 (3): 587–595.

    Article  Google Scholar 

  • Mingers J and Willmott H (2013). Taylorizing business school research: On the ‘one best way’ performative effects of journal ranking lists. Human Relations 66 (8): 1051–1073.

    Article  Google Scholar 

  • Mingers J, Watson K and Scaparra MP (2012). Estimating business and management journal quality from the 2008 Research Assessment Exercise in the UK. Information Processing and Management 48 (6): 1078–1093.

    Article  Google Scholar 

  • Moed H (2010a). CWTS crown indicator measures citation impact of a research group’s publication oeuvre. Journal of Informetrics 4 (3): 436–438.

    Article  Google Scholar 

  • Moed H (2010b). Measuring contextual citation impact of scientific journals. Journal of Informetrics 4 (3): 265–277.

    Article  Google Scholar 

  • Moed H (2010c). The source-normalized impact per paper (SNIP) is a valid and sophisticated indicator of journal citation impact. in arXiv preprint, arxiv.org.

  • Moed HF, Burger W, Frankfort J and Van Raan A (1985). The use of bibliometric data for the measurement of university research performance. Research Policy 14 (3): 131–149.

    Article  Google Scholar 

  • Morris M, Harvey C and Kelly A (2009). Journal rankings and the ABS Journal Quality Guide. Management Decision 47 (9): 1441–1451.

    Article  Google Scholar 

  • Morris H, Harvey C, Kelly A and Rowlinson M (2011). Food for thought? A rejoinder on peer-review and RAE2008 evidence. Accounting Education 20 (6): 561–573.

    Article  Google Scholar 

  • Paul R (2007). Challenges to information systems: Time to change. European Journal of Information Systems 16 (3): 193–195.

    Article  Google Scholar 

  • Paul R (2008). Measuring research quality: The United Kingdom government’s Research Assessment Exercise. European Journal of Information Systems 17 (4): 324–329.

    Article  Google Scholar 

  • Pudovkin A and Garfield E (2002). Algorithmic procedure for finding semantically related journals. Journal of the American Society for Information Science and Technology 53 (13): 1113–1119.

    Article  Google Scholar 

  • RAE (2004). RAE2008: Initial decisions by the UK funding bodies. Report, HEFCE, http://www.rae.ac.uk/pubs/2004/01/.

  • RAE (2005). Guidance on submissions. Report, HEFCE, http://www.rae.ac.uk/pubs/2005/03/.

  • RAE (2006). RAE2008 Panel criteria and working methods. Report No. 01/2006, HEFCE, http://www.rae.ac.uk/pubs/2006/01/.

  • RAE (2009). RAE2008 subject overview reports: I 36 business and management studies. Report, HEFCE, http://www.rae.ac.uk/pubs/2009/ov/.

  • Rafols I and Leydesdorff L (2009). Content-based and algorithmic classifications of journals: Perspectives on the dynamics of scientific communication and indexer effects. Journal of the American Society for Information Science and Technology 60 (9): 1823–1835.

    Article  Google Scholar 

  • Rafols I, Leydesdorff L, O’Hare A, Nightingale P and Stirling A (2012). How journal rankings can suppress interdisciplinary research: A comparison between innovation studies and business & management. Research Policy 41 (7): 1262–1282.

    Article  Google Scholar 

  • Rinia EJ, van Leeuwen TN, van Vuren HG and van Raan AFJ (1998). Comparative analysis of a set of bibliometric indicators and central peer review criteria: Evaluation of condensed matter physics in the Netherlands. Research Policy 27 (1): 95–107.

    Article  Google Scholar 

  • Rowlinson M, Harvey C, Kelly A, Morris H and Todeva E (2013). Accounting for research quality: Research audits and the journal rankings debate. Critical Perspectives on Accounting, available online 25 June, doi:10.1016/j.cpa.2013.05.012.

  • Salton G and McGill M (1987). Introduction to Modern Information Retrieval. McGraw-Hill: New York.

    Google Scholar 

  • Steinley D (2004). Properties of the Hubert-Arabie adjusted Rand index. Psychological Methods 9: 386–396.

    Article  Google Scholar 

  • Van Raan A (2003). The use of bibliometric analysis in research performance assessment and monitoring of interdisciplinary scientific developments. Technology Assessment—Theory and Practice 1 (12): 20–29.

    Google Scholar 

  • Van Raan A, van Leeuwen T, Visser M, van Eck N and Waltman L (2011). Rivals for the crown: Reply to Opthof and Leydesdorff. Journal of Informetrics 4 (3): 431–435.

    Article  Google Scholar 

  • Waltman L, van Eck N, van Leeuwen T and Visser M (2013). Some modifications to the SNIP journal impact indicator. Journal of Informetrics 7 (2): 272–285.

    Article  Google Scholar 

  • Willmott H (2011). Journal list fetishism and the perversion of scholarship: Reactivity and the ABS list. Organization 18 (4): 429–442.

    Article  Google Scholar 

  • Zhao H and Lin X (2010). A comparison of mapping algorithms for author co-citation data analysis. Proceedings of the American Society for Information Science and Technology, Vol. 47(1), pp 1–3.

Download references

Acknowledgements

We are grateful to Thomson-Reuters for permission to use the JCR data.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to John Mingers.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Mingers, J., Leydesdorff, L. Identifying research fields within business and management: a journal cross-citation analysis. J Oper Res Soc 66, 1370–1384 (2015). https://doi.org/10.1057/jors.2014.113

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1057/jors.2014.113

Keywords

Navigation