Skip to main content
Log in

Where is Knowledge Generated? On the Productivity and Impact of Political Science Departments in Latin America

  • Profession
  • Published:
European Political Science Aims and scope Submit manuscript

Abstract

Clear rules that encourage meritocracy, and that include the evaluation of scholarly productivity, are slowly and unevenly taking hold in academic life in Latin America. While some countries have official rankings of political science departments, others rely only on informal assessments. In a third set of countries, we cannot even consider the competition because the market is dominated by a state monopoly. This article provides a first, systematic study of scientific productivity and concomitant impact in more than twenty departments of Political Science and International Relations in the region. I show that scholars’ productivity is intimately related to where they pursued graduate studies, what subfield of research they work on, and the explicit adoption of rules that encourage meritocracy and academic careerism.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Notes

  1. I thank the anonymous referees of European Political Studies and the following colleagues: Ana De Luca Zuria, Ana Laura Rodriguez, Andres Malamud, Angelika Rettberg, Anthony Pezzola, Artur Zimerman, Carlos Ranulfo, Catalina Smulovitz, Clara Riba, Daniel Buquet, Daniel Chasquetti, Eric Magar, Fabiano Santos, Felipe Botero, Gideon Rahat, Gilberto Aranda, Izabel Noll, Jacint Jordana, Joy Langston, Manuel Alcántara, Marcelo Leiras, Maria Gloria Municoy, Mariana Magaldi de Sousa, Miguel A. López, Miguel de Luca, Rachel Meneguello, Rafael Velazquez, Roberto Breña, Rossana Castiglioni, and Simon Hug. This research fits within the orbit of FONDECYT's Project All caveats apply.

  2. Ballard and Mitchell (1998), Garand and Graddy (1999), Hix (2004), Jackman and Siverson (1996), Katz and Eagles (1996), Lowry and Silver (1996), McCormick and Rice (2001), Miller et al (1996), Schmitter (2002), Welch and Hibbing (1983), Giles and Garand (2007).

  3. For a general overview, see Altman (2005). Also see articles on Argentina (Leiras et al, 2005), Brazil (Neto and Santos, 2005), Chile (Fuentes and Santana, 2005), Colombia (Bejarano and Wills, 2005), Mexico (Loaeza, 2005), Uruguay (Garcé, 2005), and Venezuela (Alvarez and Dahdah, 2005).

  4. Online information on Venezuelan universities is incomplete and no responses to my inquiries were received.

  5. Yet, at the same time is important to notice that during the most recent years, WoK has extended its inclusion of journals in Spanish.

  6. Publish or Perish uses Google Scholar to obtain the raw citations, and then analyzes these citations and presents many interesting statistics, such as total number of papers, total number of citations, average number of citations per year, Hirsch's H-index, and related parameters among others. These data are more ‘democratic’ than WoK's data, as they include columns in newspapers, conference presentations, chapters, books, etc. However, its use requires greater care, given that lots of ‘gray’ data are mixed with good data. Downloadable from http://www.harzing.com/pop.htm.

  7. Actually, some scholars have argued that the H-Index is slightly less predictive in terms of accuracy and precision than even the simpler measure of mean citations per paper (Lehmann et al, 2006). Also, the H-index does not take into account the presence of self-citations, nor does it account for the number of authors of a paper.

  8. The H-Index seems to be more useful for an advocacy group than for academic work.

  9. The assumption that each author worked proportionately with the number of authors on a particular piece undermines David Collier's popular saying that when he writes a paper with other scholar each one ends up doing 75 per cent of the work. At the same time, this assumption also undermines those works in which the workload was skewed, particularly in the cases of professor–student relationships (as many people assume that the student did most of the work and the professor's name is just a blue-ribbon). As these two cases are extremely hard to quantify, I stick to the ‘proportional’ focal point.

  10. This is one of the reasons WoK is already working with a larger impact factor in social sciences (5 years instead of 2 years).

  11. Self-citations are also excluded in the case of co-authorship. While I am confident that the data provided in this paper are reliable, I must acknowledge that some problems could arise simply by the fact that WoK uses the typical ordering of names based on the last surname provided. Sometimes, authors are indexed with their second last name instead of the first (in the cases of non-hyphenated names). Though I checked both last names, it could be the case that some entries were not considered.

  12. A natural question is: what about a scholar who has one or two widelycited articles? Alternatively, what about a measure based on dividing citations by publications to create a perarticle citation rate? This proposal is great for assessing leading articles, but not necessarily for calculating ‘areas of scientific research’, or for comparing scholars or departments. For instance, imagine a colleague A that has just one article with fifty citations; and another, B, with five articles and ten citations each. Is there any rule of thumb to determine who is ‘better’ as a scholar? On the one hand, scholar A had a ‘silver bullet’, on the other hand, scholar B has shown a much more even pattern of production (which is also notably valuable by current academic standards). This is one of the reasons that the ‘area of scientific research’ is appealing: it is not contingent to the distribution of cites. (Five articles, with five citations each, would be equal to four articles with no citations and one with twenty-five; in both cases, the citations ‘units’ is equal to twenty-five.) Moreover, imagine a department with ten articles cited ten times each, and another department that has published twenty articles with nine citations each. Is department A better than department B because it has a higher rate of citations per article? I do not believe so.

  13. This variable measures the degree to which there is a financial subsidy for publications. Colleagues from each department were contacted and asked on this regard, and these criteria were constructed inductively. Probably, as research incorporates more departments, these criteria will have to be expanded. Universidad de los Andes uses two criteria: one done by Departamento Administrativo de Ciencia, Tecnología e Innovación de Colombia (COLCIENCIAS) (see: http://scienti.colciencias.gov.co:8084/publindex/EnIbnPublindex/resultados.do) and one developed by the Department of Political Science. With CIDE, the story is very similar to that of Los Andes. CIDE uses both, the selection made by the Consejo Nacional de Ciencia y Tecnología (http://www.conacyt.gob.mx/Indice/Paginas/default.aspx) and an in-house index.

  14. One could argue that in many universities in the region, good publications may have an indirect impact on the incomes of scholars because they improve their nominations for national competitive funds (Sistema Nacional de Investigadores in Uruguay, Consejo Nacional de Investigaciones Científicas y Técnicas in Argentina, etc). This is an indirect effect that affects all the cases studied simply by the selection criteria used in this work. Consequently, it was not considered here.

  15. These countries are Austria, Canada, Germany, Italy, and Switzerland.

  16. Actually, I met with some of the scholars from prestigious universities in charge of settling agreements with other universities, either for agreements on providing a shared degree, or for exchanging students (though most of the time this is one-directional), and they were eager to read more about an inter-departmental comparison. They pushed for continuation of this research agenda because, as they send more and more students to the region for a semester or year of studies, they need to know where to send them. Of course, their decisions are not solely based on departmental ‘strength’ (whatever that means); they consider other aspects such as safety for students, services, and many other unrelated dimensions. Nevertheless, what to study, and with whom, are still critical criteria for them.

References

  • Altman, D. (2005) ‘La Institucionalización de la Ciencia Política en Chile y América Latina: Una Mirada desde el Sur’, Revista de Ciencia Política 25 (1): 3–15.

    Google Scholar 

  • Alvarez, A.E. and Dahdah, S. (2005) ‘La Ciencia Politica en Venezuela: Fortalezas Pasadas y Vulnerabilidades Presentes’, Revista de Ciencia Política 25 (1): 245–260.

    Google Scholar 

  • Ballard, M.J. and Mitchell, N.J. (1998) ‘The good, the better, and the best in political science’, PS-Political Science & Politics 31 (4): 826–828.

    Article  Google Scholar 

  • Bejarano, A.M. and Wills, M.E. (2005) ‘La Ciencia Política en Colombia: De Vocación a Disciplina’, Revista de Ciencia Política 25 (1): 111–123.

    Google Scholar 

  • Fuentes, C. and Santana, G. (2005) ‘El “Boom” de la Ciencia Política en Chile: Escuelas, Mercado y Tendencias’, Revista de Ciencia Política 25 (1): 16–39.

    Google Scholar 

  • Garand, J.C. and Graddy, K.L. (1999) ‘Ranking political science departments: Do publications matter?’ PS-Political Science & Politics 32 (1): 113–116.

    Article  Google Scholar 

  • Garcé, A. (2005) ‘La Ciencia Política en Uruguay: Un Desarrollo Tardío, Intenso y Asimétrico’, Revista de Ciencia Política 25 (1): 232–244.

    Google Scholar 

  • Giles, M.W. and Garand, J.C. (2007) ‘Ranking political science journals: Reputational and citational approaches’, PS: Political Science & Politics 40 (4): 741–751.

    Google Scholar 

  • Hix, S. (2004) ‘A global ranking of political science departments’, Political Studies Review 2 (3): 293–313.

    Article  Google Scholar 

  • Jackman, R.W. and Siverson, R.M. (1996) ‘Rating the rating: An analysis of the national research council's appraisal of political science Ph.D. programs’, PS-Political Science & Politics 29 (2): 155–160.

    Google Scholar 

  • Katz, R.S. and Eagles, E. (1996) ‘Ranking political science departments: A view from the lower half’, PS-Political Science & Politics 29 (2): 149–154.

    Article  Google Scholar 

  • LaPalombara, J. (1968) ‘Macrotheories and microapplications in comparative politics’, Comparative Politics 1 (October): 52–78.

    Article  Google Scholar 

  • Lehmann, S., Jackson, A.D. and Lautrup, B.E. (2006) ‘Measures for measures’, Nature 444 (December): 1003–1004.

    Article  Google Scholar 

  • Leiras, M., Medina, J.M.A.(h.) and D’Alessandro, M. (2005) ‘La Ciencia Política en Argentina: El Camino de la Institucionalización Dentro y Fuera de las Aulas Universitarias’, Revista de Ciencia Política 25 (1): 76–91.

    Google Scholar 

  • Loaeza, S. (2005) ‘La Ciencia Política: El Pulso del Cambio Mexicano’, Revista de Ciencia Política 25 (1): 192–203.

    Google Scholar 

  • Lowry, R.C. and Silver, B.D. (1996) ‘A rising tide lifts all boats: Political science department reputation and the reputation of the university’, PS-Political Science & Politics 29 (2): 161–167.

    Article  Google Scholar 

  • McCormick, J.M. and Rice, T.W. (2001) ‘Graduate training and research productivity in the 1990s: A look at who publishes’, PS-Political Science & Politics 34 (3): 675–680.

    Article  Google Scholar 

  • Miller, A.H., Tien, C. and Peebler, A.A. (1996) ‘Department rankings: An alternative approach’, PS-Political Science & Politics 29 (4): 704–717.

    Article  Google Scholar 

  • Neto, O.A. and Santos, F. (2005) ‘La Ciencia Política en el Brasil: El Desafío de la Expansión’, Revista de Ciencia Política 25 (1): 101–110.

    Google Scholar 

  • Schmitter, P. (2002) ‘Seven (Disputable) theses concerning the future of “Transatlanticised” or “Globalised” political science’, European Political Science 1 (2): 23–40.

    Article  Google Scholar 

  • Welch, S. and Hibbing, J.R. (1983) ‘What do the new ratings of political science departments measure?’ PS 16 (3): 532–540.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Appendix

Appendix

Table A1

Table A1 Descriptive statistics

Rights and permissions

Reprints and permissions

About this article

Cite this article

Altman, D. Where is Knowledge Generated? On the Productivity and Impact of Political Science Departments in Latin America. Eur Polit Sci 11, 71–87 (2012). https://doi.org/10.1057/eps.2010.82

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1057/eps.2010.82

Keywords

Navigation