Skip to main content
Log in

“The Hardest Task”—Peer Review and the Evaluation of Technological Activities

  • Published:
Minerva Aims and scope Submit manuscript

Abstract

Technology development and innovation are fundamentally different from scientific research. However, in many circumstances, they are evaluated jointly and by the same processes. In these cases, peer review—the most usual procedure for evaluating research—is also applied to the evaluation of technological products and innovation activities. This can lead to unfair results and end up discouraging the involvement of researchers in these fields. This paper analyzes the evaluation processes in Uruguay's National System of Researchers. In this system, all members' activities, both scientific and technological, are evaluated by peer committees. Based on documentary analysis and semi-structured interviews, the difficulties faced by evaluators in assessing technology products are explored. The article highlights the persistence of a linear conception of the link between science and technology and describes the obstacles to assimilate the particularities of technological activities. Refereed publications are presented as the only uncontested product. Other types of output are reviewed with suspicion. This study emphasizes the need for specific mechanisms to evaluate technological production within academic careers.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1

Source: Prepared by the authors.

Similar content being viewed by others

Availability of data and material

Not applicable

Code availability

Not applicable

Notes

  1. This problem goes beyond the Latin American region. In the United States and Canada, tenure and promotion committees in universities do not place much value on the public impact of research (Schimanski and Alperin 2018; Alperin et al. 2019).

  2. Only researchers living in Uruguay and holding a position in a university or research institute are eligible to receive the financial incentive.

  3. Unlike the usual practice in other research assessment systems, SNI has only one set of criteria applicable to all disciplines.

  4. Plant varieties are registered in accordance with UPOV conventions. See Sanderson (2019) for a detailed account.

References

  • Abramo, Giovanni, Ciriaco D’Angelo, Marco Ferretti, and A. Adele Parmentola. 2012. An individual-level assessment of the relationship between spin-off activities and research performance in universities. R&D Management 42(3): 225–242.

    Article  Google Scholar 

  • Alperin, Juan, Carol Muñoz Nieves, Lesley Schimanski, Gustavo Fischman, Meredith Niles, and Erin McKiernan. 2019. How significant are the public dimensions of faculty work in review, promotion and tenure documents? eLife 8: e42254. https://doi.org/10.7554/eLife.42254

  • Alperin, Juan, and Cecilia Rozemblum. 2017. La reinterpretación de visibilidad y calidad en las nuevas políticas de evaluación de revistas. Revista Interamericana De Bibliotecología 40(3): 231–241.

    Article  Google Scholar 

  • Ambos, Tina, Kristiina Mäkelä, Julian Birkinshaw, and Pablo D’Este. 2008. When Does University Research Get Commercialized? Creating Ambidexterity in Research Institutions. Journal of Management Studies 45(8). https://doi.org/10.1111/j.1467-6486.2008.00804.x

  • ANII - Agencia Nacional de Investigación e Innovación. 2018. Sistema Nacional de Investigadores. Informe de monitoreo 2008-2018. Montevideo: ANII. https://www.anii.org.uy/upcms/files/listado-documentos/documentos/informe-de-monitoreo-sistema-nacional-de-investigadores-2008-2018.pdf

  • Arboleya, Jorge, and Ernesto Restaino. 2003. Agricultural extension models in South America: A description of systems in use in Argentina, Brazil, Paraguay, and Uruguay. HortTechnology 14(1): 14–19.

    Article  Google Scholar 

  • Archambault, Éric, Étienne Vignola-Gagné, Grégoire Côté, Vincent Larivière, and Yves Gingras. 2006. Benchmarking scientific output in the social sciences and humanities: The limits of existing databases. Scientometrics 68(3): 329–342.

    Article  Google Scholar 

  • Arocena, Rodrigo, and Judith Sutz. 2010. Weak knowledge demand in the South: learning divides and innovation policies. Science and Public Policy 37(8): 571–582.

    Article  Google Scholar 

  • Babini, Dominique, and Juan Machin-Mastromatteo. 2015. Latin American science is meant to be open access: Initiatives and current challenges. Information Development 31(5): 477–481.

    Article  Google Scholar 

  • Babini, Dominique. 2020. Toward a Global Open-Access Scholarly Communications System: A Developing Region Perspective. In Reassembling Scholarly Communications: Histories, Infrastructures, and Global Politics of Open Access, eds. Martin Eve and Jonathan Gray. Cambridge: MIT Press. https://doi.org/10.7551/mitpress/11885.003.0033

  • Beigel, Fernanda. 2014. Publishing from the periphery: Structural heterogeneity and segmented circuits. The evaluation of scientific publications for tenure in Argentina’s CONICET. Current Sociology 62(5): 743-765.

  • Beigel, Fernanda. 2017. Peripheral Scientists, between Ariel and Caliban. Institutional know-how and Circuits of Recognition in Argentina. The “Career-best Publications” of the Researchers at CONICET. Dados 60(3): 63-102.

  • Bianco, Mariela, Natalia Gras, and Judith Sutz. 2016. Academic Evaluation: Universal Instrument? Tool for Development? Minerva 54(4): 399–421.

    Article  Google Scholar 

  • Bin, Adriana, Cecilia Gianoni, Paule J. V. Mendes, Carolina Rio, Sergio L.M. Salles-Filho, and Luiza M. Capanema. 2013. Organization of Research and Innovation: a Comparative Study of Public Agricultural Research Institutions. Journal of Technology Management & Innovation 8(1). https://doi.org/10.4067/S0718-27242013000300048

  • Bornmann, Lutz. 2013. What is societal impact of research and how can it be assessed? A literature survey. Journal of the American Society for Information Science and Technology 64(2): 217–233.

    Article  Google Scholar 

  • Bortagaray, Isabel. 2017. Cultura, innovación, ciencia y tecnología en Uruguay. Trazos de sus vinculaciones. Revista de Ciencias Sociales UDELAR 41(30): 87-110.

  • Bozeman, Barry, and Daniel Sarewitz. 2011. Public value mapping and science policy evaluation. Minerva 49(1): 1–23.

    Article  Google Scholar 

  • Bozeman, Barry, Daniel Fay, and Catherine Slade. 2012. Research collaboration in universities and academic entrepreneurship: the-state-of-the-art. The Journal of Technology Transfer 38(1): 1–67.

    Article  Google Scholar 

  • Bruun-Jensen, Casper. 2011. Making Lists, Enlisting Scientists: the Bibliometric Indicator, Uncertainty and Emergent Agency. Science Studies 24: 64–84.

    Google Scholar 

  • Cañibano, Carolina, Inmaculada Vilardell, Carmen Corona, and Carlos Benito-Amat. 2018. The evaluation of research excellence and the dynamics of knowledge production in the humanities: The case of history in Spain. Science and Public Policy 45(6): 775–789.

    Article  Google Scholar 

  • D’Onofrio, Guillermina. 2020. Efectos de los sistemas de evaluación de la investigación en las experiencias de carrera de biólogos moleculares y biotecnólogos en Argentina. PhD Dissertation. FLACSO Argentina.

  • Derrick, Gemma. 2018. The Evaluator’s Eye. Impact Assessment and Academic Peer Review. London: Palgrave Macmillan.

  • Donovan, Claire, and Stephen Hanney. 2011. The ‘Payback Framework’ explained. Research Evaluation 20(3): 181–183.

    Article  Google Scholar 

  • Echeverría, Javier. 2001. Tecnociencia y sistemas de valores. In Ciencia, Tecnología, Sociedad y Cultura, eds. J. A. López Cerezo, and J. M. Sánchez Ron, 221-230. Madrid: OEI.

  • FONTAGRO. 2017. Fortaleciendo el capital humano. Lineamientos de una estrategia para el fortalecimiento de capacidades en países miembros de FONTAGRO. Washington DC. https://www.fontagro.org/es/publicaciones/fortalecimiento-del-capital-humano-lineamientos-de-una-estrategia-para-el-fortalecimiento-de-capacidades-en-paises-miembros-de-fontagro/

  • Hackett, Edward. 1990. Peerless Science. Peer Review and U.S. Science Policy. New York: SUNY Press.

  • Hellström, Tomas, and Christina Hellström. 2017. Achieving impact: impact evaluations and narrative simplification. Prometheus 35(3): 215–230.

    Article  Google Scholar 

  • Hicks, Diana, Paul Wouters, Ludo Waltman, Sarah de Rijcke, and Ismael Rafols. 2015. Bibliometrics: The Leiden Manifesto for research metrics. Nature 520: 429–431.

    Article  Google Scholar 

  • Holbrook, J. Britt. 2005. Assessing the science–society relation: The case of the US National Science Foundation’s second merit review criterion. Technology in Society 27(4): 437–451.

    Article  Google Scholar 

  • Invernizzi, Noela, and Amílcar Davyt. 2019. Críticas recientes a la evaluación de la investigación: ¿vino nuevo en odres viejos? Redes (bernal) 25(49): 233–252.

    Google Scholar 

  • Joly, Pierre-Benoît, Ariane Gaunand, Laurence Colinet, Philippe Larédo, Stéphane Lemaire, and Mireille Matt. 2005. ASIRPA: A comprehensive theory-based approach to assessing the societal impacts of a research organization. Research Evaluation 24(4): 440–453.

    Article  Google Scholar 

  • Kaltenbrunner, Wolfgang, and Sarah De Rijcke. 2016. Quantifying ‚output‘ for evaluation: Administrative knowledge politics and changing epistemic cultures in Dutch law faculties. Science and Public Policy 44(2): 284–293.

    Google Scholar 

  • Kreimer, Pablo. 2019. Science and Society in Latin America. Peripheral Modernities. New York: Routledge.

  • Lamont, Michele. 2009. How Professors Think. Inside the Curious World of Academic Judgment. Cambridge: Harvard University Press.

  • Langfeldt, Liv, and Svein Kyvik. 2011. Researchers as evaluators: Tasks, tensions and politics. Higher Education 62(2): 199–212.

    Article  Google Scholar 

  • Laudel, Grit. 2017. How do national career systems promote or hinder the emergence of new research lines? Minerva 55(3): 341–369.

    Article  Google Scholar 

  • Lin, Min-Wei, and Barry Bozeman. 2006. Researchers’ industry experience and productivity in university-industry research centers: A `scientific and technical human capital` explanation. Journal of Technology Transfer 31: 269–290.

    Article  Google Scholar 

  • Ma, Lai, Junwen Luo, Thomas Feliciani, and Kalpana Shankar. 2020. How to evaluate ex ante impact of funding proposals? An analysis of reviewers’ comments on impact statements. Research Evaluation, rvaa022. https://doi.org/10.1093/reseval/rvaa022

  • Macnaghten, Phil. 2016. Responsible innovation and the reshaping of existing technological trajectories: the hard case of genetically modified crops. Journal of Responsible Innovation 3(3): 282–289.

    Article  Google Scholar 

  • Naidorf, Judith, Federico Vasen, and Mauro Alonso. 2019. Aunar criterios en un sistema fragmentado. Tensiones en torno a evaluación de la investigación aplicada y el desarrollo tecnológico en el origen de los Proyectos de Desarrollo Tecnológico y Social. EccoS Revista Científica 49: 1–21.

    Google Scholar 

  • Naidorf, Judith, Federico Vasen, Mauro Alonso, and Melisa Cuschnir. 2020. De evaluar diferente a orientar como siempre. Burocratización e inercias institucionales en la implementación de una política científica orientada al desarrollo tecnológico y social. Revista Iberoamericana de Ciencia, Tecnología y Sociedad 15(45): 163-182.

  • Neff, Mark. 2008. Publication incentives undermine the utility of science: Ecological research in Mexico. Science and Public Policy 45(2): 191–201.

    Article  Google Scholar 

  • Packer, Abel. 2020. The Pasts, Presents, and Futures of SciELO. In Reassembling Scholarly Communications: Histories, Infrastructures, and Global Politics of Open Access, eds. Martin Eve and Jonathan Gray. Cambridge: MIT Press. https://doi.org/10.7551/mitpress/11885.003.0030

  • Percy, Helen, James Turner, and Wendy Boyce. 2019. Five principles of co-innovation. Blogpost. Integration and Implementation Insights. Australian National University. https://i2insights.org/2019/07/16/five-principles-of-co-innovation/

  • Reymert, Ingvild. 2021. Bibliometrics in Academic Recruitment: A Screening Tool Rather than a Game Changer. Minerva 59(1): 53–78. https://doi.org/10.1007/s11024-020-09419-0.

    Article  Google Scholar 

  • RICYT - Red de Indicadores de Ciencia y Tecnología. 2021. El Estado de la Ciencia. Principales Indicadores de Ciencia y Tecnología Iberoamericanos/Interamericanos. Buenos Aires: OEI-UNESCO.

  • Rushforth, Alexander, and Sarah de Rijcke. 2015. Accounting for impact? The journal impact factor and the making of biomedical research in the Netherlands. Minerva 53(2): 117–139.

    Article  Google Scholar 

  • Samuel, Gabrielle, and Gemma Derrick. 2015. Societal impact evaluation: Exploring evaluator perceptions of the characterization of impact under the REF2014. Research Evaluation 24(3): 229–241.

    Article  Google Scholar 

  • Sanderson, Jay. 2019. Plants, people and practices: the nature and history of the UPOV Convention. Cambridge: Cambridge UP.

    Google Scholar 

  • Sandoval-Romero, Vanessa, and Vincent Larivière. 2020. The national system of researchers in Mexico: implications of publication incentives for researchers in social sciences. Scientometrics 122: 99–126. https://doi.org/10.1007/s11192-019-03285-8.

    Article  Google Scholar 

  • Sanz-Menéndez, Luis. 1995. Research actors and the state: research evaluation and evaluation of science and technology policies in Spain. Research Evaluation 5(1): 79–88.

    Article  Google Scholar 

  • Sarthou, Nerina. 2016. Twenty years of merit-pay programme in Argentinean universities: Tracking policy change through instrument analysis. Higher Education Policy. https://doi.org/10.1057/s41307-016-0001-0.

    Article  Google Scholar 

  • Schimanski, Lesley, and Juan Alperin. 2018. The evaluation of scholarship in academic promotion and tenure processes: Past, present, and future. F1000 Research 7: 1605

  • SNI – Sistema Nacional de Investigadores. 2014. Reglamento. https://sni.org.uy/wp-content/uploads/2016/07/Reglamento-del-SNI-aprobado-28-3-2014.pdf

  • SNI – Sistema Nacional de Investigadores. 2020. Criterios de evaluación. https://www.anii.org.uy/upcms/files/SNI2020/criterios-de-evaluaci-n-sni-2020.pdf

  • Spaapen, Jack, and Leonie van Drooge. 2011. Introducing ‘productive interactions’ in social impact assessment. Research Evaluation 20(3): 211–218.

    Article  Google Scholar 

  • Sugimoto, Cassidy, and Vincent Lariviere. 2017. Measuring research. New York: Oxford University Press.

    Google Scholar 

  • Temple, Ludovic, et al. 2018. Assessing impacts of agricultural research for development: A systemic model focusing on outcomes. Research Evaluation 27(2): 157–170.

    Article  Google Scholar 

  • Thomas, Hernán, and Carlos Gianella. 2009. Procesos socio-técnicos de construcción de perfiles productivos y capacidades tecnológicas en el Mercosur. In Innovación a escala MERCOSUR, eds. Guillermo Rozenwurcel, Carlos Gianella, Gabriel Bezchinsky, and Hernán Thomas. Buenos Aires: Prometeo.

  • Turner, James, Laurens Klerkx, Kelly Rijswijk, Tracy Williams, and Tim Barnard. 2016. Systemic problems affecting co-innovation in the New Zealand Agricultural Innovation System: Identification of blocking mechanisms and underlying institutional logics. NJAS – Wageningen Journal of Life Sciences. 76: 99-112.

  • Vasen, Federico. 2018. La ‘torre de marfil’ como apuesta segura: políticas científicas y evaluación académica en México. Archivos Analíticos De Políticas Educativas 26: 95.

    Google Scholar 

  • Vasen, Federico, and Ivonne Lujano. 2017. Sistemas nacionales de clasificación de revistas científicas en América Latina: tendencias recientes e implicaciones para la evaluación académica en ciencias sociales. Revista Mexicana De Ciencias Sociales y Políticas 62(231): 199–228.

    Article  Google Scholar 

  • Vasen, Federico, Nerina Sarthou, Silvina Romano, Brenda D Gutiérrez, María Eugenia Ortiz, and Manuel Pintos. 2021a. Sistemas Nacionales de Categorización de Investigadores en Iberoamérica: la configuración de un modelo regional. Documentos de trabajo PICT2018-2794, 1. Available at SSRN: https://doi.org/10.2139/ssrn.3891052

  • Vasen, Federico, Miguel Sierra, José Paruelo, Carlos Negro, Federico Nolla, Joaquín Lapetina, and Marcelo Salvagno. 2021b. Evaluation of Technical Production in Agricultural Sciences. Agrociencia Uruguay 25(2): e491. https://doi.org/10.31285/AGRO.25.491.

  • Vessuri, Hebe, Jean-Claude. Guédon, and Ana María Cetto. 2013. Excellence or quality? Impact of the current competition regime on science and scientific publishing in Latin America and its implications for development. Current Sociology 62(5): 647–665.

    Article  Google Scholar 

  • Von Schomberg, René, and Jonathan Hankins, (eds.). 2019. International Handbook on Responsible Innovation. London: Edward Elgar.

  • Whitley, Richard. 2003. Competition and pluralism in the public sciences: the impact of institutional frameworks on the organisation of academic science. Research Policy 32: 1015–1029.

    Article  Google Scholar 

  • Yegros-Yegros, Alfredo, Joaquín Azagra-Caro, Mayte López-Ferrer, and Robert Tijssen. 2016. Do university–industry co-publication outputs correspond with university funding from firms? Research Evaluation 25(2): 136–150.

    Article  Google Scholar 

Download references

Funding

This work was supported by Instituto Nacional de Tecnología Agropecuaria and Agencia Nacional de Promoción Científica y Tecnológica [grant PICT2018-2794].

Author information

Authors and Affiliations

Authors

Contributions

Not applicable

Corresponding author

Correspondence to Federico Vasen.

Ethics declarations

Conflicts of interest

None declared

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Vasen, F., Sierra Pereiro, M. “The Hardest Task”—Peer Review and the Evaluation of Technological Activities. Minerva 60, 375–395 (2022). https://doi.org/10.1007/s11024-022-09461-0

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11024-022-09461-0

Keywords

Navigation