Exportar Publicação

A publicação pode ser exportada nos seguintes formatos: referência da APA (American Psychological Association), referência do IEEE (Institute of Electrical and Electronics Engineers), BibTeX e RIS.

Exportar Referência (APA)
Santos, J. M. (2023). Quis judicabit ipsos judices? A case study on the dynamics of competitive funding panel evaluations. Research Evaluation. 32 (1), 70-85
Exportar Referência (IEEE)
J. M. Santos,  "Quis judicabit ipsos judices? A case study on the dynamics of competitive funding panel evaluations", in Research Evaluation, vol. 32, no. 1, pp. 70-85, 2023
Exportar BibTeX
@article{santos2023_1731980307113,
	author = "Santos, J. M.",
	title = "Quis judicabit ipsos judices? A case study on the dynamics of competitive funding panel evaluations",
	journal = "Research Evaluation",
	year = "2023",
	volume = "32",
	number = "1",
	doi = "10.1093/reseval/rvac021",
	pages = "70-85",
	url = "https://academic.oup.com/rev"
}
Exportar RIS
TY  - JOUR
TI  - Quis judicabit ipsos judices? A case study on the dynamics of competitive funding panel evaluations
T2  - Research Evaluation
VL  - 32
IS  - 1
AU  - Santos, J. M.
PY  - 2023
SP  - 70-85
SN  - 0958-2029
DO  - 10.1093/reseval/rvac021
UR  - https://academic.oup.com/rev
AB  - Securing research funding is essential for all researchers. The standard evaluation method for
competitive grants is through evaluation by a panel of experts. However, the literature notes that
peer review has inherent flaws and is subject to biases, which can arise from differing interpretations of the criteria, the impossibility for a group of reviewers to be experts in all possible topics within their field, and the role of affect. As such, understanding the dynamics at play during panel evaluations is crucial to allow researchers a better chance at securing funding, and also for the reviewers themselves to be aware of the cognitive mechanisms underlying their decision-making. In this study, we conduct a case study based on application and evaluation data for two social sciences panels in a competitive state-funded call in Portugal. Using a mixed-methods approach, we find that qualitative evaluations largely resonate with the evaluation criteria, and the candidate’s scientific output is partially aligned with the qualitative evaluations, but scientometric indicators alone do not significantly influence the candidate’s evaluation. However, the polarity of the qualitative evaluation has a positive influence on the candidate’s evaluation. This paradox is discussed as possibly resulting from the occurrence of a halo effect in the panel’s judgment of the candidates. By providing a multi-methods approach, this study aims to provide insights that can be useful for all stakeholders involved in competitive funding evaluations.
ER  -