Scientometrics and management of scientific activities: once again about the global and Ukrainian
DOI:
https://doi.org/10.15407/visn2019.09.081Keywords:
research evaluation, scientometrics, Scopus, Web of Science, h-index, journal metrics, population of UkraineAbstract
The main purpose of the paper is to perform a short review on the problem of implementation of scientometric indicators for research evaluation in the context of Ukraine. Peculiarities of usage of key scientometric terms in normative documents are examined. A number of case studies are given to illustrate the ambiguity of application of particular indicators in order to rate authors, research groups, institutions or scientific journals. The importance of balance between expert evaluation and quantitative analysis in the national system of research evaluation is highlighted, and the inadmissibility of any manipulations of scientometrical terms and notions is underscored.
References
Bornmann L. Scientific peer review. Annual Review of Information Science and Technology. 2011. 45(1): 197. https://doi.org/10.1002/aris.2011.1440450112
Bornmann L. The Hawthorne effect in journal peer review. Scientometrics. 2012. 91(3): 857. https://doi.org/10.1007/s11192-011-0547-y
Rennie D. Let’s make peer review scientific. Nature. 2016. 535(7610): 31. https://doi.org/10.1038/535031a
Traag V.A., Waltman L. Systematic analysis of agreement between metrics and peer review in the UK REF. Palgrave Communications. 2019. 5(29). https://doi.org/10.1057/s41599-019-0233-x
Mryglod O., Kenna R., Holovatch Y., Berche B. Comparison of a citation-based indicator and peer review for absolute and specific measures of research-group excellence. Scientometrics. 2013. 97(3): 767. https://doi.org/10.1007/s11192-013-1058-9
Derrick G.E., Pavone V. Democratising research evaluation: Achieving greater public engagement with bibliometrics-informed peer review. Science and Public Policy. 2013. 40(5). 563. https://doi.org/10.1093/scipol/sct007
Capaccioni A., Spina G. Guidelines for Peer Review. A Survey of International Practices. In: Bonaccorsi A. (ed.). The Evaluation of Research in Social Sciences and Humanities. (Springer, Cham, 2018). https://doi.org/10.1007/978-3-319-68554-0_3
Wilsdon J. et al. The Metric Tide: Report of the Independent Review of the Role of Metrics in Research Assessment and Management. 2015. https://doi.org/10.13140/RG.2.1.4929.1363
Order of the Ministry of Education and Science of Ukraine 14.12.2015 No. 1287 https://zakon3.rada.gov.ua/laws/show/z0015-16
Order of the Ministry of Education and Science of Ukraine 15.01.2018 No. 32 On Approval of the Procedure for the Formation of the List of Scientific Professional Publications of Ukraine. https://zakon.rada.gov.ua/laws/show/z0148-18
Methodology for evaluating the effectiveness of scientific institutions of the National Academy of Sciences of Ukraine. Approved by the Resolution of the Presidium of the National Academy of Sciences of Ukraine. 15.03.2017. No. 75 http://www.nas.gov.ua/text/pdfNews/metodyka_text.pdf
Aksnes D.W., Langfeldt L., Wouters P. Citations, Citation Indicators, and Research Quality: An Overview of Basic Concepts and Theories. SAGE Open. 2019. 9(1): 215824401982957. https://doi.org/10.1177/2158244019829575
Dictionary of the Ukrainian language: in 11 vols. (Kyiv.: Naukova Dumka, 1970–1980).
Garfield E., Welljams-Dorof A. Of Nobel class: A citation perspective on high impact research authors. Theoretical Medicine. 1992. 13(2): 117. https://doi.org/10.1007/BF02163625
Gingras Y., Wallace M.L. Why it has become more difficult to predict Nobel Prize winners: A bibliometric analysis of nominees and winners of the chemistry and physics prizes (1901-2007). Scientometrics. 2010. 82(2): 401. https://doi.org/10.1007/s11192-009-0035-9
Martin G.N., Clarke R.M. Are Psychology Journals Anti-replication? A Snapshot of Editorial Practices. Frontiers in Psychology. 2017. 8: 523. https://doi.org/10.3389/fpsyg.2017.00523
Wager E., Barbour V., Yentis S., Kleinert S. Guidelines for retracting articles. 2009. https://doi.org/10.24318/cope.2019.1.4
Teixeira da Silva J.A., Dobránszki J. Highly cited retracted papers. Scientometrics. 2017. 110(3): 1653. https://doi.org/10.1007/s11192-016-2227-4
Garfield E. Citation analysis as a tool in journal evaluation: journals can be ranked by frequency and impact of citations for science policy studies. Science. 1972. 178(4060): 471.
Bornmann L., Marx W. The wisdom of citing scientists. Journal of the Association for Information Science and Technology. 2014. 65(6): 1288. https://doi.org/10.1002/asi.23100
Lazarev V.S. Insufficient definitions or a vaguely grasped notion? On definitions of “impact”. Scholarly Research and Information. 2019. 2(1): 63. https://doi.org/10.24108/2658-3143-2019-2-1-63-78
Waltman L., van Eck N.J., Wouters P. Counting publications and citations: Is more always better? Journal of Informetrics. 2013. 7(3): 635. https://doi.org/10.1016/j.joi.2013.04.001
Bornmann L. Measuring impact in research evaluations: a thorough discussion of methods for, effects of and problems with impact measurements. Higher Education. 2017. 73(5): 775. https://doi.org/10.1007/s10734-016-9995-x
Mongeon P., Paul-Hus A. The journal coverage of Web of Science and Scopus: a comparative analysis. Scientometrics. 2016. 106(1): 213. https://doi.org/10.1007/s11192-015-1765-5
Content Policy and Selection. Elsevier. https://www.elsevier.com/solutions/scopus/how-scopus-works/content/content-policy-and-selection
Journal Selection Process. Clarivate Analytics. https://clarivate.com/essays/journal-selection-process/
Resolution of the Cabinet of Ministers of Ukraine 30.12.2015 No. 1187. https://zakon3.rada.gov.ua/laws/show/1187-2015-%D0%BF/
Order of the Ministry of Education and Science of Ukraine 14.01.2016 No. 13. https://zakon2.rada.gov.ua/laws/show/z0183-16/
Franceschini F., Maisano D., Mastrogiacomo L. Do Scopus and WoS correct “old” omitted citations? Scientometrics. 2016. 107(2): 321. https://doi.org/10.1007/s11192-016-1867-8
Franceschini F., Maisano D., Mastrogiacomo L. Empirical analysis and classification of database errors in Scopus and Web of Science. Journal of Informetrics. 2016. 10(4): 933. https://doi.org/10.1016/j.joi.2016.07.003
Liu W., Hu G., Tang L. Missing author address information in Web of Science – An explorative study. Journal of Informetrics. 2018. 12(3): 985. https://doi.org/10.1016/j.joi.2018.07.008
Zhu J., Hu G., Liu W. DOI errors and possible solutions for Web of Science. Scientometrics. 2019. 118(2): 709. https://doi.org/10.1007/s11192-018-2980-7
Huang M., Liu W. Substantial numbers of easily identifiable illegal DOIs still exist in Scopus. Journal of Informetrics. 2019. https://doi.org/10.1016/j.joi.2019.03.019
Krauskopf E. An analysis of discontinued journals by Scopus. Scientometrics. 2018. 116(3): 1805. https://doi.org/10.1007/s11192-018-2808-5
Nazarovets S., Nazarovets M. The danger of spreading pseudoscientific journals for the development of science in Ukraine. Odesa National University Herald. Library studies, Bibliography studies, Bibliology. 2017. 22(1): 163. https://doi.org/10.18524/2304-1447.2017.1(17).104171
Franceschini F., Maisano D., Mastrogiacomo L. The museum of errors/horrors in Scopus. Journal of Informetrics. 2016. 10(1): 174. https://doi.org/10.1016/j.joi.2015.11.006
San Francisco Declaration on Research Assessment (DORA) https://sfdora.org/read/
Hicks D., Wouters P., Waltman L., De Rijcke S., Rafols I., Bibliometrics: the Leiden Manifesto for research metrics. Nature. 2015. 520(7548): 429. https://doi.org/10.1038/520429a
Hirsch J.E. An index to quantify an individual's scientific research output. Proceedings of the National Academy of Sciences. 2005. 102(46): 16569. https://doi.org/10.1007/s11192-010-0193-9
Mryglod O., Kenna R., Holovatch Y., Berche B. Predicting results of the Research Excellence Framework using departmental h-index. Scientometrics. 2015. 102(3): 2165. https://doi.org/10.1007/s11192-014-1512-3
Garfield E., Sher I.H. New factors in the evaluation of scientific literature through citation indexing. American Documentation. 1963. 14(3): 195. https://doi.org/10.1002/asi.5090140304
Archambault É., Larivière V. History of the journal impact factor: Contingencies and consequences. Scientometrics. 2009. 79(3): 635. https://doi.org/10.1007/s11192-007-2036-x
Jacsó P. Five-year impact factor data in the journal citation reports. Online Information Review. 2009. 33(3): 603. https://doi.org/10.1108/14684520910969989
Yue W., Wilson C.S., Rousseau R. The Immediacy Index and the Journal Impact Factor: Two Highly Correlated Derived Measures. The Canadian Journal of Information and Library Science. 2004. 28(1): 33.
Lancho-Barrantes B.S., Guerrero-Bote V.P., Moya-Anegón F. What lies behind the averages and significance of citation indicators in different disciplines? Journal of Information Science. 2010. 36(3): 371. https://doi.org/10.1177/0165551510366077
Colledge L., de Moya-Anegón F., Guerrero-Bote V., López-Illescas C., El Aisati M., Moed H. SJR and SNIP: two new journal metrics in Elsevier's Scopus. Serials. 2010. 23(3): 215. https://doi.org/10.1629/23215
Waltman L., Van Eck N.J., Van Leeuwen T.N., Visser M.S. Some modifications to the SNIP journal impact indicator. Journal of Informetrics. 2013. 7(2): 272. https://doi.org/10.1016/j.joi.2012.11.011
Schiermeier Q. Ukraine’s science revolution stumbles five years on. Nature. 2019. 566(7743): 162. https://doi.org/10.1038/d41586-019-00512-3
Loktev V.M. Periculum in mora. Transcript of report at the General Meeting of the Department of Physics and Astronomy of NAS of Ukraine, April 24, 2019. Visn. Nac. Acad. Nauk Ukr. 2019. (5): 72.