The Sustaining the Knowledge Commons project was made possible through a SSHRC Insight Development Grant (2014 – 2016) and a SSHRC Insight Grant (2016 – 2021). SSHRC has graciously granted a one-year extension for project completion due to COVID. Between now and spring 2022, the work of SKC will focus on completing projects already started, blog wrap-up, and a final report and summary. Thanks to everyone who contributed to the SKC team over the years, read, shared and/or commented on posts.
The book launch video is now available.
Please join us for a launch of the book “Global University Rankings”. The session will be recorded and made available for later viewing. I briefly introduced the book and my chapter in it in this post: https://sustainingknowledgecommons.org/2021/08/03/irrational-rationality-critique-of-metrics-based-evaluation-of-researchers-and-universities/
According to one of the most consulted of the global university rankings services, the QS World University Rankings 2022, the University of Toronto is the top ranked university in Canada. It shouldn’t take more than a brief pause to reflect on this statement to see the fiction in what is presented as objective empirical information (pseudoscience). In the real world, it is mid-June, 2021. The empirical “facts” on which QS is based are still in progress, in a year of pandemic with considerable uncertainty. It is not possible to complete data on 2021 until the year is over. Meanwhile, QS is already reporting stats for 2022; perhaps they are psychic?
Scratching slightly at the surface, anyone with even a little bit of familiarity with the universities in Canada is probably aware that the University of Toronto is currently under a rare Censure against the University of Toronto due to a “serious breach of the principles of academic freedom” in a hiring decision. Censure is a “rarely invoked sanction in which academic staff in Canada and internationally are asked to not accept appointments, speaking engagements or distinctions or honours at the University of Toronto, until satisfactory changes are made”. I don’t know the details of the QS algorithms, but I think it’s fair to speculate that neither support for academic freedom or a university’s ability to attract top faculty for appointments, speeches, distinctions or honours is factored in, or if factored in, weighted appropriately.
Digging just a little bit deeper, someone with a modicum of understanding of the university system in Canada and Ontario in particular would know that the University of Toronto is one of Ontario’s 23 public universities, all of which have programs approved and regularly reviewed for quality by the same government, and funded under the same formulae and provide the same economic support for students. Degrees at a particular level are considered equivalent locally and courses are often transferable between institutions. When not under censure, the University of Toronto is indeed a high quality university; so is the University of Ottawa, where I work, Carleton (the other Ottawa-based university), and all the other Ontario universities. Specific programs frequently undergo additional accreditation. My department offers a Master’s of Information Studies program that is accredited by the American Library Association (ALA). Both the Ontario government and ALA require actual data in their QA / accreditation process. This includes evidence of strategic planning, but not guesswork about future output.
If QS is this far off base in their assessment of universities in the largest province of a G7 country (the epitome of the Global North), how accurate is QS and other global university rankings in the Global South? According to Stack (2021) and the authors of the newly released book Global University Rankings and the Politics of Knowledge http://hdl.handle.net/2429/78483 global university rankings such as QS and THE and the push for the Global South to develop globally competitive “world class universities” are more about reproducing colonial relations, marketizing higher education and commercializing research than assuring high quality education. The attention paid to such rankings distracts universities and even countries from what matters locally. As Chou points out, the focus on rankings leads scholars in Taiwan to publish in English rather than Mandarin although Mandarin is the local language. A focus on publishing in international, English language journals creates a disincentive to conduct research of local importance almost everywhere.
My chapter in this work focuses on the intersection of critique on metrics-based evaluation of research and how this feeds into the university rankings system. The first part of the chapter Dysfunction in knowledge creation and moving beyond provides a brief history and context of bibliometrics and the development of traditional and new metrics-based approaches and major critique and advocacy efforts to change practice (the San Francisco Declaration on Research Assessment (DORA) and the Leiden Manifesto). The unique contribution of this chapter is critique of the underlying belief behind both traditional and alternative metrics-based approaches to assessing research and researchers, that is, the assumption that impact is good and an indicator of quality research and therefore it makes sense to measure impact, with the only questions being whether particular technical measures of impact are accurate or not. For example, if impact is necessarily good, then the retracted study by Wakefield et al. that falsely correlated vaccination with autism is good research by any metric – many academic citations both before and after publication, citations in popular and social media and arguably a factor in the real-world impact of the anti-vaccination movement and the subsequent return of preventable illnesses like measles and a factor in the challenge of fighting COVID through vaccination. An alternative approach is suggested, using the traditional University of Ottawa’s collective agreement with APUO (union of full-time professors) as a means of evaluation that considers many different types of publications and considers quantity of publication in a way that gives evaluators the flexibility to take into account the kind of research and research output.
Morrison, H. (2021). What counts in research? Dysfunction in knowledge creation and moving beyond. http://ruor.uottawa.ca/handle/10393/39088 In: Stack, M. (2021). Global University Rankings and the Politics of Knowledge, pp. 109 – 130. http://hdl.handle.net/2429/78483
Stack, M. (2021). Global University Rankings and the Politics of Knowledge. http://hdl.handle.net/2429/78483
by: Heather Morrison, Luan Borges, Xuan Zhao, Tanoh Laurent Kakou & Amit Nataraj Shanbhoug
This study examines trends in open access article processing charges (APCs) from 2011 – 2021, building on a 2011 study by Solomon & Björk (2012). Two methods are employed, a modified replica and a status update of the 2011 journals. Data is drawn from multiple sources and datasets are available as open data (Morrison et al, 2021). Most journals do not charge APCs; this has not changed. The global average per-journal APC increased slightly, from 906 USD to 958 USD, while the per-article average increased from 904 USD to 1,626 USD, indicating that authors choose to publish in more expensive journals. Publisher size, type, impact metrics and subject affect charging tendencies, average APC and pricing trends. About half the journals from the 2011 sample are no longer listed in DOAJ in 2021, due to ceased publication or publisher de-listing. Conclusions include a caution about the potential of the APC model to increase costs beyond inflation, and a suggestion that support for the university sector, responsible for the majority of journals, nearly half the articles, with a tendency not to charge and very low average APCs, may be the most promising approach to achieve economically sustainable no-fee OA journal publishing.
A preprint of the full article is available here: https://ruor.uottawa.ca/handle/10393/42327
The two base datasets and their documentation are available as open data:
Morrison, Heather et al., 2021, “2011 – 2021 OA APCs”, https://doi.org/10.5683/SP2/84PNSG, Scholars Portal Dataverse, V1
Citation: cite the original URL rather than this blogpost URL (article); if citing data, use the citation above.
Morrison, H., Borges, L., Zhao, X., Kakou, T.L., Shanbhoug, A.M. (2021). Open access article processing charges 2020 – 2021. Preprint. Sustaining the Knowledge Commons. https://ruor.uottawa.ca/handle/10393/42327
by: Xuan Zhao, Luan Borges, & Heather Morrison
The Directory of Open Access Journals http://doaj.org is an excellent service that fulfills many important functions, in particular facilitating access to a vetted collection of over 15,000 freely available peer-reviewed journals. The DOAJ search services and metadata download are very useful for researchers as well. The purpose of this post is to alert researchers to some of the limitations of the DOAJ metadata that researchers need to take into account to avoid drawing erroneous conclusions. First, when downloading DOAJ metadata, it is necessary to open the .csv file in Unicode in order to retain non-English characters. We open in Open Office for this reason, then save as an excel file. The nature of the metadata means that some data is inserted in the wrong column; clean-up, as discussed below, is necessary before data analysis. When journal editors or others working on their behalf enter metadata into DOAJ, research is not the primary purpose of this exercise; for this reason, in-depth assessment and corrections may be necessary before analysis. Below, we present publisher size analysis as an example of what researchers may encounter. Finally, because the main purpose of DOAJ is connecting readers with content, the metadata of interest to a particular research project may not be up to date. As demonstrated below, as of Jan. 5, 2021, only 30% of DOAJ journals have a “last update” date within the previous year (2020). We do not know whether the “last update” date reflects a full or partial metadata review. We illustrate the potential impact on research results with the example of the SKC longitudinal APC study. Of the 4,292 DOAJ journals that responded “yes” to the APC question, only 30% have a last update date of 2020 or 2021. Even with this 30% of journals, we have no way of knowing whether the APC status and/or amount per se was updated, or only other unrelated metadata. This means that if we compare 2019 prices obtained from publisher websites in 2019 with 2021 DOAJ APC metadata, we will almost certainly get incorrect results, for example falsely assuming that matching APC amounts means no change in the prices. DOAJ provides rich and useful metadata for the researcher and the research question “is this journal listed in DOAJ?” is of value in and of itself. For this reason, we intend to continue using DOAJ metadata in addition to data derived from other sources, particularly data derived directly from publisher websites. See below to a link to an open data version of the DOAJ metadata reflecting the corrections explained in this post.
Correcting for displaced observations
As previously mentioned, the first step to confidently use the DOAJ metadata for analysis and research is identifying and correcting data inserted in the wrong column, herein also called displaced observations.
Below we can see an example of a displaced observation from the DOAJ metadata. Column BB has no assigned variable while containing some observations, apparently displaced one column to the right.
Users may follow different steps to correct for displaced data. Here we explain in more detail how we have identified these displacements and corrected them.
Before proceeding with any analysis, it is important to get familiarized with the DOAJ metadata first. We recommend users to read the DOAJ Guide to applying, available online, because the metadata reflects responses to questions asked in the application process. The DOAJ metadata, as of 5 Jan. 2021, possesses 53 variables ranging from Journal Title to Country to Most recent article added. It may be helpful to start correcting observations from variables with easily identifiable responses, such as « Country » or « Country of Publisher », or variables that allow only two types of answers (i.e Yes or No), such as Author holds copyright without restrictions and APC. It is recommended to create a pivot table to identify displaced observations, repeating this process until no observations are identified in a wrong column.
When cleaning-up the DOAJ metadata, users will notice that in some cases only one observation was displaced; in other cases, an entire row was displaced beginning on a specific variable. In the example highlighted in yellow below, all observations beginning at variable Publisher were displaced one column to the right.
Data entry inconsistencies
When correcting for displaced observations, we have also identified some inconsistencies in the way observations are registered in the DOAJ metadata. The table below lists the main visible inconsistencies found for some variables. In the majority of instances, the inconsistencies will not impact DOAJ users looking up information for a particular journal. However, it is important to take into account these inconsistencies before proceeding to any automated statistical analysis. For example, DOAJ metadata as is can be used to identify the number of journals with persistent article identifiers, but automated counting of DOI v. ARK or other approaches would require some advance data manipulation.
|Alternative title||Some journals alternative titles may be registered as a number. Some examples are “2300-6633” and “0”.|
|Keywords||Some observations have some special characters as follows: |
6. rheology, tribology, hydrodynamics, thermodynamics, mechanics of structures, mechatronics.
water cycles, water environment, water treatment and reuse, water resource, water quality, hydrology
• natural sciences, • environmental sciences, • social sciences, agricultural sciences, veterinary medicine, medical sciences
|Copyright information URL||Some URLs lack a letter « h » at the beginning or the end. The example below illustrates this small error. There should be an “h” at the beginning and an “l” at the end of the link. ttp://www.emeraldgrouppublishing.com/services/publishing/jiuc/authors.htm|
|Plagiarism information URL||Some URLs lack a letter « h » at the beginning or the end. The example below illustrates this small error. There should be an « h » at the beginning and an « l » at the end of the link.|
|URL for journal’s instructions for authors||Some URLs lack a letter « h » at the beginning or the end. The example below illustrates this small error. There should be an « h » at the beginning of the URL|
|Other submission fees information URL||Some URLs have extra letters. The example below, for instance, has a letter « i » at the beginning of the URL|
Some URLs lack a letter « h » at the beginning or the end. The example below illustrates this small error. There should be an « h » at the beginning of the URL
|Preservation Services||Preservation services can be registered as a name or a website|
|Preservation Service: national library||Preservation services – national library can be registered as a name or a website|
|Preservation information URL||Some URLs lack a letter « h » at the beginning or the end. The example below, for instance, has a small error. There should be an « h » at the beginning of the URL |
|Deposit policy directory||Deposit policy directory can be registered as a name or a website|
|Persistent article identifiers||Persistent article identifiers can be registered as an acronym (UDC, DOI, ARK), but also as a website, such as dc.identifier.uri (DSpaceUnipr) or NBN http://www.depositolegale.it/national-bibliography-number/. |
Another example is the occurrences UDC and UDC (Universal decimal Classification), which are equivalents but were registered differently
|URL for journal’s Open Access statement||Some URLs lack a letter « h » at the beginning or at the end, or they have an extra h at the beginning of the URL. The example below has an extra letter « h » at the beginning of the URL. |
Publisher’s names duplicates investigation and clean-up
The purpose of this project is preparation to develop a rough picture of publisher size to compare with Solomon & Björk’s findings (2012). In order to better perform publisher size analysis, we have specifically investigated the publisher duplicates and corrected most of the obvious errors, such as small differences in punctuation and/or characters, extra spaces at the beginning and/or at the end, and minor differences in entering the publisher name when it is the same, etc. (Please see examples in Table 4 – Investigative Strategies – Publisher Names Duplicates).
The process of clean-up was divided into three stages. Firstly, we created a pivot table for the publisher column to identify the entries in rows which were slightly different but weren’t gathered. Secondly, when potential duplicates were found, we conducted an investigation to confirm duplicates and/or to decide which name to keep (in priority order: use the name with the most journal entries; correct name with obvious typo; use the first name listed). Please see the investigative strategies below:
Thirdly, after identifying inconsistencies in publisher names, we created a table (please see Table 5 – Corrections Gathering – Publisher Names Duplicates) to register all the corrections on the variable Publisher. About 500 inconsistencies were corrected. Thus, the number of publishers in the pivot table has decreased from 7218 entries (data resource: pivot table based on DOAJ metadata) to 6804 entries (data resource: pivot table based on the cleaned-up version of database).
As illustrated in the two tables above, there were different types of data inconsistencies. In order to respect metadata to the greatest extent, we acted prudently when making decisions. In some minor variation cases, we tried to click on the URLs to check publisher websites and to collect convincing evidence. However, we met some intricate complex challenges.
One of the challenges was the language. Due to the massiveness and the wide-range of publishers (124 countries, 80 languages, DOAJ, 7 Feb. 2021) [https://doaj.org/], we were unable to identify all of the sources of information. Besides, when there were invalid URLs or unmatched information, it was difficult to seek out any precision. What’s more, among 7218 entries of publisher names, some of the potential duplicates weren’t gathered because of their different beginning words. For example, “Editora da Universidade Estadual de Maringá (Eduem)” vs. “Eduem – Editora da Universidade Estadual de Maringá” and “Academica Brâncuşi” vs. “Editura Academica Brâncuşi”. They were usually far apart and hard to be detected. More details can be found in the Table 6 below:
|Different beginning words (examples)||“Academica Brâncuşi” vs. “Editura Academica Brâncuşi”;|
“Alexandru Ioan Cuza University of Iaşi” vs. “Editura Universităţii ‘Alexandru Ioan Cuza’ Iaşi”;
“Editora da Universidade Estadual de Maringá (Eduem)” vs. “Eduem – Editora da Universidade Estadual de Maringá”
Unmatched publisher names (examples):
|Original publisher names||Possible correct names||URLs|
|Canadian Society for the Study of Education.||The Canadian Association for Curriculum Studies||https://jcacs.journals.yorku.ca/index.php/jcacs/index|
|Badan Penelitian dan Pengembangan Kesehatan||URL directs to a new web link:|
whose publisher name is:
Pusat Penelitian dan Pengembangan Biomedis dan Teknologi Dasar Kesehatan
|Shaheed Beheshti University of Medical Sciences and Health Services||Kowsarmedical||http://journals.sbmu.ac.ir/jme|
Invalid URLs (examples):
|Original publisher names||Original URLs (invalid)|
|Alborz University of Medical Sciences|
(URLs wrongly directs to a website whose contents are meaningless; when we searched the journal title, we were directed to this website : https://enterpathog.abzums.ac.ir/)
|http://enterpathog.com/?page=home ; https://jehe.abzums.ac.ir/index.php?slc_lang=en&sid=1|
|Instituto Nacional de Salud (INS)||http://revistas.ins.gov.py/index.php/rspp/|
|Instituto Superior de Ciências de Educação do Huambo||http://revista.isced-hbo.ed.ao/rop/index.php/ROP/index|
Given the barriers and challenges mentioned above, we can draw a conclusion to the limitations of publisher names clean-up project. Precision is not possible in this project because the question “who is the publisher” is complex. Instead of making any definitive claims about publisher size, we are primarily interested in whether the long tail effect (a few big publishers, a few more middle-sized, most very small) reported by Solomon & Björk (2012) can still be observed in DOAJ in 2021.
DOAJ metadata update analysis
The following analysis was conducted to determine whether DOAJ metadata on article processing charges (APCs) – charging status and amount – would be sufficient for SKC’s longitudinal study on APC trends over time. The answer is clearly no. The metadata for the vast majority of journals in DOAJ (overall and APC charging) has not been updated for more than a year, and it is unknown whether the most recent update would have included an update to APC or other metadata. We will continue to use DOAJ metadata as it is rich and the question “is this journal listed in DOAJ” is of value in and of itself, however for price comparisons we cannot rely on this data as it would likely result in erroneous conclusions.
DOAJ journals by year of last update.
This chart illustrates the percentage of DOAJ journals last update by year. Detailed figures are in the table below. Note that just under half the journals were last updated 2 or more years ago (2018 or earlier).
|DOAJ last update as of Jan. 5, 2021|
|Year||# journals last updated||% journals last updated|
DOAJ APC charging journals by year of last update
The chart above illustrates the percentage of journals that answered “yes” to the DOAJ question about charging APCs by year of last update. The table below provides the detailed figures. Note that only 30% of DOAJ journals that charge APCs were updated in the past year (2020 or 2021). It is also unknown whether in these cases the last update was a thorough review of the metadata, or might have been an update of non-APC data.
|DOAJ last update APC journals only Jan. 5, 2021|
|Year of last udpate||# of journals last updated||% journals last updated|
A version of the Jan. 5, 2021 DOAJ metadata file reflecting the corrections explained below is available as open data here:
Directory of Open Access Journals; Zhao, Xuan; Borges, Luan; Morrison, Heather, 2021, “DOAJ_metadata_2021_01_05_with_SKC_clean_up”, https://doi.org/10.5683/SP2/G5LEXG, Scholars Portal Dataverse, V1
The Directory of Open Access Journals (DOAJ) online: https://doaj.org/
Solomon, D. J., & Björk, B. (2012). A study of open access journals using article processing charges. Journal of the American Society for Information Science and Technology, 63(8), 1485–1495. https://doi.org/10.1002/asi.22673
Cite as: Zhao, X., Borges, L., & Morrison, H. (2021). Some limitations of DOAJ metadata for research purposes. Sustaining the Knowledge Commons. https://sustainingknowledgecommons.org/2021/02/10/some-limitations-of-doaj-metadata-for-research-purposes/
Cross-posted from The Imaginary Journal of Poetic Economics
While many aspects of our lives and activities have slowed down during the COVID pandemic, this has not been the case with open access! The OA initiatives tracked through this series continue to show strong growth on an annual and quarterly basis. Important milestones are being reached, and others will be coming soon.
The Directory of Open Access Journals now lists over 15,000 fully open access, peer reviewed journals, having added 379 journals (> 4 per day) in the past quarter, and now provides searching for over 5 million articles at the article level.
A PubMed search for “cancer” limited to literature from the past 5 years now links to full-text for over 50% of the articles.
The Bielefeld Academic Search Engine now cross-searches over 8,000 repositories and will soon surpass the milestone of a quarter billion documents.
Anyone worried about running out of cultural materials during the pandemic will be relieved to note that the Internet Archive has exceeded a milestone of 6 million movies in addition to over 27 million texts (plus audio, concerts, TV, collections, webpages, and software).
Analysis of quarterly and annual growth for 39 indicators from 10 services reflecting open access publishing and archiving (Internet Archive, Bielefeld Academic Search Engine, Directory of Open Access Books, bioRxiv, PubMedCentral, PubMed, SCOAP3, Directory of Open Access Journals, RePEC and arXiv) demonstrates ongoing robust growth beyond the baseline growth of scholarly journals and articles of 3 – 3.5 per year. Growth rates for these indicators ranged from 4% – 100% (doubling). 26 indicators had a growth rate of over 10%, 15 had a growth rate of over 20%, and 6 had a growth rate of over 40%. The full list can be found in this table.
Thank you to everyone in the open access movement for continuing the hard work that makes this growth possible.
The open data edition is available here:
Morrison, Heather, 2020, “Dramatic Growth of Open Access Sept. 30, 2020”, https://doi.org/10.5683/SP2/AVBOW6, Scholars Portal Dataverse, V2
This post is part of the Dramatic Growth of Open Access Series.
Cite as: Morrison, H. (2020). Dramatic Growth of Open Access September 30, 2020. The Imaginary Journal of Poetic Economics https://poeticeconomics.blogspot.com/2020/10/dramatic-growth-of-open-access.html
Notre Tanoh Laurent Kakou a créé un blog pour son propre projet de recherche en libre accès, C.A.S.A.D.: Centre d’Accès aux Savoirs d’Afrique et de sa Diaspora.
Quelques articles seront familiers aux lecteurs de Soutenir les savoirs communs, le travail de l’équipe; d’autres sont nouveau recherche fait par Tanoh. La vidéo Qu’est-ce que la revue Afroscopie?, un entretien avec Benoit Awazi, est éclairante pour quiconque s’intéresse à la recherche en Afrique francophone.
Our Tanoh Laurent Kakou has created a blog for his own research project in open access, C.A.S.A.D.: Centre d’Accès aux Savoirs d’Afrique et de sa Diaspora.
Some articles will be familiar to readers of Sustaining the knowledge commons, as the work of the team; others are new research projects by Tanoh. The video Qu’est-ce que la revue Afroscopie?, an interview with Benoit Awazi, is enlightening for anyone who is interested in research in francophone Africa.
Thank you and congratulations to our Tanoh Laurent Kakou, a doctoral candidate in communication (and graduate of ÉSIS) on passing his comprehensive exam this summer! Best wishes to Tanoh and his research.
Update July 15: a 10-minute YouTube video overview of this work by Dr. Rahman & I can be viewed here.
The context of this paper is an analysis of three emerging models for developing a global knowledge commons. The concept of a ‘global knowledge commons’ builds on the vision of the original Budapest Open Access Initiative (2002) for the potential of combining academic tradition and the internet to remove various access barriers to the scholarly literature, thus laying the foundation for an unprecedented public good, uniting humanity in a common quest for knowledge. The global knowledge commons is a universal sharing of the knowledge of humankind, free for all to access (recognizing reasons for limiting sharing in some circumstances such as to protect individual privacy), and free for everyone qualified to contribute to. The three models are Plan S / cOAlition S, an EU-led initiative to transition all of scholarly publishing to an open access model on a short timeline; the Global Sustainability Coalition for Open Science Services (SCOSS), a recent initiative that builds on Ostrom’s study of the commons; and PubMedCentral (PMC) International, building on the preservation and access to the medical research literature provided by the U.S. National Institutes of Health to support other national repositories of funded research and exchange of materials between regions. The research will involve analysis of official policy and background briefing documents on the three initiatives and relevant historical projects, such as the Research Council U.K.’s block grants for article processing charges, the EU-led OA2020 initiative, Europe PMC and the short-lived PMC-Canada. Theoretical analysis will draw on Ostrom’s work on the commons, theories of development, under-development, epistemic / knowledge inequity and the concepts of Chan and colleagues (2011) on the importance of moving beyond north-to-south access to knowledge (charity model) to include south-to-south and south-to-north (equity model). This model analysis contributes to build a comparative view of transcontinental efforts for a global knowledge commons building with shared values of open access, sharing and collaboration, in contrast to the growing trend of commodification of scholarly knowledge evident in both traditional subscriptions / purchase-based scholarly publishing and in commercial open access publishing. We anticipate that our findings will indicate that a digital world of inclusiveness and reciprocity is possible, but cannot be taken for granted, and policy support is crucial. Global communication and information policy have much to contribute towards the development of a sustainable global knowledge commons.
Full text: https://ruor.uottawa.ca/handle/10393/40664
Cite as: Morrison, H. & Rahman, R. (2020). Knowledge and equity: analysis of three models. International Association of Communication and Media Researchers (IAMCR) annual conference, July 2020.
By Anqi Shi & Heather Morrison
307 SpringerOpen titles for which we have data on journals that were fully open at some point from 2010 to the present were studied, with a primary focus on pricing and status changes from 2019 – 2020 and a secondary focus on longitudinal status changes. Of the 307 titles, 226 are active, fully open access and are still published by SpringerOpen, 40 have ceased publication, 19 were transferred to another publisher, and 18 journals that were formerly open access are now hybrid. 6 of these journals transitioned from free to hybrid in the past year. An additional 2 journals were not found. An additional 2 journals were not found. Of the 226 active journals published by SpringerOpen, 51% charge APCs. The average APC is 1,233 EUR, an increase of 3% over the 2019 average. 46.5% of the 101 journals for which we have 2019 and 2020 data did not change in price; 13.9% decreased in price; and 39.6% increased in price. The extent of change in price was substantial, ranging from a 50% price drop to a 94% price increase.
Detail – download the PDF: springer open 2019-2020
Data (for DOAJ 2016 – 2019 data for journals that are now hybrid see columns BV – ): Springeropen_2019_2020
Cite as: Shi, A. & Morrison, H. (2020). SpringerOpen pricing trends 2019-2020. Sustaining the Knowledge Commons May 25, 2020 https://sustainingknowledgecommons.org/2020/06/11/springeropen-2019-2020/