Science is in need of fundamental reform. That is the belief of the initiators of Science in Transition. Science has become a self-referential system where quality is measured mostly in bibliometric parameters and where societal relevance is undervalued.
The Science in Transition initiators have put forward their ideas in a position paper (read pdf here). This has kindled a debate among researchers and policy makers in The Netherlands. In November 2013 the Science in Transition initiative organised a two-day conference. Next, in separate meetings the initiators will be discussing the topics raised with representatives from KNAW, NWO, VSNU and others. This should lead to an agenda for change. Below a first draft for this agenda is formulated by Science in Transition.
UPDATE: In June 2014 Science in Transition released a status report with debate, progress and recommendations. Download pdf here.
UPDATE 2, June 2015: A new and updated English overview of Science in Transition milestones reached in a year and a half. Download pdf here.
Science in Transition joins a worldwide chorus
Of course, the Science in Transition initiative is not the first to notice that science has gone wrong. All over the world there is debate on how to assess research quality and make sure science does the right things. A few examples.
* The San Francisco Declaration On Research Assessment wants to put an end to the use of bibliometric parameters when deciding what researchers should receive grants or jobs. (December 2012)
* Newspaper The Economist made the problems in science a cover story (“How Science Goes Wrong“). It focuses on unreliable research and states that many errors in science go uncorrected. (October 2013)
* Nobel Prize winner Randy Schekman calls for a boycott of journals with high impact factors like Science, Nature and Cell. (December 2013)
* The Reproducibility Initiative wants to reproduce landmark studies since reproducing important papers in the current system is not rewarded, while it is of vital importance.
* Medical journal The Lancet wants to “increase value and reduce waste” in biomedical research. It discusses ways to do so in a series of articles. (January 2014)
* The US National Institutes of Health are exploring initiatives to restore the self-correcting nature of preclinical research. (January 2014)
* Promotion and grant committees should be reading through papers and judging research by its merit, says Nobel Prize winner Sydney Brenner. “I know of many places in which they say they need this paper in Nature, or I need my paper in Science because I’ve got to get a post doc. But there is no judgment of its contribution as it is.” (March 2014)
* Biomedical science in the US needs to be rescued from its “systemic flaws”, write Bruce Alberts and Harold Varmus in PNAS (March 2014). One of their recommendations is “to gradually reduce the number of entrants into PhD training in biomedical science — producing a better alignment between the number of entrants and their future opportunities—and to alter the ratio of trainees to staff scientists in research groups.”
* Academic environments often place more value on the discovery itself and less value on learning how to realize the potential benefit of its application. This should change, universities should foster implementation science, write three doctors in the New England Journal of Medicine (May 2014).
* The European Commission starts an online “Public consultation ‘Science 2.0’: Science in Transition” about the changing science system. The Science in Transition initiative features prominently in the background analysis. “In the Netherlands, an intensive debate has evolved on the basis of a position-paper entitled ‘Science in Transition’. The ongoing debate in the Netherlands addressed, among other, the issue of the use of bibliometrics in relation to the determination of scientific careers.”
* Former Secretary General of the European Molecular Biology Organization Gottfried Schatz analyses the effects of Big Science in an essay in Nature Reviews Molecular Cell Biology. The exponential growth of science has led to meaningless quantification, a crisis in peer review, reproducibility problems and the rise of fellowships (May 2014).
* Modify reward system for science to create reproducible and translatable research, says John Ioannidis in PLoS Medicine. With the current reward system “an estimated 85% of research resources are wasted”. (October 2014)
* Science should strive for “impact, not impact factor”, says PNAS Editor-in-Chief Inder Verma. “When it comes to judging the quality and significance of a body of work, there is no substitute for qualitative assessment. And it bears repeating that the impact factor is not an article-level metric, nor was it intended as a yardstick for comparing researchers’ scholarly contributions. However, at many institutions performance assessments hinge greatly on this number, which currently wields outsize influence on the advancement of scientific careers.”
* In an extensive review about quantative indicators, ‘The Metric Tide’, a British committee concludes: “There is legitimate concern that some indicators can be misused or ‘gamed’: journal impact factors, university rankings and citation counts being three prominent examples.” July 2015
* In PNAS two researchers “[S]how that biomedical research outcomes over the last five decades, as estimated by both life expectancy and New Molecular Entities approved by the Food and Drug Administration, have remained relatively constant despite rising resource inputs and scientific knowledge.” July 2015
The Science in Transition agenda for change
Science is broadly appreciated. And rightfully so. Its successes are unfathomable, and modern societies are inconceivable without its influence. But on the other hand, many aspects of science are under discussion. The way in which quality of research is guaranteed, the value of the many degrees awarded, the connection to the social and economic agenda, the way in which knowledge is discussed in public debate: all of these issues raise questions. It is to be expected that many developments in the near future (complex social issues, rising student numbers, cutbacks, polarised political debate) will only increase the pressure to come up with adequate answers.
Science in Transition wishes to consider the various issues involving science in a comprehensive way. On the basis of an agenda, it wishes to inspire a broad discussion in order to explore the nature of the situation, the possibilities and desirability of improvement and the parties that are best suited to take the initiative.
1. The image of science
Science is often referred to in the singular. In reality, “science” encompasses many divergent disciplines using various methods and practices. Notwithstanding this variation that can hardly be classified under one denominator, the question “what is science” is still important. The answer will never be solely descriptive (“how it works”) but will always contain a normative element (“how it should be”). The image of science binds scientists and tests their performance: it expresses the underlying values and sets the standard to which people practicing science have to conform in order to arrive at valid statements. The image of science is also important for the outside world. It constitutes the basis on which the outside world grants science a relatively autonomous position, and gives it trust and funding. But if outdated or romanticised images are prevalent, practice does not always accord with expectation. If science’s reputation is being disinterested, but it is still encouraged to collaborate with business and if only curiosity should be science’s drive but ambition gets it to the front page, there will be friction, and problems will surface.
Is it important to disseminate a public image of what science is? What kind of image should this be? A correction to an overly mythical image? But according to what normative model should science behave? Is transparency a solution or a path to disaster? And who exactly should be transparent? The ownership of the problem does not only belong to scientists but also to administrators and scientific organisations (KNAW, NWO) and to a certain extent, to journalists and “the media”.
2. Trust in science
Opinion polls show that public trust in science is high. Such trust is important. No sector can do without it. But to a considerable extent, science is internally organised on the basis of scepticism and mistrust: the claims made are systematically and critically tested (which – in other ways – also applies to economics and politics). Therefore, distrust is not wrong. Regularly, issues come under discussion in which science plays an important role and which make tempers flare. Think for instance of discussions about fracking, the climate, durable energy, food safety and the promises of new medical treatments. In these cases, the question is whether trust and mistrust are still balanced. The
discussion about genetically modified food has reached a stalemate between pro and contra. Are the different sides still attempting to acquire a critical perspective on assumptions and methods, or has scepticism become fruitless here, leading to loss of opportunities? The question is how the right proportion between trust and mistrust can be reached with regard to these very different issues. Where and by whom can changes be instigated? Are better procedures needed to streamline the relationship knowledge-policy and to prevent the abuse of science? Or is this not about procedures but about the recognition of the intrinsically political character of some issues, and the limited role science can play here? It is hard to give an unequivocal definition of “trust”. And yet, everyone agrees that is it a precarious commodity. It arrives on foot and leaves on horseback. Unreflected trust or mistrust is therefore undesirable. And who should take on this task?
The most important mechanism for quality control in science is self regulation. Scientists review each other’s work as colleagues, as a professional assignment. But how can this be done in an objective and neutral manner? The system now dominant preponderantly departs from numbers of publications, citations, awards won, research applications awarded, funding acquired and possible patents. It measures and quantifies. But not everything that is of importance, and not all kinds of products or possible scientific results, can be assessed fairly. For instance, we should be aware that the sciences, the social sciences and the humanities each have a proper, mutually different style. Not all kinds of scientific practice are equally suited for the quantitative expression of quality. And long-term positive social and economic effects are in danger of disappearing from view.
Is the present system still adequate for measuring quality? Is it possible to map more diverse results? In what ways can the discussion about the nature of quality and about possible desirable qualities, be conducted best? How can such a dialogue be turned into a structural part of scientific work? And who should take on this task? NWO, VSNU, KNAW and NFU, in general: Organisations where various if not all scientists gather, in order for everyone to commit himself to the quality criteria, so that nobody can shirk from a discussion about desirable assessment criteria and (even more importantly) the harmonisation of the very different objectives.
4. Fraud and deceit
Integrity is a value scientists strongly subscribe to, as any self-respecting professional group does. But recently, we have repeatedly seen that fraud and deceit cannot be ruled out among scientists. The control mechanisms of the sciences aim to give them a self-cleaning function. The critical monitoring of each other’s work should prevent abuses. But these control mechanisms are primarily equipped for the quantitative assessment of quality, expressed in numbers of publications in recognised top journals. The precise basis that gets articles published is often insufficiently checked. Most scientists who judge the work of a fellow-scientist do not consider this their task. Moreover, time and resources to do it properly are lacking. Checking the work of others is not valued per se and is seen as less rewarding than being in the news oneself. The incentives and the remuneration are therefore not particularly grand. How can control in the sciences be improved? Should the present mechanisms be enhanced with incentives to make checking as appealing as publishing one’s own work? Or should completely different mechanisms be put in place, for instance checking by third parties or research funders? Could more transparency and data-sharing prevent a large part of the damage? Or does the challenge primarily lie at the level of personal integrity and the professional culture? Who should take on this task? Should a new control institution be established? Should this be centrally managed, via NWO or KNAW? Or should its management be decentralised, at the level of the universities? Is coordination per discipline a better idea? And is there a role in all of this for area administration and research schools?
Science takes up a much larger part of the daily news than the size of the science supplement would suggest. Articles and news items on health but also on food, climate, energy and environment, let alone on economic and financial stability are often underpinned by insights or applications from the sciences. But how do these items reach the news? In what form are they supplied? With what expectations? And in what way do scientific experiences influence people’s lives? Science often has a formative role. New technology changes people’s lives, can make some professions almost superfluous or usher in new business. Historic research partly determines the identity of a country or a people. Concepts from the social sciences (such as, in the Netherlands, the distinction between “autochtoon” and “allochtoon”) have great bearing upon the self-image of a society. Are scientists not unilaterally relying on a limited number of specific communication channels? On publications if they wish to reach colleagues, on the media for the public at large? Should scientists not communicate with much more focus? Do they not leave communication to communications officers, while they themselves have a role in boosting public debate? Are the communications offices of the university not first and foremost concerned with marketing? Or do they also have a task in truly scientific communications and can these two tasks be combined? Or in the careful transmission of relevant insights to the concerned groups? Is the science press sufficiently critical towards information from the scientific world, or is it only interested in ready-made news items? Do science journalists know how science arrives at the facts? Are they familiar with the present practice of knowledge production? Who should take on this task? The combined press (and its umbrella organisation), but also the schools of journalism and educations where communications people and other public information officers are trained.
6. Democracy and policy
Science and politics constitute separate domains. The first revolves around the search for truth; around the structure of reality and the ways to make plausible statements about it. The second revolves around the collective decision-making process in a democratically legal manner. This distinction is of crucial importance to guarantee the objectivity and disinterestedness of science. And to prevent politics from becoming a matter for experts who determine what is best for others while avoiding democratic accountability. Many of today’s issues demand a more intensive exchange between the two. “Fact free politics” is a nightmare, and science cut loose from society is a fantasy of very few people. But science is not decided through elections or a broad public debate alone. Where may we look for change? The agenda is difficult to determine. How should we solve the paradox of democracy and technocracy? It is obscure how the agenda for scientific research is drawn up. Scientists themselves have ways to assess the relevance of the research agenda internationally, for instance via articles, journals and conferences. But are their ways of following the social agenda really adequate? Is the Top Sector policy sufficiently attuned to the issues that are relevant to society as a whole? Public participation in scientific processes is a delicate matter. But many cases are known in which layman’s knowledge or the involvement of volunteers executing research tasks has yielded palpable results. And, at least as important, has led to a constructive form of coproduction. Who should take on this task? Committees of the House of Representatives, ministerial committee, WRR, Rathenau Institute, NWO, KNAW?
7. The connection between education and research
The connection between education and research is typical of universities. Contrary to the strict research institutes and the colleges largely devoted to education, the university flourishes in the exchange between both. On the one hand, there are professors who both develop knowledge and transfer it. On the other hand, there are students who not only absorb knowledge well but who can also learn how it can be acquired in a valid way. Let us make it clear: this is an ideal. And this ideal is under pressure. Large student numbers, the increasing importance of a degree combined with the threat of its decreasing significance put great pressure on the education mission. Bachelor students know less and less what science is actually about. In their turn, professors are being assessed largely on the basis of their research results, notwithstanding all the hours they work in the class room. There is need for a fundamental discussion about the level of the education to be given at universities – and the advantages and disadvantages, the possibilities and impossibilities of the connection between education and research.
Huub Dijstelbloem (Scientific Council for Government Policy, University of Amsterdam)
Frank Huisman (University Medical Center Utrecht, Descartes Centre, Utrecht University)
Frank Miedema (University Medical Center Utrecht)
Wijnand Mijnhardt (Descartes Centre, Utrecht University)