Academia’s obsession with silly questions and made up figures

By Dave Abson

Image

First let me state that I really like Trends in Ecology and Evolution, it is an excellent journal publishing many interesting articles (see, for example, academia`s obsession with quantity, (Fischer et al. 2013) *). Indeed, I would be very happy to have an article in such an esteemed organ. So the following should not be seen as a critique on TREE. It just happens to be the case that there are always copies of TREE on the table around which we have our weekly team meeting. Therefore, TREE is the only journal that I read—while giving my complete and undivided attention to important team meeting stuff, obviously—cover to cover. So it is in TREE that the issues of “silly questions” and made “up figures” is most obvious to me, although I see the same issues in many other journals as well.

What do I mean by silly questions? Well here are a few examples of article titles from recent(ish) issues of TREE: Are we willing to build a better future? Do simple models lead to generality in ecology?  Does research help to safeguard protected areas? Can evolutionary design of social networks make it easier to be ‘green’? Peerage of Science: will it work? Do we need a global strategy for microbial conservation?

The answers to these questions (in case you are not sure) are: yes, no, yes, no, no, yes. Only joking, in fact the reason why I think these are “silly” questions is because in every case above the answer can be (more or less) summarised as “Well we are not sure, we think that maybe, sometimes the answer is yes/no, it depends on the particular circumstances”. It is not that such “silly question” papers can’t have interesting insights on important issues; it is just that they are framed so broadly and deal with such complex, multi-faceted and context dependent problems that it is simply not possible to provide “correct” or perhaps even meaningful answers to such questions.

So why ask the question in the first place? I think the answer has something to do with that second great obsession of Academia (alongside quantity), namely that of novelty. There seems to be an increasing mania for academics to always ask the new, big, globally relevant question, even when there is no answer to such a question. It seems to me these big unanswerable questions are published at the expense of smaller more focused and contextualized questions for which meaningful answers can be given.

I think “made up figures” are part of the same cultural malaise (see figure 1 for example**). The rush to be the first to discuss the new big topic means that the publications are often not based on detailed empirical evidence. Who has time to waste collecting and analysing actual evidence for some proposed relationship between x and y, when someone else can provide a figure showing “the conceptualisation relations between x and y” and beat you to publication by a couple of years? Don’t get me wrong I think such conceptualised figures can have a useful heuristic purpose, but too often they seem to be used as a substitute for actual data and often the fact that they are simply made up lines on a graph are skated over in the publications.

Image

Increasingly academia seems to value the “big”, “broad”, “new” and “cutting edge” while marginalizing the small, focused, incremental and well-tested accumulation of knowledge that was previously seen as the cornerstone of the scientific endeavour. I am not convinced that is a particularly good thing.      

 

 † You many also wish to read and cite the six other excellent publications that Professor Fischer has in this highly respected journal.

* Full disclosure:  a) I am a sometime employee of Professor Fischer; b) Professor Fischer is considerably larger than I am; this means that c) I am more or less contractually obliged to only say nice things about Professor Fischer,  his work and his behaviour towards his employees; and d) I am not at all physically intimidated when Professor Fischer “politely asks me in a calm and reasoned manner” if I would consider writing an (unpaid) blog entry for him… see point c for contextual clarification of point d.

** Warning this figure is not based on empirical evidence, real relationships may differ.

Advertisements

7 thoughts on “Academia’s obsession with silly questions and made up figures

    • If I was not so afraid I would take offense at that remark. As is it is I will simply say, that I have always believed that “sounding smart” was what I was good at…

  1. Interesting blog post that raises a few good points. For example striving for novelty can be a problem and testing something a bunch of times in different places in different ways allows us to see how general relationships are.

    I agree it is a pity that much fieldwork is now marginalised. I have had colleagues that have had problems publishing such work because it is “too narrow” or “is only a single site study.” This shouldn’t happen.

    I’m not sure I completely agree with your opinions about broad, globally relevant work though. You seem to be suggesting that this is in some way “bad science” – I suggest you look at the recent blog post at Dynamic Ecology for a (kind of) rebuttal of this (http://dynamicecology.wordpress.com/2013/11/07/the-one-true-route-to-good-science-is/).

    Do you have a problem in principle with such broad, big and cutting edge work?

  2. I don’t have any problem at all about broad, big and cutting edge work, but I don’t think these things should be the only (non rigour/quality based) criteria journals use for publication. Moreover, I think there are times when the rush to do broad and “big” science means that the edge applied to the problems are not as sharp as they might have been if a little more time had been spent on understanding the details. One example of this blunt tool would be the (in)famous ecosystem services paper by Constanza (1997) where a value of the world’s ecosystem services was provided. The paper is hugely cited, but littered with flaws and conceptual problems that have propargated through ecosystem services research ever since.

    I think that sometimes there is a pressure to present big science ideas without actually doing the big science (so it is not the big science that is the problem).

    • I agree that the obsession with the new can sometimes mean that poor work gets published in big journals.

      I partly agree with you about the Costanza paper. It is all complete bollocks and makes no sense whatsoever. I would love to say that people have noticed this but I saw Costanza present a couple of years back and he still stood by it. Positives did come out of his research though – I think it really kicked off the current ecosystem services obsession which is generally a good thing, even if a lot of it is done quite badly.

      I’m glad to hear that you don’t think ‘big science’ is per se the problem, because I have been getting the feeling that there is a bit of an antagonistic relationship developing between field researchers and those who use data collected by others.

      I think we need to foster this relationship, by for example giving credit for when we use the data of others. Without this research will ultimately suffer.

      If you’re interested in this issue Jarrett Byrnes has set up an open letter to ISI (the company behind Web of Knowledge) to improve the attribution of credit to those who work collecting data in the field: https://t.co/vw0JHy0A2H

  3. Pingback: Friday links: the history of “Big Data” in ecology, inside an NSF panel, funny Fake Science, and more | Dynamic Ecology

  4. Pingback: How (not) to influence the direction of your field | Dynamic Ecology

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s