The evolution of cooperation isn’t so puzzling

There’s a bit of a vibe in evolutionary anthropology/biology/theory that the evolution of cooperation is puzzling. A recent book — The Moral Brain — says “cooperation between unrelated individuals poses a puzzle from both the perspective of natural selection and that of rational self-interest”. Still there are loads (and I mean *loads*) of theoretical models showing a variety of ways for cooperation to evolve. In a moment of heightened procrastination, I decided to delve into the literature and find out just how puzzling scientists find cooperation (warning: this is not even remotely a scholarly piece of work).
Continue reading “The evolution of cooperation isn’t so puzzling”

A tool to visualise the scientific literature

I spent an enjoyable couple of days last week in the beautiful Wellcome Trust taking part in a data/text mining workshop run by The Content Mine.

The idea behind their project is to develop tools that help scientists and other interested sorts pull data from published articles, potentially on a large scale. If you want to learn more, all their presentations are online, as are the materials from the Wellcome workshop.

The second day was a hackday. For this, my team wanted to build on the ContentMine tools to create something that helps you explore the scientific literature and find connections between papers you might not otherwise have found. We thought it would work something like this:
Continue reading “A tool to visualise the scientific literature”

My reviews from the UK Conference of Science Journalists 2014

Here are three things from last year’s UK Conference of Science Journalists (UKCSJ).

I was one of three people selected to pitch to real-life science editors in a Dragons’ Den-style session. My idea was to write an article about the increasing prevalence of shortsightedness (myopia) around the world. Scary stat: nearly one in three people might not have 20/20 vision by the year 2020. I didn’t get commissioned but you can watch the video here anyway.

After that intimidating experience, I reviewed two sessions. The first was about the use and misuse of statistics in science journalism:

“The Statistics in Science Journalism session at UKCSJ 2014 was a head-on collision between passionate journalists and the confusing monstrosity that is statistics. Deborah Cohen, the BMJ’s investigations editor, produced this session to help us understand how not to get things wrong.

Ivan Oransky, vice president of MedPage Today and co-founder of the excellent Embargo Watch and Retraction Watch blogs, led proceedings by taking us on a slide-by-slide journey through a realm of shoddy studies and equally shoddy reporting.”

Oransky’s presentation should be on his SlideShare page, but I couldn’t find them just now. He has plenty of other deeply interesting things on there, though.

The other session was about the issue of reproducibility in science:

“Science is in crisis, they say. Negative results don’t get published, while gibberish occasionally does; shaky studies are under-powered and over-reported; peer reviewers miss obvious mistakes and accept results that agree with their biases, regardless of merit; field-defining results cannot be replicated.

The current culture of ‘publish or perish’ doesn’t help matters. A scientist’s worth is judged based on how many papers they publish, how many times those papers are cited, and how much money they pull in.

Scientists, science journalists and others are beginning, however, to rage against the machine.”

Professor Chris Chambers, one of the speakers, put his excellent presentation about pre-registering studies, replicating them and making data open online here.

 

Me pitching to the UKCSJ (source: Twitter)

Daniel Kahneman on criticism within science

If you visit a courtroom you will observe that lawyers apply two styles of criticism: to demolish a case they raise doubts about the strongest arguments that favour it; to discredit a witness, they focus on the weakest part of the testimony. The focus on weaknesses is also normal in political debates. I do not believe it is appropriate in scientific controversies, but I have come to accept as a fact of life that the norms of debate in the social sciences do not prohibit the political style of arguments, especially when large issues are at stake—and the prevalence of bias in human judgment is a larger issue.

—Daniel Kahneman, Thinking, Fast and Slow (p.165)