Pencil Icon

Science is hard

As David Shaywitz accounts today in today’s Forbes, science is hard. Communication is hard (especially for scientists).

This is becoming increasingly apparent in the era of big data science. The scientific complexities are compounded by current modes of communication, specifically two dimensional text representations of complex analytics, which have become wholly inefficient in allowing appropriate peer review and understanding of scientific claims.

As was explored by Kesselheim et al (pay wall), funding disclosures have a profound impact on how physicians interpret research findings. But ultimately an individuals’ perception of research results comes down to trust. And the further one is removed from the science itself, the more assumptions one has to make, and ultimately the more one has to rely on preconceived notions to guide their opinion. Under the current model of scientific communication, interpretations require not only trust but a full on leap of faith. Allowing consumers a bridge across this vast chasm between scientific claims and the scientific process itself is one way to gain trust. Narrowing this gap is important specifically because it mitigates preconceived notions which are otherwise necessary to form one’s opinion, such as study funding sponsors.

The clearScience pilot, funded by the Alfred P. Sloan Foundation, is specifically positioning itself to build infrastructure for more effective scientific communication. By leveraging the open APIs of GitHub, Amazon Web Services, and Synapse, clearScience demonstrates how scientists can easily transition from exploring data—executing science—and providing the scientific community all the resources and artifacts to recreate analyses. Conducting research in this manner allows reproducibility to be a byproduct of the process rather than a burden. And more importantly, provides a framework for the science to be extended upon instead of publication as a a finite endpoint for research.

simplystatistics:

In our most recent video, Steven Salzberg discusses the ENCODE project. Some of the advantages and disadvantages of top-down science are described. Here, top-down refers to big coordinated projects like the Human Genome Project (HGP). In contrast, the approach of funding many small…

simplystatistics:

In a recent conversation with Brian (of abstraction fame) about the relationship between mathematics and statistics. Statistics, for historical reasons, has been treated as a mathematical sub-discipline (this is the NSF’s view).

One reason statistics is viewed as a sub-discipline of math is…

An interesting question.

Quote IconMethods, computational and otherwise, are also important. This is in part because people need to be able to (in theory) reproduce your paper, and also because in larger part progress in biology is driven by new techniques and technology. If the methods aren’t published in detail, you’re short-changing the future. As noted above, this may be an excellent strategy for any given lab, but it’s hardly conducive to advancing science. After all, if the methods and technology are both robust and applicable to more than your system, other people will use them—often in ways you never thought of. {gratis Mark DeLong for this link}
Pencil Icon

Methods & Results

I was supposed to be a concert violinist first.

Of course this meant many hours practicing daily; “commuting” 300 miles every other week for lessons; going to the Upstate NY summer music camp (almost uniformly called the “Concentration Camp”) that all the Juilliard kids went to… In music there are a couple “phenotypes” you often see: the kids that are “technicians”—perfect intonation, meticulous fingering and bowing technique, flashy brilliance with technical passages; and the “musicians”, technically excellent—not quite the same level—but who exhibit a real affinity for interpretation, phrasing, great dynamic range, great feel for the piece at hand. The superstars were the people that had both (perhaps only one or two a generation). And to me, the epitome of such a superstar is Jascha Heifetz. He had a distinct musical style that was fully enabled by a preternatural technique that brought his musicality to a different level. His tremendous elasticity with phrasing, tempi, and tone served the music and was only possible because he had the absolutely best technique of his generation—Roger Federer with zero unforced errors. Heifetz defined an entirely different phenotype because he was both a brilliant technician and musician.

So let’s get back to science.

In Mark Delong’s blog post in response to our the first post on this blog, “A scientific intervention”, he laments “Look for the smallest font size in a scientific journal and you find the ‘methods section’. There’s a reason for the small typeface, and it seems to me the literal diminishment of method is a clear problem for science.” And I think Mark’s making a key point here. How science is done takes a subsidiary role to what it produces, when they really should have equal status.

It’s easy to see how things get this way. The “results” of biomedical research hopefully produce diagnostics or medicines that transmogrify research into practice—hence our obsession with results.

But science is a process, and as scientists we ignore process at our own peril. Our “technique”, our methods really are part and parcel of our result. How we do something is commensurate with our results. We should treat them the same.

Our hope with clearScience, by exposing all the steps that arrive up to a “claim” or a “result”, really is to make the process part of the result.

When I listen to a Heifetz performance, the thing that raises the hair on the back of my neck is how he can do things musically that could only happen because his technique gives him the liberty to do that.

In the same fashion, scientific conclusions are an extension of how we arrived at them. And Mark’s point about the literal and figurative diminishment of method is an insight into why our field is struggling with reproducibility. We’re good at assertions rather than illustrations.

Quote IconOne day I watched Nat deleting and changing a lot of code that people had obviously spent a lot of time writing. Some people might feel scared to even save the file after deleting so much code. Nat didn’t hesitate at all. He said, “Ok, well this is all in git,” and just started deleting. He was right. There was nothing he could do that would set back anyone else’s work, and even if he pushed to a development server (not likely unless he was sure it was a good commit), it would probably only take someone a few minutes to roll things back to the way things were…
a blog post about "coding fearlessly"; technology (i.e., Git) that supports creative anarchy with the security that you can back out of your mistakes.

markdelong:

Essentially, “results” are tied to methods much more tightly than our current scientific publication modes let on. Look for the smallest font size in a scientific journal and you find the “methods section.” There’s a reason for the small typeface, and it seems to me the literal diminishment of method is a clear problem for science.

Quote IconThere’s nothing wrong with being wrong in science. Science is supposed to move forward as scientists test out one another’s ideas and results. But 21st-century science struggles to live up to this ideal. Scientific journals prize flashy, original papers (in part because journalists like me write about them). A disappointing follow-up simply doesn’t have the same cachet…