Everyone should have access to the same scientific information (what that information should be is the subject of another post). It’s not even about who pays for it. It is science, and as such belongs to everyone.
Posts Tagged ‘communicating science’
This post is my entry for Open Access Day 2008. It is about blogging, peer-review publishing, open access, elitism, prejudice and exclusion.
I had this dream: publishing directly on the web mixed with interactive comments to individual publications was to replace traditional peer-reviewed publishing in scientific journals. I set up SciPhu.com to achieve this. The core idea was (and still is) that any publication would remain fluid in its format and content since communication with referees was continuous and interactive. By letting the publication evolve in this way it should asymptote towards perfect.
Inspiration of how a series of comments can serve to paint the whole picture, I took from posts (and their comment-threads) like Anna Koushnir’s post on female scientists (where the comments are sometimes borderline harassment, but the comments taken together as a whole, nails the relevant issues, I think) and Olivia Judson’s post “The monster is back” (I could have taken any post with more than say -10- comments)……
I realize now that I was a bit naive in believing this could completely replace a system that has worked so well for so long. The system has after all allowed the scientific community to retain an unsurpassed credibility, worldwide.
But……..I am not accepting defeat just yet, because traditional peer-review publishing has it’s problems.
Firstly, most articles published are not open access. That means that if you need a scientific article and you do not work in an institution that has paid a large amount of money for subscription to the given journal (or you have a private subscription) you will not be able to access it (you can pay for individual articles, but that will become very expensive over time). In contrast, blog publishing would be free and accessible to all. The cost of publishing on a blog is very low indeed. It does not need printing and it will not need conventional employees. Editors and peer-reviewers (those who comment) would all be doing work on a volunteer basis. Consequently, no need to charge the readers,…….- at least not for scientific use (commercial use could potentially have a different set of rules).
Secondly, traditional paper-publishing is painstakingly slow and, compared to web-publishing, extremely inefficient. Blogs and scientific news-sites can publish in a matter of minutes and provide continuous updates (and corrections) on almost any topic. Also, web-publishing allows you to track your readers in an unprecedented way, – and most importantly, it provides a feedback channel for them. This feedback I think is essential in the next era of publishing. It does however, need some structure. The commenting anarchy of today will not suffice.
Thirdly, – the complete lack of such anarchy, has been one the strengths of the traditional peer-review model. But, to such an extent has peer-review been controlled that it has become the opposite of anarchy, namely dictatorship. Reviewers are handpicked by the publishers, creating potential old boy networks, with the journal editors as presidents. To add to this, reviewers are usually anonymous, potentially masking any conflicts of interest, be it financial, work-related or personal. In addition editors and peer reviewers tend to have agenda’s different to those who submit their papers. Thus, the traditional system does not serve to get as much quality scientific information out as possible, – this is contrary to the intentions. Below, I’ll quote from a discussion I had with (editor) T Ryan Gregory, to illustrate elitism, prejudice and exclusion, as examples of editor-agendas (comments from this genomicron post):
My concern is that publish first, comment second represents an easy way around the rigors of review by experts in which publication is dependent on positive reviews and revision. I am all for open discussion, but initiatives started by people who don’t publish much in the peer-reviewed literature or do not themselves review many manuscripts do not really appeal to me because it adds to my sense of concern that this is a backdoor. I am not trying to seem elitist, I am just saying that peer review, for all its problems, is there for a reason. Submitted by T Ryan Gregory on 22 May 2008 – 8:10am.
May I ask what your record is in terms of reviewing manuscripts and publishing peer reviewed articles? Submitted by T Ryan Gregory on 22 May 2008 – 9:00am.
One might also be forgiven for thinking that someone with only a few publications might be looking to skip the hard (but necessary) stage of getting through reviewers. Isn’t it a bit odd to complain about the anonymous nature of peer review while moderating a “review” blog anonymously? Submitted by T Ryan Gregory on 22 May 2008 – 9:00am.
Peer review isn’t supposed to be democratic. It is supposed to be done by peers — a set of individuals with highly specific knowledge in a particular field. The democratic part comes only once the paper gets through that filter, when it is made accessible to the entire community. Peer review is a vetting process, not a rating process. Submitted by T Ryan Gregory on 22 May 2008 – 9:42am.
By now I have concluded that SciPhu or similar blog-publishing alternatives is not going to replace traditional publishing, and that other alternatives will have to. Blogs can/will however, still be a valuable addition to traditional publishing, perhaps serving to correct some of the flaws:
“….fact is, at least now, if you come up for review and your citations are all on SciPhu or PLoS, you are going to get clobbered.”
This is very true. What I am saying is that I hope that it will be different in the future. Sciphu is probably not the final solution, but it is a starting point. And hopefully one of many similar initiatives to come. A site like this can be developed into a wiki or it can have staffed (unpaid) experts in given fields as reviewers or it can develop in any other direction. But, and this is important, it should never require fees of any sort from either referees, authors or readers. There are no fees attached to the sciphu site (it doesn’t even have google adds), it’s all non-profit scientific idealism. Submitted by Sciphu (not verified) on 22 May 2008 – 10:21am.
Money and open access, – this is the imperative issue that needs to be sorted out. Development of new publishing methods will most certainly follow. Peer-review publishing is dead, long live peer-review publishing.
Hence I pledge my complete and unrestricted support to any open-access initiative.
Everyone’s talking about their pet peeves, I thought maybe I should too. Here are three of them
1. Labeling DNA with unknown function as “junk”
2. Scientists in ivory towers and on top of their high horses.
3. Holding back on scientific arguments in fear that someone will use them in an unscientific way.
The two last ones are really about how scientists communicate with the rest of the world, and I’ll get back to that
The post that lets me comment on these issues all at the same time is: Scientists Cynical use of “Junk DNA” at Michael Eisens blog (I know the post is rather old, but it is new to me). Coincidentally his post allows me to sum up my recent “junk” posts and the “creationist terror” post.
Quotes for each peeve
Unfortunately, for initially practical reasons, a disproportionate amount (surely in excess of 90%) of research has focused on protein-coding genes, fostering the faulty impression – amongst scientists as well as science writers – that the ~3% of the human genome that is protein-coding contains > 90% of the function.
This is good…..if it means what I mean: that labeling DNA of unknown function as “junk” by default is wrong. Which it most certainly is. For more on this topic, see my 6 post discussion with Larry Moran (1,2,3,4,5,6).
But, then Eisen starts criticizing the press release for the fact that they used this “junk” term:
They work on non-coding DNA precisely because they know it is NOT junk. So why, when it’s time to make a pitch to the local press officer, do they fall back on this old bromide? It obviously appeals to writers – who love it when they can pitch a story as overturning orthodoxy. It seems minor, but pegging it this way leads to some really attrocious misrepresentations of current biological knowledge.
This is bad, because of two reasons. Firstly, the term is not wrong, it’s used on a piece of DNA with an previously unknown function. A lot of this DNA is currently labeled “junk” by the “junk-people“. The only wrongdoing here is that they didn’t specify that regulatory elements have been known for some time, – that’s hardly a grave error. On the contrary, phrasing it like this in the press release underscores and highlights that much of what we previously labeled as “junk” in fact isn’t. That is really, really good, in fact it’s an excellent way of enlightening the public that non-coding DNA isn’t necessarily “junk”. Secondly, when scientists communicate with the rest of the world it is important to use terms that will not serve to alienate. Only to a certain extent of course, because oversimplification can easily distort the true message. But, this press release is not an example of oversimplification. The critique of these news-pieces constitutes nitpicking, and strengthens the view of scientist as locked inside ivory towers or sitting on top of their high horses. There are plenty of examples of press releases that is misrepresenting science, are inaccurate or just plain wrong. The over-hyping of imminent cures for cancer, diabetes and Alzheimer are good examples of bad science news reporting, and this happens almost daily all over the world. Another example is getting statistics all wrong in reporting on genetic tests (there are however, initiatives to try and fix this situation. You are hereby encouraged to go see HelixGene, I also strongly recommend you to join if you can help, or report bad news coverage of science if you find any).
Towards the end of the post I find my third pet peeve:
A second, and less obvious, problem is that this view has played into the hands of the intelligent design crowd.
And every time a new study comes out reporting that “junk DNA” is not junk, the ID’ers jump on it as validation of the predictions of ID. It’s hooey of course, but we needn’t give them the opportunity.
Which shows us Eisen is a victim of creationist-terror (see my previous post on this topic), and it makes me sad that we as scientists do not have the guts to stand up against this terror. We must feel free to express whichever valid scientific argument we find relevant in a given topic or field. That some of us don’t makes me really, really unhappy….
To: HelixGene Genomic Experts
Thank you for adding me to your google group. I am excited to see what topics will emerge on HelixGene Genomic Experts and how the discussions will turn out.
I would love to see any project aiming to build a bridge between popular press and scientists, gain momentum. As I mentioned on the comments to the Gene Sherpa blogpost, I have been wanting to do something similar for some time (SciPhu.com is supposed to be the starting point), but you are in a much better position than me to achieve this. To further the cause, I’d like to share my thoughts on “peer reviewing” popular press scientific information by giving you a short summary of The SciPhu project proposal / business plan:
SciPhu.com shall be a website that brings experts and journalists (or the general public) together to produce accurate scientific information. Experts will be “peer-reviewing” what journalists have written or what a company wants to use as PR-material, and a label would go on the particular text to confirm scientific accuracy. Experts need to be approved, but can be recruited from any part of the world, in any scientific field and, in theory, on most academic levels (skillful grad-students and post-docs could easily participate). Experts could be graded upon the quality of their peer reviews, and such grading could be used as screen to get into doing peer-reviewing for paying customers like news-agencies and company PR-departments (the money incentive to get academics to participate). The website thus, would serve as a 24/7 resource for anyone wanting to do quality assurance on scientific information, some would pay for this service (news agencies, PR-companies/departments) and others would get it for free (institutions, government officials, general public)…
One of the HelixGene Genomic Experts approaches is to grade texts after publication. The other approach like mentioned in your about section, is to act as a liaison between journalists and scientist and this is what I strongly believe needs to be focused on. To that end, I have been looking at HARO (HelpAReporterOut) which is an e-mail group where journalists asks for expert opinions prior to writing their story. Incorporating such services I think could have great benefits (altough the current HARO is focused on life-style/travel and other trivialities of life, rather than hardcore science).
I have more details worked out in my mind and plenty more (premature) ideas, but neither the programming/web skills nor the time to go through with this alone. I’ll just repeat my willingness to contribute and collaborate. I am very happy to see that my thoughts are shared by others.
The goal is accurate information. Information is fundamental in shaping the developments that constitutes our future. This is the time to put some quality control on that information. HelixGene is a call out to all scientists: Participate_in_this_mission.
Nils Reinton, PhD
The other day I was listening to a talk on evidence-based medicine and how to navigate the literature without getting lost in too much information. The speaker went through some of the guidelines to effectively use scientific evidence in the clinical practice and medical research. These guidelines alone constitute information overload it seems, but if you want to try yourself, a starting point can be found at Cambridge university Library.
One of his conclusions that I found particularly interesting, was that to penetrate this massive amount of information, you need to use Sturgeon’s law. This law states that: to avoid information overload, you have to assume that
“Ninety percent of everything is crap”.
I immediately thought of friendfeed, not because there’s a lot of crap there, which there isn’t, not in my crowd anyway, ….not yet… But, because friendfeed is the current spearhead aggregator of information, – information from channels that were already overloaded, like twitter, the blogosphere and web-news.
Here’s a descriptive tweet (on friendfeed) from Berci Mesko some time ago:
“I’m absolutely not worried when I see I have 1500 feed items to read. Am I totally mad?”
A possible solution (using something similar to Sturgeons law) comes from the blog post Why I Stopped Reading Blogs (for a while):
1000+ items. That’s what Google Reader told me I need to read to catch up with my RSS subscriptions. It’s intimidating. My RSS feeds were mocking me. I could see them with sneaky voices “hee hee, you’ll never read me, you don’t have the time. ha ha.” The sad part is, they were right………..
….took a nice long look at the list and asked myself – does this matter to me? Do I even know this person? Will I be worse off without this content in my life? No. No. No.
And doing something like this may help you avoid some dire consequences (from Slaw.ca):
They are numerous studies to suggest that information overload makes us dumber: Persons exposed to excessive amounts of information are less productive, prone to make bad decisions, and risk suffering serious stress-related diseases.
Me on the other hand, I never got to the point where I had 1000+ entries in my reader, I only have three or so blogs there. But, I also follow twitter and friendfeed. And then I’ve got a couple of (three ?, – maybe four ?) science news sites I go through on a semi-daily basis, and in addition I am following feed-networks like The DNA Network, and there’s mail correspondence and of course journals to skim through (and consequently, articles to examine) as well as a couple of books I’d like to read…..
What I do to keep from overloading is simply to click and read only when I have some time to spare. I also very rarely go beyond that first page of friendfeed and seldom look at historic postings on blogs or news sites. I just do not have the time. I have this life I need to live and it keeps getting in the way of the internet and reading in general.
This does however, mean that I am missing out on a hell of a lot and that the timing is essential to the information I get. Thus, although I have tried to optimize the information channels I take in, I am basing my information (knowledge ?) on luck of timing.
I will continue to do so I guess beacuse you just can’t have all the cakes and eat them too. Which basically is Sturgeon’s law, only reformulated, and a comfort to my ignorance.
Sean O’Donoghue and Lars Jensen
|Reflect: Automated Annotation of Scientific Terms|
Timothy Baldwin, Lawrence Cavedon, Sarvnaz Karimi, David Martinez, David Newman, Falk Scholer and Justin Zobel
|Effective Search, Classification, and Visualisation of Information from Large Collections of Biomedical Literature|
Vit Novacek, Siegfried Handschuh and Tudor Groza
|Teaching Machines to Teach Us: A Truly Knowledge-Based Publication Management|
Amr Ahmed, Andrew Arnold, Luis Pedro Coelho, Saboor Sheikh, Eric Xing, William Cohen and Robert F. Murphy
|Information Retrieval and Topic Discovery using both Figures and Captions in Biological Literature|
Stephen Wan, Robert Dale and Cecile Paris
|In-Context Summaries of Cited Documents: A Research Prototype for Academic and Scholarly Literature|
|Towards realising Darwin’s dream: setting the trees free|
Jose Gonzalez-Brenes, Aabid Shariff, Sourish Chaudhuri and Carolyn Rose
|Automating the Generation of Life Science Protocols|
Glenn Ford, Sameer Antani, Dina Demner Fushman and George Thoma
|Tools to build and use Interactive Publications|
Michael Greenacre and Trevor Hastie
|Guided Tours in N-Dimensional Space: Dynamic Visualization of Multivariate Data|
Alexander Garcia and Alberto Labarga
|A tale of two cities in the land of serendipity: The semantic web and the social web heading towards a living document in life sciences.|
Quotes from The Next Renaissance, A talk by Douglas Rushkoff
I am not a programmer. I thought maybe blogging would suffice in doing my part to change the world, – that has been, and still is, the distant goal of my blogging endeavor.
Computers and networks finally offer us the ability to write. And we do write with them. Everyone is a blogger, now. Citizen bloggers and YouTubers who believe we have now embraced a new “personal” democracy. Personal, because we can sit safely at home with our laptops and type our way to freedom.
But reading further in this piece in a recent Edge edition made me realize that to truly make an impact, knowing some molecular biology and writing about it, will not cut it.
But writing is not the capability being offered us by these tools at all. The capability is programming—which almost none of us really know how to do. We simply use the programs that have been made for us, and enter our blog text in the appropriate box on the screen. Nothing against the strides made by citizen bloggers and journalists, but big deal. Let them eat blog.
At the very least on a metaphorical level, the opportunity here is not to write about politics or—more likely—comment on what someone else has said about politics. The opportunity, however, is to rewrite the very rules by which democracy is implemented. The opportunity of a renaissance in programming is to reconfigure the process through which democracy occurs.
Since for the time being I do not have the time or the money to educate myself a second time around, blogging will have to do. And I still believe there’s some impact in that (maybe not in my blogging, but there’s without a doubt power in the blogosphere as a whole).
At some point however, since true future power apparently lies in programming, – off to school again, in a mission to rule the world.
In addition to posting news, a science blog should serve to make science more accessible by being less formal in its presentation. By blogging a personal (and political) viewpoint a scientist can express thoughts he would otherwise be unable to in a scientific journal/publication. A good science blog serve as a discussion hub for scientists and I am pretty sure that blogs in this way will continue to be important information channels both between scientists and out to the public.
Which blog is the best I think is irrelevant, – we need a lot of them. Twitter and Friendfeed are just a bit to shallow for proper scientific arguments to be made, but like blogs serve as an extension to journals, they serve as extensions to blogs.
With this post, my posts on genetic counseling are now a trilogy (which somewhat unfairly puts them in the same category as some amazing literature, – and films).
From a recent Nature News Special Report:
No one denies that genetic test results can be life-altering for some individuals. But research by Theresa Marteau, a health psychologist at King’s College London, and others has shown that most people are remarkably resilient in the face of traumatic genetic test results. They typically report feeling anxious or depressed around the time of testing, but these effects dwindle within a few months.
This fits well with my first post where I argued that the need for genetic counselors was overrated. After reading an article on Huntington’s disease however, I changed my mind, and wrote another blog post. But now, this quote contradicts what I thought was my final conclusions and I am left wondering where I stand … again:
Studies by Aad Tibben, a psychologist and psychotherapist at Leiden University Medical Centre in the Netherlands, and his colleagues showed that people who took predictive tests for Huntington’s disease mostly recovered from the shock. Many actually felt more in control after testing because they could make arrangements for care, or even for euthanasia.
And I am not the only one who is confused on these matters
With so much uncertainty about how people deal with genetic risk, is genetic counselling necessary or helpful for people undergoing the less definitive tests for an increased propensity for heart conditions or diabetes? “I’m convinced it’s necessary,” says Tibben. But he and others in the field acknowledge that there is little in the way of controlled trials to support their belief.
I have decided to go with the conclusion that the best thing to do is probably to do the genetic counseling,… and then evaluate,… and then stop doing it if it doesn’t work. This simply because to my knowledge, genetic counseling doesn’t do any harm. It may even do some good even if the effect is all placebo:
“……….Did the counsellor help the patient understand complicated risks, or just provide some face-to-face contact and empathy in a confusing medical world?
So, until someone comes out with a study that says that genetic counseling is harmful, this post will reflect my final (!?) postition. End of story (trilogy).
After reading “Living at risk: Concealing risk and Preserving Hope“, which was an eye-opening experience, I am ready to argue against myself and the arguments in my previous post “Now why do we need genetic counselors ?”.
In this post I predicted that genetic counselors may soon be obsolete because nobody cares about low-risk alleles. In addition I argued that information on high-risk alleles is better managed by physicians.
The not caring bit is still true (unfortunately) as far as I know, but after reading the above mentioned paper, I need to modify my opinion on high risk tests. High risk tests in this context, are tests that if positive, means developing disease in the near future. Testing for Huntington disease is a model example of such tests since:
“Penetrance, the likelihood of showing symptoms of the disease if the associated genetic mutation is present, is virtually 100%.”
Thus, this is a clear medical case and a physician should be able to give adequate counsel to the patient. But, the issues a practitioner would face are so much more than medicine alone and recommendations for counseling goes wider than what is expected of a primary care physician:
Nearly every participant with children experienced terrible difficulty in talking to their children about their risk, even when the children were grown. We infer from this difficulty that practitioners could, and should, find ways to help people at risk develop plans for educating their children at an appropriate age. We envision such plans to be developmentally based, geared to answering questions at the child’s level, as well as being persistent and gradual in the presentation of the issues of importance.
This to me, sounds like genetic counseling. Further arguments for genetic counseling comes from the recommendations to the clinicians:
Clinicians also need to reflect on their own beliefs and biases about genetic testing, and to examine the extent to which those beliefs and biases present themselves in their care for people at risk for HD. Primary care health professionals need to be cognizant of the fact that just because a test can be done does not mean that it should be done. What these men and women are telling us is that it is not safe to assume that genetic testing for incurable diseases will necessarily provide information that is wanted, or needed, by those at risk and that testing may have a significant negative impact on the lives of their patients.
Objectively reflecting on genetic testing as well as telling the patient that it may actually be wise not to get tested, are probably things a genetic counselor would do better than the primary care physician.
So, the conclusion must be: I was wrong, we need genetic counselors. Reading “Living at risk: Concealing risk and Preserving Hope” will tell you this. In adition it will teach you that 80-85 % of at risk individuals elects not to undertake predictive genetic testing. They do so to survive since a lack of hope can be devastating:
Something that my uncle said, that I think really stuck with me, is he wrote a suicide note. He said that there’s such a big difference between living with hope and living with knowledge. And that he would take the living with hope any day. And so he really did not think we needed to know, one way or another.
…………and, proper counseling (not only genetic) may be a help for people in handling their life at risk since:
It is noteworthy that several participants said the interview for this study was their first opportunity to talk about the emotional side of HD, despite their years of experience with neurological, cognitive, and psychomotor testing …………… We think that unstructured interviews might actually change the views and actions of the participants with respect to their careful concealment of risk and their preservation of hope.
This is probably true. Regardless of the extent of counseling, it seems to me that genetic counseling for these patients and their family members is a good starting point.
The final take home message must be that not testing for a condition has significant value, especially when treatment options are scarce or non-existent.
Hope is sometimes a life saver. Knowledge on the other hand, can put peoples lives in ruins. Use this as a guiding light if you will, – I know I am going to.