Is it me, or did we just read an article that began “Once upon a time,” only to find out that it was a 32-page-long thick-tongued meta-analysis of the depressing state of psychological science? Talk about false advertising.
There are two things I walked away from the Westen, Novotny & Thompson-Brenner (2004) article thinking: the DSM is worthless, and so are psychology articles written by biased researchers (and by biased researchers, I mean all researchers).
I propose three solutions:
1) A DSM Congress.
2) Data in place of language.
3) Open-source data sharing.
Westen, et al. outlines many concrete reasons why the current DSM is hindering the validity of research on ESTs: it doesn’t apply to most people and it leaves out entire conditions which have come to be ignored as a result of not being mentioned (p. 634), among many others. Many of the overall assumptions of EST research seem to be based on the fact that we must draw lines between disorders, and those lines are based on DSM categorizations, which are bogus. The majority of Westen, et al.’s problems with the assumptions in EST research would be solved if an overhaul of our diagnostic system were put into place. Based on this article and the class discussion on September 5th, it seems that the system should be dynamic so as to adjust to the changing literature, and public so that researchers always know the rationalization behind everything. A DSM congress, modeled after the current structure of the US congress, seems to make sense. With knowledgeable representatives, a system to propose bills and amendments, a public record, and the ability to constantly change the system, we would not have to worry about rewriting the constitution every twenty years, or giving researchers reasons to write novella after meta-analytic novella on the problems contained in an unchangeable manual written two decades ago. A DSM congress would allow the diagnostic system to actually benefit from up-to-the-minute research discoveries, and create an open forum for debate. A DSM congress would allow growth in places where, with the current DSM-IV-TR state of affairs, we are crying out for change in the direction of an abyss.
In their section on “Maximizing the Efficacy of Clinical Trials,” Westen, et al. elaborates on what scientists should be reporting in their publications, repeatedly pointing out cases in which researchers clearly misinformed the reader by wrapping their information up to “tell the best story” (p. 653). For example, they pointed out that when setting criteria for qualifying a participant as having completed the therapy, “many of the reports…used different definitions in different analyses. The only reason we even noticed this problem was that we were meta-analyzing data that required us to record Ns, and noticed different Ns in different tables” (p. 654).
In general, it seems that this is a problem with language – articles are written for people to read. But it is silly to read data in the form of words and expect it to lack bias. How many times have you looked at the procedure and results section of a psychology article and scanned for vitals? We don’t need to read these sections – we need a list. What’s more is that lists can be standardized. I say throw out everything between the intro and the discussion, and replace it with a standardized chart created by the journal that is publishing the study before the paper is even submitted. The procedure and the results section would then contain the ALL information we actually need, much like the little box for carbon on the periodic table tells us that it’s got 6 protons and weighs 12.01 au. Let the reader know everything, save time, and conserve space. [edit] Meanwhile, slanted opinions and biases can be shared in the introduction and the discussion sections - this might even lead the discussion sections to grow in length, as the reader has not just had to read statistics in the form of words, and has the energy to absorb a well-synthesized argument for why their data is significant. [end edit]
Regardless of whether the above proposal makes any sense, if research is to move forward, it is clear that we need to change our approach to data sharing. The more people have access to a given data set, the better research will be and the faster we will find the answers we are looking for. Sharing data is not unlike the concept of sharing source-code associated with software programs, termed “open-source software.” In the past, hoarding source code as company secrets has rendered software weak, such as Microsoft’s Windows and pretty much everything associated with it (have you ever tried to successfully use Windows Media Player? I mean come on). Whereas, take for example the success of Mozilla Firefox, the most highly functional and secure internet browser, and also the first to use open-source code. Firefox blows Internet Explorer out of the water with its ability to block pop-ups, its easy-to-use interface, and overall top-notch security. And it’s free. Amazing!
All of the complaints about the fact that money and grant-writing skills guide the research agenda might be appeased if the field utilizes the internet medium in the same way that the tech boom has done. Unfortunately, we are still using clunky programs like PsycInfo and then manually downloading articles as PDF files, which are essentially pictures set in a certain order. It’s 2007 and psychological science is using pictures of text as our main source of information. I’m trying to look for the technological progress in this state of affairs, but find none.
So, I say, democratize the DSM, force researchers to show all their cards if they are going to get published, and put aside pride for the sake of progress by creating an open-source data sharing medium that will utilize research to its fullest extent.
Or maybe we're doomed. Maybe, in the words of Cory Doctorow in his essay Metacrap, "It's wishful thinking to believe that a group of people competing to advance their agendas will be universally pleased with any hierarchy of knowledge. The best that we can hope for is a detente in which everyone is equally miserable."
Monday, September 17, 2007
That's so meta.
Posted by Thrasher at 4:24 PM
Subscribe to:
Post Comments (Atom)
6 comments:
Holy shit! Way to form an opinion! That was really something. I do find the idea of the DSM congress compelling, and once proposed the abolition of the APA Ethics Code for similar reasons, proposing "consensus conferences" on critical issues and publishing policy statements on the bases of the consensus. While I actually mostly like the structure of scientific articles, I also get the part about data sharing. In fact, many neuroscience journals are now doing precisely this. You can't publish in, for example, Nature Neuroscience, without handing over your dataset for public download. Finally, about the need for a new paradigm of data sharing via the internet, well, you simply must be right. I think it is inevitable, but things like this can be glacial.
wow--i don't normally leave comments for the other people in our class, but this post was really neat. i really enjoyed reading it! lots of really cool ideas. i'm excited to hear what you have to say during class. :)
also, i too felt way tricked by the "once upon a time" opening...
see ya!
"we are crying out for change in the direction of an abyss."
ohmahgaw! drop tha bomb, Cat.
Inspirational stuff! I say we render a Manifesto and nail 95 or so of them to the door of the APA (where is that, anyway?), or just post them online.
My only disagreement: I think the discussion sections are vital, as no one knows more about the specific research project (it's context, the hypothesis that prompted it, the experiences involved in it) than the researchers who conducted it. Perhaps, though, a combination of thorough discussion (maybe in blog format?) and open-source datasets (neat idea!... though it would preclude lots of people from going to grad school, maybe) might be the ultimate 21st-century publication paradigm!
Matt, I totally agree! The discussion section and the intro are must-haves. After all, they're the fun part!
I miss-wrote what I was trying to say - see the above edit.
Thanks!
Checkout the NASA CoLab Project here: http://colab.arc.nasa.gov/ Delia Santiago is working on this group... ask me if you'd like an introduction.
Also try googling for "NASA CoLab Group" and find lots of other blogs about sharing science data and research in isolation.
The CoLab group is trying to figure out how to get researchers to publish their raw data for the public to view and contribute to. The idea is that the public and other researchers will think of things that the PI never considered, and create something useful as a result.
Post a Comment