Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Research Data: Share Early, Share Often

timothy posted more than 2 years ago | from the coin-toss-1-coin-toss-2-coin-toss-3 dept.

Science 138

Shipud writes "Holland was recently in the news when a psychology professor in Tilburg University was found to have committed large-scale fraud over several years. Now, another Dutch psychologist is suggesting a way to avert these sort of problems, namely by 'sharing early and sharing often,' since fraud may start with small indiscretions due to career-related pressure to publish. In Wilchert's study, he requested raw data from the authors of some 49 papers. He found that the authors' reluctance to share data was associated with 'more errors in the reporting of statistical results and with relatively weaker evidence (against the null hypothesis). The documented errors are arguably the tip of the iceberg of potential errors and biases in statistical analyses and the reporting of statistical results. It is rather disconcerting that roughly 50% of published papers in psychology contain reporting errors and that the unwillingness to share data was most pronounced when the errors concerned statistical significance.'"

Sorry! There are no comments related to the filter you selected.

Psychology (0, Troll)

oldhack (1037484) | more than 2 years ago | (#38281990)

What did you expect?

Re:Psychology (1)

Anonymous Coward | more than 2 years ago | (#38282120)

Thats how climategate started !!

Re:Psychology (2, Insightful)

rgbatduke (1231380) | more than 2 years ago | (#38282758)

And continues. Phil Jones, for example, has stonewalled requests for the raw data used to e.g. create HadCRUT3 etc, although recently it seems that one reason he hasn't shared it is that he lost it and literally can't share it. So we have a rather important temperature series, openly available on the web and used by many, many climate researchers and nobody can reconstruct it, including the original author. The problem continues -- it is like pulling teeth, getting members of the hockey team to share data and/or methods so anyone can check them.

Since the few times somebody has bulled through until they've succeeded, e.g. Steve Mcintyre vs Michael Mann, what has been discovered is that the published result (the infamous MBH "hockey stick") is nothing but amplified, distorted white noise that has absolutely no correlation with the data used to produce it, let alone skill at reconstructing actual past temperatures, it doesn't bode well for the discipline.

I've recently written a guest article on WUWT calling for data/methods transparency in climate research. By transparent, I mean that you should not be allowed to publish a paper that could potentially influence lawmakers and public policy to the tune of hundreds of billions of dollars unless you simultaneously publish all contributory raw data (including any data you for any reason left out) and the actual computer code used to process it into figures and conclusions. Something this important needs full open source open data transparency even more than medical research (another discipline where reproducibility of results is abysmal, where there are vested interests galore, and where we spend/waste a phenomenal amount of both money and human morbidity and mortality on crap results.

rgb

Re:Psychology (4, Insightful)

sstamps (39313) | more than 2 years ago | (#38283176)

And continues. Phil Jones, for example, has stonewalled requests for the raw data used to e.g. create HadCRUT3 etc, although recently it seems that one reason he hasn't shared it is that he lost it and literally can't share it.

That is complete and utter bullshit.

First, he has never stonewalled requests for the raw data. It's been out there for ANYONE to obtain. The problem is that, for some of it, you have to PAY to get it, and UEA was forbidden by contract to give away said data for free because then people wouldn't PAY for it anymore. So, if you want to piss and moan about access to the raw data, then apply your angst and woe to the most responsible parties, the Met offices which want to profit from their weather data-gathering businesses.

Second, the "lost data" canard is a crock. Since the raw data is not owned or generated by UEA, but instead obtained from outside sources, they have NO mandate to keep the original raw data once they have processed it. They (and you and anyone else) can go and get it from the same sources at any time. Whip out your checkbook and get to it.

So we have a rather important temperature series, openly available on the web and used by many, many climate researchers and nobody can reconstruct it, including the original author. The problem continues -- it is like pulling teeth, getting members of the hockey team to share data and/or methods so anyone can check them.

You (and they) most certainly can get the original raw data and reconstruct it. There are literally mountains of data that have been released to the public on a large part of climate science. You just need to learn who and how to ask properly and, in some cases, how much it costs.

Here's [realclimate.org] a huge FREE repository of all kinds of climate-related data, from the climate scientists themselves.

Since the few times somebody has bulled through until they've succeeded, e.g. Steve Mcintyre vs Michael Mann, what has been discovered is that the published result (the infamous MBH "hockey stick") is nothing but amplified, distorted white noise that has absolutely no correlation with the data used to produce it, let alone skill at reconstructing actual past temperatures, it doesn't bode well for the discipline.

Mann's work has been vindicated and replicated time and time again, McIntyre's (and others') quixotic attempts to discredit it notwithstanding.

I've recently written a guest article on WUWT..

That explains the ignorance of your previous comments a bit.

..calling for data/methods transparency in climate research. By transparent, I mean that you should not be allowed to publish a paper that could potentially influence lawmakers and public policy to the tune of hundreds of billions of dollars unless you simultaneously publish all contributory raw data (including any data you for any reason left out) and the actual computer code used to process it into figures and conclusions. Something this important needs full open source open data transparency even more than medical research (another discipline where reproducibility of results is abysmal, where there are vested interests galore, and where we spend/waste a phenomenal amount of both money and human morbidity and mortality on crap results.

In large part, this is precisely what happens, with a few exceptions. Those exceptions usually revolve around whether any kinda of contracts with private entities to obtain said data, or to develop software/hardware, are in effect that would preclude giving them away. That said, the research should (and usually does) document the specifications for said hardware/software, and include where the original data came from for anyone to pay to obtain it themselves.

As a software developer who actually writes software for scientific/academic research, I can assure you that I like to eat, too. That said, I think that critical research needs to plan projects in such a way that they can license all data and apparatus used in said research so they can share it with everyone. I think that is now happening in climate research a lot more than it used to, but it is by no means a common practice in most of academia. There's an awful lot of proprietary stuff out there; even so, its existence hasn't negatively impacted research as a whole, so I don't understand why it is such a problem now.

Re:Psychology (1, Informative)

mrcaseyj (902945) | more than 2 years ago | (#38283704)

sstamps wrote:
>First, he has never stonewalled requests for the raw data. It's been out there for ANYONE to obtain. The problem is that, for some of it, you have to PAY to get it, and UEA was forbidden by contract to give away said data for free...

No. Those who requested the data requested that if all the data couldn't be provided, then the freely available data should be provided. They were refused. When asked for a list of what data was used, but not the data itself, they refused. Even if the data is available for free on the net, how can the results be replicated if they will not say which data was used?

>Mann's work has been vindicated and replicated time and time again...

It has only been replicated by his buddies. It's like a study by an oil company being replicated by another oil company. There can be no vindication for trying to "hide the decline". It is a well established rule of science that you don't leave out data that casts doubt on your conclusion.

You've fallen for their story. Many of us used to think the alarmists were good willed, and we assumed they were honest. I still think they are good willed, but we now know they are not honest. They hide important information that casts doubt on their theories. And worse, when their colleagues are caught doing corrupt science, their community maintains a code of silence or defends the indefensible. This casts doubt on all the evidence brought by the entire climate science community.

Re:Psychology (5, Informative)

sstamps (39313) | more than 2 years ago | (#38285212)

No. Those who requested the data requested that if all the data couldn't be provided, then the freely available data should be provided. They were refused.

Bzzt. Wrong. Try again. [realclimate.org]

30.First, in answer to the question of whether the raw data are accessible and verifiable, Professor Jones told us that:
The simple answer is yes, most of the same basic data are available in the United States in something called the Global Historical Climatology Network. They have been downloadable there for a number of years so people have been able to take the data, do whatever method of assessment of the quality of the data and derive their own gridded product and compare that with other workers.

31.In addition, of course, there are the sources of the data, the weather stations, to which any individual is free to go and collect the data in the same way that CRU did. This is feasible because the list of stations that CRU used was published in 2008.

41. Professor Jones contested these claims. According to him, “The methods are published in the scientific papers; they are relatively simple and there is nothing that is rocket science in them”. He also noted: “We have made all the adjustments we have made to the data available in these reports; they are 25 years old now”. He added that the programme that produced the global temperature average had been available from the Met Office since December 2009.

51. Even if the data that CRU used were not publicly available—which they mostly are—or the methods not published—which they have been—its published results would still be credible: the results from CRU agree with those drawn from other international data sets; in other words, the analyses have been repeated and the conclusions have been verified.

When asked for a list of what data was used, but not the data itself, they refused. Even if the data is available for free on the net, how can the results be replicated if they will not say which data was used?

Jones PERSONALLY refused. The information about what data was used has been available since the original papers and research were performed! IT'S IN THE RESEARCH, DURRRR. Have you ever read any of it?

It has only been replicated by his buddies.

Bzzt! [globalwarmingart.com] Wrong. [skepticalscience.com] Try again. [berkeleyearth.org]

BEST was funded by the Koch brothers, owners of a giant oil/petrochemical company. Most DEFINITELY NOT "buddies" with Mann. Even still, being "buddies" in science doesn't mean diddly-squat; it's not about WHO you know, but WHAT you know, and HOW WELL you know it. So far, Mann's work has been REPEATEDLY vindicated.

There can be no vindication for trying to "hide the decline".

Ya know, for a minute there, I thought you might be trying to be genuinely serious and skeptical. Then you trot THAT out. /facepalm

It is a well established rule of science that you don't leave out data that casts doubt on your conclusion.

You are correct, it is, and the vast majority of climate scientists and their research faithfully follow that rule, no matter how many intellectually dishonest, ignorant, and gullible idiots falling for charlatans and snake oil salesmen lke Watts, Michaels, Singer, et cetera ad nauseum, try to spin otherwise.

You've fallen for their story.

No, I've fallen for the FACTS of the matter. I've done my homework; I've looked beyond anyone's story; what's YOUR excuse?

Many of us used to think the alarmists were good willed, and we assumed they were honest. I still think they are good willed, but we now know they are not honest. They hide important information that casts doubt on their theories. And worse, when their colleagues are caught doing corrupt science, their community maintains a code of silence or defends the indefensible. This casts doubt on all the evidence brought by the entire climate science community.

Many of us so-called "alarmists" used to think that deniers were good-willed, and assumed that they were trying to be honest skeptics. Now, I know that they all are intellectually dishonest, if not downright maliciously ignorant. They cherry-pick over the facts to obtain the tiniest amount of information to spin in support of their impaired mental state, and try to pass it off as if it wasn't oblivious to the other 99.999% of the facts and evidence inconveniently contrary to it. Worse, when they are caught in their lies and ignorance, they NEVER, and I mean *NEVER* admit fault and accept what they were wrong about. Worse, they wait a few months, then trot out the SAME EXACT SHIT again and again, as if it was some startling new revelation that no one has ever possibly considered before, ESPECIALLY "those darn climate scientists; they NEVER think of this stuff". This casts doubt on their sanity as human beings, let alone their rationality as any kind of REAL skeptics.

Now, see how that works? Demonization works both ways, ya know.

I KNOW the scientists are beyond tired of it, and I can't blame them. I can imagine they feel like Copernicus or Galileo all over again, with the tidal wave of the ignorant masses constantly threatening to wash them "away" in one form or another.

Re:Psychology (0)

Anonymous Coward | more than 2 years ago | (#38284516)

Second, the "lost data" canard is a crock. Since the raw data is not owned or generated by UEA, but instead obtained from outside sources, they have NO mandate to keep the original raw data once they have processed it. They (and you and anyone else) can go and get it from the same sources at any time.

If there's a risk that they'd be charged again if they go and get it from the same sources, they're fools to delete it. And even if there isn't, it's common sense to hang onto your raw data in case you find a bug in your tools and want to reprocess it from scratch.

Re:Psychology (1)

sstamps (39313) | more than 2 years ago | (#38285572)

There is likely no risk; it is a digital data product, so it is licensed, and they probably pay a subscription fee so they can get any of the data (including more current data) at any time.

Re:Psychology (3, Interesting)

rgbatduke (1231380) | more than 2 years ago | (#38284610)

Hmmm, you really do need to read the climategate 2 letters, don't you.

From message 4241.txt, a communication from Rob Wilson to Ed Cook (and others):

I first generated 1000 random time-series in Excel – I did not try and approximate the persistence structure in tree-ring data. The autocorrelation therefore of the time-series was close to zero, although it did vary between each time-series. Playing around therefore with the AR persistent structure of these time-series would make a difference. However, as these series are generally random white noise processes, I thought this would be a conservative test of any potential bias.

I then screened the time-series against NH mean annual temperatures and retained those series that correlated at the 90% C.L.

48 series passed this screening process.

Using three different methods, I developed a NH temperature reconstruction from these data:

1. simple mean of all 48 series after they had been normalised to their common period

2. Stepwise multiple regression

3. Principle component regression using a stepwise selection process.

The results are attached.

Interestingly, the averaging method produced the best results, although for each method there is a linear trend in the model residuals – perhaps an end-effect problem of over-fitting.

The reconstructions clearly show a ‘hockey-stick’ trend. I guess this is precisely the phenomenon that Macintyre has been going on about.


Surely this vindicates Mann -- by proving that it does indeed turn white noise into hockey sticks! Not only is Mann wrong, but the hockey team knows it perfectly well! There are letters where people openly lament being involved with the hockey stick type reconstructions (and other places, e.g. where they "hid the decline" in tree ring data) because they are terrible science and because they are openly worried that sooner or later people will catch on. As indeed they have, although they have won the PR war (another great Mann quote) to such an extent that even though they themselves know that the hockey stick is bogus and that white noise fit according to Mann's cherrypicking methodology will produce nothing but hockey sticks, it just won't die, will it? Thanks to people like you!

We could review the specific Climategate 2 letters where Jones talks about deliberately trying not to give away data to the people who requested it (something I would call "stonewalling", except that the circumstance in question is a FOIA request that was only a missed deadline away from being "a crime" upon the release of the CG emails), or about the points where it turns out that he does a lousy job of keeping records (problems with Excel spreadsheets) and no longer can reproduce his own results because he doesn't know what data he used, if you like.

Or we could look at the many, many other places where internal communications show that the hockey team is well aware of many problems with their own results and consistently choose not to let the general public know about them lest we be led to doubt their conclusion. Then we could read Feynman's lovely article on "Cargo Cult Science": http://www.lhup.edu/~DSIMANEK/cargocul.htm [lhup.edu] . See how close you think the hockey team comes to Feynman's fairly modest standard for good, honest science, while reading Mann going on about the importance of winning the PR war, getting journal editors fired, and generally doing his very best to eliminate all challenge to his papers, or, if he can't manage that, eliminating the challengers themselves.

But really, read them yourself. Don't accept what people tell you about them, read them! Then tell me that this is honest science, well done.

rgb

More Errors in the Reporting of Statistical result (1)

Anonymous Coward | more than 2 years ago | (#38281994)

"It is rather disconcerting that roughly 50% of published papers in psychology contain reporting errors"

How many errors are present in this statement? Just saying.

First Post (-1, Offtopic)

Crash McBang (551190) | more than 2 years ago | (#38282002)

Got started early on my sharing!

You Mean... (3, Funny)

Anonymous Coward | more than 2 years ago | (#38282004)

"Trust me I'm a scientist" isn't good enough anymore?

Re:You Mean... (5, Funny)

Anonymous Coward | more than 2 years ago | (#38282256)

Did you know that you can just BUY labcoats?

Re:You Mean... (-1, Flamebait)

jimmerz28 (1928616) | more than 2 years ago | (#38282300)

Psychology isn't a science. It's a pseudoscience.

Re:You Mean... (3, Informative)

Trepidity (597) | more than 2 years ago | (#38282720)

A lot of these errors have been found in neuroscience journals, too, which fancies itself a harder science...

Re:You Mean... (4, Insightful)

jc42 (318812) | more than 2 years ago | (#38284844)

A lot of these errors have been found in neuroscience journals, too, which fancies itself a harder science...

Actually, this is mostly a special case of a problem that's recognized in most scientific fields: Much scientific work (experimental or observational) has a statistical component, and scientists generally don't have as good an understanding of statistics as their work requires.

Statistics shares a common problem with other basic subject such as quantum theory, relativity, and chaos theory: They don't fit well with human "intuitive" concepts of how the world works. With quantum theory and relativity, this is fairly blatant, and people usually don't try to pretend to understand them until they've done some serious study. But with statistics (and chaos ;-), people tend to think they have at least a basic understanding of probability, and they also tend to think that that's all they need. They end up publishing data on the basis of output from packaged software that they don't understand well.

A while back, there was a discussion in a linguistic forum that I follow, about the Pirahã language which lacks words for numbers. As a way of explaining how people could survive without numbers, one contributor came up with an informative parallel: In the modern Western world, there are many important things (economics and climate are hot-topic examples) that can't be understood without an understanding of the important concepts of statistics. But one can easily argue that the dominant "modern" languages lack words for statistical concepts.

Nearly everyone will object that, for instance, English has well-known terms like "chance", "probability", "mean", "standard deviation", "correlation", etc. But, the author pointed out, these are "cargo-cult" terms, borrowed from an alien (i.e., scientific) language, with little or no actual understanding of their meanings by most of the native speakers of English. This is clear if you look for statistical terms in the English media, and figure out how they're being used. They are just magical terms used to sound convincing, but it's usually clear that the speaker/write doesn't actually understand their technical meaning. Similarly, "quantum" is a common English word, but it's common meaning is very nearly an antonym of the technical meaning in physics. Most English speakers have little or no understanding of the technical meanings of these terms

In the case of statistical terms, scientists do tend to have taken a course or two in college. But understanding is low, barely above the common understanding used in the media and politics. So it's not surprising that a good number of papers in many scientific fields claim results that don't strictly follow from the data. If there is any sampling done to get the data (and there usually is), it's likely that the conclusions came partly from an interpretation of some software's output that is based on a misunderstanding of the statistical terminology.

Of course, when you get to the pseudo-sciences and the political arena, this process isn't accidental. Statistical buzz-words are often used as part of the psychological weaponry, to convince readers/listeners of whatever the writer/speaker is trying to convince them of. This is often done with malice aforethought, knowing that the public has almost no understanding of statistics.

Re:You Mean... (1)

rgbatduke (1231380) | more than 2 years ago | (#38282796)

Not biological psychology/neurophysiology. Not even all social psychology. The work on cognitive dissonance, for example, is pretty amazing and reproducible and explains so very much...

rgb

Re:You Mean... (0)

Anonymous Coward | more than 2 years ago | (#38282820)

++

Isn't it time we stopped pretending that a bunch of folks who couldn't conduct a controlled, double-blind study if their life depended on it are "scientists"?

Psychology is no more a science than homoepathy, phrenology or spirit-healing.

Re:You Mean... (2)

Eil (82413) | more than 2 years ago | (#38282876)

Psychology isn't a science. It's a pseudoscience.

Hey, there's a scientologist in our midst!

So if psychology isn't a science, then classical conditioning doesn't exist, despite the huge volume of evidence that says it does? There's no value in trying to understand how human reasoning and memory works? We can't learn anything at all about how brain damage causes changes in day-to-day behavior?

Fact: Anything researched and studied according to the scientific method is science. That there are some researchers who draw conclusions without appropriate methods or sufficient evidence, or that some areas are difficult to conclusively test does not cast the entire profession as pseudoscience.

Re:You Mean... (1, Insightful)

jimmerz28 (1928616) | more than 2 years ago | (#38283488)

Your "fact" is utterly incorrect. Psychology isn't a science like math, physics, chemistry, biology and computer science are sciences.

A science has laws with verifiable, reproducible outcomes that can be proven (psychology has theories of behavior, not laws). Look at Jung vs. Freud for a great example of why there are no laws of psychology, neither of them is wrong but neither of them is right doesn't make a science.

Descartes used research and studied according to a scientific method to prove there was such a thing as a "mind", that didn't make him correct or the "science" of the mind an actual science.

Re:You Mean... (3)

king neckbeard (1801738) | more than 2 years ago | (#38283888)

The difference between a theory and a law isn't how verifiable a law is, but that theories attempt to explain why and laws do not. There is no explanation of why certain things happen in math and physics, so we have lots of laws in it. However, biology is far more abstract from fundamental truths of the universe, so it tends to have theories, since what is tested has explanations. Psychology is even more abstract, and thus would be even further down that line.

Re:You Mean... (1)

jimmerz28 (1928616) | more than 2 years ago | (#38285294)

Not sure where the tautology of the first sentence was supposed to go...I simply pointed out that when your field is filled with theories and does not contain any laws (which allow you to make verifiable claims) then your field is not a science.

It's a pseudoscience.

Similar to a pseudo-question, which is a question without an answer or for which any answer serves (e.g. "Where are your thoughts?" Descartes

It looks like a question because it's imperative in form, but it is not a proper question. Just like psychology looks like a science in form, but is not.

Questions have verifiable answers, just like sciences have verifiable postulates (laws).

Re:You Mean... (1)

jimmerz28 (1928616) | more than 2 years ago | (#38285320)

And by "imperative" I meant "interrogative".

Re:You Mean... (1)

king neckbeard (1801738) | more than 2 years ago | (#38285522)

The first sentence was to address the common misconception that laws are more scientific or require more evidence than theories.

So, are you claiming that evolutionary biology is a pseudoscience? Really, I'm hard pressed to think of any laws that exist in biology at all, although I won't claim that there are none. However, I'm quite certain that you could have a biology class that is completely absent of those laws if they exist.

Re:You Mean... (1)

Anonymous Coward | more than 2 years ago | (#38284060)

After all the bullshit about "EVOLUTION IS JUST A THEORY LALALALALA", are you seriously dismissing a branch of science because it only has *theories* about how the world works? The process of science is the process of constructing and verifying theories. The defining feature of actually doing science is often being neither wrong nor right; in the science of physics, Newton's laws are a prime example of just that.

And are you seriously pointing to Jung and Freud as "a great example" of anything at all? No degree in Experimental Psychology worth more than the paper it is printed on will even mention either of them outside of a historical context. Their contribution to modern psychology and cognitive neuroscience is vanishingly small, besides justifying the actions of people who want to take money off you.

Re:You Mean... (1)

Belial6 (794905) | more than 2 years ago | (#38283770)

Well, child psychologist have come to a consensus that classical conditioning doesn't exist. Just look at the huge body of works that claim making a child uncomfortable will not discourage the activity that makes them uncomfortable.

Re:You Mean... (2)

BlueScreenO'Life (1813666) | more than 2 years ago | (#38283240)

Psychology is a vast field, and some claimed psychology teachings are indeed bullshit. For good examples of non-bullshit psychology, read Richard Wiseman's Quirkology [wordpress.com] and other works by Wiseman.

Re:You Mean... (1)

king neckbeard (1801738) | more than 2 years ago | (#38283636)

I'd say it's more of a protoscience than a pseudoscience.

Re:You Mean... (4, Insightful)

jc42 (318812) | more than 2 years ago | (#38282512)

"Trust me I'm a scientist" isn't good enough anymore?

Actually, in a very real sense, it never was. The story here is that those that were unwilling to let outsiders (i.e., independent researchers) study their data had a significant error rate. But this has generally been understood by scientists; it's why normal scientific procedure encourages getting second opinions from others outside the group.

If you doesn't want us seeing your data, that will normally be taken as a sign that you know or suspect that there are problems with your data. Attempts to block independent researchers from replicating the experiments or data collection (which is one of the main uses of patent law) is generally taken as an open admission that there's something wrong with your data.

"Trust me I'm a scientist" may sometimes work with the general public, but it really hasn't ever worked with scientists. A real scientist reacts to interesting scientific news with "Further research is needed", and applies for funding to carry out the research.

And Get Screwed by Better Funded Competitors (0)

Anonymous Coward | more than 2 years ago | (#38282042)

EOM

So who is going to pay for the costs? (1)

Anonymous Coward | more than 2 years ago | (#38282058)

It'll need money to store and make available and staff to manage.

So who is going to accept an increase in taxes to allow this to happen?

Re:So who is going to pay for the costs? (2)

rgbatduke (1231380) | more than 2 years ago | (#38282864)

Are you kidding? What's the cost of storage on a webserver, per byte? Would that be "zero" compared to the size of any reasonable dataset in the discipline? It would. You could put up a single e.g. 10 TB server in a single lab for a few thousand dollars and it would cost a few hundred dollars a year to run and would handle all the data associated with all the publications in psychology in a decade.

What is expensive and wastes taxes is bozos who do crap research, publish the crap results, hide the crap data and crap methods, and are cited repeatedly in other people's work, a circle of error and corruption that often lasts for years before it is finally discovered and weeded out. We pay for that work already; we need to make people accountable for it by requiring data/methods transparency (if you are e.g. not privately funded). That way the bozos would have research careers that are either over instantly or they'd get so sharply corrected by their peers that they'd wake up and do things right.

rgb

Re:So who is going to pay for the costs? (1)

Anonymous Coward | more than 2 years ago | (#38285388)

My dissertation research data set is pushing half a terabyte, plus another half or so of transformed data in intermediate form. Mind you I'm in neuro, and we're a pretty data-driven lab. If you're curious, it's mostly confocal stacks, neuron recordings (each experiment is usually several hours, multiple channels, and sampled at 20kHz or more), integrator output, parameter-space vector data from optimizations, etc. Some of this *can* be regenerated from smaller seed data sets ... if you have access to a decent cluster and a LOT of free time on your hands, so we can't exactly dump it. Keep in mind, I'm ONE guy at a lab which has been doing this sort of work for decades. There's tons of data on other computers, on blu-ray and DVD and CD, DAT tape, various proprietary backup tape formats, and even a lot of old chart recorder scrolls.

A lot of psych labs are doing fMRI studies now and god knows how many gigabytes of data you get out of a day's worth of work with those. Some are working with computational models and by the time you finish exploring the dynamics of your parameter space and getting your fitter to work you can fill up a drive or two. Some work with video of experimental animals (which some bored undergrad sits around watching and counting the number of times the rat went in a circle) ... those can use up a lot of space too.

We do make our data available to anyone who asks, and increasingly try to do so online, but storage is a nontrivial problem. So is indexing. Your assumptions that you could "handle all the data associated with all the publications in psychology in a decade" on 10TB just show how woefully ignorant you are of how science these days works.

There's no such thing as mental illness. (-1, Offtopic)

Wonko the Sane (25252) | more than 2 years ago | (#38282064)

At least, not as it is currently understood [youtube.com] .

Re:There's no such thing as mental illness. (0)

Anonymous Coward | more than 2 years ago | (#38282966)

Wow, very convincing, though I wish it was shorter. I've thought for a while that most of the shrinks were in it to make $, not cure people. CBT may be legitimate, but most 'therapy' is BS.

Re:There's no such thing as mental illness. (1)

Arker (91948) | more than 2 years ago | (#38283160)

Sounds like this guy has been reading a little classic [yorku.ca] from the good Dr. Szasz.

Science is like any other job/craft in that... (5, Insightful)

Urthas (2349644) | more than 2 years ago | (#38282096)

...most people who do it are downright bad at it. That they might take more time and care to be good at it without the perpetual axe of publish-publish-publish and grants funding hanging over their heads is another issue all together.

An Anecdote to Back This Up (3)

eldavojohn (898314) | more than 2 years ago | (#38282528)

...most people who do it are downright bad at it. That they might take more time and care to be good at it without the perpetual axe of publish-publish-publish and grants funding hanging over their heads is another issue all together.

I agree and I can think of something to illustrate your point.

I was listening to a This American Life episode a few weeks [thisamericanlife.org] back and there was a story done on two people -- one a music professor and the other a respected oncologist -- who were investigating a long defunct theory that certain electromagnetic wavelengths can kill cancer cells and only cancer cells leaving healthy cells completely fine. When left to run the test, the music professor failed to maintain the control correctly and many other things. But after being corrected by the respected researcher they started getting positive sets of preliminary results. The respected researcher requested that the music professor not share this with anyone and not to attach his name to it just yet.

Well, the music professor did not follow this advice because he was so excited about the preliminary results and had, I guess, sort of felt like the respected researcher had short changed him and suppressed him. What the music professor wanted to do was blow the lid off this thing with possibly flawed data and sent it to other oncologists with the original researcher's name attached to it -- possibly misrepresenting it as flawed data. Now I can see why a researcher might fly off the handle when data is released extremely early. They were having problems recreating their own findings (with sham-control) which caused the original researcher to want to keep this very much out of the public's eye. You might claim he was just trying to save himself embarrassment but there's nothing embarrassing about finding out your hypothesis is wrong in science, I just think the best researchers avoid these "failures" and the subsequent investment of resources into them.

I think that scientists figure out how to create the most data and separate the wheat from the shaft in a very lengthy (think decades) long process whereas the first sign of a breakthrough might cause more inexperienced researchers to show the world. And the reason, as you mentioned, is probably the immediate funding they can get with it. But I think it badly neuters scientific news, the reward system and even the direction that research takes. But to release and share early on and often might just make everyone look bad when the whole background of the data is unknown to someone who receives it.

Re:An Anecdote to Back This Up (0)

Anonymous Coward | more than 2 years ago | (#38282822)

It is wheat from the chaff.. not shaft...

Re:An Anecdote to Back This Up (1)

CrackedButter (646746) | more than 2 years ago | (#38283318)

That was a good episode, I felt sorry for the scientist.

Re:Science is like any other job/craft in that... (0)

Anonymous Coward | more than 2 years ago | (#38282694)

So factors secondary to the actual production of scientific results, directly influence how and when those results should be released, and or interpreted.

Sounds less like science and more like every Corporate, Government, higher education, and Public sector office space in existence.

Pure science, for it's own sake, is devoid of these factors and exists only in a black box. Heisenberg was on to more than just physics!

Lie or Die (1, Interesting)

Chemisor (97276) | more than 2 years ago | (#38282126)

It is very difficult to make a man understand something when his job depends on not understanding it. If psychology research were made to adhere to any kind of stringent scientific standard, there would be no psychology research.

Re:Lie or Die (4, Funny)

ColdWetDog (752185) | more than 2 years ago | (#38282308)

It is very difficult to make a man understand something when his job depends on not understanding it. If psychology research were made to adhere to any kind of stringent scientific standard, there would be no psychology research.

Sounds like you have some issues with authority. Would you like to discuss it?

Re:Lie or Die (1)

rgbatduke (1231380) | more than 2 years ago | (#38282916)

Only if he can discuss it in properly controlled, double blind circumstances. For example, he can wait in a room until a man in a white lab coat enters to discuss it with him. Outside, the researcher can flip a coin to determine whether the individual in the lab coat is an actual psychologist or is a plumber or taxi cab driver. Afterwards he can be ordered to perform a really nasty task, such is cleaning up the urinals in a public restroom with a toothbrush, to determine whether or not he still has issues with authority,

rgb

Re:Lie or Die (1)

wintercolby (1117427) | more than 2 years ago | (#38283300)

Afterwards he can be ordered to perform a really nasty task, such is cleaning up the urinals in a public restroom with a toothbrush...

It's not actually really nasty until he has to brush his teeth with said toothbrush afterwards.

Re:Lie or Die (1)

rgbatduke (1231380) | more than 2 years ago | (#38284634)

Good, good, science in progress! We'll add that to the authoritative command!

Re:Lie or Die (1)

Droog57 (2516452) | more than 2 years ago | (#38282678)

Psychology is NOT science, see what Richard Feynman, a somewhat intelligent guy had this to say on the subject.. " I would offer that very good minds can practice psychology, people with deep experience and wisdom and understanding. Psychology obviously has value to many, many people, and also makes deep metaphysical arguments about the world and our understanding of it, yet, its just not a science." Feynman's assesses psychology as a cargo cult science, "(It) follows all the apparent precepts and forms of scientific investigation, but they're missing something ..."

Re:Lie or Die (2, Interesting)

Toonol (1057698) | more than 2 years ago | (#38282830)

I wonder if we just haven't quite mastered the techniques necessary to deal scientifically with highly complex systems. Psychology, economics, climatology, etc., all are theoretically understandable, but are so chaotic that our standard scientific methodology can't be applied... you can't, for instance, repeat an experiment. You can't isolate one changing variable.

Re:Lie or Die (0)

Anonymous Coward | more than 2 years ago | (#38285238)

Why on earth does ignorant crap like this get modded up? Are people really *that* ignorant about statistics and science?

Just as in any other field, the fact that you can't control all variables doesn't invalidate the process, it just means you have to be cautious in your experimental design and your analysis. But people repeat psychology experiments all the time. Hell, even you can do it if you like: count how many times your and your friends' dogs salivate when you ring a bell, condition them to associate the bell with food, then repeat the trial, and use an appropriate test to compare paired values. Have you controlled all your variables? No. Have you established that conditioning probably works? Yes. Is it science? Definitely.

Re:Lie or Die (0)

Anonymous Coward | more than 2 years ago | (#38282878)

You're confusing psychology with counseling and the pseudoscientific models thereof.

Attempting to plumb the depths of a person's psyche to fix something is counseling. That's not science. Neither is attributing those depths to his relationshp with his mother.

Demonstrating (to use a tangible example of a colleague) that one form of pain has quantitative, statistically significant, and reproducible impact on the threshold of perception of another form of pain is psychology. That's science.

Re:Lie or Die (4, Insightful)

Anonymous Coward | more than 2 years ago | (#38282938)

Sorry, but you're an intellectual bigot who resorts to citing well-known celebrities rather than actually researching what the content of a field actually is and making a principled argument. Unfortunately, your bigotry is only ameliorated by its ubiquity in communities such as Slashdot.

A number of points need to be made:

First, most people have a stereotyped idea of what psychology is, because they don't actually know what it is. It's the scientific study of human behavior and experience. If you think it's couches and Freud, you're uninformed. My guess is that Feynman took psychology courses and had his primary exposure to the field during the mid-20th century, when psychoanalysis was dominant in *one branch of psychology*, and isn't even dominant in that area anymore. Psychologists study molecular neurobiology, multivariate statistics, neurophysiology, immunology, and any other number of topics. Be prepared to argue that those fields aren't science (or math) if you're prepared to argue that psychology isn't a science.

Second, it's worth noting that this fraud case (and the way the story is framed) focuses on psychology, but similar problems happen in other fields. E.g.:

http://en.wikipedia.org/wiki/Controversy_over_the_discovery_of_Haumea
http://abcnews.go.com/Health/Wellness/chronic-fatigue-researcher-jailed-controversy/story?id=15076224

Finally, what would you propose to do instead? Study human behavior and experience nonscientifically? That's what you seem to be suggesting.

Re:Lie or Die (1)

Droog57 (2516452) | more than 2 years ago | (#38283306)

Awright!! a good, well reasoned argument form someone with a vocabulary. However. I would suggest that the definition of a Science does not include Psychology, in which experimentation does not produce results that can be DISPROVED. I would also suggest that many branches of Medicine fall into the same category. That is why an MD has a "practice". Possibly drug testing may fall into the Science category, but not the Practice of Psychology. It's an ART, not a SCIENCE. Call me a stickler for details, but people that have a vested interest in promoting their chosen profession as a "Science" have included many crackpots over the years, and the Mental Health field is not exactly a beacon of integrity.

Re:Lie or Die (1)

Eil (82413) | more than 2 years ago | (#38283560)

Head back to Wikipedia for a bit... Feynman was not talking about all of psychology, but mostly parapsychology. Reading minds, bending keys, that kind of thing. He was also speaking in a time where non-religious (or loosely religious) mysticism was fairly common and even mainstream compared to today. Psychology is a much different field nowadays than it was almost 40 years ago.

Re:Lie or Die (1)

DriedClexler (814907) | more than 2 years ago | (#38285272)

It is very difficult to make a man understand something when his job depends on not understanding it.

Could you post the psychological research backing up this claim?

Fantastic (-1)

Anonymous Coward | more than 2 years ago | (#38282230)

Now do it with datasets from climatology papers.

In Wilchert's study, he requested raw data from the authors of some 49 papers. He found that the authors' reluctance to share data was associated with 'more errors in the reporting of statistical results

I wonder if the same thing applies to climatology papers?

(Ok Slashdorks, you may now recommence jacking off to your retarded global warming shit.)

Re:Fantastic (0)

Anonymous Coward | more than 2 years ago | (#38282290)

As much as I am skeptical of the feedback methods for their models, they seem pretty open with the data now.

Re:Fantastic (0)

Anonymous Coward | more than 2 years ago | (#38282860)

Dude, no. Ask for raw data, then ask again if it's really, really, raw.

You get what your reward ... (0)

Anonymous Coward | more than 2 years ago | (#38282870)

As much as I am skeptical of the feedback methods for their models, they seem pretty open with the data now.

I thought one of the issues during the tempest was that some original sensor data was not maintained and had been lost, and only interpreted or adjusted data was currently available?

A better way (3, Insightful)

Hentes (2461350) | more than 2 years ago | (#38282242)

Don't believe anything that hasn't been verified by an independent group of researchers.

Re:A better way (1)

ColdWetDog (752185) | more than 2 years ago | (#38282328)

Just don't believe anything.

very expensive in medicine (1)

peter303 (12292) | more than 2 years ago | (#38282352)

The so-called "4th-stage clinical trial" is to study patients after the drug is released to the public. There may be thousands of times more patients than the first three stages. But is can cost eight figures to finish stage 3.

Sometimes its not an unwillingness... (3, Interesting)

DBCubix (1027232) | more than 2 years ago | (#38282254)

I do research in textual web mining and from time to time I have other researchers ask me for my collections which I spider myself from copyrighted web sources. While my work is purely academic, I am covered by fair use. But since US intellectual property laws are obtuse and overbearing (imho), I cannot take the risk of sharing my collections with others for fear of running afoul of copyright law (since I can't control what is done with the collection once it is out of my hands and how do I know they would use it in a manner consistent with fair use). So it may be more than an unwillingness out of statistical fudging and more an unwillingness to become a target of copyright lawyers.

Re:Sometimes its not an unwillingness... (0)

Anonymous Coward | more than 2 years ago | (#38282330)

From the linked article:
(Note that sometimes lack of data sharing is due to legitimate considerations, such as being part of an ongoing study, or third-party proprietary rights. However, those were not considerations in 49 papers analyzed here.)

Re:Sometimes its not an unwillingness... (2)

ColdWetDog (752185) | more than 2 years ago | (#38282366)

I do research in textual web mining and from time to time I have other researchers ask me for my collections which I spider myself from copyrighted web sources. While my work is purely academic, I am covered by fair use. But since US intellectual property laws are obtuse and overbearing (imho), I cannot take the risk of sharing my collections with others for fear of running afoul of copyright law (since I can't control what is done with the collection once it is out of my hands and how do I know they would use it in a manner consistent with fair use). So it may be more than an unwillingness out of statistical fudging and more an unwillingness to become a target of copyright lawyers.

Why would that be an issue? The onus would be on the people you share the data with it do keep it in the fair use domain. An analogy would be a professor quoting some copyrighted text in a syllabus and then saying she couldn't give a copy of the syllabus to another professor (or student) because she can't control what they do with it.

Regrettably, consult an attorney (1)

perpenso (1613749) | more than 2 years ago | (#38282976)

I do research in textual web mining and from time to time I have other researchers ask me for my collections which I spider myself from copyrighted web sources. While my work is purely academic, I am covered by fair use. But since US intellectual property laws are obtuse and overbearing (imho), I cannot take the risk of sharing my collections with others for fear of running afoul of copyright law (since I can't control what is done with the collection once it is out of my hands and how do I know they would use it in a manner consistent with fair use). So it may be more than an unwillingness out of statistical fudging and more an unwillingness to become a target of copyright lawyers.

Why would that be an issue? The onus would be on the people you share the data with it do keep it in the fair use domain. An analogy would be a professor quoting some copyrighted text in a syllabus and then saying she couldn't give a copy of the syllabus to another professor (or student) because she can't control what they do with it.

There is a difference between copying a brief excerpt in a fair use context and copying the complete copyrighted work. The key point is that the fellow researchers want the complete data set, complete copies of copyrighted works. The original researcher is correct to fear legal consequences and regrettably should consult an attorney before sharing such a data set. Alternatively the original researcher should have logged the URLs where the original data was found and provided these URLs to fellow researchers, they could harvest their own copies. Admittedly some content may have been taken down or changed.

How do you prevent scooping? (4, Insightful)

svendsen (1029716) | more than 2 years ago | (#38282280)

One reason scientist's don't share is because if the data gets out early and gets around (damn slutty data) is that other scientist's might steal/copy/scope/whatever the data. Unless there is a great way to prevent this the suggestion proposed here will never go anywhere.

Re:How do you prevent scooping? (1)

Anonymous Coward | more than 2 years ago | (#38282358)

If you publish the data with the paper then that isn't a problem. If you want to publish the paper before you've finished analysing the data, that looks like a bigger cause for concern than someone else stealing the data.

Re:How do you prevent scooping? (0)

Anonymous Coward | more than 2 years ago | (#38282472)

If you publish the data with the paper then that isn't a problem. If you want to publish the paper before you've finished analysing the data, that looks like a bigger cause for concern than someone else stealing the data.

It seems you never considered that someone might want to publish more than one paper about a single data set.

Re:How do you prevent scooping? (3, Insightful)

sustik (90111) | more than 2 years ago | (#38282808)

And I believe that they should not have the "right" to publish without others trying as well. Yes, having a topic and milking it for the rest of your life sounds a wet dream, but it is not in the interest of the society, so why would that approach be encouraged/protected?

So publish your paper and disclose the data. Others after you will reference your work, in fact even those who *just* use the data and otherwise have not much common with your ideas will still have to cite your paper. Sounds great to me. Also remove the quantity thinking in publishing. One paper in 5 years that will be referenced for 50 years coming is way better that 10 papers in 5 years that are reshuffling of the same and instantly forgotten.

I would replace the publish or perish with: be cited or perish.

In fact, too many publications can be taken as warning signs that:
1. There is little new material, but a lot of reuse of text.
2. The paper is not carefully written and so it is not understood by the field and so the same gets republished over and over.
3. Corners were cut regarding the experiments or methods, or reviewing related work etc. to save time.

Of course there are exceptions and just because someone publishes a lot they do not necessarily guilty of the above.

Re:How do you prevent scooping? (0)

Anonymous Coward | more than 2 years ago | (#38282362)

Maybe set your ego aside and let science progress at the speed that it wants to.

Re:How do you prevent scooping? (0)

Anonymous Coward | more than 2 years ago | (#38282426)

Problem is, there is currently no reward for "sharing data" (especially non raw data) there are rewards for "publishing", "being cited", "getting grants" etc. So, if we let our data out and be prevented from publishing by people who are quicker to publish but had no effort in collecting the data, we will get fired / not get promoted / not get tenure / etc.

IMO, there should be separate rewards for publishing good hypotheses, good data, and good analyses, and let people do and be rewarded for what they do best.

Re:How do you prevent scooping? (0)

Anonymous Coward | more than 2 years ago | (#38282456)

That's lovely. Then you perish in the publish or perish world, as everyone else just has to use that data you spent months obtaining, and gets all the benefit from your hard work. Whilst you were getting the data they were publishing off some other poor sod, and at the end of the day you've got 1 pub to their 5 and no hope of a job.

If you want good science, stop demanding quantity over quality. Sure, quality is harder to measure, but until you get rid of the publish like crazy mentality you just won't see good work.

Re:How do you prevent scooping? (2)

sustik (90111) | more than 2 years ago | (#38282868)

I understand what you are saying. But consider: where does the publish or perish demand come from?

It seems it is perpetuated by academia itself to a great extent. A lot of them got pretty got at the game and it is easier than writing really good papers. People in academia should promote and use "be cited or perish" instead (if a rigid measure is needed).

Re:How do you prevent scooping? (0)

Anonymous Coward | more than 2 years ago | (#38283468)

That just leads to citation farming, which is already ridiculous (try to get a paper through peer review without having 10 extra citations being 'helpfully' given by a reviewer). Or publications only in popular areas where you're guaranteed a lot of cross-citation, rather than neglected areas which might be fruitful.

Publish or perish comes from the funding agencies, not academia. It comes because they have no solid metric for science so they grab the easiest numbers they can - publication counts. I don't know anyone in research who thinks it's a good thing, it's certainly NOT coming from within.

Re:How do you prevent scooping? (2)

Toonol (1057698) | more than 2 years ago | (#38282896)

Whilst you were getting the data they were publishing off some other poor sod, and at the end of the day you've got 1 pub to their 5 and no hope of a job.

The problem there isn't with the science, or sharing data; both are good. The problem is with the inane and counterproductive prestige game that science has become. Counting publications is a moronic method of measuring ability. Generating and making available a good data set is often times MORE important than publishing a particular paper.

Re:How do you prevent scooping? (1)

Rostin (691447) | more than 2 years ago | (#38282914)

The speed that it wants to? You realize that science doesn't do itself, right? Ego has always been one of the chief reasons that human beings, who are to a man petty and selfish, do science. The Royal Society invented many practices like peer review that we now consider to be necessary components of the scientific process. One of the reasons that it established Philosophical Transactions, which was the very first modern scientific journal, was to resolve disputes about who did what first. If you were to somehow remove ego from the equation, I think we'd have very few working scientists. There just aren't that many people sufficiently motivated by pure altruism to do it for very long without recognition.

Re:How do you prevent scooping? (2, Insightful)

Anonymous Coward | more than 2 years ago | (#38282418)

Other scientists typically inquire for data after publication of the findings. (How else would someone know what to ask for?) This suggestion only stresses that error-checking be encouraged after the current process of publication.
Note that this error checking (after attempts to reproduce findings failed) is what led to Gordon Gallup identifying Marc Hauser's recently-acknowledged academic fraud.
https://en.wikipedia.org/wiki/Marc_Hauser#Previous_controversy_over_an_experiment [wikipedia.org]

Re:How do you prevent scooping? (1)

rgbatduke (1231380) | more than 2 years ago | (#38282952)

You mean, a way like "requiring the authors to put their data and numerical methods up on a website no later than the date of publication of the paper"?

Unless their competitors are good at time travel, that seems as though it might be enough...

rgb

Re:How do you prevent scooping? (0)

Anonymous Coward | more than 2 years ago | (#38285462)

In radio astronomy, the way we deal with it is this: when you use a telescope at a public observatory, the data are yours to analyse and publish for 12-18 months. After that, the data are released to the public by the observatory, so anyone else can check your work.

What is this "share early, share often"? (1, Insightful)

Scareduck (177470) | more than 2 years ago | (#38282310)

The IPCC doesn't know about this. Or does this only apply to the "soft sciences"?

Re:What is this "share early, share often"? (3, Informative)

hexghost (444585) | more than 2 years ago | (#38282442)

What? The IPCC was just collecting already published data, there was no 'new' studies done.

Careful - your bias is shining through.

Careful, YOUR bias is shining through (1)

Quila (201335) | more than 2 years ago | (#38282730)

Regardless of where the data was from and what agreements covered its release, the stated purpose of the people in the emails was to keep the data out of the hands of their opponents regardless of open freedom of information requests, and Phil Jones himself said he would be "hiding behind" the agreements in order to keep the data from being released. There was quite literally a conspiracy not only to avoid normal scientific sharing of information, but legal freedom of information requests.

As scientists, they should be discredited and shamed.

As public employees, they should be fired or put in jail.

Re:What is this "share early, share often"? (1)

rgbatduke (1231380) | more than 2 years ago | (#38283038)

No, the IPCC doesn't collect data, they collect published results from data that remains occult even now. Indeed, Phil Jones appears to have lost the raw data out of which e.g. HadCRUT3 was built. Not only is it impossible for anyone else to check his methods or reproduce his results from raw data, he can't reproduce his results from raw data.

Doesn't matter to the IPCC or anybody else that uses that data.

Look up the history of e.g. Steve Mcintyre's efforts to get actual data and methods out of any of the hockey team. Look up the comments in Climategate 2 emails where they conspire to deliberately hide it, even from FOIA requests. That's the true basis of the cooked (up) data used by the IPCC and hence by all the governments of the world to make decisions involving tens to hundreds of billions of dollars now, trillions of dollars over the next few decades.

Transparency of data and methods should be legally mandated for any publicly funded published result used in any sort of political process involving the expenditure of vast amounts of taxpayer money.

rgb

Re:What is this "share early, share often"? (1)

Arker (91948) | more than 2 years ago | (#38283056)

What? The IPCC was just collecting already published data, there was no 'new' studies done.

That's a very weak dodge. Metaresearch doesnt get some magical exemption from scientific procedure. Whether you are going out and collecting raw data to start with, or importing the results of a dozen earlier studies and going from there, all information necessary to replicate your results must be made openly available for replication or else you simply are not doing science.

mixed policy in space sciences (3, Informative)

peter303 (12292) | more than 2 years ago | (#38282404)

Some probes like Mars Rovers, Cassini, SOHO post their data on the web within days. Others like kepler and ESA-Express have posted very little of their data. The tradition is for Principal Investigators to embargo the data one year.

Very Meta (1)

Kamiza Ikioi (893310) | more than 2 years ago | (#38282476)

Psychologist's statistical study suggesting that psychologists have possible psychological issues with sharing their psychological studies... perhaps this warrants a further psychological study of said psychologists?

Incentives and disincentives (5, Insightful)

br00tus (528477) | more than 2 years ago | (#38282482)

Einstein was unable to find a teaching post, and was working in a patent office when he published his annus mirabilis papers. Things have changed over the years though. John Dewey discovered a century ago how children best learned - let the child direct his own learning, and have an adult to facilitate this. This, of course, is not how children are taught. Things nowadays are very test-heavy, and becoming even more so, not as a means to help students in seeing what their deficiencies are, but as a punishment system - and the teachers, and the administrators are under the same punishment system. The carrot of reward is very vague and ill-defined and far-off. It is a system designed to try to squelch the curiosity of those handful of students who had been curious and wanted to learn. Businesses want to get into the education gravy train, and all this charter school stuff is being embraced by both parties, which isn't surprising if you look at the funding behind it.

At the university, the financial incentives are all aligned so that publishing is a necessity. If one does not publish, they do not get tenure, and then all those years of work were for naught as the academic career is over. And what gets published? An average series of experiments done by the scientific method would usually lead to either inconclusive data and results, or just wind up in a dead end. And what journal wants to publish those results after months of work? One of the most popular Phd comics is this one [phdcomics.com] . It seems fairly obvious to me - the more financial incentives are tied to getting published, the more that bogus studies are going to be published. As far as the idea of honesty, integrity or whatever, these things will gradually subside for most people when they come into conflict with keeping a roof over one's head and food on the table.

Implementation is the problem (2)

macwhizkid (864124) | more than 2 years ago | (#38282534)

Ultimately, everyone agrees that open sharing of research data funded by the taxpayers would be A Good Thing(TM). The problem is: how do you persuade people to actually do it. Much how things like advanced safety features on cars, free college tuition, and taxes on big banks sound like great ideas, until you look at what it will actually cost to implement. Not just "cost" in terms of money for infrastructure development, data storage, and support, but in terms of persuading an entire culture to change their workflow.

In our lab, we already spend an extraordinary amount of time on administrative tasks only indirectly related to our research. Adding in a mandatory data sharing task and fielding questions from random people who wanted to use it would be a serious additional chore. Then there's the embarrassment aspect... we actually had a project a couple months ago where there was another group doing an experiment that we wanted to do, and they had software already written. So we thought, "great, we'll just ask them for the code". So we fired off an email... and after a couple weeks we finally got a reply to the effect of "this is actually my first program, and I don't feel comfortable sharing it." So we had to spend 2-3 months writing our own version to do exactly the same thing.

Re:Implementation is the problem (2)

Trepidity (597) | more than 2 years ago | (#38282814)

In the latter case, I think sometimes this is actually for the best. Even though it results in redundant coding, that's one form of replication. If everyone reused the same code written by one grad student long ago and never rewrote it (and the grad student's first program, no less!), there would be a lot of reliance that that program does what it says it does, and does it correctly in all cases. Sure, you could run test cases, read through the code carefully, even try to formally verify it, but in my experience nobody does any of that: if another lab sends you a hairy pile of Fortran or Perl scripts or whatever, the common case is that you try to figure it out only to the extent of figuring out how to run it.

Re:Implementation is the problem (1)

rgbatduke (1231380) | more than 2 years ago | (#38283166)

All true, so perhaps we should add a line or two limiting the scope of the rule, then make it law. If your research data is being used as the basis for major political policy decisions and the spending of unlimited amounts of taxpayer money (e.g. climate research) the public's need outweighs your own inconvenience or embarrassment. Cost we can ignore. Note well that in most enterprises putting data/methods up on a website should be almost free -- who doesn't have websites? Who cannot use archive/comression tools like gzip, tar, zip? Preserving a snapshot of data and methods is already a part of responsible research; a law here would just be a matter of mandating that the publication archival snapshot be made public. If you don't have one, well, that's a problem that suggests that you aren't very professional, maybe not professional enough to deserve funding since you wouldn't necessarily be able to reproduce you own published results without it, could you?

The law should apply to medical research and certain other scientific venues where huge amounts of money or public trust or political decisioning are on the line. Not so much a legal mandate for people studying poison dart frogs in the rain forest canopy, although even for them it should be the expected standard of practice if you are non-privately funded. Anybody doing science should be archiving and/or maintaining a revision history of their work for themselves, and making an archival snap available that matches any actual accepted publication should be cheap, easy, and is the right thing to do.

rgb

No way... (0)

Anonymous Coward | more than 2 years ago | (#38282554)

In my field this won't work - you share either on conferences or via a paper. Most researchers also only share on conferences when they have the corresponding manuscript more or less accepted. It's just too risky and expensive to get scooped.

Re:No way... (1)

thedonger (1317951) | more than 2 years ago | (#38283448)

Sounds like paleo. My wife's lab is always covering up bones and such when people visit because apparently in that field you can publish based solely on what you remember seeing at someone else's university. Maybe, the more imaginary the discipline, the more likely that shenanigans comes into play? So, psychology, paleontology, string theory...

NSF requires sharing already (4, Informative)

Steavis (887731) | more than 2 years ago | (#38282596)

The NSF is now requiring this [nsf.gov] as part of grant applications. You have to have a data management plan that includes the public deposit of both the data and results from grant funded work. Other funding orgs are following suit.

This is a fairly major project at the university I work for, both from the in-process data management perspective (keeping field researchers from storing their only copies on thumbdrives and laptops) and from the long-term repository perspective for holding the data when the grant is completed (that's what I'm involved with).

Storage is cheap. Convincing university administrators to pay for keeping it accessible is another problem, but the NSF position is helping.

Re:NSF requires sharing already (1)

rgbatduke (1231380) | more than 2 years ago | (#38283216)

And this is the way everybody should be doing it, including work funded by NIH, NOAA, DOE, EPA, DOD (well, not all of the DOD work, but some of it). It should be legally mandated for all granting agencies, especially agencies that fund research that is critical to public policy decisions, decisions on the spending of large amounts of tax money, human life and well being, or technological advances that belong in the public domain because (after all) the public paid for them.

rgb

What's the Incentive? (1)

dcollins (135727) | more than 2 years ago | (#38282724)

Surely people aren't just going to turn over the means to get themselves charged with fraud out of the goodness of their hearts. Somehow this has to be made mandatory by the institutions or the publications that they hope to present their work (as suggested in the second linked article; and as I understand some of the top medical journals do nowadays).

Re:What's the Incentive? (1)

Toonol (1057698) | more than 2 years ago | (#38282934)

Surely people aren't just going to turn over the means to get themselves charged with fraud out of the goodness of their hearts.

Well, maybe the scientists that aren't committing fraud will be happy to share their data... then, that small percentage of scientists that refuse to will be shamed and/or ignored.

Hiding correlates positively with... (1)

retroworks (652802) | more than 2 years ago | (#38282818)

Having something to hide. In some cases it is error or bias. What other attributes are "something to hide?". And why didn't the researchers disclose them? What didn't they know, and when did they not know it ?

incentives (1)

superwiz (655733) | more than 2 years ago | (#38283276)

This does show that the pressure to overstate certainty of the results is more common in academia than is otherwise claimed. This is not limited to psychology. Human beings respond to incentives. And lack of requirement to publish data acts as an incentive to overstate certainty.

It's psychology. (0)

nedlohs (1335013) | more than 2 years ago | (#38283356)

In other words made up bullshit, with no cohesive theory tying it all together. So made up data is the least of their concerns...

so those are the stats on statistical errors? (0)

Anonymous Coward | more than 2 years ago | (#38284672)

well that is at least 50% made of iron...

Everybody Knows that Psychology is about as real a science as Astrology anyway - one only needs to look at the farce that is the DSM - the Psychology Bible - to appreciate that

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?