Sunday, April 30, 2017

If Science Is Your Religion, You Are On Shaky Ground

(Source)
       Neo-neoconservative just published an article on how "when a scientific theory becomes a religion…then those with an opposing view become apostates." Her focus is on anthropogenic global warming (AGW), about which she writes:
Science, of course, is not a religion, and the history of science is littered with theories that have been considered proven and then are disproven. So scientists must remain skeptical and open to any evidence that would challenge their theories and their findings. That’s difficult enough to do when the topic is an abstract one with few practical applications. But when a topic is highly highly politicized (as with AGW), the difficulty increases exponentially and the public also becomes very much involved.
       If you have followed my blog for any real period of time, you're aware of the many shenanigans perpetrated by climate scientists. Watts Up With That pointed out this past week an article reporting that "Former Obama Official: Bureaucrats Manipulate Climate Stats To Influence Policy." But it goes beyond climate science. Basic scientific theories and research--some accepted as fact for decades--are overturned on a regular basis, especially in the psychology and medical fields.

       For instance, a study of a large group of participants recently discovered that the participants that limited their salt intake per government guidelines had higher blood pressure than those consuming higher levels of salt. In other words, "[c]onsuming fewer than 2,500 milligrams of sodium daily is actually associated with higher blood pressure, according to the Framingham Offspring Study report, given today." The American Heart Association (which was responsible for fake research on fat intake), on the other hand, has recommended that salt intake be limited to less than 2,300 milligrams per day. This isn't to say that going to the opposite extreme is recommended, but that a salt restrictive diet is also dangerous. As another article notes:
      Although the findings appear to kick against the status quo, they are in line with other recent studies asking similar questions. Research has shown that there is a "J-shaped relationship" between cardiovascular risk and sodium. This means that low-sodium diets and very high-sodium diets both carry a higher risk of heart disease. 
      Many people in the United States sit in the middle of this curve, where the cardiovascular risk is at its lowest. 
       "We saw no evidence that a diet lower in sodium had any long-term beneficial effects on blood pressure. Our findings add to growing evidence that current recommendations for sodium intake may be misguided."
       Such a dramatic reversal should not be surprising. "Nearly all of our medical research is wrong" according to Quartz. The article explains:
       When it came to light that the biotechnology firm Amgen tried to reproduce 53 “landmark” cancer studies and managed to confirm only six, scientists were “shocked.” It was terrible news, but if we’re honest with ourselves, not entirely unexpected. The pernicious problem of irreproducible data has been discussed among scientists for decades. Bad science wastes a colossal amount of money, not only on the irreproducible studies themselves, but on misguided drug development and follow-up trials based on false information. And while unsound preclinical studies may not directly harm patients, there is an enormous opportunity cost when drug makers spend their time on wild goose chases. Discussions about irreproducibility usually ends with shrugs, however—what can we do to combat such a deep-seated, systemic problem?
            Lack of reproducibility of biomedical research is not the result of an unusual level of mendacity among scientists. There are a few bad apples, but for the most part, scientists are idealistic and fervent about the pursuit of truth. The fault lies mainly with perverse incentives and lack of good management. Statisticians Stanley Young and Alan Karr aptly compare biomedical research to manufacturing before the advent of process control. Academic medical research functions as a gargantuan cottage industry, where the government gives money to individual investigators and programs—$30 billion annually in the US alone—and then nobody checks in on the manufacturing process until the final product is delivered. The final product isn’t a widget that can be inspected, but rather a claim by investigators that they ran experiments or combed through data and made whatever observations are described in their paper. The quality inspectors, whose job it is to decide whether the claims are interesting and believable, are peers of the investigators, which means that they can be friends, strangers, competitors, or enemies.
              Lack of process control leads to shoddy science in a number of ways. Many new investigators receive no standardized training. People who work in life sciences are generally not crackerjack mathematicians, and there’s no requirement to involve someone with a deep understanding of statistics. Principal investigators rarely supervise the experiments that their students and post-docs conduct alone in the lab in the dead of night, and so they have to rely on the integrity of people who are paid slave wages and whose only hope of future success is to produce the answers the boss hopes are true. The peer review process is corrupted by cronyism and petty squabbles. These are some of the challenges inherent in a loosely organized and largely unregulated industry, but these are not the biggest reasons why so much science is unreproducible. That has more to do with dumb luck.
      Some publishers are taking action. For instance, Retraction Watch (which monitors retractions of scientific papers) reported recently that Springer, one of the major publishers of scientific journals, is retracting 107 papers from one journal after discovering they had been accepted with fake peer reviews. By fake, they mean fraudulent: "To submit a fake review, someone (often the author of a paper) either makes up an outside expert to review the paper, or suggests a real researcher — and in both cases, provides a fake email address that comes back to someone who will invariably give the paper a glowing review."

             Some of the fake research is probably due to the money and prestige involved. For instance, the New York Times reports on one influential researcher that, so far, is escaping consequences for allegedly falsified research. According to the article:
      Dr. Carlo Croce is among the most prolific scientists in an emerging area of cancer research involving what is sometimes called the “dark matter” of the human genome. A department chairman at Ohio State University and a member of the National Academy of Sciences, Dr. Croce has parlayed his decades-long pursuit of cancer remedies into a research empire: He has received more than $86 million in federal grants as a principal investigator and, by his own count, more than 60 awards.
      The article goes on:
             Over the last several years, Dr. Croce has been fending off a tide of allegations of data falsification and other scientific misconduct, according to federal and state records, whistle-blower complaints and correspondence with scientific journals obtained by The New York Times.

             In 2013, an anonymous critic contacted Ohio State and the federal authorities with allegations of falsified data in more than 30 of Dr. Croce’s papers. Since 2014, another critic, David A. Sanders, a virologist who teaches at Purdue University, has made claims of falsified data and plagiarism directly to scientific journals where more than 20 of Dr. Croce’s papers have been published.
             “It’s a reckless disregard for the truth,” Dr. Sanders said in an interview.
             In increasing number of journals are posting notices of possible problems with Dr. Croce's work. But he is not the only one. "Findings of fraud in biomedical research have surged in recent years, whether from an actual increase in misconduct or from heightened caution inspired in part by an internet-age phenomenon: 'digital vigilantes' who post critiques of scientific papers on anonymous websites." Harvard and Duke Universities have been accused of fraudulently obtaining federal grants for research.

             Dr. Croce is not the only one. In " 'Mindless Eating,' or how to send an entire life of research into question," from Ars Technica, reports:
      Tim van der Zee, one of the scientists participating in the ongoing examination into Wansink’s past, keeps a running account of what’s turned up so far. “To the best of my knowledge,” van der Zee writes in a blog post most recently updated on April 6, “there are currently 42 publications from Wansink which are alleged to contain minor to very serious issues, which have been cited over 3,700 times, are published in over 25 different journals, and in eight books, spanning over 20 years of research.”
      The article goes on to note that "[y]ou’ve probably come across Wansink’s ideas at some point. He researches how subtle changes in the environment can affect people’s eating behavior, and his findings have made a mark on popular diet wisdom." Some of the ideas from his research are using smaller plates to limit portion size, moving unhealthy snakes to hard to reach areas while putting healthy snakes out, and using very low calories snakes to curb hunger.


             A lot of the problem is just sloppy science, such as reported by Slate in the article entitled "The Impostor Cell Line That Set Back Breast Cancer Research." The article reports:  
             It’s an open secret among cancer scientists that a staggering number of cell lines used in studies—one 2007 paper estimated a fifth to more than a third—are later discovered to be contaminated or misidentified strains of the disease. 
             Researchers, in other words, often end up studying the wrong cancer. (HeLa cells, a cervical cancer–derived line of The Immortal Life of Henrietta Lacks fame, are the most common contaminators, in part because their ability to replicate indefinitely makes them fantastic for lab experiments). The mix-ups end up in tens of thousands of studies, costing billions of dollars and years of setbacks on the road to potential treatments. And the scientific community’s pressure to publish and general unwillingness to admit error have made the problem even worse. Biologists rush to research without authenticating their cells; some even dug in their heels after a strain they researched got unmasked as a wayward line. Gradually, a group of alarmed scientists began to coalesce with a mission to expose these shams. As of 2016, the International Cell Line Authentication Committee database had grown to 438 false cell lines, with no end in sight.
      The sad part is I had read about this same problem in the 1980's, but apparently cancer researchers have still not learned.

      Update: Fixed a couple typos and a dropped link.

      No comments:

      Post a Comment

      Weekend Reading -- A New Weekend Knowledge Dump

      Greg Ellifritz has posted a new Weekend Knowledge Dump at his Active Response Training blog . Before I discuss some of his links, I want to ...