Tag Archives: hype

Overpromising and underdelivering: genome, stem cells, gene therapy and nano food

When people talk about overpromising (aka hype/hyperbole) and science, they’re usually referring to overexcited marketing collateral and/or a public relations initiative and/or news media coverage.  Scientists themselves don’t tend to be identified as one of the sources for hype even when that’s clearly the case. That’s right, scientists are people too and sometimes they get carried away by their enthusiasms as Emily Yoffe notes in her excellent Slate essay, The Medical Revolution; Where are the cures promised by stem cells, gene therapy, and the human genome? From Yoffe’s essay,

Dr. J. William Langston has been researching Parkinson’s disease for 25 years. At one time, it seemed likely he’d have to find another disease to study, because a cure for Parkinson’s looked imminent. In the late 1980s, the field of regenerative medicine seemed poised to make it possible for doctors to put healthy tissue in a damaged brain, reversing the destruction caused by the disease.

Langston was one of many optimists. In 1999, the then-head of the National Institute of Neurological Disorders and Stroke, Dr. Gerald Fischbach, testified before the Senate that with “skill and luck,” Parkinson’s could be cured in five to 10 years. Now Langston, who is 67, doesn’t think he’ll see a Parkinson’s cure in his professional lifetime. He no longer uses “the C word” and acknowledges he and others were naive. [emphasis mine] He understands the anger of patients who, he says, “are getting quite bitter” that they remain ill, long past the time when they thought they would have been restored to health.

The disappointments are so acute in part because the promises have been so big. Over the past two decades, we’ve been told that a new age of molecular medicine—using gene therapy, stem cells, and the knowledge gleaned from unlocking the human genome—would bring us medical miracles. [emphasis mine] Just as antibiotics conquered infectious diseases and vaccines eliminated the scourges of polio and smallpox, the ability to manipulate our cells and genes is supposed to vanquish everything from terrible inherited disorders, such as Huntington’s and cystic fibrosis, to widespread conditions like cancer, diabetes, and heart disease.

Yoffe goes on to outline the problems that researchers encounter when trying to ‘fix’ what’s gone wrong.

Parkinson’s disease was long held out as the model for new knowledge and technologies eradicating illnesses. Instead, it has become the model for its unforeseen consequences. [emphasis mine]

Langston, head of the Parkinson’s Institute and Clinical Center, explains that scientists believed the damage to patients took place in a discrete part of the brain, the substantia nigra. “It was a small target. All we’d have to do was replace the missing cells, do it once, and that would cure the disease,” Langston says. “We were wrong about that. This disease hits many other areas of the brain. You can’t just put transplants here and there. The brain is not a pincushion.”

Disease of all kinds have proven to be infinitely more complex than first realized. Disease is not ’cause and effect’ driven so much as it is a process with an infinite number of potential inputs and any number of potential outcomes. Take for example gene therapy (Note: the human genome project was supposed to yield gene therapies),

In some ways, gene therapy for boys with a deadly immune disorder, X-linked severe combined immune deficiency, also known as “bubble boy” disease, is the miracle made manifest. Inserting good genes into these children has allowed some to live normal lives. Unfortunately, within a few years of treatment, a significant minority have developed leukemia. The gene therapy, it turns out, activated existing cancer-causing genes in these children. This results in what the co-discoverer of the structure of DNA, James Watson, calls “the depressing calculus” of curing an invariably fatal disease—and hoping it doesn’t cause a sometimes-fatal one.

For me, it seems that that the human genome project was akin to taking a clock apart. Looking at the constituent parts and replacing broken ones does not guarantee that you will be able assemble a more efficient working version unless you know how the clock worked in the first place. We still don’t understand the basic parts, the genes,  interact with each other, within their environment, or with external inputs.

The state of our ignorance is illustrated by the recent sequencing of the genome of Bishop Desmond Tutu and four Bushmen. Three of the Bushmen had a gene mutation associated with a liver disease that kills people while young. But the Bushmen are all over 80—which means either the variation doesn’t actually cause the disease, or there are other factors protecting the Bushmen.

As for the pressures acting on the scientists themselves,

There are forces, both external and internal, on scientists that almost require them to oversell. Without money, there’s no science. Researchers must constantly convince administrators who control tax dollars, investors, and individual donors that the work they are doing will make a difference. Nancy Wexler says that in order to get funding, “You have to promise cures, that you’ll meet certain milestones within a certain time frame.”

The infomercial-level hype for both gene therapy and stem cells is not just because scientists are trying to convince funders, but because they want to believe. [emphases mine]

Scientific advances as one of Yoffe’s interview subjects points out involve a process dogged with failure and setbacks requiring an attitude of humility laced with patience and practiced over decades before an ‘overnight success’ occurs, if it ever does.

I was reminded of Yoffe’s article after reading a nano food article recently written by Kate Kelland for Reuters,

In a taste of things to come, food scientists say they have cooked up a way of using nanotechnology to make low-fat or fat-free foods just as appetizing and satisfying as their full-fat fellows.

The implications could be significant in combating the spread of health problems such as obesity, diabetes and heart disease.

There are two promising areas of research. First, they are looking at ways to slow digestion,

One thing they might look into is work by scientists at Britain’s Institute of Food Research (IFR), who said last month they had found an unexpected synergy that helped break down fat and might lead to new ways of slowing digestion, and ultimately to creating foods that made consumers feel fuller.

“Much of the fat in processed foods is eaten in the form of emulsions such as soups, yoghurt, ice cream and mayonnaise,” said the IFR’s Peter Wilde. “We are unpicking the mechanisms of digestion used to break them down so we can design fats in a rational way that are digested more slowly.”

The idea is that if digestion is slower, the final section of the intestine called the ileum will be put on its “ileal brake,” sending a signal to the consumer that means they feel full even though they have eaten less fat

This sounds harmless and it’s even possible it’s a good idea but then replacing diseased tissue with healthy tissue, as they tried with Parkinson’s Disease gene therapies, seemed like a good idea too. Just how well is the digestive process understood?

As for the second promising area of research,

Experts see promise in another nano technique which involves encapsulating nutrients in bubble-like structures known as vesicles that can be engineered to break down and release their contents at specific stages in the digestive system.

According to Vic Morris, a nano expert at the IFR, this technique in a larger form, micro-encapsulation, was well established in the food industry. The major difference with nano-encapsulation was that the smaller size might be able to take nutrients further or deliver them to more appropriate places. [emphasis mine]

They’ve been talking about trying to encapsulate and target medicines to more appropriate places and, as far as I’m aware, to no avail. I sense a little overenthusiasm on the experts’ part. Kelland does try to counterbalance this by discussing other issues with nanofood such as secretiveness about the food companies’ research, experts’ concerns over nanoparticles, and public concerns over genetically modified food. Still the allure of ‘all you can eat with no consequences’ is likely to overshadow any journalist’s attempt at balanced reporting with resulting disappointment when somebody realizes it’s all much more complicated than we thought.

Dexter Johnson’s Sept. 22, 2010 posting ( Protein-based Nanotubes Pass Electrical Signals Between Cells) on his Nanoclast blog offers more proof that we still have a lot to learn about basic biological processes,

A few years back, scientists led by Hans-Hermann Gerdes at the University of Bergen noticed that there were nanoscale tubes connecting cells sometimes over significant distances. This discovery launched a field known somewhat by the term in the biological community as the “nanotube field.”

Microbiologists remained somewhat skeptical on what this phenomenon was and weren’t entirely pleased with some explanations offered because they seemed to fall outside “existing biological concepts.”

So let’s start summing up.  The team notices nanotubes that connect cells over distances which microbiologists have difficulty accepting as “they [seem] to fall outside existing biological concepts. [emphasis mine] Now the team has published a paper which suggests that electrical signals pass through the nanotubes and that a ‘gap junction’ enables transmission to nonadjacent cells.  (Dexter’s description provides  more technical detail in an accessible writing style.)

As Dexter notes,

Another key biological question it helps address–or complicate, as the case may be–is the complexity of the human brain. This research makes the brain drastically more complex than originally thought, according to Gerdes. [emphasis mine]

Getting back to where I started, scientists are people too. They have their enthusiasms as well as pressure to get grants and produce results for governments and other investors, not to mention their own egos.  And while I’ve focused on the biological and medical sciences in this article, I think that all the sciences yield more questions than answers and that everything is far more complicated and  interconnected than we have yet to realize.

Research and the 2010 Canadian federal budget; nanotechnology, hype, markets, and medicine; Visionaries in Banagalore; materials science and PBS offer a grant opportunity; To Think To Write To Publish for emerging science writers

It’s time for quiet appreciation as Rob Annan (Don’t leave Canada behind blog) points out in his breakdown of the 2010 Canadian federal budget’s allocation for research.  From the posting (Budget 2010 – A Qualified Success),

Last year’s cuts to the research granting councils, though relatively small, were magnified by their inclusion in a so-called “stimulus budget” full of spending increases in other areas.

This year, the opposite is true. Funding increases, though relatively small, are made more significant by the context of spending restraint evidenced elsewhere in the budget.

Rob goes through the budget allocations for each of the research funding agencies and provides a comparison with previous funding amounts. As he points out, it’s not time to pop the champagne corks as this is a modest success albeit at a time when many were expecting deep cuts. One comment from me, this increase is not a good reason to get complacent and run back to the research facilities effectively disappearing from the public discourse. After all, there’s another budget next year.

Pallab Chatterjee of the IEEE (Institute of Electrical and Electronics Engineers) recently made some comments (on EDN [Electronics Design, Strategy, News] about nanotechnology and commercialization focusing (somewhat) on nanomedicine. It caught my eye because Andrew Maynard (2020 Science blog) has written a piece on cancer and nanomedicine which poses some questions about nanomedicine hype. First, the comments from Chatterjee,

The Nanosys announcement heralds the arrival of nanotechnology products from other companies that will soon be entering the market and shows that the typical eight- to 10-year gestation period for breakthrough technologies to reach commercialization is now reaching an end. For example, nanomedicine is now emerging as a major topic of investigation. To help solidify the topics in this area and to determine the best direction for commercialization, the ASME (American Society of Mechanical Engineers) held the First Global Congress on NEMB (nanoengineering for medicine and biology), a three-day event that took place last month in Houston.

As nanomedicine products hit the commercial marketplace, you can expect hype. According to Andrew (Nanotechnology and cancer treatment: Do we need a reality check?), government agencies have already been on a ‘hype’ trail of sorts (from 2020 Science),

Cancer treatment has been a poster-child for nanotechnology for almost as long as I’ve been involved with the field. As far back as in 1999, a brochure on nanotechnology published by the US government described future “synthetic anti-body-like nanoscale drugs or devices that might seek out and destroy malignant cells wherever they might be in the body.” Over the intervening decade, nanotechnology has become a cornerstone of the National Cancer Institute’s fight against cancer, and has featured prominently in the US government’s support for nanotechnology research and development.

Andrew goes on to quote various experts in the field discussing what they believe can be accomplished. These comments are hopeful and measured and stand in stark contrast to what I imagine will occur once nanomedicine products seriously enter the marketplace. Take for example, Michael Berger’s (Nanowerk) comments about the wildly overhyped nanotechnology market valuations. From Berger’s 2007 article (Debunking the trillion dollar nanotechnology market size hype),

There seems to be an arms race going on among nanotechnology investment and consulting firms as to who can come up with the highest figure for the size of the “nanotechnology market”. The current record stands at $2.95 trillion by 2015. The granddaddy of the trillion-dollar forecasts of course is the National Science Foundation’s (NSF) “$1 trillion by 2015”, which inevitably gets quoted in many articles, business plans and funding applications.

The problem with these forecasts is that they are based on a highly inflationary data collection and compilation methodology. The result is that the headline figures – $1 trillion!, $2 trillion!, $3 trillion! – are more reminiscent of supermarket tabloids than serious market research. Some would call it pure hype. This type of market size forecast leads to misguided expectations because few people read the entire report and in the end only the misleading trillion-dollar headline figure gets quoted out of context, even by people who should now better, and finally achieves a life by itself.

The comments and the figures that Berger cites are still being used ensuring commentary is still relevant. In fact, if you apply the psychology of how these claims become embedded, these comments can be applied to nanomedicine as well.

On a not entirely unrelated note, MIT’s (Massachusetts Institute of Technology) Technology Review Journal has organised a meeting in Bangalore which starts on Monday, March 8, 2010. From the news item on Business Standard,

Nearly a hundred of the world’s leading business and tech visionaries will discuss next generation technologies that are ready for the market in the annual Emerging Technologies Conference (Emtech) in Bangalore next week.

The two-day conference begining March 8 is being held in India for the second year in succession in association with CyberMedia.

The conference, organised by the Massachusetts Institute of Technology’s Technology Review journal, will cover a variety of cutting edge topics ranging from green computing techniques, clean transport alternatives and smarter energy grid to the role that wireless can play in connecting India.

Special sessions on innovative diagnostics and neglected diseases will draw attention towards unheralded health care fields. A session on the future of nanotechnology will touch on new capabilities, giving people new ways to make things and heal bodies.

Finally, I got my monthly NISENet (Nanoscale Informal Science Education Network) newsletter and found a couple of opportunities (from the newsletter), one for materials scientists,

Making Stuff Grant Opportunity
The Materials Research Society and WGBH will be premiering Making Stuff, a four-part PBS series about materials science, in fall 2010 and are looking for outreach partners to organize and host events, demos, workshops, and science cafes in connection with the premiere.  They’ll provide outreach partners with a stipend as well as a resource toolkit.  One of the four episodes is focused on nanotechnology, and nano will be a common thread throughout the episodes. You can find lots more information, as well as the application form, here.  Applications are due April 1st.

and one for emerging science writers,

Calling all “next generation” science and tech writers!

Our partners at ASU asked us to pass along this writing and publishing fellowship opportunity to all of you. They’re now accepting applications for To Think-To Write-To Publish, an intensive two-day workshop followed by a three-day conference in Arizona for early career writers of any genre with an interest in science and technology. The deadline is March 15th, click here to download the flier.

If you are interested in NISENet or want to submit a haiku about nanotechnology (sadly the newsletter doesn’t feature one this month), their website is here.

Can you trust science and scientists?; nanoparticle sludge is a good thing

The recent kerfuffle about scientists, climate change, and hacked emails  (see this story in the UK Guardian for more details) is oddly coincidental with a couple of articles I’ve read recently about trust, science, pr, and scientific claims.

Andrew Maynard (2020 Science) wrote Do scientists dncourage misleading coverage? to explore some of the issues around how scientists get media coverage for their work as he examines a specific incident.

The easiest, simplest way to get coverage for anything is to make a dramatic statement. e.g. First xxxx in history; Huge losses xxxx; xxx possibly fatal; etc. This can lead to overblown claims and/or a snarky, combative communications style. Maynard’s example features overblown claims about possible future applications of a very exciting development. The serious work was published in Nature Physics but someone at the university has written up a news release and produced a video that features the overblown claims as part of their science outreach. Some of this more dramatic material has been picked up and reproduced elsewhere for general consumption.

The reality is that any scientific endeavour occurs over a long period of time and there are many twists and turns before there is any relative certainty about the discovery and/or the implications for any applications that may arise from it.

In the case of climate change, there is strong evidence but as in any other scientific endeavour there are uncertainties. These uncertainties are integral to how science is practiced because our understanding is constantly being refined (theoretically anyway).

The campaign in the popular media to raise concern about climate change is often quite dramatic and has stripped away much of the uncertainty inherent to scientific data. The campaign has been quite successful but an opportunity was created when the evidence for climate change was presented as irrefutable. Opponents were able to capitalize on anomalies and the uncertainty that is inherent in the practice of science. Interestingly, the opponents are just as dramatic and insist their material is just as irrefutable. So, who do you trust? It’s a pretty basic issue and one that keeps recurring.

The point Maynard and Matthew Nisbett (Framing Science blog)  in his posting, Two articles on prediction and hype in science, is that in trying engage the public scientists need to be mindful. Giving in to the impulse to exaggerate or overstate a conclusion for a headline (I do sympathize with that impulse) will do more damage than good to the public’s trust.

Now for something completely different. As more products with nanoparticles enter the marketplace, there’s increasing concern about what happens to them as they are washed off from athletic gear, cleaning products, your body (after using beauty and cosmetic products) and more. According to a newly published paper, scientists may have found a way to remove nanoparticles  from wastewater.  From the news item on Nanowerk,

The new study, details of which are published in Environmental Science & Technology (“Fate of Silica Nanoparticles in Simulated Primary Wastewater Treatment”), simulated primary sewage treatment to show that coating silica nanoparticles with a detergent-like material (called a surfactant) made the nanoparticles interact with components of the sewage to form a solid sludge. This sludge can be separated from the wastewater and disposed of. In contrast, uncoated nanoparticles stayed dispersed in the wastewater and were therefore likely to continue through the effluent stream and potentially on into the environment.

Assuming that nanoparticles entering the environment in substantive quantities is not a good thing, I hope they find some way to deal with them and this research certainly seems promising.