Monthly Archives: October 2012

Commercializing nanotechnology talk at Simon Fraser University in downtown Vancouver (Canada)

Professor Geoffrey Ozin will be giving a free talk titled, Commercializing Nanotechnology: An Evening with Geoffrey Ozin from Opalux, at Simon Fraser University’s Segal Graduate School of Business, 500 Granville Street, Vancouver (Room 2800) from 5-6 pm PST on Monday, Oct. 22, 2012. From the event page,

You are cordially invited to hear Professor Geoffrey Ozin, co-founder of Opalux – a global leader in photonic colour technology research and development, speak about his experiences in advancing and commercializing nanomaterials and Opalux’s strategy in overcoming challenges to commercialize their photonic colour technology platform.

Professor Ozin, a Tier 1 Canada Research Chair and Distinguished University Professor at the University of Toronto, is considered to be the father of Nanochemistry. His career’s work, which include pioneering studies of new classes of nanomaterials, mesoporous materials, photonic crystals and most recently nanomachines, epitomizes how leading-edge research in Nanochemistry can be most effectively directed towards solving contemporary challenges in Nanotechnology and how these contributions have brought true benefit and well being to mankind.

… Professor Ozin co-founded Opalux Inc in 2006 to commercialize his inventions of photonic ink and elastic ink, two new and exciting photonic crystal technologies.

Opalux has been developing a platform of technologies using active polymer-based materials that can respond to an array of stimuli such as pressure, stretching, heat, humidity, and electrical current/voltage. By exploiting the many advantages of photonic color, Opalux has invented a new color display technology that stands apart with its unique combination of brightness, energy efficiency, form factor, customizability, and economy.

Opalux was mentioned here in my Jan. 31, 2011 posting. Given the current low rate of commercializing nanotechnology in Vancouver and BC, I imagine Ozin’s talk is causing some excitement. His company’s (Opalux) website is here.

ETA Oct. 18, 2012 10 am PST: I sent Dr. Ozin a few questions about himself and his talk. Here are the questions and answers (which arrived via Blackberry and less than 20 mins. after I sent the email):

  • What brings you to Vancouver? Were you specially invited by the Segal business school to talk about commercializing nanotechnology?

  Yes

  • Could you describe your business experience? (Is this the first time you’ve commercialized a technology?)

 Yes

  •  Can you offer a preview of what you’ll be talking about on Monday, Oct. 22, 2012?

Idea to Innovation
Lab to Market
Material to Manufacturing

Thank you Dr. Ozin for taking the time to answer and replying in such a speedy fashion.

Antioxidant-like carbon nanoparticles could help heal traumatic brain injuries

The research sounds exciting but all of the testing has taken place in laboratories on animal models (rats). The Oct. 18, 2012 news item on Azonano describes why the research team wanted to test  antioxidant-like carbon nanotubes for use with traumatic brain injury (TBI) patients,

Thomas Kent, James Tour and colleagues explain that TBI disrupts the supply of oxygen-rich blood to the brain. With the brain so oxygen-needy — accounting for only 2 percent of a person’s weight, but claiming 20 percent of the body’s oxygen supply — even a mild injury, such as a concussion, can have serious consequences. Reduced blood flow and resuscitation result in a build-up of free-radicals, which can kill brain cells. Despite years of far-ranging efforts, no effective treatment has emerged for TBI. That’s why the scientists tried a new approach, based on nanoparticles so small that 1000 would fit across the width of a human hair.

The American Chemical Society (ACS) Oct. 17, 2912 news release, which originated the news item, provides a few details about the research,

They [the research team]  describe development and successful laboratory tests of nanoparticles, called PEG-HCCs. In laboratory rats, the nanoparticles acted like antioxidants, rapidly restoring blood flow to the brain following resuscitation after TBI. “This finding is of major importance for improving patient health under clinically relevant conditions during resuscitative care, and it has direct implications for the current [TBI] war-fighter victims in the Afghanistan and Middle East theaters,” they say.

The abstract for the paper gives more insight,

Injury to the neurovasculature is a feature of brain injury and must be addressed to maximize opportunity for improvement. Cerebrovascular dysfunction, manifested by reduction in cerebral blood flow (CBF), is a key factor that worsens outcome after traumatic brain injury (TBI), most notably under conditions of hypotension. We report here that a new class of antioxidants, poly(ethylene glycol)-functionalized hydrophilic carbon clusters (PEG-HCCs), which are nontoxic carbon particles, rapidly restore CBF in a mild TBI/hypotension/resuscitation rat model when administered during resuscitation—a clinically relevant time point. Along with restoration of CBF, there is a concomitant normalization of superoxide and nitric oxide levels. Given the role of poor CBF in determining outcome, this finding is of major importance for improving patient health under clinically relevant conditions during resuscitative care, and it has direct implications for the current TBI/hypotension war-fighter victims in the Afghanistan and Middle East theaters. The results also have relevancy in other related acute circumstances such as stroke and organ transplantation.

I notice this treatment has shown some success for mildTBI/hypotension if applied in the resuscitation phase and the testing, as I mentioned earlier, has been done on rats. For anyone who wants more information about this promising treatment,

Antioxidant Carbon Particles Improve Cerebrovascular Dysfunction Following Traumatic Brain Injury by Brittany R. Bitner, Daniela C. Marcano, Jacob M. Berlin, Roderic H. Fabian, Leela Cherian, James C. Culver, Mary E. Dickinson, Claudia S. Robertson, Robia G. Pautler, Thomas A. Kent, and James M. Tour. ACS Nano, 2012, 6 (9), pp 8007–8014 DOI: 10.1021/nn302615f

The article is behind a paywall and I notice it was published online Aug. 6, 2012. It looks like the ACS may may have tried to publicize this at the time of publication and decided to try again now in the hope of getting more publicity for this work.

Purpose in nature (and the universe): even scientists believe

An intriguing research article titled, Professional Physical Scientists Display Tenacious Teleological Tendencies: Purpose-Based Reasoning as a Cognitive Default, is behind a paywall making it difficult to do much more than comment on the Oct. 17, 2012 news item (on ScienceDaily),

A team of researchers in Boston University’s Psychology Department has found that, despite years of scientific training, even professional chemists, geologists, and physicists from major universities such as Harvard, MIT, and Yale cannot escape a deep-seated belief that natural phenomena exist for a purpose.

Although purpose-based “teleological” explanations are often found in religion, such as in creationist accounts of Earth’s origins, they are generally discredited in science. When physical scientists have time to ruminate about the reasons why natural objects and events occur, they explicitly reject teleological accounts, instead favoring causal, more mechanical explanations. However, the study by lead author Deborah Kelemen, associate professor of psychology, and collaborators Joshua Rottman and Rebecca Seston finds that when scientists are required to think under time pressure, an underlying tendency to find purpose in nature is revealed.

“It is quite surprising what these studies show,” says Kelemen. “Even though advanced scientific training can reduce acceptance of scientifically inaccurate teleological explanations, it cannot erase a tenacious early-emerging human tendency to find purpose in nature. It seems that our minds may be naturally more geared to religion than science.”

I did find the abstract for the paper,

… In Study 2, we explored this further and found that the teleological tendencies of professional scientists did not differ from those of humanities scholars. Thus, although extended education appears to produce an overall reduction in inaccurate teleological explanation, specialization as a scientist does not, in itself, additionally ameliorate scientifically inaccurate purpose-based theories about the natural world. A religion-consistent default cognitive bias toward teleological explanation tenaciously persists and may have subtle but profound consequences for scientific progress.

Here’s the full citation for the paper if you want examine it yourself,

Professional Physical Scientists Display Tenacious Teleological Tendencies: Purpose-Based Reasoning as a Cognitive Default. By Kelemen, Deborah; Rottman, Joshua; Seston, Rebecca

Journal of Experimental Psychology: General, Oct 15, 2012.

What I find particularly intriguing about this work is that it helps to provide an explanation for a phenomenon I’ve observed at science conferences and science talks and in science books. The phenomenon is a tendency to ignore a particular set of questions, how did it start? where did it come from? etc. when discussing nature or, indeed, the universe.

I noticed the tendency again last night (Oct. 16, 2012) at the CBC (Canadian Broadcasting Corporation) Massey Lecture being given by Neil Turok, director of the Canadian Perimeter Institute for Theoretical Physics, and held in Vancouver (Canada). The event was mentioned in my  Oct. 12, 2012 posting (scroll down 2/3 of the way).

During this third lecture (What Banged?)  in a series of five Massey lectures. Turok asked the audience (there were roughly 800 people by my count) to imagine a millimetre ball of light as the starting point for the universe. He never did tell us where this ball of light came from. The entire issue as to how it all started (What Banged?) was avoided. Turok’s avoidance is not unusual. Somehow the question is always set aside, while the scientist jumps into the part of the story she or he can or wants to explain.

 

Interestingly, Turok has given the What Banged? talk previously in 2008 in Waterloo, Ontario. According to this description of the 2008 What Banged? talk, he did modify the presentation for last night,

The evidence that the universe emerged 14 billion years ago from an event called ‘the big bang’ is overwhelming. Yet the cause of this event remains deeply mysterious. In the conventional picture, the ‘initial singularity’ is unexplained. It is simply assumed that the universe somehow sprang into existence full of ‘inflationary’ energy, blowing up the universe into the large, smooth state we observe today. While this picture is in excellent agreement with current observations, it is both contrived and incomplete, leading us to suspect that it is not the final word. In this lecture, the standard inflationary picture will be contrasted with a new view of the initial singularity suggested by string and M-theory, in which the bang is a far more normal, albeit violent, event which occurred in a pre-existing universe. [emphasis mine] According to the new picture, a cyclical model of the universe becomes feasible in which one bang is followed by another, in a potentially endless series of cosmic cycles. The presentation will also review exciting recent theoretical developments and forthcoming observational tests which could distinguish between the rival inflationary and cyclical hypotheses.

Even this explanation doesn’t really answer the question. If there, is as suggested, a pre-existing universe, where did that come from? At the end of last night’s lecture, Turok seemed to be suggesting some kind of endless loop where past, present, and future are linked, which still begs the question: where did it all come from?

I can certainly understand how scientists who are trained to avoid teleological explanations (with their religious overtones) would want to avoid or rush over any question that might occasion just such an explanation.

Last night, the whole talk was a physics and history of physics lesson for ‘dummies’ that didn’t quite manage to be ‘dumb’ enough for me and didn’t really deliver on the promise in this description, from the Oct. 16, 2012 posting by Brian Lynch on the Georgia Straight website,

Don’t worry if your grasp of relativistic wave equations isn’t what it once was. The Waterloo, Ontario–based physicist is speaking the language of the general public here. Even though his subject dwarfs pretty much everything else, the focus of the series as a whole is human in scale. Turok sees our species as standing on the brink of a scientific revolution, where we can understand “how our ideas regarding our place in the universe may develop, and how our very nature may change.” [emphasis mine]

Perhaps Turok is building up to a discussion about “our place  in the universe” and “how our very nature may change,” sometime in the next two lectures.

Mathematical theorems as spiritual practice

Alex Bellos in an Oct. 16, 2012 article for the UK’s Guardian newspaper discusses a unique practice combining spirituality and mathematics (Note: I have removed a link),

… one of the most intriguing practices in the history of mathematics.

Between the seventeenth and nineteenth centuries, the Japanese used to hang up pictures of maths theorems at their shrines.

Called “sangaku”, the pictures were both religious offerings and public announcements of the latest discoveries.

It’s a little like as if Isaac Newton had decided to hang up his monographs at the local church instead of publishing them in books.

More than 700 sangaku are known to have survived, and the above shape is a detail from the oldest one that exists in its complete form.

Here’s a picture of a sangaku that Bellos took while in Japan to make a documentary on numeracy for BBC Radio 4,

Picture: Alex Bellos

The purpose of a sangaku was threefold: to show off mathematical accomplishment, to thank Buddha and to pray for more mathematical knowledge.

There are more images and details in Bellos article about this intriguing practice. I look forward to hearing more about Bellos’ documentary, Land of the Rising Sums, due to be broadcast Monday, Oct. 29, 2012 on BBC Radio 4 from 11 – 11:30 am GMT.

Looking at glass on the molecular scale

Glass isn’t transparent (at the molecular scale) as it’s cooling and scientists have been curious about this transition from liquid to glass state. According to an Oct. 15, 2012 posting by Carol Clark for Emory University’s eScienceCommons, a team from Emory University (and New York University)  has cracked this mystery. First, here’s more about the mystery (from Clark’s article)

Scientists fully understand the process of water turning to ice. As the temperature cools, the movement of the water molecules slows. At 32 F, the molecules lock into crystal lattices, solidifying into ice. In contrast, the molecules of glasses do not crystallize.The movement of the glass molecules slows as the temperature cools, but they never lock into crystal patterns. Instead, they jumble up and gradually become glassier, or more viscous. No one understands exactly why.

The phenomenon leaves physicists to ponder the molecular question of whether glass is a solid, or merely an extremely slow-moving liquid.

This purely technical physics question has stoked a popular misconception: That the glass in the windowpanes of some centuries-old buildings is thicker at the bottom because the glass flowed downward over time.

“The real reason the bottom is thicker is because they hadn’t yet learned how to make perfectly flat panes of glass,” Weeks says [Emory physicist Eric Weeks]. “For practical purposes, glass is a solid and it will not flow, even over centuries. But there is a kernel of truth in this urban legend: Glasses are different than other solid materials.”

Speaking more technically about the transition,

“Cooling a glass from a liquid into a highly viscous state fundamentally changes the nature of particle diffusion,” says Emory physicist Eric Weeks, whose lab conducted the research. “We have provided the first direct observation of how the particles move and tumble through space during this transition, a key piece to a major puzzle in condensed matter physics.”

Weeks specializes in “soft condensed materials,” substances that cannot be pinned down on the molecular level as a solid or liquid, including everyday substances such as toothpaste, peanut butter, shaving cream, plastic and glass.

The scientists have prepared a video animation of what they believing is occurring as glass cools (no sound),

Here’s what the movie depicts (from the Clark article),

The movie and data from the experiment provide the first clear picture of the particle dynamics for glass formation. As the liquid grows slightly more viscous, both rotational and directional particle motion slows. The amount of rotation and the directional movements of the particles remain correlated.

“Normally, these two types of motion are highly coupled,” Weeks says. “This remains true until the system reaches a viscosity on the verge of being glass. Then the rotation and directional movements become decoupled: The rotation starts slowing down more.”

He uses a gridlocked parking lot as an analogy for how the particles are behaving. “You can’t turn your car around, because it’s not a sphere shape and you would bump into your neighbors. You have to wait until a car in front of you moves, and then you can drive a bit in that direction. This is directional movement, and if you can make a bunch of these, you may eventually be able to turn your car. But turning in a crowded parking lot is still much harder than moving in a straight line.”

There’s more about the work and team in Clark’s article. H/T to the Oct. 16, 2012 news item on Nanowerk for alerting me to this work. You can find the article the researchers have written at the Proceedings of the National Academy of Sciences (PNAS),

Decoupling of rotational and translational diffusion in supercooled colloidal fluids by Kazem V. Edmond, Mark T. Elsesser, Gary L. Hunter, David J. Pine, and Eric R. Weeks. Published online before print October 15, 2012, doi: 10.1073/pnas.1203328109 PNAS October 15, 2012

The article is behind a paywall.

NASA, nano, and the race to space

“NASA’s Relationship with Nanotechnology: Past, Present and Future Challenges” has just been published by Rice University’s (located in Texas) Baker Institute for Public Policy. The paper claims that the US National  Aeronautics and Space Administration(NASA) needs to invest more money in nanotechnology research or risk being eclipsed by other countries in the ‘race to space’.

The Oct. 16, 2012 news release from Rice University provides more information,

The paper sheds light on a broad field that holds tremendous potential for improving space flight by reducing the weight of spacecraft and developing smaller and more accurate sensors.

This area of research, however, saw a dramatic cutback from 2004 to 2007, when NASA reduced annual nanotechnology R&D expenditures from $47 million to $20 million. NASA is the only U.S. federal agency to scale back investment in this area, the authors found, and it’s part of an overall funding trend at NASA. From 2003 to 2010, while the total federal science research budget remained steady between $60 billion and $65 billion (in constant 2012 dollars), NASA’s research appropriations decreased more than 75 percent, from $6.62 billion to $1.55 billion.

“The United States currently lacks a national space policy that ensures the continuity of research and programs that build on existing capabilities to explore space, and that has defined steps for human and robotic exploration of low-Earth orbit, the moon and Mars,” Matthews said [Kirstin Matthews, one of researchers and a co-author]. “With Congress and the president wrestling over the budget each year, it is vital that NASA present a clear plan for science and technology R&D that is linked to all aspects of the agency. This includes connecting R&D, with nanotechnology as a lead area, to applications related to the agency’s missions.”

H/T to R&D magazine where I first saw the news item which led me to the Rice University news release and paper.

I have read the paper, which was written by the research team of  Baker Institute science and technology policy fellow Kirstin Matthews, current Rice graduate student Kenneth Evans and former graduate students Padraig Moloney and Brent Carey, and found that much of the reasoning is based on the notion that nanotechnology research is fundamental to wining the ‘space race’. Strikingly, there is very little attempt to explain or justify this reasoning. It’s a little disconcerting and reminds me of joining a conversation that’s been in progress for some time and where the context has been long established  leaving the new participant struggling to catch up and in the position of asking ‘dumb’ questions. For example, how important is leading the ‘race to space’?

In general, this paper seems to reflect a fairly high level of anxiety about US scientific superiority (from the news release),

The authors said that to effectively engage in new technology R&D, NASA should strengthen its research capacity and expertise by encouraging high-risk, high-reward projects to help support and shape the future of U.S. space exploration

“Failure to make these changes, especially in a political climate of flat or reduced funding, poses substantial risk that the United States will lose its leadership role in space to other countries — most notably China, Germany, France, Japan and Israel — that make more effective use of their R&D investments,” Matthews said.

I sometimes think the current US interest in space exploration is a way of harkening back to the glory days of the 1960s where US scientific superiority was unassailable. Much of this superiority was based on the US successfully beating Russia in a race to place ‘a man on the moon’.

Harvard researchers look deeply into oily puddles as they rethink thin films and optical loss

For centuries it was thought that thin-film interference effects, such as those that cause oily pavements to reflect a rainbow of swirling colors, could not occur in opaque materials. Harvard physicists have now discovered that even very “lossy” thin films, if atomically thin, can be tailored to reflect a particular range of dramatic and vivid colors.

from the Oct. 14, 2012 news release on EurekAlert (also available on the Harvard School of Engineering and Applied Sciences [SEAS] news page),

The discovery is the latest to emerge from the laboratory of Federico Capasso, Robert L. Wallace Professor of Applied Physics and Vinton Hayes Senior Research Fellow in Electrical Engineering at SEAS, whose research group most recently produced ultrathin flat lenses and needle light beams that skim the surface of metals. The common thread in Capasso’s recent work is the manipulation of light at the interface of materials that are engineered at the nano- scale, a field referred to as nanophotonics. Graduate student and lead author Mikhail A. Kats carried that theme into the realm of color.

“In my group, we frequently reexamine old phenomena, where you think everything’s already known,” Capasso says. “If you have perceptive eyes, as many of my students do, you can discover exciting things that have been overlooked. In this particular case there was almost a bias among engineers that if you’re using interference, the waves have to bounce many times, so the material had better be transparent. What Mikhail’s done—and it’s admittedly simple to calculate—is to show that if you use a light-absorbing film like germanium, much thinner than the wavelength of light, then you can still see large interference effects.”

The result is a structure made of only two elements, gold and germanium (or many other possible pairings), that shines in whatever color one chooses.

These are gold films colored with nanometer-thick layers of germanium. Credit: Photo courtesy of Mikhail Kats, Romain Blanchard, and Patrice Genevet

The Oct. 14, 2012 news item on ScienceDaily notes,

“We are all familiar with the phenomenon that you see when there’s a thin film of gasoline on the road on a wet day, and you see all these different colors,” explains Capasso.

Those colors appear because the crests and troughs in the light waves interfere with each other as they pass through the oil into the water below and reflect back up into the air. Some colors (wavelengths) get a boost in brightness (amplitude), while other colors are lost.

That’s essentially the same effect that Capasso and Kats are exploiting, with coauthors Romain Blanchard and Patrice Genevet. The absorbing germanium coating traps certain colors of light while flipping the phase of others so that the crests and troughs of the waves line up closely and reflect one pure, vivid color.

“Instead of trying to minimize optical losses, we use them as an integral part of the design of thin-film coatings,” notes Kats. “In our design, reflection and absorption cooperate to give the maximum effect.”

Most astonishingly, though, a difference of only a few atoms’ thickness across the coating is sufficient to produce the dramatic color shifts. The germanium film is applied through standard manufacturing techniques — lithography and physical vapor deposition, which the researchers compare to stenciling and spray-painting — so with only a minimal amount of material (a thickness between 5 and 20 nanometers), elaborate colored designs can easily be patterned onto any surface, large or small.

“Just by changing the thickness of that film by about 15 atoms, you can change the color,” says Capasso. “It’s remarkable.”

I will never look at another oily puddle the same way again.

Designers, manufacturers, research institutes, end-users, and more boost European nano competitiveness with CORONA

The European Union CORONA project brings together a multidisiplinary team dedicated to “Customer-Oriented Product Engineering of Micro and Nano Devices” according to the project’s home page. The Oct. 15, 2012 news item on Nanowerk fills in a few details,

An EU-funded project to improve and strengthen Europe’s competitiveness in micro and nano devices has resulted in the successful development of a customer-oriented engineering methodology that will ultimately benefit a wide range of European industries that depend on these technologies.

Micro and nano devices are used by many industries in the manufacture of their products, including the automotive, consumer products, and medical applications sectors. Boosting competitiveness in micro and nano devices, improving quality and providing new functions will therefore add value right along the European manufacturing chain.

The project – CORONA – brought together designers, manufacturers, tool providers, research institutes and end-users to tackle the technological challenges faced by the industry. Their overall goal was to reduce development times, crucial for competitiveness in this field since success relies very much on fast time-to-market.

I have looked at the CORONA website and was not able to get details about ongoing or completed engineering projects. There have been some workshops although the most recent on the website was in May 2011.

You say nanocrystalline cellulose, I say cellulose nanocrystals; CelluForce at Japan conference and at UK conference

In reading the Oct. 14, 2012 news release from CelluForce about its presence at conferences in Japan and in the UK, I was interested to note the terminology being used,

CelluForce, the world leader in the commercial development of NanoCrystalline Cellulose (NCC), also referred to as Cellulose Nanocrystals (CNC),[emphases mine] is participating in two  upcoming industry conferences:  the ‘Nanocellulose Summit 2012’ in Kyoto, Japan on October 15, 2012, and ‘Investing in Cellulose 2012’, in London, UK, on November 5, 2012.

All of the materials from Canadian companies and not-for-profits have used the term nanocrystalline cellulose (NCC) exclusively, until now. I gather there’ve been some international discussions regarding terminology and that the term cellulose nanocrystals (CNC) is, at the least, a synonym if not the preferred term.

Here’s more about the conference in Japan (from the CelluForce news release),

The 209th Symposium on Sustainable Humanosphere: Nanocellulose Summit 2012’ welcomes the world’s top scientists and large research project leaders involved with nanocellulose to present on each country’s current status and prospects concerning nanocellulose research and industrialization.

What:                  CelluForce – What do we do?

Who:                    Richard Berry, Vice President and Chief  Technology Officer, CelluForce

When:                 Monday, October 15, 2012, 4 p.m. JST

Where:                 Kyoto Terrsa Venue, Shinmachi Kujo Minami-ku,
Kyoto, Japan (Kyoto Citizen’s Amenity Plaza)

I found out a little more about the conference Dr. Richard Berry will be attending on the Nanocellulose Summit 2012 webpage on the Kyoto University website,

The world’s top scientists and large research project leaders involved with nanocellulose (cellulose nanofiber (CNF) [sic] and cellulose nanocrystal (CNC or NCC) ) brought together. They will talk about each country’s current status and prospects concerning nanocellulose research and industrialization.

You can find more details, including the agenda, on the conference webpage.

Here’s more about the investment-oriented conference taking place in the UK,

In its second edition, ‘Investing in Cellulose 2012’ is a global conference on specialty cellulose, organized by CelCo. The company focuses primarily on the specialty cellulose business including the organization of cellulose training courses as well as advisory and consultancy to the industry.

What:                  Nanocrystalline technologies: Bringing Innovation to the Market

Who:                    Jean Moreau, President and CEO, CelluForce

When:                 Monday, November 5, 2012, 2:30 p.m. BST

Where:                The Royal Horseguards Hotel, 2 Whitehall Court Whitehall, London SW1A 2EJ, United Kingdom

I have found an ‘Investing in Cellulose 2012‘ conference webpage (of sorts) on the CelCo website (Note: I have removed some of the formatting),

Based on the success of 2011 specialty cellulose conference and encouraged by a 92% return intention response we are pleased to announce that Investing in Cellulose -2012 Conference will take place in London on November 5th.

A cocktail will kick off the event the preceding night and close around 18:00 of November 5th.

So please SAVE THE DATE in your calendar and contact us HERE

 We have taken into account your wishes and suggestions for this second year event and some of the changes will include:

  • Antitrust lawyer attending meeting allowing larger participation esp. from USA.
  • New topics to allow ether and viscose market to be better covered. Technology section during the day.
  • Seat in lunch accommodations and air condition.
  • Larger china representation.
  • More downstream value chain participation.

We will share later this year the Agenda but feel free to let us know if there were any particular topics you would like us to cover or you would like to present.

The most I could find out about the UK conference organizer is that  Celco Cellulose Consulting is a Swiss company founded by two partners.