Monthly Archives: December 2010

Canadian helps to revise periodic table of elements

A professor (Michael Wieser) at the University of Calgary is making a bit of a splash, so to speak, with his contributions to the changes being made to the periodic table of elements. According to the Dec. 15, 2010 news item on the CBC News website,

Science’s ubiquitous periodic table of the elements is getting a fresh face courtesy of a team led by an Alberta researcher.

As part of the revamp, the atomic weights of at least 10 elements — among them oxygen, carbon and nitrogen — are to be restated, said Michael Wiesner [sic], an associate professor at the University of Calgary.

The update is meant to better reflect how the elements vary in the natural world.

To start with, an international group of scientists will restate the weights of 10 elements, classifying them as a low and a high, known as an interval. The interval varies depending on where the elements are found in nature.

“These are the 10 where we’ve completed the review,” Wieser said on Tuesday. “There’s another series we’re working on right now.”

Apparently, this is the first revision of this type (there have been many additions and moves) to the table since it was developed in 1869 by Mendeleev. (The table is attributed to Dmitri Mendeleev although the history of its development is a little more complicated than I have time for here. Sam Kean’s book, The Disappearing Spoon: And Other True Tales of Madness, Love, and the History of the World from the Periodic Table of the Elements, goes into more detail about it all.)

The implications of these 2010 changes are quite interesting,

Wiesner [sic], who is secretary of the Commission on Isotopic Abundances and Weights for the International Union of Pure and Applied Chemistry, has co-authored a paper outlining the revisions in the journal Pure and Applied Chemistry.

“People have used atomic weight data to look at nuclear processes occurring in the solar system … we can say something about the formation of the solar system and the planets,” he said.

“People are probably comfortable with having a single value for the atomic weight, but that is not the reality for our natural world.

As noted in the Dec. 15, 2010 news item on physorg.com, an impact will be felt in the classrooms,

“Though this change offers significant benefits in the understanding of chemistry, one can imagine the challenge now to educators and students who will have to select a single value out of an interval when doing chemistry calculations,” says Dr. Fabienne Meyers, associate director of IUPAC.

Not all elements will undergo changes (from physorg.com),

Elements with only one stable isotope do not exhibit variations in their atomic weights. For example, the standard atomic weights for fluorine, aluminum, sodium and gold are constant, and their values are known to better than six decimal places.

I think someone got a little overexcited about this,

For the first time in history, a change will be made to the atomic weights of some elements listed on the Periodic table of the chemical elements posted on walls of chemistry classrooms and on the inside covers of chemistry textbooks worldwide. [emphasis mine]

The periodic table of elements is an intellectual construct which was developed in the mid-19 century. For me and most folks, science provides our best guesses but very rarely any certainties. Gravity is a law of physics at the macro level (unless someone manages to prove differently) but when you’re talking about the quantum world, we believe and it seems to be true, experimentally, that a whole other set of rules apply.

Elsevier and Google; scientific publishing

Due to my interest in communication,  I have from time to time commented or drawn attention to developments in publishing (scientific and otherwise) and ebooks. Earlier this month, Google announced the launch of its ebook store and now Elsevier, a major publisher of scientific, technical, and medical information, has announced that it will be using Google’s ebook store as a new distribution channel. From the Dec. 10, 2010 news item on Nanowerk,

Elsevier, the world-leading publisher of scientific, technical and medical information products and services, announced today that it is participating in the recently launched Google eBooks store by including a large selection of Elsevier’s eBook titles. Elsevier regards Google eBooks as a valuable new distribution channel to increase reach and accessibility of its scientific and professional ebook content in the United States.

“Selling a substantial part of our Science & Technology ebooks through Google eBooks will significantly add to the reach and accessibility of our content,” said Suzanne BeDell, Managing Director of Science & Technology Books at Elsevier. “The platform contains one of the largest ebook collections in the world and is compatible with a wide range of devices such as laptops, smartphones, e-readers and tablets. We are therefore confident that our partnership with Google will prove an important step in reaching our objective to provide universal access to scientific content.”

Presumably ‘adding accessibility’ as BeDell puts it means that the books will be significantly cheaper. (I still remember the shock I experienced at discovering the costs of academic texts. Years later, I am still recovering.)

I’m not sure that buyers will own the ebooks. It is possible for an ebook to be removed without notice if you buy from Amazon as I noted in my Sept. 10, 2010 posting, part 2 of a 3 part series on e-readers.)

If you’re interested in the Google part of the story, here’s an article by E. B. Boyd for Fast Company,

If you stroll on over to your corner bookstore this week and ask the person behind the counter about Google’s new ebookstore, which launches today, you probably won’t be greeted with the kind of teeth-gnashing that has accompanied other digital developments, like Amazon’s online bookstore or the advent of proprietary e-readers. Instead, you might actually be greeted with some excitement and delight. That’s because Google is taking a different approach to selling e-books than Amazon or Barnes & Noble. Rather than create a closed system that leaves others out in the cold, Google is actually partnering with independent bookstores to sell its wares–and share the profits.

There are a few reasons Google is going a different way. The ebookstore emerged from the Google Books program, which didn’t start out as a potential revenue stream. Instead, the company’s book-scanning project was simply a program to help the company fulfill its mission to make all of the world’s information accessible. Since so much information is contained in books, the company wanted to make sure that if you were using Google Search to look for a particular topic, it would be able to point you to books containing information about that topic, in addition to relevant web pages. Then, as Google Books began partnering with publishers and contemplating a program to sell books in addition to just making them searchable, it made a philosophical decision that brick-and-mortar bookstores are critical to the literary ecosystem. “A huge amount of books are bought because people go into a physical bookstore and say, ‘Hey, I want this, I want that,’” Google Books engineering director Dan Clancy told an audience at the Computer History Museum last year.

Here’s a response from some of the bookstore owners (from the article),

Bookstores seem to be cautiously optimistic about the Google program. A person who answered the phone at St. Mark’s Bookshop in New York said, “We’re looking forward to it,” before referring Fast Company to the ABA. “We’re really pleased,” said Mark LaFramboise, a buyer at Washington D.C.’s Politics and Prose. “We’ve been waiting for this for a long time.”

Darin Sennett, director of strategic partnerships at the famous Powell’s book shop in Portland, Oregon, is particularly excited about Google’s technological model. The Kindle, the Nook, and the Sony eReader all use the traditional approach to e-books: They sell DRM-protected files that customers download to devices and which must be read with specific e-reading software. Google, however, is using the cloud. Its e-books will be stored on Google servers, and readers who’ve purchased them will access their books via a browser. [emphasis mine] Unlike in the Kindle system, where Kindle e-books can only be read on Kindle devices, Google e-books will be able to be read on any device that has a browser. Until now, independent bookstores have been effectively shut out of devices like the iPad and smartphones (which are emerging as many customers’ reading platforms of choice) because the e-books available from other distributors were either not compatible with those devices or the formatting was so clunky as to make them effectively unreadable.

Certainly, this sounds a lot better from the bookseller’s and reader’s perspectives. I’m glad to see that people at one of my favourite bookstores (Powell’s) is so enthusiastic but I do note that the books are stored on Google’s servers, which means they can be removed or even altered quite easily. On the plus side, the books can be downloaded in either PDF or ePub format. All in all, bravo!

Arxis, healing with liquid bone

I spotted this Dec. 8, 2010 news item about liquid bone on the Azonano website,

Here’s the vision: an elderly woman comes into the emergency room after a fall. She has broken her hip. The orthopaedic surgeon doesn’t come with metal plates or screws or shiny titanium ball joints.

Instead, she pulls out a syringe filled with a new kind of liquid that will solidify in seconds and injects into the break. Over time, new bone tissue will take its place, encouraged by natural growth factors embedded in the synthetic molecules of the material.

Although still early in its development, the liquid is real. In the Brown engineering lab of professor Thomas Webster it’s called TBL, for the novel DNA-like “twin-base linker” molecules that give it seemingly ideal properties. The biotech company Audax Medical Inc., based in Littleton, Mass., announced on Dec. 7 an exclusive license of the technology from Brown. It brands the technology as Arxis and sees similar potential for repairing broken vertebrae.

In chasing down more information about this particular liquid bone technology, I went to Brown University’s website to find an article by David Orenstein,

In some of his work, Webster employs nanotechnology to try to bridge metals to bone better than traditional bone cement. But TBL is an entirely new material, co-developed with longtime colleague and chemist Hicham Fenniri at the University of Alberta. [emphasis mine] Fenniri synthesized the molecules, while Webster’s research has focused on ensuring that TBL becomes viable material for medical use.

The molecules are artificial, but made from elements that are no strangers to the body: carbon, nitrogen, and oxygen. At room temperature their aggregate form is a liquid, but the material they form solidifies at body temperature. The molecules look like nanoscale tubes (billionths of a meter wide), and when they come together, it is in a spiraling ladder-shaped arrangement reminiscent of DNA or collagen. That natural structure makes it easy to integrate with bone tissue.

Yes, there is a University of Alberta connection! In fact, Fenniri (his university webpage is here) also works for Canada’s National Institute of Nanotechnology (NINT) in the Supramolecular Nanoscale Assembly group (webpage here). Why isn’t NINT making some sort of an announcement about this? (I digress.)

Back to the bone. You can see a video demonstration of the liquid bone by visiting the  Orenstein article on the Brown University website. The following image is also from the Orenstein article,

Buttressing bones Twin-based linker molecules, top left, self-assemble into six-molecule rings. Stacked in a tube shape, the rings of molecules not only provide a new scaffold for bone growth, but can also store growth factors and helpful drugs inside. Credit: Websterlab/Brown University

While this is a promising development, there are yet to be any clinical trials,

The molecules are artificial, but made from elements that are no strangers to the body: carbon, nitrogen, and oxygen. At room temperature their aggregate form is a liquid, but the material they form solidifies at body temperature. The molecules look like nanoscale tubes (billionths of a meter wide), and when they come together, it is in a spiraling ladder-shaped arrangement reminiscent of DNA or collagen. That natural structure makes it easy to integrate with bone tissue.

In the space within the nanotubes, the team, which includes graduate student Linlin Sun, has managed to stuff in various drugs including antibiotics, anti-inflammatory agents, and bone growth factors, which the tubes release over the course of months. Even better, different recipes of TBL, or Arxis, can be chemically tuned to become as hard as bone or as soft as cartilage, and can solidify in seconds or minutes, as needed. Once it is injected, nothing else is needed.

“We really like the fact that it doesn’t need anything other than temperature to solidify,” Webster said. Other compounds that people have developed require exposure to ultraviolet light and cannot therefore be injected through a tiny syringe hole. They require larger openings to be created.

For all of TBL’s apparent benefits, they have only been demonstrated in cow bone fragments in incubators on the lab bench top, Webster said. TBL still needs to be proven in vivo and, ultimately, in human trials.

I gather it will be years before we can expect to experience the scenario (breaking a hip and being injected with liquid bone) that opened this posting.

Collaborative nano research

The journal, Nature, published a study about a trend towards collaborative nanotechnology research in its Dec. 2, 2010 online edition (Note: There’s a paywall and I don’t usually link to articles behind them).  From the Dec. 9, 2010 news item on Nanowerk,

Despite their initial focus on national economic competitiveness, the nanotechnology research initiatives now funded by more than 60 countries have become increasingly collaborative, with nearly a quarter of all papers co-authored by researchers across borders.

Researchers from the two leading producers of nanotechnology papers – China and the United States – have become each nation’s most frequent international co-authors. Though Chinese and U.S. researchers now publish roughly the same number of nanotechnology papers, the U.S. retains a lead in the quality of publications – as measured by the number of early citations.

“Despite ten years of emphasis by governments on national nanotechnology initiatives, we find that patterns of nanotechnology research collaboration and funding transcend country boundaries,” said Phillip Shapira, study co-author and a professor in the School of Public Policy at the Georgia Institute of Technology. “For example, we found that U.S. and Chinese researchers have developed a relatively high level of collaboration in nanotechnology research. Each country is the other’s leading collaborator in nanotechnology R&D.”

I’m not convinced that the number of early citations is a good indicator of quality and I have a couple questions. First, are papers published in prestigious journals like Science, Nature, etc. more likely to be cited early? Also, are the Chinese papers being published in English or in Chinese first?

Despite my reservations about this ‘quality issue’, I do find the research quite illuminating. More from the news item,

They [the study’s authors] found that although researchers from 152 nations were represented in the survey, just 15 countries represented 90 percent of the papers. The top four countries by author affiliation were the United States (23 percent), China (22 percent), Germany (8 percent) and Japan (8 percent). Papers authored by researchers from more than one nation – which constituted 23 percent of those examined – were assigned to more than one country.

Though the United States and China now produce approximately the same number of papers, the U.S. maintains significant advantages.

“Compared with Chinese counterparts, papers authored by U.S. researchers still have a substantial lead in terms of citation quality and U.S. corporate activity in nanotechnology innovation remains rather larger,” Shapira said. “However, Chinese quality is improving and an increasing number of Chinese companies are becoming engaged in developing and commercializing nano-enabled products.”

Shapira and study collaborator Jue Wang, an assistant professor at Florida International University, had some other interesting findings,

The study also found that sponsors concentrating their funding in fewer institutions had lower research impact as measured by early citation counts.

“Our starting hypothesis is that when groups from multiple institutions vie for funding, there is increased competition, review processes are less partial, and there are more opportunities to select the most improving projects,” Shapira explained.

With increasing budget pressures, growth in nanotechnology funding appears unlikely. How should countries invest their limited funding for greatest benefit?

“One way would be to foster more high-quality international collaborations, perhaps by opening funding competitions to international researchers and by offering travel and mobility awards for domestic researchers to increase alliances with colleagues in other countries,” the researchers suggested in their paper.

Montreal nanotech company bought by US Versatilis

According to a Dec. 8, 2010 news item on Azonano,

Versatilis LLC, a high technology development Company based here, announced today that it has acquired the technology assets of Nanometrix, Inc. of Montreal, Canada.

The acquisition included all of the intellectual property and the associated inventory, production and laboratory equipment owned by Nanometrix. Price and terms were not disclosed. Versatilis intends to combine the acquired technology with several of its own developing technology and to immediately relaunch the business, eventually spinning off a separate subsidiary under a new name. The business will provide a novel engineered system solution for emerging nanotechnology applications that can enable next generation products based on so-called “Macroelectronics” or large area, flexible electronics. These include novel solar cells and solid state lighting panels in large area, flexible forms.

It seems like a typical Canadian success story, grow the company until its viable and then sell it ASAP to a foreign company. (sarcasm intended)

If you’re interested, here the company’s website is here.

EyeSwipe nano: harbinger for the future?

The promise is always that technology will make it better—whatever it may be. Austin Carr’s article, EyeSwipe Nano: Cheap, Dollar Bill-Size Iris Scanner Replaces Card Reader Apps on the Fast Company website, is touting a new iris scanner, which has the word nano in it for the same reason Apple calls one of their products iPod Nano—good marketing (from the article),

“This is going to put the ability to do a biometric scan in the hands of virtually everyone in the world for a price that is comparable and competitive to card readers,” says company [Hoyos Corporation] CDO Jeff Carter, explaining that orders at volume will make the Nano a sub-thousand dollar product. “The Nano has roughly the footprint of a dollar bill, and I think it’s going to allow us to target virtually everything–any applications where you’d have a typical card reader, whether entry to office buildings or banks or apartments.”

Carter says the company’s scanners have already received “tons and tons of business” from around the globe, and pre-orders for the Nano, which begin today, will ship by the end of January. Between the EyeSwipe Nano and EyeSwipe Mini, it almost feels the company is modeling its products and names after Apple’s–don’t the iPod Nano and EyeSwipe Nano have a similar ring? And perhaps it’s no coincidence: Thanks to the device’s shrinking size, Carter says the companies next step is entering the mobile space, allowing Big Brother to scan your eyes on the go.

Carter is not fantasizing as there is a city in Mexico (Leon) which is already testing installations of earlier and current models of the company’s scanners. From an Aug. 18, 2010 article by Austin Carr on the Fast Company website,

Biometrics R&D firm Global Rainmakers Inc. (GRI) [now called Hoyos Corporation] announced today that it is rolling out its iris scanning technology to create what it calls “the most secure city in the world.” In a partnership with Leon — one of the largest cities in Mexico, with a population of more than a million — GRI will fill the city with eye-scanners. That will help law enforcement revolutionize the way we live — not to mention marketers. [emphases mine]

“In the future, whether it’s entering your home, opening your car, entering your workspace, getting a pharmacy prescription refilled, or having your medical records pulled up, everything will come off that unique key that is your iris,” says Jeff Carter, CDO of Global Rainmakers. Before coming to GRI, Carter headed a think tank partnership between Bank of America, Harvard, and MIT. “Every person, place, and thing on this planet will be connected [to the iris system] within the next 10 years,” he says.

Carter has more to say,

Leon is the first step. To implement the system, the city is creating a database of irises. Criminals will automatically be enrolled, their irises scanned once convicted. Law-abiding citizens will have the option to opt-in.

When these residents catch a train or bus, or take out money from an ATM, they will scan their irises, rather than swiping a metro or bank card. Police officers will monitor these scans and track the movements of watch-listed individuals. “Fraud, which is a $50 billion problem, will be completely eradicated,” says Carter. Not even the “dead eyeballs” seen in Minority Report could trick the system, he says. “If you’ve been convicted of a crime, in essence, this will act as a digital scarlet letter. If you’re a known shoplifter, for example, you won’t be able to go into a store without being flagged. For others, boarding a plane will be impossible.”

It’s definitely an eye-opening view of the future (pun intended).  As for not being able to trick the system—there’s always, always a way. And, there’s always a way to abuse it.

Carter’s belief that most people will want to opt in to the system, i.e., voluntarily have their irises scanned is a distinct possibility. After all great swathes of the population have opted into points systems (handed over personal data for tracking purposes) so they can get reduced prices on groceries, airplane miles, etc.

My apologies for arsenic blooper

I made a mistake when reporting on NASA and the ‘arsenic’ bacterium. Apparently, the research methodology was problematic and the conclusion that the bacterium can substitute arsenic for phosphorus in its DNA is not supported by the evidence as presented.

Martin Robbins at the Lay Scientist blog (one of The Guardian’s science blogs) has posted an analysis of how this ‘media storm’ occurred. The article which started it all was in a well respected,  peer-reviewed journal, Science (which is published by the American Association for the Advancement of Science).  From Robbins’s Dec. 8, 2010 posting,

Should the paper have been published in the first place? Carl Zimmer’s blog post for Slate collects the responses of numerous scientists to the work, including the University of Colorado’s Shelley Copley declaring that: “This paper should not have been published.”

There are two distinct questions here to tease apart: ‘should the paper have been published?’ and ‘should it have been published in Science?’

To the first question I would say ‘yes’. Peer review isn’t supposed to be about declaring whether a paper is definitely right and therefore fit for publication on that basis. The purpose of publishing paper is to submit ideas for further discussion and debate, with peer review serving as a fairly loose filter to weed out some of the utter crap. The contribution a paper makes to science goes far beyond such trivialities as whether or not it’s actually right.

Wolfe-Simon et al’s paper might be wrong, but it has also sparked an interesting and useful debate on the evidence and methodology required to make claims about this sort of thing, and the next paper on this subject that comes along with hopefully be a lot stronger as a result of this public criticism. You could argue on that basis that its publication is useful.

I would argue that the real bone of contention is whether it should have been published in Science – after all, if it had appeared in the Journal of Speculative Biological Hypotheses (and not been hyped) nobody would have given a crap. On this I’m not really qualified to comment, but what I can say is that given the wealth of scientists coming forward to criticize the work, it’s remarkable that the journal found three willing to pass it.

Robbins goes on to analyze the impact that the embargo (story is considered confidential until a prescribed date) that Science applied to the story about the article had on mainstream and other media. He also notes the impact that bloggers had on the story,

The quality, accuracy and context of material available on leading blogs exceeded that of much of mainstream media reporting by light years. While newspapers ran away with the story, it was left to bloggers like Ed Yong, Carl Zimmer, Lewis Dartnell and Phil Plait to put things into perspective.

But more importantly it turns out that peer review is being done on blogs. John Hawks and Alex Bradley – both scientists with relevant expertise – found methodological problems. Rosie Redfield, a microbiology professor a the University of British Colombia [sic], wrote an extensive and detailed take-down of the paper on her blog that morphed into a letter to Science, which I sincerely hope they publish.

Robbins does not suggest that the blogosophere is the perfect place for peer review only that it played an important role regarding this research. There is much more to the posting and I do encourage you to read it.

I did look at Rosie Redfield’s postings about the papers. I found her Dec. 4, 2010 posting to provide the most accessible analysis of the methodological issues of the two. Her Dec. 8, 2010 posting is her submission to Science about the matter.

I do apologize for getting caught up in the frenzy.

More bimetallic nanoparticles

Two days ago, I noted that I’d never encountered bimetallic nanoparticles before reading about the ‘Christmas decorations’ created by a Mexico/US research team (my Dec. 6, 2010 posting). Live and learn. Here’s another bimetallic (gold and silver this time too) news item on Nanowerk,

Shrink Nanotechnologies, Inc. (“Shrink”), an innovative nanotechnology company developing products and licensing opportunities in the solar energy industry, medical diagnostics and sensors and biotechnology research and development tools businesses, announced today that Shrink’s MetalFluor™ technology was studied, reported on and made the front cover of the November issue of Applied Physics Letters (“Bimetallic nanopetals for thousand-fold fluorescence enhancements”). [the article is behind a paywall]

I was most interested to note that at least one of the authors is a researcher associated with the company that issued the news release trumpeting the article in Applied Physics Letters. From the news item on Nanowerk,

The Company’s technology and the work being performed by Dr. Michelle Khine, our scientific founder, continues to gain high praise from leading academic journals. [emphases mine] The studies relate to potential commercial applications of this technology. Of note, the article states, “Because we have a range of nanostructure and nanogap sizes, we can ensure that we can achieve huge fluorescent enhancements on our substrate. These advantages show great potential for low-cost biomedical sensing at single molecular levels at physiological concentrations.” The Company believes that this article is further evidence that certain medical diagnostics tests, a multi-billion dollar annual industry in the United States alone, can provide physicians, patients and other medical professionals with better results using lower quantities of specimens using MetalFluor™ technologies.

Here’s more about possible uses for the technology cited in the article in Applied Physics Letters (citation: Bimetallic nanopetals for thousand-fold fluorescence enhancements by Chi-Cheng Fu1, Giulia Ossato, Maureen Long, Michelle A. Digman, Ajay Gopinathan, Luke P. Lee, Enrico Gratton, and Michelle Khine in vol. 97, issue no. 20, Nov. 15, 2010),

Our method can be easily integrated with microfluidic devices to combine with high throughput lab-on-chip techniques. Importantly, because of–not in spite of–the “variability” in our substrate, we do not need to choose an esoteric dye such that it would match our plasmon resonance. Because we have a range of nanostructure and nanogap sizes, we can ensure that we can achieve huge fluorescence enhancements on our substrate. These advantages show great potential for low-cost biomedical sensing at single molecular levels at physiological concentrations.

The company Khine founded is very interesting from an organizational perspective (the news item on Nanowerk),

Shrink is a first of its kind FIGA™ organization. FIGA companies bring together diverse contributions from leaders in the worlds of finance, industry, government and academia. [emphases mine] Shrink’s solutions, including its diverse polymer substrates, nano-devices and biotech research tools, among others, are designed to be ultra-functional and mechanically superior in the solar energy, environmental detection, stem cell and biotechnology markets. The Company’s products are based on a pre-stressed plastic called NanoShrink™, and on a patent-pending manufacturing process called the ShrinkChip Manufacturing Solution™. Shrink’s unique materials and manufacturing solution represents a new paradigm in the rapid design, low-cost fabrication and manufacture of nano-scale devices for numerous significant markets.

I can’t make much of this academic/business hybrid but I am intrigued and will watch its progress with some interest. You can visit the Shrink Nanotechnologies website here.