Tag Archives: nanotechnology

FrogHeart one of 50 Forward Thinking Nanotech Blogs

I got an email yesterday from Carolyn Friedman notifying me that FrogHeart was on a list of 50 Forward Thinking Nanotech Blogs. I was a little mystified when I noted that the list is on a site called, Becoming a Computer Technician but I looked at the list closely and recognized many of other blogs and happily discovered a few intriguing ones. They offer descriptions of each blog and organize the list into categories. FrogHeart is no. 35 and is in the ‘fan’ category. I’m both chuffed and re-energized, maybe one I’ll achieve the professional category (fingers crossed). Thank you to Carolyn Friedman and the other folks at the website.

Nanotechnology and the Council of Canadian Academies assessment report

I started discussing the Council of Canadian Academies and its mid-term assessment report (Review of the Council of Canadian Academies; Report from the External Evaluation Panel 2010) yesterday and will finish today with my thoughts on the assessment of the Council’s nanotechnology report and its impact.

Titled Small is Different: A Science Perspective of the Regulatory Challenges on the Nanoscale (2008), the Council’s report is one of the best I’ve read. I highly recommend it to anyone who wants an introduction to some of the issues (and was much struck by its omission from the list of suggested nanotechnology readings that Peter Julian [Canadian MP] offered in part 2 of his interview).  Interestingly, the Council’s nanotechnology report is Case Study No. 3 in the mid-term expert panel assessment report’s Annex 6 (p. 33 in the print version and p. 37 in PDF).

Many respondents were concerned that Health Canada has made no response to, or use of, this report. However, Health Canada respondents were highly enthusiastic about the assessment and the ways in which it is being used to inform the department’s many – albeit still entirely internal – regulatory development activities: “We’ve all read it and used it. The fact that we haven’t responded to the outside is actually a reflection of how busy we’ve been responding to the file on the inside!” [emphases mine]

The report has been particularly valuable in providing a framework to bring together Health Canada’s five – very different – regulatory regimes to identify a common approach and priorities. The sponsor believes the report’s findings have been well-incorporated into its draft working definition of nanomaterials, [emphasis mine] its work with Canadian and international standards agencies, its development of a regulatory framework to address shorter- and longer-term needs, and its creation of a research agenda to aid the development of the science needed to underpin the regulation of nanomaterials in Canada.

I think the next time somebody confronts me as to why I haven’t responded externally to some notice (e.g., paid my strata fees), I’ll assure them that I’ve been ‘responding on the inside’. (Sometimes I cannot resist the low-hanging fruit and I just have to take a bite.)

As for the second paragraph where they claim that Health Canada has incorporated suggestions from the report for its nanomaterials definition, that’s all well and good but the thinking is changing and Health Canada doesn’t seem to be responding (or to even be aware of the fact). Take a look at the proposed definition in the current draft bill before the US Senate where in addition to size, they mention shape, reactivity, and more as compared the Health Canada 1 to 100 nm. size definition. (See details in this posting from earlier in the week where I compare the proposed US and Canadian definitions.)

Additionally, I think they need to find ways to measure impact that are quantitative as well as this qualitative approach, which itself needs to be revised. Quantitative measures could include the numbers of reports disseminated in print and online, social networking efforts (if any), number of times reports are mentioned in the media, etc. They may also want to limit case studies in future reports so they can provide more depth. The comment about the ‘internal’ impact could have been described at more length. How have the five different Health Canada regulatory regimes come together? Has something substantive occurred?

Finally, it’s hard to know if the Julian’s failure to mention the council’s report in his list of nanotechnology readings is a simple failure of memory or a reflection of the Council’s “invisibility”. I’m inclined to believe that it’s the latter.

NANO Magazine’s April 2010 issue country focus: Canada

I’m a little late to the party but the month isn’t over yet so, today I’m going to focus on Nano Magazine‘s April 2010 issue or more specifically their article about Canada and it’s nanotechnology scene. The magazine (available both in print and online) has selected Canada for its country focus this issue. From the April 2010, issue no. 17 editorial,

The featured country in this issue is Canada, notable for its well funded facilities and research that is aggressively focused on industrial applications. Although having no unifying national nanotechnology initiative, there are many extremely well-funded organisations with world class facilities that are undertaking important nano-related research. Ten of these centres are highlighted, along with a new network that will research into innovative plastics and manufacturing processes, and added value can be gained in this field – with the economic future benefit for Canada firmly in mind!

It’s always an eye-opening experience to see yourself as others see you. I had no idea Canadian research was “aggressively focused on industrial applications.” My view as a Canadian who can only see it from the inside reveals a scattered landscape with a few pockets of concentrated effort. It’s very difficult to obtain a national perspective as communication from the various pockets is occasional, hard to understand and/or interpret at times, and not easily accessible (some of these Canadian nanotechnology groups (in government agencies, research facilities, civil society groups, etc.) seem downright secretive.

As for the ‘aggressive focus on industrial applications’ by Canadians, I found it interesting and an observation I could not have made for two reasons. The first I’ve already noted (difficulty of obtaining the appropriate perspective from the inside) and, secondly, it seems to me that the pursuit of industrial applications is a global obsession and not confined to the field of nanotechnology, as well, I’m not able to establish a basepoint for comparison so the comment was quite a revelation. Still, it should be noted that Nano Magazine itself seems to have a very strong bias towards commercialization and business interests.

The editorial comment about “not have a unifying national nanotechnology initiative” I can heartily second, although the phrase brings the US National Nanotechnology Initiative strongly to mind where I think a plan (any kind of plan) would do just as well.

The article written by Fraser Shand and titled Innovation finds new energy in Western Canada provides a bit of word play that only a Canadian or someone who knows the province of Alberta, which has substantive oil reserves albeit in the sands, would be able to appreciate. Kudos to whoever came up with the title. Very well done!

I have to admit to being a bit puzzled here as I’m not sure if Shand’s article is the sole article about the Canadian nanotechnology scene  (it profiles only the province of Alberta) or if there are other articles profiling pockets of nanotechnology research present, largely in Quebec, Ontario, and British Columbia with smaller pockets in other provinces. I apologize for giving short shrift to six provinces but, as I’ve noted, information is difficult to come by and most of the information I can obtain is from the four provinces mentioned.

From the article,

Steeped in a pioneering spirit and enriched by ingenuity, one of the most exciting, modern day outposts on the nanotechnology frontier is located on the prairies of Western Canada. The province of Alberta is home to some of Canada’s most significant nanotechnology assets and has quickly become a world-destination for nanotechnology research, product development and commercialization.

While Alberta is rooted in the traditional resource sectors of energy, agriculture and forestry, it is dedicated to innovation. The Government of Alberta launched its nanotechnology strategy in 2007, committing $130 million to growth and development over five years. It also created a dedicated team.

Shand goes on to note Canada’s National Institute of Nanotechnology (NINT), located in Edmonton, Alberta’s capital city, and its role in attracting world class researchers (see News Flash below). Other than the brief mention of a federal institution, the focus remains unrelentingly on Alberta and this is surprising since the title misled me into believing that the article would concern itself with Western Canada, which arguably includes the prairie provinces (Manitoba and Saskatchewan) and British Columbia.

Meanwhile, the editorial led me to believe that I would find a national perspective with mention of 10 research centres somewhere in the April 2010 issue. If they are hiding part of the issue, I wish they’d note that somewhere easily visible (front page?) on their website and clarify the situation.

If this is the magazine’s full profile of the Canadian nanotechnology scene, they’ve either come to the conclusion that the only worthwhile work is being done in Alberta (I’m making an inference) or they found the process of gathering information about the other nanotechnology research pockets so onerous that they simply ignored them in favour of pulling a coherent article together.

I have been viewing the site on a regular basis since I heard about the April 2010 issue and this is the only time I’ve seen an article about Canada made available. They seem to have a policy of rotating the articles they make available for free access.

One other thing, a Nanotechnology Asset Map of Alberta is going to be fully accessible sometime in May 2010. I gather some of the folks from the now defunct, Nanotech BC organization advised the folks at nanoAlberta on developing the tool after the successful BC Nanotechnology Asset Map was printed in 2008 (?). I’m pleased to see the Alberta map is online which will make updating a much easier task and it gives a very handy visual representation that is difficult to achieve with print. You can see Alberta’s beta version at nanoAlberta. Scroll down and look to the left of the screen and at the sidebar for a link to the asset map.

I have to give props to the people in the province of Alberta who have supported nanotechnology research and commercialization efforts tirelessly. They enticed the federal government into building NINT in Edmonton by offering to pay a substantive percentage of the costs and have since created several centres for commercialization and additional research as noted in Shand’s article. Bravo!

News Flash: I just (in the last five minutes, i.e., 11:05 am PT) received this notice about the University of Alberta and nanotechnology. From the Eureka Alert notice,

A University of Alberta-led research team has taken a major step forward in understanding how T cells are activated in the course of an immune response by combining nanotechnology and cell biology. T cells are the all important trigger that starts the human body’s response to infection.

Christopher Cairo and his team are studying how one critical trigger for the body’s T cell response is switched on. Cairo looked at the molecule known as CD45 and its function in T cells. The activation of CD45 is part of a chain of events that allows the body to produce T cells that target an infection and, just as importantly, shut down overactive T cells that could lead to damage.

Cairo and crew are working on a national/international team that includes: “mathematician Dan Coombs (University of British Columbia), biochemist Jon Morrow (Yale University Medical School) and biophysicist David Golan (Harvard Medical School).” Their paper is being published in the April issue of the Journal of Biological Chemistry.

Now back to my regular programming: I should also mention Nano Québec which I believe was the first provincial organization founded  in Canada, circa 2005, to support nanotechnology research and commercialization efforts. French language site / English language site

NaNO Ontario has recently organized itself as the Nanotechnology Network of Ontario.

Unfortunately, Nanotech BC no longer exists.

If you know of any other provincial nanotechnology organizations, please do let me know.

Comparing nanomaterials definitions: US and Canada

In light of yesterday’s (April 26, 2010) posting about Health Canada and their nanomaterials definition, Andrew Maynard’s April 23, 2010 post at 2020 Science (blog) is quite timely. Andrew has some details about new nanomaterials definitions being proposed in the both the US Senate and House of Representatives so that their Toxic Substances Control Act can be amended. From Andrew’s posting, an excerpt about the proposed House bill,

The House draft document is a little more explicit. It recommends amending section 3(2) of the original act with:

“(C) For purposes of this Act, such term may include more than 1 form of a substance with a particular molecular identity as described in sub-paragraph (A) if the Administrator has determined such forms to be different substances, based on variations in the substance characteristics. New forms of existing chemical substances so determined shall be considered new chemical substances.” (page 6)

with the clarification that

“The term ‘substance characteristic’ means, with respect to a particular chemical substance, the physical and chemical characteristics that may vary for such substance, and whose variation may bear on the toxicological properties of the chemical substance, including—

(A) chemical structure and composition

(B) size or size distribution

(C) shape

(D) surface structure

(E) reactivity; and

(F) other characteristics and properties that may bear on toxicological properties” (page 11)

Both the Senate bill and the House discussion document provide EPA with the authority to regulate any substance that presents a new or previously unrecognized risk to human health as a new substance. This is critical to ensuring the safety of engineered nanomaterials, where risk may depend on more than just the chemistry of the substance. But it also creates a framework for regulating any new material that presents a potential risk – whether it is a new chemical, a relatively simple nanomaterial, a more complex nanomaterial – possibly one that changes behavior in response to its environment, or a novel material that has yet to be invented. In other words, these provisions effectively future-proof the new regulation.

I prefer the definition in the draft House of Representatives bill to Health Canada’s because of its specificity and its future-oriented approach. Contrast their specificity with this from the Interim Policy Statement on Health Canada’s Working Definition for Nanomaterials:

Health Canada considers any manufactured product, material, substance, ingredient, device, system or structure to be nanomaterial if:

1. It is at or within the nanoscale in at least one spatial dimension, or;

2. It is smaller or larger than the nanoscale in all spatial dimensions and exhibits one or more nanoscale phenomena.

For the purposes of this definition:

* The term “nanoscale” means 1 to 100 nanometres, inclusive;

* The term “nanoscale phenomena” means properties of the product, material, substance, ingredient, device, system or structure which are attributable to its size [emphasis mine] and distinguishable from the chemical or physical properties of individual atoms, individual molecules and bulk material; and,

* The term “manufactured” includes engineering processes and control of matter and processes at the nanoscale.

You’ll notice the House of Representatives’ draft bill offers five elements to the description (chemical composition, size or size distribution [emphasis mine], shape, surface structure, reactivity, and other characteristics and properties that may bear on toxicological properties). So in the US they include elements that have been identified as possibly being a problem and leave the door open for future discovery.

The proposed legislation has another feature, Andrew notes that,

Both the Senate bill and the House discussion document provide EPA with the authority [emphasis mine] to regulate any substance that presents a new or previously unrecognized risk to human health as a new substance. This is critical to ensuring the safety of engineered nanomaterials, where risk may depend on more than just the chemistry of the substance. But it also creates a framework for regulating any new material that presents a potential risk – whether it is a new chemical, a relatively simple nanomaterial, a more complex nanomaterial – possibly one that changes behavior in response to its environment, or a novel material that has yet to be invented. In other words, these provisions effectively future-proof the new regulation.

As far as I can recall, Peter Julian’s (MP – NDP) tabled draft bill for nanotechnology regulation in Canada does not offer this kind of ‘future-proofing’ although it could be added if it is ever brought forward for debate in the House of Commons. Given the quantity of public and political discussion on nanotechnology (and science, in general) in Canada, I doubt any politician could offer those kinds of amendments to Julian’s proposed bill.

As for Canada’s proposed nanomaterials reporting plan/inventory/scheme, Health Canada’s proposed definition’s vagueness makes compliance difficult. Let me illustrate what I mean while I explain why I highlighted ‘size distribution’ in the House of Representatives draft bill by first discussing Michael Berger’s article on Nanowerk about environment, health and safety (EHS) research into the toxicological properties of nanomaterials. From Berger’s article,

” What we found in our work is that nanomaterials purchased from commercial sources may not be as well characterized as indicated by the manufacturer,” Vicki H. Grassian, a professor in the Department of Chemistry at the University of Iowa, tells Nanowerk. “For example, it might be stated that a certain nanoparticle is being sold as 30 nm in diameter and, although ’30 nm’ might be close to the average diameter, there is usually a range of particle sizes that can extend from as much as small as 5 nm to as large as 300 nm. [emphases mine]”

That’s size distribution and it reveals two problems with a reporting plan/inventory/scheme that uses a definition that sets the size within a set range. (Julian’s bill has the same problem although his range is 1 to 1000 nm.) First, what happens if you have something that’s 1001 nm? This inflexible and unswerving focus on size will frustrate the intent both of the reporting plan and of Julian’s proposed legislation. Second, how can a business supply the information being requested when manufacturers offer such a wide distribution of sizes in  products where a uniform size is claimed? Are businesses going to be asked to measure the nanomaterials? Two or three years or more after they received the products? [Aug.4.10 Note: Some grammatical changes made to this paragraph so it conveys my message more clearly.]

Then Berger’s article moves onto another issue,

Reporting their findings in a recent paper in Environmental Toxicology and Chemistry (“Commercially manufactured engineered nanomaterials for environmental and health studies: Important insights provided by independent characterization”), among other problems Grassian and first author Heaweon Park also discuss the issue of batch-to-batch variability during the production of nanoparticles and that some nanomaterials which were being sold as having spherical morphology could contain mixed morphologies such as spheres and rods [emphases mine].

That’s right, you may not be getting the same shape of nanoparticle in your batch. This variability should not pose a problem for the proposed reporting plan/inventory/scheme since shape is not mentioned in Health Canada’s definition but it could bear on toxicology issues which is why a plan/inventory/scheme is being proposed in the first place.

Interestingly, the only ‘public consultation’ meeting that Health Canada/Environment Canada has held appears to have taken place in 2007 with none since and none planned for the future (see my April 26, 2010 posting).

Apparently, 3000 stakeholders have been contacted and asked for responses. I do wonder if an organization like Nano Quebec has been contacted and counted not as a single stakeholder but as representing its membership numbers (e.g. 500 members = 500 stakeholders?) whatever they may be. There is, of course, a specific Health Canada website for this interim definition where anyone can offer comments. It takes time to write a submission and I’m not sure how much time anyone has to devote to it which is why meetings can be very effective for information gathering especially in a field like nanotechnology where the thinking changes so quickly. 2007 seems like a long time ago.

Finally, Dexter Johnson on his Nanoclast blog is offering more perspective on the recent Andrew Schneider/National Nanotechnology Initiative dust up. Yes, he gave me a shout out (and I’m chuffed) and he puts the issues together to provide a different perspective on journalistic reporting environment, health and safety issues as they relate to nanotechnology along with some of the issues associated with toxicology research.

Dr. Wei Lu, the memristor, and the cat brain; military surveillance takes a Star Trek: Next Generation turn with a medieval twist; archiving tweets; patents and innovation

Last week I featured the ‘memristor’ story mentioning that much of the latest excitement was set off by Dr. Wei Lu’s work at the University of Michigan (U-M). While HP Labs was the center for much of the interest, it was Dr. Lu’s work (published in Nano Letters which is available behind a paywall) that provoked the renewed interest. Thanks to this news item on Nanowerk, I’ve now found more details about Dr. Lu and his team’s work,

U-M computer engineer Wei Lu has taken a step toward developing this revolutionary type of machine that could be capable of learning and recognizing, as well as making more complex decisions and performing more tasks simultaneously than conventional computers can.

Lu previously built a “memristor,” a device that replaces a traditional transistor and acts like a biological synapse, remembering past voltages it was subjected to. Now, he has demonstrated that this memristor can connect conventional circuits and support a process that is the basis for memory and learning in biological systems.

Here’s where it gets interesting,

In a conventional computer, logic and memory functions are located at different parts of the circuit and each computing unit is only connected to a handful of neighbors in the circuit. As a result, conventional computers execute code in a linear fashion, line by line, Lu said. They are excellent at performing relatively simple tasks with limited variables.

But a brain can perform many operations simultaneously, or in parallel. That’s how we can recognize a face in an instant, but even a supercomputer would take much, much longer and consume much more energy in doing so.

So far, Lu has connected two electronic circuits with one memristor. He has demonstrated that this system is capable of a memory and learning process called “spike timing dependent plasticity.” This type of plasticity refers to the ability of connections between neurons to become stronger based on when they are stimulated in relation to each other. Spike timing dependent plasticity is thought to be the basis for memory and learning in mammalian brains.

“We show that we can use voltage timing to gradually increase or decrease the electrical conductance in this memristor-based system. In our brains, similar changes in synapse conductance essentially give rise to long term memory,” Lu said.

Do visit Nanowerk for the full explanation provided by Dr. Lu, if you’re so inclined. In one of my earlier posts about this I speculated that this work was being funded by DARPA (Defense Advanced Research Projects Agency) which is part of the US Dept. of Defense . Happily, I found this at the end of today’s news item,

Lu said an electronic analog of a cat brain would be able to think intelligently at the cat level. For example, if the task were to find the shortest route from the front door to the sofa in a house full of furniture, and the computer knows only the shape of the sofa, a conventional machine could accomplish this. But if you moved the sofa, it wouldn’t realize the adjustment and find a new path. That’s what engineers hope the cat brain computer would be capable of. The project’s major funder, the Defense Advanced Research Projects Agency [emphasis mine], isn’t interested in sofas. But this illustrates the type of learning the machine is being designed for.

I previously mentioned the story here on April 8, 2010 and provided links that led to other aspects of the story as I and others have covered it.

Military surveillance

Named after a figure in Greek mythology, Argos Panoptes (the sentry with 100 eyes), there are two new applications being announced by researchers in a news item on Azonano,

Researchers are expanding new miniature camera technology for military and security uses so soldiers can track combatants in dark caves or urban alleys, and security officials can unobtrusively identify a subject from an iris scan.

The two new surveillance applications both build on “Panoptes,” a platform technology developed under a project led by Marc Christensen at Southern Methodist University in Dallas. The Department of Defense is funding development of the technology’s first two extension applications with a $1.6 million grant.

The following  image, which accompanies the article at the Southern Methodist University (SMU) website, features an individual who suggests a combination of the Geordi character in Star Trek: The Next Generation with his ‘sensing visor’ and a medieval knight in full armour wearing his helmet with the visor down.

Soldier wearing helmet with hi-res "eyes" courtesy of Southern Methodist University Research

From the article on the SMU site,

“The Panoptes technology is sufficiently mature that it can now leave our lab, and we’re finding lots of applications for it,” said ‘Marc’ Christensen [project leader], an expert in computational imaging and optical interconnections. “This new money will allow us to explore Panoptes’ use for non-cooperative iris recognition systems for Homeland Security and other defense applications. And it will allow us to enhance the camera system to make it capable of active illumination so it can travel into dark places — like caves and urban areas.”

Well, there’s nothing like some non-ccoperative retinal scanning. In fact, you won’t know that the scanning is taking place if they’re successful  with their newest research which suggests the panopticon, a concept from Jeremy Bentham in the 18th century about prison surveillance which takes place without the prisoners being aware of the surveillance (Wikipedia essay here).

Archiving tweets

The US Library of Congress has just announced that it will be saving (archiving) all the ‘tweets’ that have been sent since Twitter launched four years ago. From the news item on physorg.com,

“Library to acquire ENTIRE Twitter archive — ALL public tweets, ever, since March 2006!” the Washington-based library, the world’s largest, announced in a message on its Twitter account at Twitter.com/librarycongress.

“That’s a LOT of tweets, by the way: Twitter processes more than 50 million tweets every day, with the total numbering in the billions,” Matt Raymond of the Library of Congress added in a blog post.

Raymond highlighted the “scholarly and research implications” of acquiring the micro-blogging service’s archive.

He said the messages being archived include the first-ever “tweet,” sent by Twitter co-founder Jack Dorsey, and the one that ran on Barack Obama’s Twitter feed when he was elected president.

Meanwhile, Google made an announcement about another twitter-related development, Google Replay, their real-time search function which will give you data about the specific tweets made on a particular date.  Dave Bruggeman at the Pasco Phronesis blog offers more information and a link to the beta version of Google Replay.

Patents and innovation

I find it interesting that countries and international organizations use the number of patents filed as one indicator for scientific progress while studies indicate that the opposite may be true. This news item on Science Daily strongly suggests that there are some significant problems with the current system. From the news item,

As single-gene tests give way to multi-gene or even whole-genome scans, exclusive patent rights could slow promising new technologies and business models for genetic testing even further, the Duke [Institute for Genome Sciences and Policy] researchers say.

The findings emerge from a series of case studies that examined genetic risk testing for 10 clinical conditions, including breast and colon cancer, cystic fibrosis and hearing loss. …

In seven of the conditions, exclusive licenses have been a source of controversy. But in no case was the holder of exclusive patent rights the first to market with a test.

“That finding suggests that while exclusive licenses have proven valuable for developing drugs and biologics that might not otherwise be developed, in the world of gene testing they are mainly a tool for clearing the field of competition [emphasis mine], and that is a sure-fire way to irritate your customers, both doctors and patients,” said Robert Cook-Deegan, director of the IGSP Center for Genome Ethics, Law & Policy.

This isn’t an argument against the entire patenting system but rather the use of exclusive licenses.

Math, science and the movies; research on the African continent; diabetes and mice in Canada; NANO Magazine and Canada; poetry on Bowen Island, April 17, 2010

About 10 years ago, I got interested in how the arts and sciences can inform each other when I was trying to organize an art/science event which never did get off the ground (although I still harbour hopes for it one day).  It all came back to me when I read Dave Bruggeman’s (Pasco Phronesis blog) recent post about a new Creative Science Studio opening at the School of Cinematic Arts at the University of Southern California (USC). From Dave’s post,

It [Creative Science Studio] will start this fall at USC, where its School of Cinematic Arts makes heavy use of its proximity to Hollywood, and builds on its history of other projects that use science, technology and entertainment in other areas of research.

The studio will not only help studios improve the depiction of science in the products of their students, faculty and alumni (much like the Science and Entertainment Exchange), but help scientists create entertaining outreach products. In addition, science and engineering topics will be incorporated into the School’s curriculum and be supported in faculty research.

This announcement reminds me a little bit of an IBM/USC initiative in 2008 (from the news item on Nanowerk),

For decades Hollywood has looked to science for inspiration, now IBM researchers are looking to Hollywood for new ideas too.

The entertainment industry has portrayed possible future worlds through science fiction movies – many created by USC’s famous alumni – and IBM wants to tap into that creativity.

At a kickoff event at the USC School of Cinematic Arts, five of IBM’s top scientists met with students and alumni of the school, along with other invitees from the entertainment industry, to “Imagine the World in 2050.” The event is the first phase of an expected collaboration between IBM and USC to explore how combining creative vision and insight with science and technology trends might fuel novel solutions to the most pressing problems and opportunities of our time.

It’s interesting to note that the inspiration is two-way if the two announcements are taken together. The creative people can have access to the latest science and technology work for their pieces and scientists can explore how an idea or solution to a problem that exists in a story might be made real.

I’ve also noted that the first collaboration mentioned suggests that the Creative Science Studio will be able to “help scientists create entertaining outreach products.” My only caveat is that scientists too often believe that science communication means that they do all the communicating while we members of the public are to receive their knowledge enthusiastically and uncritically.

Moving on to the math that I mentioned in the head, there’s an announcement of a new paper that discusses the use of mathematics in cinematic special effects. (I believe that the word cinematic is starting to include games and other media in addition to movies.)  From the news item on physorg.com,

The use of mathematics in cinematic special effects is described in the article “Crashing Waves, Awesome Explosions, Turbulent Smoke, and Beyond: Applied Mathematics and Scientific Computing in the Visual Effects Industry”, which will appear in the May 2010 issue of the NOTICES OF THE AMS [American Mathematical Society]. The article was written by three University of California, Los Angeles, mathematicians who have made significant contributions to research in this area: Aleka McAdams, Stanley Osher, and Joseph Teran.

Mathematics provides the language for expressing physical phenomena and their interactions, often in the form of partial differential equations. These equations are usually too complex to be solved exactly, so mathematicians have developed numerical methods and algorithms that can be implemented on computers to obtain approximate solutions. The kinds of approximations needed to, for example, simulate a firestorm, were in the past computationally intractable. With faster computing equipment and more-efficient architectures, such simulations are feasible today—and they drive many of the most spectacular feats in the visual effects industry.

This news item too brought back memories. There was a Canadian animated film, Ryan, which both won an Academy Award and involved significant collaboration between a mathematician and an animator. From the MITACS (Mathematics of Information Technology and Complex Systems)  2005 newsletter, Student Notes:

Karan Singh is an Associate Professor at the University of Toronto, where co-directs the graphics and HCI lab, DGP. His research interests are in artist driven interactive graphics encompassing geometric modeling, character animation and non-photorealistic rendering. As a researcher at Alias (1995-1999), he architected facial and character animation tools for Maya (Technical Oscar 2003). He was involved with conceptual design and reverse engineering software at Paraform (Academy award for technical achievement 2001) and currently as Chief Scientist for Geometry Systems Inc. He has worked on numerous film and animation projects and most recently was the R+D Director for the Oscar winning animation Ryan (2005)

Someone at Student Notes (SN) goes on to interview Dr. Singh (here’s an excerpt),

SN: Some materials discussing the film Ryan mention the term “psychorealism”. What does this term mean? What problems does the transition from realism to psychorealism pose for the animator, or the computer graphics designer?

KS: Psychorealism is a term coined by Chris {Landreth, film animator] to refer to the glorious complexity of the human psyche depicted through the visual medium of art and animation. The transition is not a problem, psychorealism is stylistic, just a facet to the look and feel of an animation. The challenges lies in the choice and execution of the metaphorical imagery that the animator makes.

Both the article and Dr. Singh’s page are well worth checking out, if the links between mathematics and visual imagery interest you.

Research on the African continent

Last week I received a copy of Thompson Reuters Global Research Report Africa. My hat’s off to the authors, Jonathan Adams, Christopher King, and Daniel Hook for including the fact that Africa is a continent with many countries, many languages, and many cultures. From the report, (you may need to register at the site to gain access to it but the only contact I ever get is a copy of their newsletter alerting me to a new report and other incidental info.), p. 3,

More than 50 nations, hundreds of languages, and a welter of ethnic and cultural diversity. A continent possessed of abundant natural resources but also perennially wracked by a now-familiar litany of post-colonial woes: poverty, want, political instability and corruption, disease, and armed conflicts frequently driven by ethnic and tribal divisions but supplied by more mature economies. OECD’s recent African Economic Outlook sets out in stark detail the challenge, and the extent to which current global economic problems may make this worse …

While they did the usual about challenges, the authors go on to add this somewhat contrasting information.

Yet the continent is also home to a rich history of higher education and knowledge creation. The University of Al-Karaouine, at Fez in Morocco, was founded in CE 859 as a madrasa and is identified by many as the oldest degree-awarding institution in the world.ii It was followed in 970 by Al-Azhar University in Egypt. While it was some centuries before the curriculum expanded from religious instruction into the sciences this makes a very early marker for learning. Today, the Association of African Universities lists 225 member institutions in 44 countries and, as Thomson Reuters data demonstrate, African research has a network of ties to the international community.

A problem for Africa as a whole, as it has been for China and India, is the hemorrhage of talent. Many of its best students take their higher degrees at universities in Europe, Asia and North America. Too few return.

I can’t speak for the details included in the report which appears to be a consolidation of information available in various reports from international organizations. Personally, I find these consolidations very helpful as I would never have the time to track all of this down. As well, they have created a graphic which illustrates research relationships. I did have to read the analysis in order to better understand the graphic but I found the idea itself quite engaging and as I can see (pun!) that as one gets more visually literate with this type of graphic that it could be a very useful tool for grasping complex information very quickly.

Diabetes and mice

Last week, I missed this notice about a Canadian nanotechnology effort at the University of Calgary. From the news item on Nanowerk,

Using a sophisticated nanotechnology-based “vaccine,” researchers were able to successfully cure mice with type 1 diabetes and slow the onset of the disease in mice at risk for the disease. The study, co-funded by the Juvenile Diabetes Research Foundation (JDRF), provides new and important insights into understanding how to stop the immune attack that causes type 1 diabetes, and could even have implications for other autoimmune diseases.

The study, conducted at the University of Calgary in Alberta, Canada, was published today [April 8, 2010?] in the online edition of the scientific journal Immunity.

NANO Magazine

In more recent news, NANO Magazine’s new issue (no. 17) features a country focus on Canada. From the news item on Nanowerk,

In a special bumper issue of NANO Magazine we focus on two topics – textiles and nanomedicine. We feature articles about textiles from Nicholas Kotov and Kay Obendorf, and Nanomedicine from the London Centre for Nanotechnology and Hans Hofstraat of Philips Healthcare and an interview with Peter Singer, NANO Magazine Issue 17 is essential reading, www.nanomagazine.co.uk.

The featured country in this issue is Canada [emphasis mine], notable for its well funded facilities and research that is aggressively focused on industrial applications. Although having no unifying national nanotechnology initiative, there are many extremely well-funded organisations with world class facilities that are undertaking important nano-related research.

I hope I get a chance to read this issue.

Poetry on Bowen Island

Heather Haley, a local Vancouver, BC area, poet is hosting a special event this coming Saturday at her home on Bowen Island. From the news release,

VISITING POETS Salon & Reading

Josef & Heather’s Place
Bowen Island, BC
7:30  PM
Saturday, April 17, 2010

PENN KEMP, inimitable sound poet from London, Ontario

The illustrious CATHERINE OWEN from Vancouver, BC

To RSVP and get directions please email hshaley@emspace.com

Free Admission
Snacks & beverages-BYOB

Please come on over to our place on the sunny south slope to welcome these fabulous poets, hear their marvelous work, *see* their voices right here on Bowen Island!

London, ON performer and playwright PENN KEMP has published twenty-five books of poetry and drama, had six plays and ten CDs produced as well as Canada’s first poetry CD-ROM and several videopoems.  She performs in festivals around the world, most recently in Britain, Brazil and India. Penn is the Canada Council Writer-in-Residence at UWO for 2009-10.  She hosts an eclectic literary show, Gathering Voices, on Radio Western, CHRWradio.com/talk/gatheringvoices.  Her own project for the year is a DVD devoted to Ecco Poetry, Luminous Entrance: a Sound Opera for Climate Change Action, which has just been released.
CATHERINE OWEN is a Vancouver writer who will be reading from her latest book Frenzy (Anvil Press 09) which she has just toured across the entirety of Canada. Her work has appeared in international magazines, seen translation into three languages and been nominated for honours such as the BC Book Prize and the CBC Award. She plays bass and sings in a couple of metal bands and runs her own tutoring and editing business.

I have seen one of Penn Kemp’s video poems. It was at least five years ago and it still resonates with me . Guess what? I highly recommend going if you can. If you’re curious about Heather and her work, go here.