Tag Archives: Boston University

From monitoring glucose in kidneys to climate change in trees

That headline is almost poetic but I admit It’s a bit of a stretch rhymewise, kidneys/trees. In any event, a Feb. 6, 2015 news item on Azonano describes research into monitoring the effects of climate change on trees,

Serving as a testament to the far-reaching impact of Governor Andrew M. Cuomo’s commitment to maintaining New York State’s global leadership in nanotechnology innovation, SUNY Polytechnic Institute’s Colleges of Nanoscale Science and Engineering (SUNY Poly CNSE) today announced the National Science Foundation (NSF) has awarded $837,000 to support development of a first of its kind nanoscale sensor to monitor the effects of climate change on trees.

A Feb. 5, 2015 SUNY Poly CNSE news release, which originated the news item, provides more details including information about the sensor’s link to measuring glucose in kidneys,

The NSF grant was generated through the Instrument Development for Biological Research (IDBR) program, which provides funds to develop new classes of devices for bio-related research. The NANAPHID, a novel aphid-like nanosensor, will provide real-time measurements of carbohydrates in live plant tissue. Carbohydrate levels in trees are directly connected to plant productivity, such as maple sap production and survival. The NANAPHID will enable researchers to determine the effects of a variety of environmental changes including temperature, precipitation, carbon dioxide, soil acidity, pests and pathogens. The nanosensor can also provide real-time monitoring of sugar concentration levels, which are of signficant importance in maple syrup production and apple and grape farming.

“The technology for the NANAPHID is rooted in a nanoscale sensor SUNY Poly CNSE developed to monitor glucose levels in human kidneys being prepared for transplant. Our team determined that certain adjustments would enable the sensor to provide similar monitoring for plants, and provide a critical insight to the effects of climate change on the environment,” said Dr. James Castracane, professor and head of the Nanobioscience Constellation at SUNY Polytechnic Institute. “This is a perfect example of the cycle of innovation made possible through the ongoing nanotechnology research and development at SUNY Poly CNSE’s NanoTech Complex.”

“This new sensor will be used in several field experiments on measuring sensitivity of boreal forest to climate warming. Questions about forest response to rising air and soil temperatures are extremely important for forecasting future atmospheric carbon dioxide levels, climate change and forest health,” said Dr. Andrei Lapenas, principal investigator and associate professor of climatology at the University at Albany. “At the same time, we already see some potential commercial application for NANAPHID-type sensors in agriculture, food industry and other fields. Our collaboration with SUNY Poly CNSE has been extremely productive and I look forward to continuing our work together.”

The NANAPHID project began in 2014 with a $135,000 SUNY Research Foundation Network of Excellence grant. SUNY Poly CNSE will receive $400,000 of the NSF award for the manufacturing aspects of the sensor array development and testing. The remaining funds will be shared between Dr. Lapenas and researchers Dr. Ruth Yanai (ESF), Dr. Thomas Horton (ESF), and Dr. Pamela Templer (Boston University) for data collection and analysis.

“With current technology, analyzing carbohydrates in plant tissues requires hours in the lab or more than $100 a sample if you want to send them out. And you can’t sample the same tissue twice, the sample is destroyed in the analysis,” said Dr. Yanai. “The implantable device will be cheap to produce and will provide continuous monitoring of sugar concentrations, which is orders of magnitude better in both cost and in the information provided. Research questions we never dreamed of asking before will become possible, like tracking changes in photosynthate over the course of a day or along the stem of a plant, because it’s a nondestructive assay.”

“I see incredible promise for the NANAPHID device in plant ecology. We can use the sensors at the root tip where plants give sugars to symbiotic fungi in exchange for soil nutrients,” said Dr. Horton. “Some fungi are believed to be significant carbon sinks because they produce extensive fungal networks in soils and we can use the sensors to compare the allocation of photosynthate to roots colonized by these fungi versus the allocation to less carbon demanding fungi. Further, the vast majority of these symbiotic fungi cannot be cultured in lab. These sensors will provide valuable insights into plant-microbe interactions under field conditions.”

“The creation of this new sensor will make understanding the effects of a variety of environmental changes, including climate change, on the health and productivity of forests much easier to measure,” said Dr. Templer. “For the first time, we will be able to measure concentrations of carbohydrates in living trees continuously and in real-time, expanding our ability to examine controls on photosynthesis, sap flow, carbon sequestration and other processes in forest ecosystems.”

Fascinating, eh? I wonder who made the connection between human kidneys and plants and how that person made the connection.

Ferroelectric switching in the lung, heart, and arteries

A June 23, 2014 University of Washington (state) news release (also on EurekAlert) describes how the human body (and other biological tissue) is capable of generating ferroelectricity,

University of Washington researchers have shown that a favorable electrical property is present in a type of protein found in organs that repeatedly stretch and retract, such as the lungs, heart and arteries. These findings are the first that clearly track this phenomenon, called ferroelectricity, occurring at the molecular level in biological tissues.

The news release gives a brief description of ferroelectricity and describes the research team’s latest work with biological tissues,

Ferroelectricity is a response to an electric field in which a molecule switches from having a positive to a negative charge. This switching process in synthetic materials serves as a way to power computer memory chips, display screens and sensors. This property only recently has been discovered in animal tissues and researchers think it may help build and support healthy connective tissues in mammals.

A research team led by Li first discovered ferroelectric properties in biological tissues in 2012, then in 2013 found that glucose can suppress this property in the body’s connective tissues, wherever the protein elastin is present. But while ferroelectricity is a proven entity in synthetic materials and has long been thought to be important in biological functions, its actual existence in biology hasn’t been firmly established.

This study proves that ferroelectric switching happens in the biological protein elastin. When the researchers looked at the base structures within the protein, they saw similar behavior to the unit cells of solid-state materials, where ferroelectricity is well understood.

“When we looked at the smallest structural unit of the biological tissue and how it was organized into a larger protein fiber, we then were able to see similarities to the classic ferroelectric model found in solids,” Li said.

The researchers wanted to establish a more concrete, precise way of verifying ferroelectricity in biological tissues. They used small samples of elastin taken from a pig’s aorta and poled the tissues using an electric field at high temperatures. They then measured the current with the poling field removed and found that the current switched direction when the poling electric field was switched, a sign of ferroelectricity.

They did the same thing at room temperature using a laser as the heat source, and the current also switched directions.

Then, the researchers tested for this behavior on the smallest-possible unit of elastin, called tropoelastin, and again observed the phenomenon. They concluded that this switching property is “intrinsic” to the molecular make-up of elastin.

The next step is to understand the biological and physiological significance of this property, Li said. One hypothesis is that if ferroelectricity helps elastin stay flexible and functional in the body, a lack of it could directly affect the hardening of arteries.

“We may be able to use this as a very sensitive technique to detect the initiation of the hardening process at a very early stage when no other imaging technique will be able to see it,” Li said.

The team also is looking at whether this property plays a role in normal biological functions, perhaps in regulating the growth of tissue.

Co-authors are Pradeep Sharma at the University of Houston, Yanhang Zhang at Boston University, and collaborators at Nanjing University and the Chinese Academy of Sciences.

Here’s a link to and a citation for the research paper,

Ferroelectric switching of elastin by Yuanming Liu, Hong-Ling Cai, Matthew Zelisko, Yunjie Wang, Jinglan Sun, Fei Yan, Feiyue Ma, Peiqi Wang, Qian Nataly Chen, Hairong Zheng, Xiangjian Meng, Pradeep Sharma, Yanhang Zhang, and Jiangyu Li. Proceedings of the National Academy of Sciences (PNAS) doi: 10.1073/pnas.1402909111

This paper is behind a paywall.

I think this is a new practice. There is a paragraph on the significance of this work (follow the link to the paper),

Ferroelectricity has long been speculated to have important biological functions, although its very existence in biology has never been firmly established. Here, we present, to our knowledge, the first macroscopic observation of ferroelectric switching in a biological system, and we elucidate the origin and mechanism underpinning ferroelectric switching of elastin. It is discovered that the polarization in elastin is intrinsic at the monomer level, analogous to the unit cell level polarization in classical perovskite ferroelectrics. Our findings settle a long-standing question on ferroelectric switching in biology and establish ferroelectricity as an important biophysical property of proteins. We believe this is a critical first step toward resolving its physiological significance and pathological implications.

Producing stronger silk musically

Markus Buehler and his interdisciplinary team (my previous posts on their work includes Gossamer silk that withstands hurricane force winds and Music, math, and spiderwebs) have synthesized a new material based on spider silk. From the Nov. 28, 2012 news item on ScienceDaily,

Pound for pound, spider silk is one of the strongest materials known: Research by MIT’s [Massachusetts Institute of Technology] Markus Buehler has helped explain that this strength arises from silk’s unusual hierarchical arrangement of protein building blocks.

Now Buehler — together with David Kaplan of Tufts University and Joyce Wong of Boston University — has synthesized new variants on silk’s natural structure, and found a method for making further improvements in the synthetic material.

And an ear for music, it turns out, might be a key to making those structural improvements.

Here’s Buehler describing the work in an MIT video clip,

The Nov. 28, 2012 MIT news release by David Chandler provides more details,

Buehler’s previous research has determined that fibers with a particular structure — highly ordered, layered protein structures alternating with densely packed, tangled clumps of proteins (ABABAB) — help to give silk its exceptional properties. For this initial attempt at synthesizing a new material, the team chose to look instead at patterns in which one of the structures occurred in triplets (AAAB and BBBA).

Making such structures is no simple task. Kaplan, a chemical and biomedical engineer, modified silk-producing genes to produce these new sequences of proteins. Then Wong, a bioengineer and materials scientist, created a microfluidic device that mimicked the spider’s silk-spinning organ, which is called a spinneret.

Even after the detailed computer modeling that went into it, the outcome came as a bit of a surprise, Buehler says. One of the new materials produced very strong protein molecules — but these did not stick together as a thread. The other produced weaker protein molecules that adhered well and formed a good thread. “This taught us that it’s not sufficient to consider the properties of the protein molecules alone,” he says. “Rather, [one must] think about how they can combine to form a well-connected network at a larger scale.”

The different levels of silk’s structure, Buehler says, are analogous to the hierarchical elements that make up a musical composition — including pitch, range, dynamics and tempo. The team enlisted the help of composer John McDonald, a professor of music at Tufts, and MIT postdoc David Spivak, a mathematician who specializes in a field called category theory. Together, using analytical tools derived from category theory to describe the protein structures, the team figured out how to translate the details of the artificial silk’s structure into musical compositions.

The differences were quite distinct: The strong but useless protein molecules translated into music that was aggressive and harsh, Buehler says, while the ones that formed usable fibers sound much softer and more fluid.

Combining materials modeling with mathematical and musical tools, Buehler says, could provide a much faster way of designing new biosynthesized materials, replacing the trial-and-error approach that prevails today. Genetically engineering organisms to produce materials is a long, painstaking process, he says, but this work “has taught us a new approach, a fundamental lesson” in combining experiment, theory and simulation to speed up the discovery process.

Materials produced this way — which can be done under environmentally benign, room-temperature conditions — could lead to new building blocks for tissue engineering or other uses, Buehler says: scaffolds for replacement organs, skin, blood vessels, or even new materials for use in civil engineering.

It may be that the complex structures of music can reveal the underlying complex structures of biomaterials found in nature, Buehler says. “There might be an underlying structural expression in music that tells us more about the proteins that make up our bodies. After all, our organs — including the brain — are made from these building blocks, and humans’ expression of music may inadvertently include more information that we are aware of.”

“Nobody has tapped into this,” he says, adding that with the breadth of his multidisciplinary team, “We could do this — making better bio-inspired materials by using music, and using music to better understand biology.”

At the end of Chandler’s news release there’s a notice about a summer course with Markus Buehler,

For those interested in the work Professor Buehler is doing, you may also be interested to know that he is offering a short course on campus this summer called Materials By Design.

Materials By Design
June 17-20, 2013

Through lectures and hands-on labs, participants will learn how materials failure, studied from a first principles perspective, can be applied in an effective “learning-from-failure approach” to design and make novel materials. Participants will also learn how superior material properties in nature and biology can be mimicked in bioinspired materials for applications in new technology. This course will be of interest to scientists, engineers, managers, and policy makers working in the area of materials design, development, manufacturing, and testing. [emphasis mine]

I wasn’t expecting to see managers and policy makers as possible students for this course.

By the way, Buehler is not the only scientist to make a connection between music and biology (although he seems to be the only person using the concept for applications), there’s also geneticist and biophysicist, Mae Wan Ho and her notion of quantum jazz. From the Quantum Jazz Biology* article by David Reilly in the June 23, 2010 Isis Report,

I use the analogy of ‘quantum jazz’ to express the quantum coherence of the organism. It goes through a fantastic range of space and time scales, from the tiniest atom or subatomic particle to the whole organism and beyond. Organisms communicate with other organisms, and are attuned to natural rhythms, so they have circadian rhythms, annual rhythms, and so on. At the other extreme, you have very fast reactions that take place in femtoseconds. And all these rhythms are coordinated, there is evidence for that.

Purpose in nature (and the universe): even scientists believe

An intriguing research article titled, Professional Physical Scientists Display Tenacious Teleological Tendencies: Purpose-Based Reasoning as a Cognitive Default, is behind a paywall making it difficult to do much more than comment on the Oct. 17, 2012 news item (on ScienceDaily),

A team of researchers in Boston University’s Psychology Department has found that, despite years of scientific training, even professional chemists, geologists, and physicists from major universities such as Harvard, MIT, and Yale cannot escape a deep-seated belief that natural phenomena exist for a purpose.

Although purpose-based “teleological” explanations are often found in religion, such as in creationist accounts of Earth’s origins, they are generally discredited in science. When physical scientists have time to ruminate about the reasons why natural objects and events occur, they explicitly reject teleological accounts, instead favoring causal, more mechanical explanations. However, the study by lead author Deborah Kelemen, associate professor of psychology, and collaborators Joshua Rottman and Rebecca Seston finds that when scientists are required to think under time pressure, an underlying tendency to find purpose in nature is revealed.

“It is quite surprising what these studies show,” says Kelemen. “Even though advanced scientific training can reduce acceptance of scientifically inaccurate teleological explanations, it cannot erase a tenacious early-emerging human tendency to find purpose in nature. It seems that our minds may be naturally more geared to religion than science.”

I did find the abstract for the paper,

… In Study 2, we explored this further and found that the teleological tendencies of professional scientists did not differ from those of humanities scholars. Thus, although extended education appears to produce an overall reduction in inaccurate teleological explanation, specialization as a scientist does not, in itself, additionally ameliorate scientifically inaccurate purpose-based theories about the natural world. A religion-consistent default cognitive bias toward teleological explanation tenaciously persists and may have subtle but profound consequences for scientific progress.

Here’s the full citation for the paper if you want examine it yourself,

Professional Physical Scientists Display Tenacious Teleological Tendencies: Purpose-Based Reasoning as a Cognitive Default. By Kelemen, Deborah; Rottman, Joshua; Seston, Rebecca

Journal of Experimental Psychology: General, Oct 15, 2012.

What I find particularly intriguing about this work is that it helps to provide an explanation for a phenomenon I’ve observed at science conferences and science talks and in science books. The phenomenon is a tendency to ignore a particular set of questions, how did it start? where did it come from? etc. when discussing nature or, indeed, the universe.

I noticed the tendency again last night (Oct. 16, 2012) at the CBC (Canadian Broadcasting Corporation) Massey Lecture being given by Neil Turok, director of the Canadian Perimeter Institute for Theoretical Physics, and held in Vancouver (Canada). The event was mentioned in my  Oct. 12, 2012 posting (scroll down 2/3 of the way).

During this third lecture (What Banged?)  in a series of five Massey lectures. Turok asked the audience (there were roughly 800 people by my count) to imagine a millimetre ball of light as the starting point for the universe. He never did tell us where this ball of light came from. The entire issue as to how it all started (What Banged?) was avoided. Turok’s avoidance is not unusual. Somehow the question is always set aside, while the scientist jumps into the part of the story she or he can or wants to explain.


Interestingly, Turok has given the What Banged? talk previously in 2008 in Waterloo, Ontario. According to this description of the 2008 What Banged? talk, he did modify the presentation for last night,

The evidence that the universe emerged 14 billion years ago from an event called ‘the big bang’ is overwhelming. Yet the cause of this event remains deeply mysterious. In the conventional picture, the ‘initial singularity’ is unexplained. It is simply assumed that the universe somehow sprang into existence full of ‘inflationary’ energy, blowing up the universe into the large, smooth state we observe today. While this picture is in excellent agreement with current observations, it is both contrived and incomplete, leading us to suspect that it is not the final word. In this lecture, the standard inflationary picture will be contrasted with a new view of the initial singularity suggested by string and M-theory, in which the bang is a far more normal, albeit violent, event which occurred in a pre-existing universe. [emphasis mine] According to the new picture, a cyclical model of the universe becomes feasible in which one bang is followed by another, in a potentially endless series of cosmic cycles. The presentation will also review exciting recent theoretical developments and forthcoming observational tests which could distinguish between the rival inflationary and cyclical hypotheses.

Even this explanation doesn’t really answer the question. If there, is as suggested, a pre-existing universe, where did that come from? At the end of last night’s lecture, Turok seemed to be suggesting some kind of endless loop where past, present, and future are linked, which still begs the question: where did it all come from?

I can certainly understand how scientists who are trained to avoid teleological explanations (with their religious overtones) would want to avoid or rush over any question that might occasion just such an explanation.

Last night, the whole talk was a physics and history of physics lesson for ‘dummies’ that didn’t quite manage to be ‘dumb’ enough for me and didn’t really deliver on the promise in this description, from the Oct. 16, 2012 posting by Brian Lynch on the Georgia Straight website,

Don’t worry if your grasp of relativistic wave equations isn’t what it once was. The Waterloo, Ontario–based physicist is speaking the language of the general public here. Even though his subject dwarfs pretty much everything else, the focus of the series as a whole is human in scale. Turok sees our species as standing on the brink of a scientific revolution, where we can understand “how our ideas regarding our place in the universe may develop, and how our very nature may change.” [emphasis mine]

Perhaps Turok is building up to a discussion about “our place  in the universe” and “how our very nature may change,” sometime in the next two lectures.

Organ chips for DARPA (Defense Advanced Research Projects Agency)

The Wyss Institute will receive up to  $37M US for a project that integrates ten different organ-on-a-chip projects into one system. From the July 24, 2012 news release on EurekAlert,

With this new DARPA funding, Institute researchers and a multidisciplinary team of collaborators seek to build 10 different human organs-on-chips, to link them together to more closely mimic whole body physiology, and to engineer an automated instrument that will control fluid flow and cell viability while permitting real-time analysis of complex biochemical functions. As an accurate alternative to traditional animal testing models that often fail to predict human responses, this instrumented “human-on-a-chip” will be used to rapidly assess responses to new drug candidates, providing critical information on their safety and efficacy.

This unique platform could help ensure that safe and effective therapeutics are identified sooner, and ineffective or toxic ones are rejected early in the development process. As a result, the quality and quantity of new drugs moving successfully through the pipeline and into the clinic may be increased, regulatory decision-making could be better informed, and patient outcomes could be improved.

Jesse Goodman, FDA Chief Scientist and Deputy Commissioner for Science and Public Health, commented that the automated human-on-chip instrument being developed “has the potential to be a better model for determining human adverse responses. FDA looks forward to working with the Wyss Institute in its development of this model that may ultimately be used in therapeutic development.”

Wyss Founding Director, Donald Ingber, M.D., Ph.D., and Wyss Core Faculty member, Kevin Kit Parker, Ph.D., will co-lead this five-year project.

I note that Kevin Kit Parker was mentioned in an earlier posting today (July 26, 2012) titled, Medusa, jellyfish, and tissue engineering, and Donald Ingber in my Dec.1e, 2011 posting about Shrilk and insect skeletons.

As for the Wyss Institute, here’s a description from the news release,

The Wyss Institute for Biologically Inspired Engineering at Harvard University (http://wyss.harvard.edu) uses Nature’s design principles to develop bioinspired materials and devices that will transform medicine and create a more sustainable world. Working as an alliance among Harvard’s Schools of Medicine, Engineering, and Arts & Sciences, and in partnership with Beth Israel Deaconess Medical Center, Boston Children’s Hospital, Brigham and Women’s Hospital, , Dana Farber Cancer Institute, Massachusetts General Hospital, the University of Massachusetts Medical School, Spaulding Rehabilitation Hospital, Tufts University, and Boston University, the Institute crosses disciplinary and institutional barriers to engage in high-risk research that leads to transformative technological breakthroughs. By emulating Nature’s principles for self-organizing and self-regulating, Wyss researchers are developing innovative new engineering solutions for healthcare, energy, architecture, robotics, and manufacturing. These technologies are translated into commercial products and therapies through collaborations with clinical investigators, corporate alliances, and new start-ups.

I hadn’t thought of an organ-on-a-chip as particularly bioinspired so I’ll have to think about that one for a while.

Billions lost to patent trolls; US White House asks for comments on intellectual property (IP) enforcement; and more on IP

It becomes clear after a time that science, intellectual property (patents, copyright, and trademarks), and business interests are intimately linked which is why I include items on the topic of intellectual property (where I am developing some strong opinions). As for business topics, I am more neutral as my understanding of business is quite limited.

All of this is to explain why I’m taking ‘another kick at the IP (intellectual property) can’. I’m going to start with patents and move on to copyright.

A June 26, 2012 news item from BBC News online highlights the costs associated with patent trolls,

The direct cost of actions taken by so-called “patent trolls” totalled $29bn (£18.5bn) in the US in 2011, according to a study by Boston University.

It analysed the effect of intellectual rights claims made by organisations that own and license patents without producing related goods of their own.

Such bodies say they help spur on innovation by ensuring inventors are compensated for their creations.

But the study’s authors said society lost more than it gained.

A June 27, 2012 commentary by Mike Masnick for Techdirt provides more detail,

The report then goes further to try to figure out whether the trolls are actually benefiting innovation and getting more money to inventors, as the trolls and their supporters like to claim. Unfortunately, the research shows quite a different story — with very little of the money actually flowing back to either inventors or actual innovation. In other words, we’re talking about a pretty massive economic dead-weight loss here. Money flowing from actual innovators and creators… to lawyers, basically. Innovators grow the economy. Lawyers do not.

Masnick’s commentary includes a table from the report showing how the costs have increased since 2005 (approximately $6B) to 2011 (approximately $29B).

The researchers are James E. Besson and Michael J. Meurer at Boston University and the open access report, The Direct Costs from NPE [non-practicing entities] Disputes, is available from the Social Science Research Network.

Interestingly the same day the study from Boston University was released was the same day that the US White House’s Intellectual Property Enforcement Coordinator, Victoria Espinel, announced she wanted comments about US IP enforcement efforts (from Espinel’s June 25, 2012 blog posting),

Today my office is starting the process of gathering input for the Administration’s new strategy for intellectual property enforcement. The overarching objective of the Strategy is to improve the effectiveness of the U.S. Government’s efforts to protect our intellectual property here and overseas. I want to make sure as many people as possible are aware that we are working on this so we can get the very best thoughts and recommendations possible. Part of the process of gathering public input is to publish a “Federal Register Notice” where we formally ask the public to give us their ideas. We will read all of your submissions – and we will make them publicly available so everyone can see them.

You can do so by following this link to Regulations.gov where you will find more details for submitting your strategy recommendations beginning today.

I believe that essential to the development of an effective enforcement strategy, is ensuring that any approaches that are considered to be particularly effective as well as any concerns with the present approach to intellectual property enforcement are understood by policymakers. [emphasis Mike Masnick of Techdirt] Recommendations may include, but need not be limited to: legislation, regulation, guidance, executive order, Presidential memoranda, or other executive action, including, but not limited to, changes to agency policies, practices or methods.

Beyond recommendations for government action as part of the next Strategy, we are looking for information on and recommendations for combating emerging or future threats to American innovation and economic competitiveness posed by violations of intellectual property rights. Additionally, it would be useful to the development of the Strategy to receive submissions from the public identifying threats to public health and safety posed by intellectual property infringement, [emphasis mine] in the U.S. and internationally as well as information relating to the costs to the U.S. economy resulting from infringement of intellectual property rights.

Aside: That bit about public health and safety being endangered by infringement is going to have to be explained to me. Moving along, Mike Masnick’s June 26, 2012 commentary about this matter on Techdirt includes an exhortation to participate,

I will be submitting my own thoughts, which I will also publish here, but for those thinking about what to say, I would focus on this sentence above [emphasized in the previous excerpt from the Espinel posting “I believe that essential …”]. Historically, many of the government’s approaches have not been at all effective, and have created a number of significant problems — most of which have been ignored by the government (either willfully or through ignorance). This really is a chance to provide examples of why the current policy is not effective (and will never be effective if it keeps on the current path) as well as the “concerns” with the current approach, such as the criminalization of expressive behavior and the outright censorship of media publications.

Meanwhile, we here in Canada are focused on copyright.

Michael Geist (the Canadian copyright guru) notes in his June 26, 2012 posting (Note: I have removed some links.),

Brian Brett, the former Chair of the Writers’ Union of Canada and an award winning author, has issued an explosive public letter that “breaks the ‘cone of silence’ that has obscured for too long some of the ugly practices of Access Copyright.”

You can get an idea why Geist described the letter as “explosive” from this excerpt (from the June 26, 2012 commentary in the Georgia Straight),

As a former Chair of the Writers’ Union of Canada (I’ve been a member more than thirty years), I have been asked to sign a letter to educational institutions supporting Access Copyright’s efforts to obtain collective licensing agreements with those institutions. I will not sign. I believe the time has come for action, not words. …

For the first time in history it has become too complex and expensive to quote the music of our era for many young writers. Writers are being charged exorbitantly for quoting other writers in their poems, fictions, and essays; yet are losing their own rights and income. Meanwhile, the Canadian Government has made legislation favouring educational institutions and media empires (at the expense of creators) in the name of supporting our nation’s culture.

As we earnestly discuss these issues, but do nothing to protect ourselves, we are seeing the rights of creators to fair compensation eroded to the point of where many are at risk of receiving nothing for their work.

Access Copyright, created specifically to collect fair compensation for creators, is central to this discussion. While I believe that educational institutions must pay writers, and will eventually pay them, it’s also necessary to call out the ugly regime of Access Copyright, which is collecting our copyright income. …

6. Access Copyright rewards textbook companies who demand that authors relinquish their copyright to their work by paying them both the publisher and creator copyright payment. Academic authors often consider textbook authorship crucial to tenure. Thus academic authors are open to being pressured by publishers out of their copyright. In effect Access Copyright is encouraging textbook publishers to undermine copyright by demanding a creators’ total copyright, and doubling the publisher’s payment for this ugly practice.

So, the academics who write those science and math (and other subject) texts are being pressured by financially motivated publishers to give up copyright while they are also being being pressured to publish for the well-being of their careers. Nicely done Access Copyright! (sarcasm)

While I suspect that I don’t agree with Betts on some issues, I do believe that content creators should receive some financial benefit from their work.

On a more hopeful note, the recent passage of Bill C-11 (Copyright) has some very good things indeed (from the June 21, 2012 commentary by Leigh Beadon on Techdirt [Note: I have removed a link.]),

Michael Geist has an excellent summary of C-11 with a comparison to previous phases of copyright law in Canada. The victories for smarter copyright law in C-11 sound almost like fantasy when compared to the American copyright debate. They include:

  • New fair dealing provisions (our version of fair use) to cover educational uses, plus parody and satire
  • New backup, format-shifting and time-shifting allowances that remove previous restrictions on networked DVRs and internet TV services (similar to those that have suffered in American courts)
  • Explicit copyright exceptions for “user-generated content”, aimed at protecting non-commercial fan-art and remixes
  • A bunch of explicit exceptions for schools, such as the right to stage public performances
  • A notice-and-notice system, not a notice-and-takedown system
  • A $5,000 cap on statutory damages for all non-commercial infringement

Sadly, there is the issue of the ‘digital lock’ provision which was rammed through Parliament despite almost universal condemnation from Canadians of all walks of life. Geist provides much more detail about this issue than I can. In fact, he offers two postings outlining both Canada’s Justice Dept. discussion about the digital lock provisions (June 25, 2012 posting) and the Competition Bureau’s (June 26, 2012 posting) and possible issues with constitutional rights.

On a much happier note for me personally is a recent Federal Court of Canada ruling about linking and posting, from the June 25, 2012 posting on the Michael Geist blog (Note: I have removed links.),

The Federal Court of Canada has issued an important decision involving copyright and posting content online. The case involves a lawsuit launched by Richard Warman and the National Post against Mark and Constance Fournier, who run the FreeDominion website. Warman and the National Post sued the site over the appearance of two articles and an inline link to photograph that appeared on the forum. The court dismissed all three claims.

While the first claim (Warman’s article) was dismissed on the basis that it took too long to file the lawsuit, the legal analysis on the National Post claim involving an article by Jonathan Kay assesses the copyright implications of posting several paragraphs from an article online. In this case, the article was 11 paragraphs long.  The reproduction on the Free Dominion site included the headline, three complete paragraphs and part of a fourth. The court ruled that this amount of copying did not constitute a “substantial part” of the work and therefore there was no infringement. The court added that in the alternative, the reproduction of the work was covered by fair dealing, concluding that a large and liberal interpretation of news reporting would include posts to the discussion forum.  The decision then includes an analysis of the six factor test and concludes that the use was fair.

So I can link to and quote from Canadian publications in peace, for now. (Great news!)

There is some additional analysis of the ruling in a (h/t) June 26, 2012 posting by Leigh Beadon on the Techdirt website.

No grand thoughts here. I just find this very fluid situation with regard to intellectual property important as I believe the outcomes will affect us all in many ways, including how we practice science.

US soldiers get batteries woven into their clothes

Last time I wrote about soldiers, equipment, and energy-efficiency (April 5, 2012 posting) the soldiers in question were British. Today’s posting focuses on US soldiers. From the May 7, 2012 news item on Nanowerk,

U.S. soldiers are increasingly weighed down by batteries to power weapons, detection devices and communications equipment. So the Army Research Laboratory has awarded a University of Utah-led consortium almost $15 million to use computer simulations to help design materials for lighter-weight, energy efficient devices and batteries.

“We want to help the Army make advances in fundamental research that will lead to better materials to help our soldiers in the field,” says computing Professor Martin Berzins, principal investigator among five University of Utah faculty members who will work on the project. “One of Utah’s main contributions will be the batteries.”

Of the five-year Army grant of $14,898,000, the University of Utah will retain $4.2 million for research plus additional administrative costs. The remainder will go to members of the consortium led by the University of Utah, including Boston University, Rensselaer Polytechnic Institute, Pennsylvania State University, Harvard University, Brown University, the University of California, Davis, and the Polytechnic University of Turin, Italy.

The new research effort is based on the idea that by using powerful computers to simulate the behavior of materials on multiple scales – from the atomic and molecular nanoscale to the large or “bulk” scale – new, lighter, more energy efficient power supplies and materials can be designed and developed. Improving existing materials also is a goal.

“We want to model everything from the nanoscale to the soldier scale,” Berzins says. “It’s virtual design, in some sense.”

“Today’s soldier enters the battle space with an amazing array of advanced electronic materials devices and systems,” the University of Utah said in its grant proposal. “The soldier of the future will rely even more heavily on electronic weaponry, detection devices, advanced communications systems and protection systems. Currently, a typical infantry soldier might carry up to 35 pounds of batteries in order to power these systems, and it is clear that the energy and power requirements for future soldiers will be much greater.” [emphasis mine]

“These requirements have a dramatic adverse effect on the survivability and lethality of the soldier by reducing mobility as well as the amount of weaponry, sensors, communication equipment and armor that the soldier can carry. Hence, the Army’s desire for greater lethality and survivability of its men and women in the field is fundamentally tied to the development of devices and systems with increased energy efficiency as well as dramatic improvement in the energy and power density of [battery] storage and delivery systems.”

Up to 35 lbs. of batteries? I’m trying to imagine what the rest of the equipment would weigh. In any event, they seem to be more interested in adding to the weaponry than reducing weight. At least, that’s how I understand “greater *lethality.” Nice of them to mention greater survivability too.

The British project is more modest, they are weaving e-textiles that harvest energy allowing British soldiers to carry fewer batteries. I believe field trials were scheduled for May 2012.

* Correction: leathility changed to lethality on July 31, 2013.

Talking nano

I’ve come across a couple interesting blog postings and a podcast about the journalistic, marketing, and communication problems posed by nanotechnology. First here’s my take as informed by reading the postings and listening to the podcast. The journalistic issue is that nanotechnology is one of those science stories that are tough to sell because if people don’t understand at least some of the underlying scientific principles making  nanotechnology very hard to discuss without a lot of ‘educational detail’ and that kind of detail can limit your potential audience.  You can find another perspective on this by Howard Lovy here.

From a marketing communications or public relations perspective, there’s a lot of promising research that suggests beneficial applications and/or potentially serious risks. It’s hard to tell if the word nano will be perceived as good, bad, or descriptive (e.g. electronic is a neutral description whereas atomic and nuclear have accrued negative connotations). Here‘s another take on the issue.

Making the whole writing/journalism/marketing communication/activism (aside: activists also want to stake nanotechnology territory) thing even harder is the (generally accepted but not official) definition of nanotechnology is a measurement. This fact is still debated within the scientific community (some don’t accept the current definition) and it doesn’t mean much to most people outside the scientific community. As for why it matters? We need ways to discuss things that affect us and it seems that if scientists have their way, nanotechnology will. For more about why it’s important to find ways to talk about nanotechnology, go here for a podcast interview with Stine Grodal, a professor at Boston University.