Tag Archives: Russia

May 16, 2018: UNESCO’s (United Nations Educational, Scientific and Cultural Organization) First International Day of Light

Courtesy: UNESCO

From a May 11, 2018 United Nations Educational, Scientific and Cultural Organization (UNESCO) press release (received via email),

UNESCO will welcome leading scientists on 16 May 2018 for the 1st edition of the International Day of Light (02:30-08:00 pm) to celebrate the role light plays in our daily lives. Researchers and intellectuals will examine how light-based technologies can contribute to meet pressing challenges in diverse areas, such as medicine, education, agriculture and energy.

            UNESCO Director-General Audrey Azoulay will open this event, which will count with the participation of renowned scientists, including:

  • Kip Thorne, 2017 Nobel Prize in Physics, California Institute of Technology (United States of America).
  • Claude Cohen-Tannoudji, 1997 Nobel Prize in Physics, Collège de France.
  • Khaled Toukan, Director of the Synchrotron-light for Experimental Science and Applications in the Middle East (SESAME) based in Allan, Jordan.

The programme of keynotes and roundtables will address many key issues including science policy, our perception of the universe, and international cooperation, through contributions from experts and scientists from around the world.

The programme also includes cultural events, an illumination of UNESCO Headquarters, a photonics science show and an exhibit on the advances of light-based technologies and art.

            The debates that flourished in 2015, in the framework of the International Year of Light, highlighted the importance of light sciences and light-based technologies in achieving the United Nations Sustainable Development Goals. Several thousand events were held in 147 countries during the Year placed under the auspices of UNESCO.  

The proclamation of 16 May as the International Day of Light was supported by UNESCO’s Executive Board following a proposal by Ghana, Mexico, New Zealand and the Russian Federation, and approved by the UNESCO General Conference in November 2017.

More information:

I have taken a look at the programme which is pretty interesting. Unfortunately, I can’t excerpt parts of it for inclusion here as very odd things happen when I attempt to ‘copy and paste’. On the plus side. there’s a bit more information about this ‘new day’ on its event page,

Light plays a central role in our lives. On the most fundamental level, through photosynthesis, light is at the origin of life itself. The study of light has led to promising alternative energy sources, lifesaving medical advances in diagnostics technology and treatments, light-speed internet and many other discoveries that have revolutionized society and shaped our understanding of the universe. These technologies were developed through centuries of fundamental research on the properties of light – starting with Ibn Al-Haytham’s seminal work, Kitab al-Manazir (Book of Optics), published in 1015 and including Einstein’s work at the beginning of the 20th century, which changed the way we think about time and light.

The International Day of Light celebrates the role light plays in science, culture and art, education, and sustainable development, and in fields as diverse as medicine, communications, and energy. The will allow many different sectors of society worldwide to participate in activities that demonstrates how science, technology, art and culture can help achieve the goals of UNESCO – building the foundation for peaceful societies.

The International Day of Light is celebrated on 16 May each year, the anniversary of the first successful operation of the laser in 1960 by physicist and engineer, Theodore Maiman. This day is a call to strengthen scientific cooperation and harness its potential to foster peace and sustainable development.

Happy International Day of Light on Wednesday, May 16, 2018!

“Living” bandages made from biocompatible anti-burn nanofibers

A February 16, 2018 news item on Nanowerk announces research from a Russian team about their work on “living” bandages,

In regenerative medicine, and particularly in burn therapy, the effective regeneration of damaged skin tissue and the prevention of scarring are usually the main goals. Scars form when skin is badly damaged, whether through a cut, burn, or a skin problem such as acne or fungal infection.

Scar tissue mainly consists of irreversible collagen and significantly differs from the tissue it replaces, having reduced functional properties. For example, scars on skin are more sensitive to ultraviolet radiation, are not elastic, and the sweat glands and hair follicles are not restored in the area.

The solution of this medical problem was proposed by the researchers from the NUST MISIS [National University of Science and Technology {formerly Moscow Institute of Steel and Alloys State Technological University})] Inorganic Nanomaterials Laboratory, led by PhD Anton Manakhov, a senior researcher. The team of nanotechnology scientists has managed to create multi-layer ‘bandages’ made of biodegradable fibers and multifunctional bioactive nanofilms, which [the bandages] prevent scarring and accelerate tissue regeneration.

A February 14, 2018 NUST MISIS press release, which originated the news item, provides more detail,

The addition of the antibacterial effect by the introduction of silver nanoparticles or joining antibiotics, as well as the increase of biological activity to the surface of hydrophilic groups (-COOH) and the blood plasma proteins have provided unique healing properties to the material.

A significant acceleration of the healing process, the successful regeneration of normal skin covering tissue, and the prevention of scarring on the site of burnt or damaged skin have been observed when applying these bandages made of the developed material to an injured area. The antibacterial components of multifunctional nanofibers decrease inflammation, and the blood plasma with an increased platelet level — vital and multi-purposed for every element in the healing process — stimulates the regeneration of tissues. The bandages should not be removed or changed during treatment as it may cause additional pain to the patient. After a certain period of time, the biodegradable fiber simply “dissolves” without any side effects.

“With the help of chemical bonds, we were able to create a stable layer containing blood plasma components (growth factors, fibrinogens, and other important proteins that promote cell growth) on a polycaprolactone base. The base fibers were synthesized by electroforming. Then, with the help of plasma treatment, to increase the material`s hydrophilic properties, a polymer layer containing carboxyl groups was applied to the surface. The resulting layer was enriched with antibacterial and protein components”, noted Elizabeth Permyakova, one of the project members and laboratory scientists.

The researchers have made images of their work available including this one,

Courtesy NUST MISS [downloaded from http://en.misis.ru/university/news/science/2018-02/5219/]

There is doesn’t appear to be an accompanying published paper.

Interstellar fullerenes

This work from Russia on fullerenes (also known as buckministerfullerenes, C60, and/or buckyballs) is quite interesting and dates back more than a year. I’m not sure why the work is being publicized now but nanotechnology and interstellar space is not covered here often enough so, here goes, (from a January 29, 2018 Kazan Federal University press release (also on EurekAlert), Note: Links have been removed,

Here’s a link to and a citation for the paper,

C60+ – looking for the bucky-ball in interstellar space by G. A. Galazutdinov, V. V. Shimansky, A. Bondar, G. Valyavin, J. Krełowski. Monthly Notices of the Royal Astronomical Society, Volume 465, Issue 4, 11 March 2017, Pages 3956–3964, https://doi.org/10.1093/mnras/stw2948 Published: 22 December 2016

This paper is behind a paywall.

h/t January 29, 2018 news item on Nanowerk

Are copper nanoparticles good candidates for synthesizing medicine?

This research appears to be a collaboration between Russian and Indian scientists. From a December 5, 2017 news item on Nanowerk (Note: A link has been removed),

Chemists of Ural Federal University with colleagues from India proved the effectiveness of copper nanoparticles as a catalyst on the example of analysis of 48 organic synthesis reactions (Coordination Chemistry Reviews, “Copper nanoparticles as inexpensive and efficient catalyst: A valuable contribution in organic synthesis”).

One of the advantages of the catalyst is its insolubility in traditional organic solvents. This makes copper nanoparticles a valuable alternative to heavy metal catalysts, for example palladium, which is currently used for the synthesis of many pharmaceuticals and is toxic for cells.

“Copper nanoparticles are an ideal variant of a heterophasic catalyst, since they exist in a wide variety of geometric shapes and sizes, which directly affects the surface of effective mass transfer, so reactions in the presence of this catalyst are characterized by shorter reaction times, selectivity and better yields,” says co-author Grigory Zyryanov, Doctor of Chemistry, Associate Professor of the Department of Organic and Biomolecular Chemistry of UrFU.

A December 11, 2017 (there can be a gap between distributing a press release and posting it on the home website) Ural Federal University press release, which originated the news item, makes the case for copper nanoparticles as catalytic agents,

Copper nanoparticles are inexpensive since there are many simple ways to obtain them from cheap raw materials and these methods are constantly being modified. As a result, it is possible to receive a highly porous structure of catalyst based on copper nanoparticles with a pore size of several tens to several hundred nanometers. Due to the small particle size, the area of the catalytic surface is enormous. Moreover, due to the insolubility of copper nanoparticles, the reactions catalyzed by them go on the surface of the catalyst. After the reaction is completed, the copper nanoparticles that do not interact with the solvents are easily removed, which guarantees the absence of the catalyst admixture in the composition of the final product. These catalysts are already in demand for organic synthesis by the methods of “green chemistry”. Its main principles are simplicity, cheapness, safety of production, recyclability of the catalysts.

One of the promising areas of application of the copper nanoparticle catalyst is, first of all, the creation of medical products using cross-coupling reactions. In 2010, for work in the field of palladium catalyzed cross-coupling reactions, the Nobel Prize in Chemistry was awarded to scientists from Japan and the USA: Richard Heck, Ei-ichi Negishi and Akira Suzuki. Despite worldwide recognition, palladium catalyzed cross-coupling reactions are undesirable for the synthesis of most medications due to the toxicity of palladium for living cells and the lack of methods for reliable removal of palladium traces from the final product. In addition to toxicity, the high cost of catalysts based on palladium, as well as another catalyst for pharmaceuticals, platinum, makes the use of copper nanoparticles economically and environmentally justified.

Here’s a link to and a citation for the paper,

Copper nanoparticles as inexpensive and efficient catalyst: A valuable contribution in organic synthesis by Nisha Kant Ojha, Grigory V. Zyryanov, Adinath Majee, Valery N. Charushin, Oleg N. Chupakhin, Sougata Santra. Coordination Chemistry Reviews Volume 353, 15 December 2017, Pages 1-57 https://doi.org/10.1016/j.ccr.2017.10.004

This paper is behind a paywall.

Editing the genome with CRISPR ((clustered regularly interspaced short palindromic repeats)-carrying nanoparticles

MIT (Massachusetts Institute of Technology) researchers have developed a new nonviral means of delivering CRISPR ((clustered regularly interspaced short palindromic repeats)-CAS9 gene therapy according to a November 13, 2017 news item on Nanowerk,

In a new study, MIT researchers have developed nanoparticles that can deliver the CRISPR genome-editing system and specifically modify genes in mice. The team used nanoparticles to carry the CRISPR components, eliminating the need to use viruses for delivery.

Using the new delivery technique, the researchers were able to cut out certain genes in about 80 percent of liver cells, the best success rate ever achieved with CRISPR in adult animals.

In a new study, MIT researchers have developed nanoparticles that can deliver the CRISPR genome-editing system and specifically modify genes, eliminating the need to use viruses for delivery. Image: MIT News

A November 13, 2017 MIT news release (also on EurekAlert), which originated the news item, provides more details about the research and a good description of and comparison between using a viral system and using a nanoparticle-based system to deliver CRISPR-CAS9,

“What’s really exciting here is that we’ve shown you can make a nanoparticle that can be used to permanently and specifically edit the DNA in the liver of an adult animal,” says Daniel Anderson, an associate professor in MIT’s Department of Chemical Engineering and a member of MIT’s Koch Institute for Integrative Cancer Research and Institute for Medical Engineering and Science (IMES).

One of the genes targeted in this study, known as Pcsk9, regulates cholesterol levels. Mutations in the human version of the gene are associated with a rare disorder called dominant familial hypercholesterolemia, and the FDA recently approved two antibody drugs that inhibit Pcsk9. However these antibodies need to be taken regularly, and for the rest of the patient’s life, to provide therapy. The new nanoparticles permanently edit the gene following a single treatment, and the technique also offers promise for treating other liver disorders, according to the MIT team.

Anderson is the senior author of the study, which appears in the Nov. 13 [2017] issue of Nature Biotechnology. The paper’s lead author is Koch Institute research scientist Hao Yin. Other authors include David H. Koch Institute Professor Robert Langer of MIT, professors Victor Koteliansky and Timofei Zatsepin of the Skolkovo Institute of Science and Technology [Russia], and Professor Wen Xue of the University of Massachusetts Medical School.

Targeting disease

Many scientists are trying to develop safe and efficient ways to deliver the components needed for CRISPR, which consists of a DNA-cutting enzyme called Cas9 and a short RNA that guides the enzyme to a specific area of the genome, directing Cas9 where to make its cut.

In most cases, researchers rely on viruses to carry the gene for Cas9, as well as the RNA guide strand. In 2014, Anderson, Yin, and their colleagues developed a nonviral delivery system in the first-ever demonstration of curing a disease (the liver disorder tyrosinemia) with CRISPR in an adult animal. However, this type of delivery requires a high-pressure injection, a method that can also cause some damage to the liver.

Later, the researchers showed they could deliver the components without the high-pressure injection by packaging messenger RNA (mRNA) encoding Cas9 into a nanoparticle instead of a virus. Using this approach, in which the guide RNA was still delivered by a virus, the researchers were able to edit the target gene in about 6 percent of hepatocytes, which is enough to treat tyrosinemia.

While that delivery technique holds promise, in some situations it would be better to have a completely nonviral delivery system, Anderson says. One consideration is that once a particular virus is used, the patient will develop antibodies to it, so it couldn’t be used again. Also, some patients have pre-existing antibodies to the viruses being tested as CRISPR delivery vehicles.

In the new Nature Biotechnology paper, the researchers came up with a system that delivers both Cas9 and the RNA guide using nanoparticles, with no need for viruses. To deliver the guide RNAs, they first had to chemically modify the RNA to protect it from enzymes in the body that would normally break it down before it could reach its destination.

The researchers analyzed the structure of the complex formed by Cas9 and the RNA guide, or sgRNA, to figure out which sections of the guide RNA strand could be chemically modified without interfering with the binding of the two molecules. Based on this analysis, they created and tested many possible combinations of modifications.

“We used the structure of the Cas9 and sgRNA complex as a guide and did tests to figure out we can modify as much as 70 percent of the guide RNA,” Yin says. “We could heavily modify it and not affect the binding of sgRNA and Cas9, and this enhanced modification really enhances activity.”

Reprogramming the liver

The researchers packaged these modified RNA guides (which they call enhanced sgRNA) into lipid nanoparticles, which they had previously used to deliver other types of RNA to the liver, and injected them into mice along with nanoparticles containing mRNA that encodes Cas9.

They experimented with knocking out a few different genes expressed by hepatocytes, but focused most of their attention on the cholesterol-regulating Pcsk9 gene. The researchers were able to eliminate this gene in more than 80 percent of liver cells, and the Pcsk9 protein was undetectable in these mice. They also found a 35 percent drop in the total cholesterol levels of the treated mice.

The researchers are now working on identifying other liver diseases that might benefit from this approach, and advancing these approaches toward use in patients.

“I think having a fully synthetic nanoparticle that can specifically turn genes off could be a powerful tool not just for Pcsk9 but for other diseases as well,” Anderson says. “The liver is a really important organ and also is a source of disease for many people. If you can reprogram the DNA of your liver while you’re still using it, we think there are many diseases that could be addressed.”

“We are very excited to see this new application of nanotechnology open new avenues for gene editing,” Langer adds.

The research was funded by the National Institutes of Health (NIH), the Russian Scientific Fund, the Skoltech Center, and the Koch Institute Support (core) Grant from the National Cancer Institute.

Here’s a link to and a citation for the paper,

Structure-guided chemical modification of guide RNA enables potent non-viral in vivo genome editing by Hao Yin, Chun-Qing Song, Sneha Suresh, Qiongqiong Wu, Stephen Walsh, Luke Hyunsik Rhym, Esther Mintzer, Mehmet Fatih Bolukbasi, Lihua Julie Zhu, Kevin Kauffman, Haiwei Mou, Alicia Oberholzer, Junmei Ding, Suet-Yan Kwan, Roman L Bogorad, Timofei Zatsepin, Victor Koteliansky, Scot A Wolfe, Wen Xue, Robert Langer, & Daniel G Anderson. Nature Biotechnology doi:10.1038/nbt.4005 Published online: 13 November 2017

This paper is behind a paywall.

Limitless energy and the International Thermonuclear Experimental Reactor (ITER)

Over 30 years in the dreaming, the International Thermonuclear Experimental Reactor (ITER) is now said to be 1/2 way to completing construction. A December 6, 2017 ITER press release (received via email) makes the joyful announcement,

WORLD’S MOST COMPLEX MACHINE IS 50 PERCENT COMPLETED
ITER is proving that fusion is the future source of clean, abundant, safe and economic energy_

The International Thermonuclear Experimental Reactor (ITER), a project to prove that fusion power can be produced on a commercial scale and is sustainable, is now 50 percent built to initial operation. Fusion is the same energy source from the Sun that gives the Earth its light and warmth.

ITER will use hydrogen fusion, controlled by superconducting magnets, to produce massive heat energy. In the commercial machines that will follow, this heat will drive turbines to produce electricity with these positive benefits:

* Fusion energy is carbon-free and environmentally sustainable, yet much more powerful than fossil fuels. A pineapple-sized amount of hydrogen offers as much fusion energy as 10,000 tons of coal.

* ITER uses two forms of hydrogen fuel: deuterium, which is easily extracted from seawater; and tritium, which is bred from lithium inside the fusion reactor. The supply of fusion fuel for industry and megacities is abundant, enough for millions of years.

* When the fusion reaction is disrupted, the reactor simply shuts down-safely and without external assistance. Tiny amounts of fuel are used, about 2-3 grams at a time; so there is no physical possibility of a meltdown accident.

* Building and operating a fusion power plant is targeted to be comparable to the cost of a fossil fuel or nuclear fission plant. But unlike today’s nuclear plants, a fusion plant will not have the costs of high-level radioactive waste disposal. And unlike fossil fuel plants,
fusion will not have the environmental cost of releasing CO2 and other pollutants.

ITER is the most complex science project in human history. The hydrogen plasma will be heated to 150 million degrees Celsius, ten times hotter than the core of the Sun, to enable the fusion reaction. The process happens in a donut-shaped reactor, called a tokamak(*), which is surrounded by giant magnets that confine and circulate the superheated, ionized plasma, away from the metal walls. The superconducting magnets must be cooled to minus 269°C, as cold as interstellar space.

The ITER facility is being built in Southern France by a scientific partnership of 35 countries. ITER’s specialized components, roughly 10 million parts in total, are being manufactured in industrial facilities all over the world. They are subsequently shipped to the ITER worksite, where they must be assembled, piece-by-piece, into the final machine.

Each of the seven ITER members-the European Union, China, India, Japan, Korea, Russia, and the United States-is fabricating a significant portion of the machine. This adds to ITER’s complexity.

In a message dispatched on December 1 [2017] to top-level officials in ITER member governments, the ITER project reported that it had completed 50 percent of the “total construction work scope through First Plasma” (**). First Plasma, scheduled for December 2025, will be the first stage of operation for ITER as a functional machine.

“The stakes are very high for ITER,” writes Bernard Bigot, Ph.D., Director-General of ITER. “When we prove that fusion is a viable energy source, it will eventually replace burning fossil fuels, which are non-renewable and non-sustainable. Fusion will be complementary with wind, solar, and other renewable energies.

“ITER’s success has demanded extraordinary project management, systems engineering, and almost perfect integration of our work.

“Our design has taken advantage of the best expertise of every member’s scientific and industrial base. No country could do this alone. We are all learning from each other, for the world’s mutual benefit.”

The ITER 50 percent milestone is getting significant attention.

“We are fortunate that ITER and fusion has had the support of world leaders, historically and currently,” says Director-General Bigot. “The concept of the ITER project was conceived at the 1985 Geneva Summit between Ronald Reagan and Mikhail Gorbachev. When the ITER Agreement was signed in 2006, it was strongly supported by leaders such as French President Jacques Chirac, U.S. President George W. Bush, and Indian Prime Minister Manmohan Singh.

“More recently, President Macron and U.S. President Donald Trump exchanged letters about ITER after their meeting this past July. One month earlier, President Xi Jinping of China hosted Russian President Vladimir Putin and other world leaders in a showcase featuring ITER and fusion power at the World EXPO in Astana, Kazakhstan.

“We know that other leaders have been similarly involved behind the scenes. It is clear that each ITER member understands the value and importance of this project.”

Why use this complex manufacturing arrangement?

More than 80 percent of the cost of ITER, about $22 billion or EUR18 billion, is contributed in the form of components manufactured by the partners. Many of these massive components of the ITER machine must be precisely fitted-for example, 17-meter-high magnets with less than a millimeter of tolerance. Each component must be ready on time to fit into the Master Schedule for machine assembly.

Members asked for this deal for three reasons. First, it means that most of the ITER costs paid by any member are actually paid to that member’s companies; the funding stays in-country. Second, the companies working on ITER build new industrial expertise in major fields-such as electromagnetics, cryogenics, robotics, and materials science. Third, this new expertise leads to innovation and spin-offs in other fields.

For example, expertise gained working on ITER’s superconducting magnets is now being used to map the human brain more precisely than ever before.

The European Union is paying 45 percent of the cost; China, India, Japan, Korea, Russia, and the United States each contribute 9 percent equally. All members share in ITER’s technology; they receive equal access to the intellectual property and innovation that comes from building ITER.

When will commercial fusion plants be ready?

ITER scientists predict that fusion plants will start to come on line as soon as 2040. The exact timing, according to fusion experts, will depend on the level of public urgency and political will that translates to financial investment.

How much power will they provide?

The ITER tokamak will produce 500 megawatts of thermal power. This size is suitable for studying a “burning” or largely self-heating plasma, a state of matter that has never been produced in a controlled environment on Earth. In a burning plasma, most of the plasma heating comes from the fusion reaction itself. Studying the fusion science and technology at ITER’s scale will enable optimization of the plants that follow.

A commercial fusion plant will be designed with a slightly larger plasma chamber, for 10-15 times more electrical power. A 2,000-megawatt fusion electricity plant, for example, would supply 2 million homes.

How much would a fusion plant cost and how many will be needed?

The initial capital cost of a 2,000-megawatt fusion plant will be in the range of $10 billion. These capital costs will be offset by extremely low operating costs, negligible fuel costs, and infrequent component replacement costs over the 60-year-plus life of the plant. Capital costs will decrease with large-scale deployment of fusion plants.

At current electricity usage rates, one fusion plant would be more than enough to power a city the size of Washington, D.C. The entire D.C. metropolitan area could be powered with four fusion plants, with zero carbon emissions.

“If fusion power becomes universal, the use of electricity could be expanded greatly, to reduce the greenhouse gas emissions from transportation, buildings and industry,” predicts Dr. Bigot. “Providing clean, abundant, safe, economic energy will be a miracle for our planet.”

*     *     *

FOOTNOTES:

* “Tokamak” is a word of Russian origin meaning a toroidal or donut-shaped magnetic chamber. Tokamaks have been built and operated for the past six decades. They are today’s most advanced fusion device design.

** “Total construction work scope,” as used in ITER’s project performance metrics, includes design, component manufacturing, building construction, shipping and delivery, assembly, and installation.

It is an extraordinary project on many levels as Henry Fountain notes in a March 27, 2017 article for the New York Times (Note: Links have been removed),

At a dusty construction site here amid the limestone ridges of Provence, workers scurry around immense slabs of concrete arranged in a ring like a modern-day Stonehenge.

It looks like the beginnings of a large commercial power plant, but it is not. The project, called ITER, is an enormous, and enormously complex and costly, physics experiment. But if it succeeds, it could determine the power plants of the future and make an invaluable contribution to reducing planet-warming emissions.

ITER, short for International Thermonuclear Experimental Reactor (and pronounced EAT-er), is being built to test a long-held dream: that nuclear fusion, the atomic reaction that takes place in the sun and in hydrogen bombs, can be controlled to generate power.

ITER will produce heat, not electricity. But if it works — if it produces more energy than it consumes, which smaller fusion experiments so far have not been able to do — it could lead to plants that generate electricity without the climate-affecting carbon emissions of fossil-fuel plants or most of the hazards of existing nuclear reactors that split atoms rather than join them.

Success, however, has always seemed just a few decades away for ITER. The project has progressed in fits and starts for years, plagued by design and management problems that have led to long delays and ballooning costs.

ITER is moving ahead now, with a director-general, Bernard Bigot, who took over two years ago after an independent analysis that was highly critical of the project. Dr. Bigot, who previously ran France’s atomic energy agency, has earned high marks for resolving management problems and developing a realistic schedule based more on physics and engineering and less on politics.

The site here is now studded with tower cranes as crews work on the concrete structures that will support and surround the heart of the experiment, a doughnut-shaped chamber called a tokamak. This is where the fusion reactions will take place, within a plasma, a roiling cloud of ionized atoms so hot that it can be contained only by extremely strong magnetic fields.

Here’s a rendering of the proposed reactor,

Source: ITER Organization

It seems the folks at the New York Times decided to remove the notes which help make sense of this image. However, it does get the idea across.

If I read the article rightly, the official cost in March 2017 was around 22 B Euros and more will likely be needed. You can read Fountain’s article for more information about fusion and ITER or go to the ITER website.

I could have sworn a local (Vancouver area) company called General Fusion was involved in the ITER project but I can’t track down any sources for confirmation. The sole connection I could find is in a documentary about fusion technology,

Here’s a little context for the film from a July 4, 2017 General Fusion news release (Note: A link has been removed),

A new documentary featuring General Fusion has captured the exciting progress in fusion across the public and private sectors.

Let There Be Light made its international premiere at the South By Southwest (SXSW) music and film festival in March [2017] to critical acclaim. The film was quickly purchased by Amazon Video, where it will be available for more than 70 million users to stream.

Let There Be Light follows scientists at General Fusion, ITER and Lawrenceville Plasma Physics in their pursuit of a clean, safe and abundant source of energy to power the world.

The feature length documentary has screened internationally across Europe and North America. Most recently it was shown at the Hot Docs film festival in Toronto, where General Fusion founder and Chief Scientist Dr. Michel Laberge joined fellow fusion physicist Dr. Mark Henderson from ITER at a series of Q&A panels with the filmmakers.

Laberge and Henderson were also interviewed by the popular CBC radio science show Quirks and Quarks, discussing different approaches to fusion, its potential benefits, and the challenges it faces.

It is yet to be confirmed when the film will be release for streaming, check Amazon Video for details.

You can find out more about General Fusion here.

Brief final comment

ITER is a breathtaking effort but if you’ve read about other large scale projects such as building a railway across the Canadian Rocky Mountains, establishing telecommunications in an  astonishing number of countries around the world, getting someone to the moon, eliminating small pox, building the pyramids, etc., it seems standard operating procedure both for the successes I’ve described and for the failures we’ve forgotten. Where ITER will finally rest on the continuum between success and failure is yet to be determined but the problems experienced so far are not necessarily a predictor.

I wish the engineers, scientists, visionaries, and others great success with finding better ways to produce energy.

Gold nanoparticles used to catalyze biofuel waste and create a useful additive

This work is the result of an international collaboration including Russia (from a May 23, 2017 news item on Nanowerk),

Gold nanoparticles serve as catalysts for obtaining valuable chemical products based on glycerol. Scientists from Tomsk Polytechnic University and their international colleagues are developing gold catalysts to recycle one of the main byproducts of biofuel production. The obtained products are in high demand in medicine, agriculture, cosmetic industry and other sectors.

Scientists from the University of Milano (Italy), the National Autonomous University of Mexico, the Institute of Catalysis and Petrochemistry of Madrid (Spain) and the University of Porto (Portugal) take part in the study of gold nanoparticles.

A May 23, 2027 Tomsk Polytechnic University press release, which originated the news item, expands on the theme,

Today the production of biofuels is an important area in many countries. They can be obtained from a great variety of biomasses. In Latin America it is orange and tangerine peel as well as banana skin. In USA biofuels are produced from corn, in the central part of Russia and Europe – from rape (Brassica napus). When processing these plants into biofuels a large amount of glycerol is formed. Its esters constitute the basis of oils and fats. Glycerol is widely used in cosmetic industry as an individual product. However, much more glycerol is obtained in the production of biofuels – many thousands of tons a year. As a result, unused glycerol merely becomes waste,’ describes the problem Alexey Pestryakov, the Head of the Department of Physical and Analytical Chemistry. ‘Now, a lot of research groups are engaged in this issue as to how to transform excess glycerol into other useful products. Along with our foreign colleagues we offered catalysts based on gold nanoparticles.’

The authors of the research note that catalytic oxidation on gold is one of the most effective techniques to obtain from glycerol such useful products as aldehydes, esters, carboxylic acids and other substances.

‘All these substances are products of fine organic chemistry and are in demand in a wide range of industries, first of all, in the pharmaceutical and cosmetic industries. In agriculture they are applied as part of different feed additives, veterinary drugs, fertilizers, plant treatment products, etc.

Thus, unused glycerol after being processed will further be applied,’ sums up Alexey Pestryakov.

Gold catalysts are super active. They can enter into chemical reactions with other substances at room temperature (other catalysts need to be heated), in some case even under zero. However, gold can be a catalyst only at the nanolevel.

‘If you take a piece of gold, even very tiny, there will be no chemical reaction. In order to make gold become chemically active, the size of its particle should be less than two nanometers. Only then it gets its amazing properties,’ explains the scientist.

As a catalyst gold was discovered not so long ago, in the early 1990s, by Japanese chemists.

To date, TPU scientists and their colleagues are not the only ones who develop such catalysts.

Unlike their counterparts the gold catalysts developed at TPU are more stable (they retain their activity longer).

‘A great challenge in this area is that gold catalysts are very rapidly deactivated, not only during work, but even during storage. Our objective is to ensure their longer shelf life. It is also important to use oxygen as an oxidizer, since toxic and corrosive peroxide compounds are often used for such purposes,’ says Alexey Petryakov.

Here’s a link to and a citation for the paper,

More Insights into Support and Preparation Method Effects in Gold Catalyzed Glycerol Oxidation by Nina Bogdanchikova, Inga Tuzovskaya, Laura Prati, Alberto Villa, Alexey Pestryakov, Mario Farías. Current Organic Synthesis VOLUME: 14 ISSUE: 3 Year: 2017Page: [377 – 382] Pages: 6 DOI: 10.2174/1570179413666161031114833

This paper is behind a paywall. (Scroll down the page to find the article.)

‘Mother of all bombs’ is a nanoweapon?

According to physicist, Louis A. Del Monte, in an April 14, 2017 opinion piece for Huffington Post.com, the ‘mother of all bombs ‘ is a nanoweapon (Note: Links have been removed),

The United States military dropped its largest non-nuclear bomb, the GBU-43/B Massive Ordnance Air Blast Bomb (MOAB), nicknamed the “mother of all bombs,” on an ISIS cave and tunnel complex in the Achin District of the Nangarhar province, Afghanistan [on Thursday, April 13, 2017]. The Achin District is the center of ISIS activity in Afghanistan. This was the first use in combat of the GBU-43/B Massive Ordnance Air Blast (MOAB).

… Although it carries only about 8 tons of explosives, the explosive mixture delivers a destructive impact equivalent of 11 tons of TNT.

There is little doubt the United States Department of Defense is likely using nanometals, such as nanoaluminum (alternately spelled nano-aluminum) mixed with TNT, to enhance the detonation properties of the MOAB. The use of nanoaluminum mixed with TNT was known to boost the explosive power of the TNT since the early 2000s. If true, this means that the largest known United States non-nuclear bomb is a nanoweapon. When most of us think about nanoweapons, we think small, essentially invisible weapons, like nanobots (i.e., tiny robots made using nanotechnology). That can often be the case. But, as defined in my recent book, Nanoweapons: A Growing Threat to Humanity (Potomac 2017), “Nanoweapons are any military technology that exploits the power of nanotechnology.” This means even the largest munition, such as the MOAB, is a nanoweapon if it uses nanotechnology.

… The explosive is H6, which is a mixture of five ingredients (by weight):

  • 44.0% RDX & nitrocellulose (RDX is a well know explosive, more powerful that TNT, often used with TNT and other explosives. Nitrocellulose is a propellant or low-order explosive, originally known as gun-cotton.)
  • 29.5% TNT
  • 21.0% powdered aluminum
  • 5.0% paraffin wax as a phlegmatizing (i.e., stabilizing) agent.
  • 0.5% calcium chloride (to absorb moisture and eliminate the production of gas

Note, the TNT and powdered aluminum account for over half the explosive payload by weight. It is highly likely that the “powdered aluminum” is nanoaluminum, since nanoaluminum can enhance the destructive properties of TNT. This argues that H6 is a nano-enhanced explosive, making the MOAB a nanoweapon.

The United States GBU-43/B Massive Ordnance Air Blast Bomb (MOAB) was the largest non-nuclear bomb known until Russia detonated the Aviation Thermobaric Bomb of Increased Power, termed the “father of all bombs” (FOAB), in 2007. It is reportedly four times more destructive than the MOAB, even though it carries only 7 tons of explosives versus the 8 tons of the MOAB. Interestingly, the Russians claim to achieve the more destructive punch using nanotechnology.

If you have the time, I encourage you to read the piece in its entirety.

Nanozymes as an antidote for pesticides

Should you have concerns about exposure to pesticides or chemical warfare agents (timely given events in Syria as per this April 4, 2017 news item on CBC [Canadian Broadcasting News Corporation] online) , scientists at the Lomonosov Moscow State University have developed a possible antidote according to a March 8,, 2017 news item on phys.org,

Members of the Faculty of Chemistry of the Lomonosov Moscow State University have developed novel nanosized agents that could be used as efficient protective and antidote modalities against the impact of neurotoxic organophosphorus compounds such as pesticides and chemical warfare agents. …

A March 7, 2017 Lomonosov Moscow State University press release on EurekAlert, which originated the news item, describes the work in detail,

A group of scientists from the Faculty of Chemistry under the leadership of Prof. Alexander Kabanov has focused their research supported by a “megagrant” on the nanoparticle-based delivery to an organism of enzymes, capable of destroying toxic organophosphorous compounds. Development of first nanosized drugs has started more than 30 years ago and already in the 90-s first nanomedicines for cancer treatment entered the market. First such medicines were based on liposomes – spherical vesicles made of lipid bilayers. The new technology, developed by Kabanov and his colleagues, uses an enzyme, synthesized at the Lomonosov Moscow State University, encapsulated into a biodegradable polymer coat, based on an amino acid (glutamic acid).

Alexander Kabanov, Doctor of Chemistry, Professor at the Eshelman School of Pharmacy of the University of North Carolina (USA) and the Faculty of Chemistry, M. V. Lomonosov Moscow State University, one of the authors of the article explains: “At the end of the 80-s my team (at that time in Moscow) and independently Japanese colleagues led by Prof. Kazunori Kataoka from Tokyo began using polymer micelles for small molecules delivery. Soon the nanomedicine field has “exploded”. Currently hundreds of laboratories across the globe work in this area, applying a wide variety of approaches to creation of such nanosized agents. A medicine on the basis of polymeric micelles, developed by a Korean company Samyang Biopharm, was approved for human use in 2006.”

Professor Kabanov’s team after moving to the USA in 1994 focused on development of polymer micelles, which could include biopolymers due to electrostatic interactions. Initially chemists were interested in usage of micelles for RNA and DNA delivery but later on scientists started actively utilizing this approach for delivery of proteins and, namely, enzymes, to the brain and other organs.

Alexander Kabanov says: “At the time I worked at the University of Nebraska Medical Center, in Omaha (USA) and by 2010 we had a lot of results in this area. That’s why when my colleague from the Chemical Enzymology Department of the Lomonosov Moscow State University, Prof. Natalia Klyachko offered me to apply for a megagrant the research theme of the new laboratory was quite obvious. Specifically, to use our delivery approach, which we’ve called a “nanozyme”, for “improvement” of enzymes, developed by colleagues at the Lomonosov Moscow State University for its further medical application.”

Scientists together with the group of enzymologists from the Lomonosov Moscow State University under the leadership of Elena Efremenko, Doctor of Biological Sciences, have chosen organophosphorus hydrolase as a one of the delivered enzymes. Organophosphorus hydrolase is capable of degrading toxic pesticides and chemical warfare agents with very high rate. However, it has disadvantages: because of its bacterial origin, an immune response is observed as a result of its delivery to an organism of mammals. Moreover, organophosphorus hydrolase is quickly removed from the body. Chemists have solved this problem with the help of a “self-assembly” approach: as a result of inclusion of organophosphorus hydrolase enzyme in a nanozyme particles the immune response becomes weaker and, on the contrary, both the storage stability of the enzyme and its lifetime after delivery to an organism considerably increase. Rat experiments have proved that such nanozyme efficiently protects organisms against lethal doses of highly toxic pesticides and even chemical warfare agents, such as VX nerve gas.

Alexander Kabanov summarizes: “The simplicity of our approach is very important. You could get an organophosphorus hydrolase nanozyme by simple mixing of aqueous solutions of anenzyme and safe biocompatible polymer. This nanozyme is self-assembled due to electrostatic interaction between a protein (enzyme) and polymer”.

According to the scientist’s words the simplicity and technological effectiveness of the approach along with the obtained promising results of animal experiments bring hope that this modality could be successful and in clinical use.

Members of the Faculty of Chemistry of the Lomonosov Moscow State University, along with scientists from the 27th Central Research Institute of the Ministry of Defense of the Russian Federation, the Eshelman School of Pharmacy of the University of North Carolina at Chapel Hill (USA) and the University of Nebraska Medical Center (UNC) have taken part in the Project.

Here’s a link to and a citation for the paper,

A simple and highly effective catalytic nanozyme scavenger for organophosphorus neurotoxins by Elena N. Efremenko, Ilya V. Lyagin, Natalia L. Klyachko, Tatiana Bronich, Natalia V. Zavyalova, Yuhang Jiang, Alexander V. Kabanov. Journal of Controlled Release Volume 247, 10 February 2017, Pages 175–181  http://dx.doi.org/10.1016/j.jconrel.2016.12.037

This paper is behind a paywall.

China, US, and the race for artificial intelligence research domination

John Markoff and Matthew Rosenberg have written a fascinating analysis of the competition between US and China regarding technological advances, specifically in the field of artificial intelligence. While the focus of the Feb. 3, 2017 NY Times article is military, the authors make it easy to extrapolate and apply the concepts to other sectors,

Robert O. Work, the veteran defense official retained as deputy secretary by President Trump, calls them his “A.I. dudes.” The breezy moniker belies their serious task: The dudes have been a kitchen cabinet of sorts, and have advised Mr. Work as he has sought to reshape warfare by bringing artificial intelligence to the battlefield.

Last spring, he asked, “O.K., you guys are the smartest guys in A.I., right?”

No, the dudes told him, “the smartest guys are at Facebook and Google,” Mr. Work recalled in an interview.

Now, increasingly, they’re also in China. The United States no longer has a strategic monopoly on the technology, which is widely seen as the key factor in the next generation of warfare.

The Pentagon’s plan to bring A.I. to the military is taking shape as Chinese researchers assert themselves in the nascent technology field. And that shift is reflected in surprising commercial advances in artificial intelligence among Chinese companies. [emphasis mine]

Having read Marshal McLuhan (de rigeur for any Canadian pursuing a degree in communications [sociology-based] anytime from the 1960s into the late 1980s [at least]), I took the movement of technology from military research to consumer applications as a standard. Television is a classic example but there are many others including modern plastic surgery. The first time, I encountered the reverse (consumer-based technology being adopted by the military) was in a 2004 exhibition “Massive Change: The Future of Global Design” produced by Bruce Mau for the Vancouver (Canada) Art Gallery.

Markoff and Rosenberg develop their thesis further (Note: Links have been removed),

Last year, for example, Microsoft researchers proclaimed that the company had created software capable of matching human skills in understanding speech.

Although they boasted that they had outperformed their United States competitors, a well-known A.I. researcher who leads a Silicon Valley laboratory for the Chinese web services company Baidu gently taunted Microsoft, noting that Baidu had achieved similar accuracy with the Chinese language two years earlier.

That, in a nutshell, is the challenge the United States faces as it embarks on a new military strategy founded on the assumption of its continued superiority in technologies such as robotics and artificial intelligence.

First announced last year by Ashton B. Carter, President Barack Obama’s defense secretary, the “Third Offset” strategy provides a formula for maintaining a military advantage in the face of a renewed rivalry with China and Russia.

As consumer electronics manufacturing has moved to Asia, both Chinese companies and the nation’s government laboratories are making major investments in artificial intelligence.

The advance of the Chinese was underscored last month when Qi Lu, a veteran Microsoft artificial intelligence specialist, left the company to become chief operating officer at Baidu, where he will oversee the company’s ambitious plan to become a global leader in A.I.

The authors note some recent military moves (Note: Links have been removed),

In August [2016], the state-run China Daily reported that the country had embarked on the development of a cruise missile system with a “high level” of artificial intelligence. The new system appears to be a response to a missile the United States Navy is expected to deploy in 2018 to counter growing Chinese military influence in the Pacific.

Known as the Long Range Anti-Ship Missile, or L.R.A.S.M., it is described as a “semiautonomous” weapon. According to the Pentagon, this means that though targets are chosen by human soldiers, the missile uses artificial intelligence technology to avoid defenses and make final targeting decisions.

The new Chinese weapon typifies a strategy known as “remote warfare,” said John Arquilla, a military strategist at the Naval Post Graduate School in Monterey, Calif. The idea is to build large fleets of small ships that deploy missiles, to attack an enemy with larger ships, like aircraft carriers.

“They are making their machines more creative,” he said. “A little bit of automation gives the machines a tremendous boost.”

Whether or not the Chinese will quickly catch the United States in artificial intelligence and robotics technologies is a matter of intense discussion and disagreement in the United States.

Markoff and Rosenberg return to the world of consumer electronics as they finish their article on AI and the military (Note: Links have been removed),

Moreover, while there appear to be relatively cozy relationships between the Chinese government and commercial technology efforts, the same cannot be said about the United States. The Pentagon recently restarted its beachhead in Silicon Valley, known as the Defense Innovation Unit Experimental facility, or DIUx. It is an attempt to rethink bureaucratic United States government contracting practices in terms of the faster and more fluid style of Silicon Valley.

The government has not yet undone the damage to its relationship with the Valley brought about by Edward J. Snowden’s revelations about the National Security Agency’s surveillance practices. Many Silicon Valley firms remain hesitant to be seen as working too closely with the Pentagon out of fear of losing access to China’s market.

“There are smaller companies, the companies who sort of decided that they’re going to be in the defense business, like a Palantir,” said Peter W. Singer, an expert in the future of war at New America, a think tank in Washington, referring to the Palo Alto, Calif., start-up founded in part by the venture capitalist Peter Thiel. “But if you’re thinking about the big, iconic tech companies, they can’t become defense contractors and still expect to get access to the Chinese market.”

Those concerns are real for Silicon Valley.

If you have the time, I recommend reading the article in its entirety.

Impact of the US regime on thinking about AI?

A March 24, 2017 article by Daniel Gross for Slate.com hints that at least one high level offician in the Trump administration may be a little naïve in his understanding of AI and its impending impact on US society (Note: Links have been removed),

Treasury Secretary Steven Mnuchin is a sharp guy. He’s a (legacy) alumnus of Yale and Goldman Sachs, did well on Wall Street, and was a successful movie producer and bank investor. He’s good at, and willing to, put other people’s money at risk alongside some of his own. While he isn’t the least qualified person to hold the post of treasury secretary in 2017, he’s far from the best qualified. For in his 54 years on this planet, he hasn’t expressed or displayed much interest in economic policy, or in grappling with the big picture macroeconomic issues that are affecting our world. It’s not that he is intellectually incapable of grasping them; they just haven’t been in his orbit.

Which accounts for the inanity he uttered at an Axios breakfast Friday morning about the impact of artificial intelligence on jobs.

“it’s not even on our radar screen…. 50-100 more years” away, he said. “I’m not worried at all” about robots displacing humans in the near future, he said, adding: “In fact I’m optimistic.”

A.I. is already affecting the way people work, and the work they do. (In fact, I’ve long suspected that Mike Allen, Mnuchin’s Axios interlocutor, is powered by A.I.) I doubt Mnuchin has spent much time in factories, for example. But if he did, he’d see that machines and software are increasingly doing the work that people used to do. They’re not just moving goods through an assembly line, they’re soldering, coating, packaging, and checking for quality. Whether you’re visiting a GE turbine plant in South Carolina, or a cable-modem factory in Shanghai, the thing you’ll notice is just how few people there actually are. It’s why, in the U.S., manufacturing output rises every year while manufacturing employment is essentially stagnant. It’s why it is becoming conventional wisdom that automation is destroying more manufacturing jobs than trade. And now we are seeing the prospect of dark factories, which can run without lights because there are no people in them, are starting to become a reality. The integration of A.I. into factories is one of the reasons Trump’s promise to bring back manufacturing employment is absurd. You’d think his treasury secretary would know something about that.

It goes far beyond manufacturing, of course. Programmatic advertising buying, Spotify’s recommendation engines, chatbots on customer service websites, Uber’s dispatching system—all of these are examples of A.I. doing the work that people used to do. …

Adding to Mnuchin’s lack of credibility on the topic of jobs and robots/AI, Matthew Rozsa’s March 28, 2017 article for Salon.com features a study from the US National Bureau of Economic Research (Note: Links have been removed),

A new study by the National Bureau of Economic Research shows that every fully autonomous robot added to an American factory has reduced employment by an average of 6.2 workers, according to a report by BuzzFeed. The study also found that for every fully autonomous robot per thousand workers, the employment rate dropped by 0.18 to 0.34 percentage points and wages fell by 0.25 to 0.5 percentage points.

I can’t help wondering if the US Secretary of the Treasury is so oblivious to what is going on in the workplace whether that’s representative of other top-tier officials such as the Secretary of Defense, Secretary of Labor, etc. What is going to happen to US research in fields such as robotics and AI?

I have two more questions, in future what happens to research which contradicts or makes a top tier Trump government official look foolish? Will it be suppressed?

You can find the report “Robots and Jobs: Evidence from US Labor Markets” by Daron Acemoglu and Pascual Restrepo. NBER (US National Bureau of Economic Research) WORKING PAPER SERIES (Working Paper 23285) released March 2017 here. The introduction featured some new information for me; the term ‘technological unemployment’ was introduced in 1930 by John Maynard Keynes.

Moving from a wholly US-centric view of AI

Naturally in a discussion about AI, it’s all US and the country considered its chief sceince rival, China, with a mention of its old rival, Russia. Europe did rate a mention, albeit as a totality. Having recently found out that Canadians were pioneers in a very important aspect of AI, machine-learning, I feel obliged to mention it. You can find more about Canadian AI efforts in my March 24, 2017 posting (scroll down about 40% of the way) where you’ll find a very brief history and mention of the funding for a newly launching, Pan-Canadian Artificial Intelligence Strategy.

If any of my readers have information about AI research efforts in other parts of the world, please feel free to write them up in the comments.