Tag Archives: Russia

Are copper nanoparticles good candidates for synthesizing medicine?

This research appears to be a collaboration between Russian and Indian scientists. From a December 5, 2017 news item on Nanowerk (Note: A link has been removed),

Chemists of Ural Federal University with colleagues from India proved the effectiveness of copper nanoparticles as a catalyst on the example of analysis of 48 organic synthesis reactions (Coordination Chemistry Reviews, “Copper nanoparticles as inexpensive and efficient catalyst: A valuable contribution in organic synthesis”).

One of the advantages of the catalyst is its insolubility in traditional organic solvents. This makes copper nanoparticles a valuable alternative to heavy metal catalysts, for example palladium, which is currently used for the synthesis of many pharmaceuticals and is toxic for cells.

“Copper nanoparticles are an ideal variant of a heterophasic catalyst, since they exist in a wide variety of geometric shapes and sizes, which directly affects the surface of effective mass transfer, so reactions in the presence of this catalyst are characterized by shorter reaction times, selectivity and better yields,” says co-author Grigory Zyryanov, Doctor of Chemistry, Associate Professor of the Department of Organic and Biomolecular Chemistry of UrFU.

A December 11, 2017 (there can be a gap between distributing a press release and posting it on the home website) Ural Federal University press release, which originated the news item, makes the case for copper nanoparticles as catalytic agents,

Copper nanoparticles are inexpensive since there are many simple ways to obtain them from cheap raw materials and these methods are constantly being modified. As a result, it is possible to receive a highly porous structure of catalyst based on copper nanoparticles with a pore size of several tens to several hundred nanometers. Due to the small particle size, the area of the catalytic surface is enormous. Moreover, due to the insolubility of copper nanoparticles, the reactions catalyzed by them go on the surface of the catalyst. After the reaction is completed, the copper nanoparticles that do not interact with the solvents are easily removed, which guarantees the absence of the catalyst admixture in the composition of the final product. These catalysts are already in demand for organic synthesis by the methods of “green chemistry”. Its main principles are simplicity, cheapness, safety of production, recyclability of the catalysts.

One of the promising areas of application of the copper nanoparticle catalyst is, first of all, the creation of medical products using cross-coupling reactions. In 2010, for work in the field of palladium catalyzed cross-coupling reactions, the Nobel Prize in Chemistry was awarded to scientists from Japan and the USA: Richard Heck, Ei-ichi Negishi and Akira Suzuki. Despite worldwide recognition, palladium catalyzed cross-coupling reactions are undesirable for the synthesis of most medications due to the toxicity of palladium for living cells and the lack of methods for reliable removal of palladium traces from the final product. In addition to toxicity, the high cost of catalysts based on palladium, as well as another catalyst for pharmaceuticals, platinum, makes the use of copper nanoparticles economically and environmentally justified.

Here’s a link to and a citation for the paper,

Copper nanoparticles as inexpensive and efficient catalyst: A valuable contribution in organic synthesis by Nisha Kant Ojha, Grigory V. Zyryanov, Adinath Majee, Valery N. Charushin, Oleg N. Chupakhin, Sougata Santra. Coordination Chemistry Reviews Volume 353, 15 December 2017, Pages 1-57 https://doi.org/10.1016/j.ccr.2017.10.004

This paper is behind a paywall.

Editing the genome with CRISPR ((clustered regularly interspaced short palindromic repeats)-carrying nanoparticles

MIT (Massachusetts Institute of Technology) researchers have developed a new nonviral means of delivering CRISPR ((clustered regularly interspaced short palindromic repeats)-CAS9 gene therapy according to a November 13, 2017 news item on Nanowerk,

In a new study, MIT researchers have developed nanoparticles that can deliver the CRISPR genome-editing system and specifically modify genes in mice. The team used nanoparticles to carry the CRISPR components, eliminating the need to use viruses for delivery.

Using the new delivery technique, the researchers were able to cut out certain genes in about 80 percent of liver cells, the best success rate ever achieved with CRISPR in adult animals.

In a new study, MIT researchers have developed nanoparticles that can deliver the CRISPR genome-editing system and specifically modify genes, eliminating the need to use viruses for delivery. Image: MIT News

A November 13, 2017 MIT news release (also on EurekAlert), which originated the news item, provides more details about the research and a good description of and comparison between using a viral system and using a nanoparticle-based system to deliver CRISPR-CAS9,

“What’s really exciting here is that we’ve shown you can make a nanoparticle that can be used to permanently and specifically edit the DNA in the liver of an adult animal,” says Daniel Anderson, an associate professor in MIT’s Department of Chemical Engineering and a member of MIT’s Koch Institute for Integrative Cancer Research and Institute for Medical Engineering and Science (IMES).

One of the genes targeted in this study, known as Pcsk9, regulates cholesterol levels. Mutations in the human version of the gene are associated with a rare disorder called dominant familial hypercholesterolemia, and the FDA recently approved two antibody drugs that inhibit Pcsk9. However these antibodies need to be taken regularly, and for the rest of the patient’s life, to provide therapy. The new nanoparticles permanently edit the gene following a single treatment, and the technique also offers promise for treating other liver disorders, according to the MIT team.

Anderson is the senior author of the study, which appears in the Nov. 13 [2017] issue of Nature Biotechnology. The paper’s lead author is Koch Institute research scientist Hao Yin. Other authors include David H. Koch Institute Professor Robert Langer of MIT, professors Victor Koteliansky and Timofei Zatsepin of the Skolkovo Institute of Science and Technology [Russia], and Professor Wen Xue of the University of Massachusetts Medical School.

Targeting disease

Many scientists are trying to develop safe and efficient ways to deliver the components needed for CRISPR, which consists of a DNA-cutting enzyme called Cas9 and a short RNA that guides the enzyme to a specific area of the genome, directing Cas9 where to make its cut.

In most cases, researchers rely on viruses to carry the gene for Cas9, as well as the RNA guide strand. In 2014, Anderson, Yin, and their colleagues developed a nonviral delivery system in the first-ever demonstration of curing a disease (the liver disorder tyrosinemia) with CRISPR in an adult animal. However, this type of delivery requires a high-pressure injection, a method that can also cause some damage to the liver.

Later, the researchers showed they could deliver the components without the high-pressure injection by packaging messenger RNA (mRNA) encoding Cas9 into a nanoparticle instead of a virus. Using this approach, in which the guide RNA was still delivered by a virus, the researchers were able to edit the target gene in about 6 percent of hepatocytes, which is enough to treat tyrosinemia.

While that delivery technique holds promise, in some situations it would be better to have a completely nonviral delivery system, Anderson says. One consideration is that once a particular virus is used, the patient will develop antibodies to it, so it couldn’t be used again. Also, some patients have pre-existing antibodies to the viruses being tested as CRISPR delivery vehicles.

In the new Nature Biotechnology paper, the researchers came up with a system that delivers both Cas9 and the RNA guide using nanoparticles, with no need for viruses. To deliver the guide RNAs, they first had to chemically modify the RNA to protect it from enzymes in the body that would normally break it down before it could reach its destination.

The researchers analyzed the structure of the complex formed by Cas9 and the RNA guide, or sgRNA, to figure out which sections of the guide RNA strand could be chemically modified without interfering with the binding of the two molecules. Based on this analysis, they created and tested many possible combinations of modifications.

“We used the structure of the Cas9 and sgRNA complex as a guide and did tests to figure out we can modify as much as 70 percent of the guide RNA,” Yin says. “We could heavily modify it and not affect the binding of sgRNA and Cas9, and this enhanced modification really enhances activity.”

Reprogramming the liver

The researchers packaged these modified RNA guides (which they call enhanced sgRNA) into lipid nanoparticles, which they had previously used to deliver other types of RNA to the liver, and injected them into mice along with nanoparticles containing mRNA that encodes Cas9.

They experimented with knocking out a few different genes expressed by hepatocytes, but focused most of their attention on the cholesterol-regulating Pcsk9 gene. The researchers were able to eliminate this gene in more than 80 percent of liver cells, and the Pcsk9 protein was undetectable in these mice. They also found a 35 percent drop in the total cholesterol levels of the treated mice.

The researchers are now working on identifying other liver diseases that might benefit from this approach, and advancing these approaches toward use in patients.

“I think having a fully synthetic nanoparticle that can specifically turn genes off could be a powerful tool not just for Pcsk9 but for other diseases as well,” Anderson says. “The liver is a really important organ and also is a source of disease for many people. If you can reprogram the DNA of your liver while you’re still using it, we think there are many diseases that could be addressed.”

“We are very excited to see this new application of nanotechnology open new avenues for gene editing,” Langer adds.

The research was funded by the National Institutes of Health (NIH), the Russian Scientific Fund, the Skoltech Center, and the Koch Institute Support (core) Grant from the National Cancer Institute.

Here’s a link to and a citation for the paper,

Structure-guided chemical modification of guide RNA enables potent non-viral in vivo genome editing by Hao Yin, Chun-Qing Song, Sneha Suresh, Qiongqiong Wu, Stephen Walsh, Luke Hyunsik Rhym, Esther Mintzer, Mehmet Fatih Bolukbasi, Lihua Julie Zhu, Kevin Kauffman, Haiwei Mou, Alicia Oberholzer, Junmei Ding, Suet-Yan Kwan, Roman L Bogorad, Timofei Zatsepin, Victor Koteliansky, Scot A Wolfe, Wen Xue, Robert Langer, & Daniel G Anderson. Nature Biotechnology doi:10.1038/nbt.4005 Published online: 13 November 2017

This paper is behind a paywall.

Limitless energy and the International Thermonuclear Experimental Reactor (ITER)

Over 30 years in the dreaming, the International Thermonuclear Experimental Reactor (ITER) is now said to be 1/2 way to completing construction. A December 6, 2017 ITER press release (received via email) makes the joyful announcement,

WORLD’S MOST COMPLEX MACHINE IS 50 PERCENT COMPLETED
ITER is proving that fusion is the future source of clean, abundant, safe and economic energy_

The International Thermonuclear Experimental Reactor (ITER), a project to prove that fusion power can be produced on a commercial scale and is sustainable, is now 50 percent built to initial operation. Fusion is the same energy source from the Sun that gives the Earth its light and warmth.

ITER will use hydrogen fusion, controlled by superconducting magnets, to produce massive heat energy. In the commercial machines that will follow, this heat will drive turbines to produce electricity with these positive benefits:

* Fusion energy is carbon-free and environmentally sustainable, yet much more powerful than fossil fuels. A pineapple-sized amount of hydrogen offers as much fusion energy as 10,000 tons of coal.

* ITER uses two forms of hydrogen fuel: deuterium, which is easily extracted from seawater; and tritium, which is bred from lithium inside the fusion reactor. The supply of fusion fuel for industry and megacities is abundant, enough for millions of years.

* When the fusion reaction is disrupted, the reactor simply shuts down-safely and without external assistance. Tiny amounts of fuel are used, about 2-3 grams at a time; so there is no physical possibility of a meltdown accident.

* Building and operating a fusion power plant is targeted to be comparable to the cost of a fossil fuel or nuclear fission plant. But unlike today’s nuclear plants, a fusion plant will not have the costs of high-level radioactive waste disposal. And unlike fossil fuel plants,
fusion will not have the environmental cost of releasing CO2 and other pollutants.

ITER is the most complex science project in human history. The hydrogen plasma will be heated to 150 million degrees Celsius, ten times hotter than the core of the Sun, to enable the fusion reaction. The process happens in a donut-shaped reactor, called a tokamak(*), which is surrounded by giant magnets that confine and circulate the superheated, ionized plasma, away from the metal walls. The superconducting magnets must be cooled to minus 269°C, as cold as interstellar space.

The ITER facility is being built in Southern France by a scientific partnership of 35 countries. ITER’s specialized components, roughly 10 million parts in total, are being manufactured in industrial facilities all over the world. They are subsequently shipped to the ITER worksite, where they must be assembled, piece-by-piece, into the final machine.

Each of the seven ITER members-the European Union, China, India, Japan, Korea, Russia, and the United States-is fabricating a significant portion of the machine. This adds to ITER’s complexity.

In a message dispatched on December 1 [2017] to top-level officials in ITER member governments, the ITER project reported that it had completed 50 percent of the “total construction work scope through First Plasma” (**). First Plasma, scheduled for December 2025, will be the first stage of operation for ITER as a functional machine.

“The stakes are very high for ITER,” writes Bernard Bigot, Ph.D., Director-General of ITER. “When we prove that fusion is a viable energy source, it will eventually replace burning fossil fuels, which are non-renewable and non-sustainable. Fusion will be complementary with wind, solar, and other renewable energies.

“ITER’s success has demanded extraordinary project management, systems engineering, and almost perfect integration of our work.

“Our design has taken advantage of the best expertise of every member’s scientific and industrial base. No country could do this alone. We are all learning from each other, for the world’s mutual benefit.”

The ITER 50 percent milestone is getting significant attention.

“We are fortunate that ITER and fusion has had the support of world leaders, historically and currently,” says Director-General Bigot. “The concept of the ITER project was conceived at the 1985 Geneva Summit between Ronald Reagan and Mikhail Gorbachev. When the ITER Agreement was signed in 2006, it was strongly supported by leaders such as French President Jacques Chirac, U.S. President George W. Bush, and Indian Prime Minister Manmohan Singh.

“More recently, President Macron and U.S. President Donald Trump exchanged letters about ITER after their meeting this past July. One month earlier, President Xi Jinping of China hosted Russian President Vladimir Putin and other world leaders in a showcase featuring ITER and fusion power at the World EXPO in Astana, Kazakhstan.

“We know that other leaders have been similarly involved behind the scenes. It is clear that each ITER member understands the value and importance of this project.”

Why use this complex manufacturing arrangement?

More than 80 percent of the cost of ITER, about $22 billion or EUR18 billion, is contributed in the form of components manufactured by the partners. Many of these massive components of the ITER machine must be precisely fitted-for example, 17-meter-high magnets with less than a millimeter of tolerance. Each component must be ready on time to fit into the Master Schedule for machine assembly.

Members asked for this deal for three reasons. First, it means that most of the ITER costs paid by any member are actually paid to that member’s companies; the funding stays in-country. Second, the companies working on ITER build new industrial expertise in major fields-such as electromagnetics, cryogenics, robotics, and materials science. Third, this new expertise leads to innovation and spin-offs in other fields.

For example, expertise gained working on ITER’s superconducting magnets is now being used to map the human brain more precisely than ever before.

The European Union is paying 45 percent of the cost; China, India, Japan, Korea, Russia, and the United States each contribute 9 percent equally. All members share in ITER’s technology; they receive equal access to the intellectual property and innovation that comes from building ITER.

When will commercial fusion plants be ready?

ITER scientists predict that fusion plants will start to come on line as soon as 2040. The exact timing, according to fusion experts, will depend on the level of public urgency and political will that translates to financial investment.

How much power will they provide?

The ITER tokamak will produce 500 megawatts of thermal power. This size is suitable for studying a “burning” or largely self-heating plasma, a state of matter that has never been produced in a controlled environment on Earth. In a burning plasma, most of the plasma heating comes from the fusion reaction itself. Studying the fusion science and technology at ITER’s scale will enable optimization of the plants that follow.

A commercial fusion plant will be designed with a slightly larger plasma chamber, for 10-15 times more electrical power. A 2,000-megawatt fusion electricity plant, for example, would supply 2 million homes.

How much would a fusion plant cost and how many will be needed?

The initial capital cost of a 2,000-megawatt fusion plant will be in the range of $10 billion. These capital costs will be offset by extremely low operating costs, negligible fuel costs, and infrequent component replacement costs over the 60-year-plus life of the plant. Capital costs will decrease with large-scale deployment of fusion plants.

At current electricity usage rates, one fusion plant would be more than enough to power a city the size of Washington, D.C. The entire D.C. metropolitan area could be powered with four fusion plants, with zero carbon emissions.

“If fusion power becomes universal, the use of electricity could be expanded greatly, to reduce the greenhouse gas emissions from transportation, buildings and industry,” predicts Dr. Bigot. “Providing clean, abundant, safe, economic energy will be a miracle for our planet.”

*     *     *

FOOTNOTES:

* “Tokamak” is a word of Russian origin meaning a toroidal or donut-shaped magnetic chamber. Tokamaks have been built and operated for the past six decades. They are today’s most advanced fusion device design.

** “Total construction work scope,” as used in ITER’s project performance metrics, includes design, component manufacturing, building construction, shipping and delivery, assembly, and installation.

It is an extraordinary project on many levels as Henry Fountain notes in a March 27, 2017 article for the New York Times (Note: Links have been removed),

At a dusty construction site here amid the limestone ridges of Provence, workers scurry around immense slabs of concrete arranged in a ring like a modern-day Stonehenge.

It looks like the beginnings of a large commercial power plant, but it is not. The project, called ITER, is an enormous, and enormously complex and costly, physics experiment. But if it succeeds, it could determine the power plants of the future and make an invaluable contribution to reducing planet-warming emissions.

ITER, short for International Thermonuclear Experimental Reactor (and pronounced EAT-er), is being built to test a long-held dream: that nuclear fusion, the atomic reaction that takes place in the sun and in hydrogen bombs, can be controlled to generate power.

ITER will produce heat, not electricity. But if it works — if it produces more energy than it consumes, which smaller fusion experiments so far have not been able to do — it could lead to plants that generate electricity without the climate-affecting carbon emissions of fossil-fuel plants or most of the hazards of existing nuclear reactors that split atoms rather than join them.

Success, however, has always seemed just a few decades away for ITER. The project has progressed in fits and starts for years, plagued by design and management problems that have led to long delays and ballooning costs.

ITER is moving ahead now, with a director-general, Bernard Bigot, who took over two years ago after an independent analysis that was highly critical of the project. Dr. Bigot, who previously ran France’s atomic energy agency, has earned high marks for resolving management problems and developing a realistic schedule based more on physics and engineering and less on politics.

The site here is now studded with tower cranes as crews work on the concrete structures that will support and surround the heart of the experiment, a doughnut-shaped chamber called a tokamak. This is where the fusion reactions will take place, within a plasma, a roiling cloud of ionized atoms so hot that it can be contained only by extremely strong magnetic fields.

Here’s a rendering of the proposed reactor,

Source: ITER Organization

It seems the folks at the New York Times decided to remove the notes which help make sense of this image. However, it does get the idea across.

If I read the article rightly, the official cost in March 2017 was around 22 B Euros and more will likely be needed. You can read Fountain’s article for more information about fusion and ITER or go to the ITER website.

I could have sworn a local (Vancouver area) company called General Fusion was involved in the ITER project but I can’t track down any sources for confirmation. The sole connection I could find is in a documentary about fusion technology,

Here’s a little context for the film from a July 4, 2017 General Fusion news release (Note: A link has been removed),

A new documentary featuring General Fusion has captured the exciting progress in fusion across the public and private sectors.

Let There Be Light made its international premiere at the South By Southwest (SXSW) music and film festival in March [2017] to critical acclaim. The film was quickly purchased by Amazon Video, where it will be available for more than 70 million users to stream.

Let There Be Light follows scientists at General Fusion, ITER and Lawrenceville Plasma Physics in their pursuit of a clean, safe and abundant source of energy to power the world.

The feature length documentary has screened internationally across Europe and North America. Most recently it was shown at the Hot Docs film festival in Toronto, where General Fusion founder and Chief Scientist Dr. Michel Laberge joined fellow fusion physicist Dr. Mark Henderson from ITER at a series of Q&A panels with the filmmakers.

Laberge and Henderson were also interviewed by the popular CBC radio science show Quirks and Quarks, discussing different approaches to fusion, its potential benefits, and the challenges it faces.

It is yet to be confirmed when the film will be release for streaming, check Amazon Video for details.

You can find out more about General Fusion here.

Brief final comment

ITER is a breathtaking effort but if you’ve read about other large scale projects such as building a railway across the Canadian Rocky Mountains, establishing telecommunications in an  astonishing number of countries around the world, getting someone to the moon, eliminating small pox, building the pyramids, etc., it seems standard operating procedure both for the successes I’ve described and for the failures we’ve forgotten. Where ITER will finally rest on the continuum between success and failure is yet to be determined but the problems experienced so far are not necessarily a predictor.

I wish the engineers, scientists, visionaries, and others great success with finding better ways to produce energy.

Gold nanoparticles used to catalyze biofuel waste and create a useful additive

This work is the result of an international collaboration including Russia (from a May 23, 2017 news item on Nanowerk),

Gold nanoparticles serve as catalysts for obtaining valuable chemical products based on glycerol. Scientists from Tomsk Polytechnic University and their international colleagues are developing gold catalysts to recycle one of the main byproducts of biofuel production. The obtained products are in high demand in medicine, agriculture, cosmetic industry and other sectors.

Scientists from the University of Milano (Italy), the National Autonomous University of Mexico, the Institute of Catalysis and Petrochemistry of Madrid (Spain) and the University of Porto (Portugal) take part in the study of gold nanoparticles.

A May 23, 2027 Tomsk Polytechnic University press release, which originated the news item, expands on the theme,

Today the production of biofuels is an important area in many countries. They can be obtained from a great variety of biomasses. In Latin America it is orange and tangerine peel as well as banana skin. In USA biofuels are produced from corn, in the central part of Russia and Europe – from rape (Brassica napus). When processing these plants into biofuels a large amount of glycerol is formed. Its esters constitute the basis of oils and fats. Glycerol is widely used in cosmetic industry as an individual product. However, much more glycerol is obtained in the production of biofuels – many thousands of tons a year. As a result, unused glycerol merely becomes waste,’ describes the problem Alexey Pestryakov, the Head of the Department of Physical and Analytical Chemistry. ‘Now, a lot of research groups are engaged in this issue as to how to transform excess glycerol into other useful products. Along with our foreign colleagues we offered catalysts based on gold nanoparticles.’

The authors of the research note that catalytic oxidation on gold is one of the most effective techniques to obtain from glycerol such useful products as aldehydes, esters, carboxylic acids and other substances.

‘All these substances are products of fine organic chemistry and are in demand in a wide range of industries, first of all, in the pharmaceutical and cosmetic industries. In agriculture they are applied as part of different feed additives, veterinary drugs, fertilizers, plant treatment products, etc.

Thus, unused glycerol after being processed will further be applied,’ sums up Alexey Pestryakov.

Gold catalysts are super active. They can enter into chemical reactions with other substances at room temperature (other catalysts need to be heated), in some case even under zero. However, gold can be a catalyst only at the nanolevel.

‘If you take a piece of gold, even very tiny, there will be no chemical reaction. In order to make gold become chemically active, the size of its particle should be less than two nanometers. Only then it gets its amazing properties,’ explains the scientist.

As a catalyst gold was discovered not so long ago, in the early 1990s, by Japanese chemists.

To date, TPU scientists and their colleagues are not the only ones who develop such catalysts.

Unlike their counterparts the gold catalysts developed at TPU are more stable (they retain their activity longer).

‘A great challenge in this area is that gold catalysts are very rapidly deactivated, not only during work, but even during storage. Our objective is to ensure their longer shelf life. It is also important to use oxygen as an oxidizer, since toxic and corrosive peroxide compounds are often used for such purposes,’ says Alexey Petryakov.

Here’s a link to and a citation for the paper,

More Insights into Support and Preparation Method Effects in Gold Catalyzed Glycerol Oxidation by Nina Bogdanchikova, Inga Tuzovskaya, Laura Prati, Alberto Villa, Alexey Pestryakov, Mario Farías. Current Organic Synthesis VOLUME: 14 ISSUE: 3 Year: 2017Page: [377 – 382] Pages: 6 DOI: 10.2174/1570179413666161031114833

This paper is behind a paywall. (Scroll down the page to find the article.)

‘Mother of all bombs’ is a nanoweapon?

According to physicist, Louis A. Del Monte, in an April 14, 2017 opinion piece for Huffington Post.com, the ‘mother of all bombs ‘ is a nanoweapon (Note: Links have been removed),

The United States military dropped its largest non-nuclear bomb, the GBU-43/B Massive Ordnance Air Blast Bomb (MOAB), nicknamed the “mother of all bombs,” on an ISIS cave and tunnel complex in the Achin District of the Nangarhar province, Afghanistan [on Thursday, April 13, 2017]. The Achin District is the center of ISIS activity in Afghanistan. This was the first use in combat of the GBU-43/B Massive Ordnance Air Blast (MOAB).

… Although it carries only about 8 tons of explosives, the explosive mixture delivers a destructive impact equivalent of 11 tons of TNT.

There is little doubt the United States Department of Defense is likely using nanometals, such as nanoaluminum (alternately spelled nano-aluminum) mixed with TNT, to enhance the detonation properties of the MOAB. The use of nanoaluminum mixed with TNT was known to boost the explosive power of the TNT since the early 2000s. If true, this means that the largest known United States non-nuclear bomb is a nanoweapon. When most of us think about nanoweapons, we think small, essentially invisible weapons, like nanobots (i.e., tiny robots made using nanotechnology). That can often be the case. But, as defined in my recent book, Nanoweapons: A Growing Threat to Humanity (Potomac 2017), “Nanoweapons are any military technology that exploits the power of nanotechnology.” This means even the largest munition, such as the MOAB, is a nanoweapon if it uses nanotechnology.

… The explosive is H6, which is a mixture of five ingredients (by weight):

  • 44.0% RDX & nitrocellulose (RDX is a well know explosive, more powerful that TNT, often used with TNT and other explosives. Nitrocellulose is a propellant or low-order explosive, originally known as gun-cotton.)
  • 29.5% TNT
  • 21.0% powdered aluminum
  • 5.0% paraffin wax as a phlegmatizing (i.e., stabilizing) agent.
  • 0.5% calcium chloride (to absorb moisture and eliminate the production of gas

Note, the TNT and powdered aluminum account for over half the explosive payload by weight. It is highly likely that the “powdered aluminum” is nanoaluminum, since nanoaluminum can enhance the destructive properties of TNT. This argues that H6 is a nano-enhanced explosive, making the MOAB a nanoweapon.

The United States GBU-43/B Massive Ordnance Air Blast Bomb (MOAB) was the largest non-nuclear bomb known until Russia detonated the Aviation Thermobaric Bomb of Increased Power, termed the “father of all bombs” (FOAB), in 2007. It is reportedly four times more destructive than the MOAB, even though it carries only 7 tons of explosives versus the 8 tons of the MOAB. Interestingly, the Russians claim to achieve the more destructive punch using nanotechnology.

If you have the time, I encourage you to read the piece in its entirety.

Nanozymes as an antidote for pesticides

Should you have concerns about exposure to pesticides or chemical warfare agents (timely given events in Syria as per this April 4, 2017 news item on CBC [Canadian Broadcasting News Corporation] online) , scientists at the Lomonosov Moscow State University have developed a possible antidote according to a March 8,, 2017 news item on phys.org,

Members of the Faculty of Chemistry of the Lomonosov Moscow State University have developed novel nanosized agents that could be used as efficient protective and antidote modalities against the impact of neurotoxic organophosphorus compounds such as pesticides and chemical warfare agents. …

A March 7, 2017 Lomonosov Moscow State University press release on EurekAlert, which originated the news item, describes the work in detail,

A group of scientists from the Faculty of Chemistry under the leadership of Prof. Alexander Kabanov has focused their research supported by a “megagrant” on the nanoparticle-based delivery to an organism of enzymes, capable of destroying toxic organophosphorous compounds. Development of first nanosized drugs has started more than 30 years ago and already in the 90-s first nanomedicines for cancer treatment entered the market. First such medicines were based on liposomes – spherical vesicles made of lipid bilayers. The new technology, developed by Kabanov and his colleagues, uses an enzyme, synthesized at the Lomonosov Moscow State University, encapsulated into a biodegradable polymer coat, based on an amino acid (glutamic acid).

Alexander Kabanov, Doctor of Chemistry, Professor at the Eshelman School of Pharmacy of the University of North Carolina (USA) and the Faculty of Chemistry, M. V. Lomonosov Moscow State University, one of the authors of the article explains: “At the end of the 80-s my team (at that time in Moscow) and independently Japanese colleagues led by Prof. Kazunori Kataoka from Tokyo began using polymer micelles for small molecules delivery. Soon the nanomedicine field has “exploded”. Currently hundreds of laboratories across the globe work in this area, applying a wide variety of approaches to creation of such nanosized agents. A medicine on the basis of polymeric micelles, developed by a Korean company Samyang Biopharm, was approved for human use in 2006.”

Professor Kabanov’s team after moving to the USA in 1994 focused on development of polymer micelles, which could include biopolymers due to electrostatic interactions. Initially chemists were interested in usage of micelles for RNA and DNA delivery but later on scientists started actively utilizing this approach for delivery of proteins and, namely, enzymes, to the brain and other organs.

Alexander Kabanov says: “At the time I worked at the University of Nebraska Medical Center, in Omaha (USA) and by 2010 we had a lot of results in this area. That’s why when my colleague from the Chemical Enzymology Department of the Lomonosov Moscow State University, Prof. Natalia Klyachko offered me to apply for a megagrant the research theme of the new laboratory was quite obvious. Specifically, to use our delivery approach, which we’ve called a “nanozyme”, for “improvement” of enzymes, developed by colleagues at the Lomonosov Moscow State University for its further medical application.”

Scientists together with the group of enzymologists from the Lomonosov Moscow State University under the leadership of Elena Efremenko, Doctor of Biological Sciences, have chosen organophosphorus hydrolase as a one of the delivered enzymes. Organophosphorus hydrolase is capable of degrading toxic pesticides and chemical warfare agents with very high rate. However, it has disadvantages: because of its bacterial origin, an immune response is observed as a result of its delivery to an organism of mammals. Moreover, organophosphorus hydrolase is quickly removed from the body. Chemists have solved this problem with the help of a “self-assembly” approach: as a result of inclusion of organophosphorus hydrolase enzyme in a nanozyme particles the immune response becomes weaker and, on the contrary, both the storage stability of the enzyme and its lifetime after delivery to an organism considerably increase. Rat experiments have proved that such nanozyme efficiently protects organisms against lethal doses of highly toxic pesticides and even chemical warfare agents, such as VX nerve gas.

Alexander Kabanov summarizes: “The simplicity of our approach is very important. You could get an organophosphorus hydrolase nanozyme by simple mixing of aqueous solutions of anenzyme and safe biocompatible polymer. This nanozyme is self-assembled due to electrostatic interaction between a protein (enzyme) and polymer”.

According to the scientist’s words the simplicity and technological effectiveness of the approach along with the obtained promising results of animal experiments bring hope that this modality could be successful and in clinical use.

Members of the Faculty of Chemistry of the Lomonosov Moscow State University, along with scientists from the 27th Central Research Institute of the Ministry of Defense of the Russian Federation, the Eshelman School of Pharmacy of the University of North Carolina at Chapel Hill (USA) and the University of Nebraska Medical Center (UNC) have taken part in the Project.

Here’s a link to and a citation for the paper,

A simple and highly effective catalytic nanozyme scavenger for organophosphorus neurotoxins by Elena N. Efremenko, Ilya V. Lyagin, Natalia L. Klyachko, Tatiana Bronich, Natalia V. Zavyalova, Yuhang Jiang, Alexander V. Kabanov. Journal of Controlled Release Volume 247, 10 February 2017, Pages 175–181  http://dx.doi.org/10.1016/j.jconrel.2016.12.037

This paper is behind a paywall.

China, US, and the race for artificial intelligence research domination

John Markoff and Matthew Rosenberg have written a fascinating analysis of the competition between US and China regarding technological advances, specifically in the field of artificial intelligence. While the focus of the Feb. 3, 2017 NY Times article is military, the authors make it easy to extrapolate and apply the concepts to other sectors,

Robert O. Work, the veteran defense official retained as deputy secretary by President Trump, calls them his “A.I. dudes.” The breezy moniker belies their serious task: The dudes have been a kitchen cabinet of sorts, and have advised Mr. Work as he has sought to reshape warfare by bringing artificial intelligence to the battlefield.

Last spring, he asked, “O.K., you guys are the smartest guys in A.I., right?”

No, the dudes told him, “the smartest guys are at Facebook and Google,” Mr. Work recalled in an interview.

Now, increasingly, they’re also in China. The United States no longer has a strategic monopoly on the technology, which is widely seen as the key factor in the next generation of warfare.

The Pentagon’s plan to bring A.I. to the military is taking shape as Chinese researchers assert themselves in the nascent technology field. And that shift is reflected in surprising commercial advances in artificial intelligence among Chinese companies. [emphasis mine]

Having read Marshal McLuhan (de rigeur for any Canadian pursuing a degree in communications [sociology-based] anytime from the 1960s into the late 1980s [at least]), I took the movement of technology from military research to consumer applications as a standard. Television is a classic example but there are many others including modern plastic surgery. The first time, I encountered the reverse (consumer-based technology being adopted by the military) was in a 2004 exhibition “Massive Change: The Future of Global Design” produced by Bruce Mau for the Vancouver (Canada) Art Gallery.

Markoff and Rosenberg develop their thesis further (Note: Links have been removed),

Last year, for example, Microsoft researchers proclaimed that the company had created software capable of matching human skills in understanding speech.

Although they boasted that they had outperformed their United States competitors, a well-known A.I. researcher who leads a Silicon Valley laboratory for the Chinese web services company Baidu gently taunted Microsoft, noting that Baidu had achieved similar accuracy with the Chinese language two years earlier.

That, in a nutshell, is the challenge the United States faces as it embarks on a new military strategy founded on the assumption of its continued superiority in technologies such as robotics and artificial intelligence.

First announced last year by Ashton B. Carter, President Barack Obama’s defense secretary, the “Third Offset” strategy provides a formula for maintaining a military advantage in the face of a renewed rivalry with China and Russia.

As consumer electronics manufacturing has moved to Asia, both Chinese companies and the nation’s government laboratories are making major investments in artificial intelligence.

The advance of the Chinese was underscored last month when Qi Lu, a veteran Microsoft artificial intelligence specialist, left the company to become chief operating officer at Baidu, where he will oversee the company’s ambitious plan to become a global leader in A.I.

The authors note some recent military moves (Note: Links have been removed),

In August [2016], the state-run China Daily reported that the country had embarked on the development of a cruise missile system with a “high level” of artificial intelligence. The new system appears to be a response to a missile the United States Navy is expected to deploy in 2018 to counter growing Chinese military influence in the Pacific.

Known as the Long Range Anti-Ship Missile, or L.R.A.S.M., it is described as a “semiautonomous” weapon. According to the Pentagon, this means that though targets are chosen by human soldiers, the missile uses artificial intelligence technology to avoid defenses and make final targeting decisions.

The new Chinese weapon typifies a strategy known as “remote warfare,” said John Arquilla, a military strategist at the Naval Post Graduate School in Monterey, Calif. The idea is to build large fleets of small ships that deploy missiles, to attack an enemy with larger ships, like aircraft carriers.

“They are making their machines more creative,” he said. “A little bit of automation gives the machines a tremendous boost.”

Whether or not the Chinese will quickly catch the United States in artificial intelligence and robotics technologies is a matter of intense discussion and disagreement in the United States.

Markoff and Rosenberg return to the world of consumer electronics as they finish their article on AI and the military (Note: Links have been removed),

Moreover, while there appear to be relatively cozy relationships between the Chinese government and commercial technology efforts, the same cannot be said about the United States. The Pentagon recently restarted its beachhead in Silicon Valley, known as the Defense Innovation Unit Experimental facility, or DIUx. It is an attempt to rethink bureaucratic United States government contracting practices in terms of the faster and more fluid style of Silicon Valley.

The government has not yet undone the damage to its relationship with the Valley brought about by Edward J. Snowden’s revelations about the National Security Agency’s surveillance practices. Many Silicon Valley firms remain hesitant to be seen as working too closely with the Pentagon out of fear of losing access to China’s market.

“There are smaller companies, the companies who sort of decided that they’re going to be in the defense business, like a Palantir,” said Peter W. Singer, an expert in the future of war at New America, a think tank in Washington, referring to the Palo Alto, Calif., start-up founded in part by the venture capitalist Peter Thiel. “But if you’re thinking about the big, iconic tech companies, they can’t become defense contractors and still expect to get access to the Chinese market.”

Those concerns are real for Silicon Valley.

If you have the time, I recommend reading the article in its entirety.

Impact of the US regime on thinking about AI?

A March 24, 2017 article by Daniel Gross for Slate.com hints that at least one high level offician in the Trump administration may be a little naïve in his understanding of AI and its impending impact on US society (Note: Links have been removed),

Treasury Secretary Steven Mnuchin is a sharp guy. He’s a (legacy) alumnus of Yale and Goldman Sachs, did well on Wall Street, and was a successful movie producer and bank investor. He’s good at, and willing to, put other people’s money at risk alongside some of his own. While he isn’t the least qualified person to hold the post of treasury secretary in 2017, he’s far from the best qualified. For in his 54 years on this planet, he hasn’t expressed or displayed much interest in economic policy, or in grappling with the big picture macroeconomic issues that are affecting our world. It’s not that he is intellectually incapable of grasping them; they just haven’t been in his orbit.

Which accounts for the inanity he uttered at an Axios breakfast Friday morning about the impact of artificial intelligence on jobs.

“it’s not even on our radar screen…. 50-100 more years” away, he said. “I’m not worried at all” about robots displacing humans in the near future, he said, adding: “In fact I’m optimistic.”

A.I. is already affecting the way people work, and the work they do. (In fact, I’ve long suspected that Mike Allen, Mnuchin’s Axios interlocutor, is powered by A.I.) I doubt Mnuchin has spent much time in factories, for example. But if he did, he’d see that machines and software are increasingly doing the work that people used to do. They’re not just moving goods through an assembly line, they’re soldering, coating, packaging, and checking for quality. Whether you’re visiting a GE turbine plant in South Carolina, or a cable-modem factory in Shanghai, the thing you’ll notice is just how few people there actually are. It’s why, in the U.S., manufacturing output rises every year while manufacturing employment is essentially stagnant. It’s why it is becoming conventional wisdom that automation is destroying more manufacturing jobs than trade. And now we are seeing the prospect of dark factories, which can run without lights because there are no people in them, are starting to become a reality. The integration of A.I. into factories is one of the reasons Trump’s promise to bring back manufacturing employment is absurd. You’d think his treasury secretary would know something about that.

It goes far beyond manufacturing, of course. Programmatic advertising buying, Spotify’s recommendation engines, chatbots on customer service websites, Uber’s dispatching system—all of these are examples of A.I. doing the work that people used to do. …

Adding to Mnuchin’s lack of credibility on the topic of jobs and robots/AI, Matthew Rozsa’s March 28, 2017 article for Salon.com features a study from the US National Bureau of Economic Research (Note: Links have been removed),

A new study by the National Bureau of Economic Research shows that every fully autonomous robot added to an American factory has reduced employment by an average of 6.2 workers, according to a report by BuzzFeed. The study also found that for every fully autonomous robot per thousand workers, the employment rate dropped by 0.18 to 0.34 percentage points and wages fell by 0.25 to 0.5 percentage points.

I can’t help wondering if the US Secretary of the Treasury is so oblivious to what is going on in the workplace whether that’s representative of other top-tier officials such as the Secretary of Defense, Secretary of Labor, etc. What is going to happen to US research in fields such as robotics and AI?

I have two more questions, in future what happens to research which contradicts or makes a top tier Trump government official look foolish? Will it be suppressed?

You can find the report “Robots and Jobs: Evidence from US Labor Markets” by Daron Acemoglu and Pascual Restrepo. NBER (US National Bureau of Economic Research) WORKING PAPER SERIES (Working Paper 23285) released March 2017 here. The introduction featured some new information for me; the term ‘technological unemployment’ was introduced in 1930 by John Maynard Keynes.

Moving from a wholly US-centric view of AI

Naturally in a discussion about AI, it’s all US and the country considered its chief sceince rival, China, with a mention of its old rival, Russia. Europe did rate a mention, albeit as a totality. Having recently found out that Canadians were pioneers in a very important aspect of AI, machine-learning, I feel obliged to mention it. You can find more about Canadian AI efforts in my March 24, 2017 posting (scroll down about 40% of the way) where you’ll find a very brief history and mention of the funding for a newly launching, Pan-Canadian Artificial Intelligence Strategy.

If any of my readers have information about AI research efforts in other parts of the world, please feel free to write them up in the comments.

International news bits: Israel and Germany and Cuba and Iran

I have three news bits today.

Germany

From a Nov. 14, 2016 posting by Lynn L. Bergeson and Carla N. Hutton for The National Law Review (Note: A link has been removed),

The German Federal Ministry of Education and Research (BMBF) recently published an English version of its Action Plan Nanotechnology 2020. Based on the success of the Action Plan Nanotechnology over the previous ten years, the federal government will continue the Action Plan Nanotechnology for the next five years.  Action Plan Nanotechnology 2020 is geared towards the priorities of the federal government’s new “High-Tech Strategy” (HTS), which has as its objective the solution of societal challenges by promoting research.  According to Action Plan Nanotechnology 2020, the results of a number of research projects “have shown that nanomaterials are not per se linked with a risk for people and the environment due to their nanoscale properties.”  Instead, this is influenced more by structure, chemical composition, and other factors, and is thus dependent on the respective material and its application.

A Nov. 16, 2016 posting on Out-Law.com provides mores detail about the plan (Note: A link has been removed),

Eight ministries have been responsible for producing a joint plan on nanotechnology every five years since 2006, the Ministry said. The ministries develop a common approach that pools strategies for action and fields of application for nanotechnology, it [Germany’s Federal Ministry of Education and Research] said.

The German public sector currently spends more than €600 million a year on nanotechnology related developments, and 2,200 organisations from industry, services, research and associations are registered in the Ministry’s nanotechnology competence map, the report said.

“There are currently also some 1,100 companies in Germany engaged [in] the use of nanotechnology in the fields of research and development as well as the marketing of commercial products and services. The proportion of SMEs [small to medium enterprises?] is around 75%,” it said.

Nanotechnology-based product innovations play “an increasingly important role in many areas of life, such as health and nutrition, the workplace, mobility and energy production”, and the plan “thus pursues the objective of continuing to exploit the opportunities and potential of nanotechnology in Germany, without disregarding any potential risks to humans and the environment.”, the Ministry said.

Technology law expert Florian von Baum of Pinsent Masons, the law firm behind Out-Law.com said: “The action plan aims to achieve and secure Germany’s critical lead in the still new nanotechnology field and to recognise and use the full potential of nanotechnology while taking into account possible risks and dangers of this new technology.”

..

“With the rapid pace of development and the new applications that emerge every day, the government needs to ensure that the dangers and risks are sufficiently recognised and considered. Nanotechnology will provide great and long-awaited breakthroughs in health and ecological areas, but ethical, legal and socio-economic issues must be assessed and evaluated at all stages of the innovation chain,” von Baum said.

You can find Germany’s Action Plan Nanotechnology 2020 here, all 64 pp.of it.

Israel and Germany

A Nov. 16, 2016 article by Shoshanna Solomon for The Times of Israel announces a new joint (Israel-Germany) nanotechnology fund,

Tsrael and Germany have set up a new three-year, €30 million plan to promote joint nanotechnology initiatives and are calling on companies and entities in both countries to submit proposals for funding for projects in this field.

“Nanotech is the industry of the future in global hi-tech and Israel has set a goal of becoming a leader of this field, while cooperating with leading European countries,” Ilan Peled, manager of Technological Infrastructure Arena at the Israel Innovation Authority, said in a statement announcing the plan.

In the past decade nanotechnology, seen by many as the tech field of the future, has focused mainly on research. Now, however, Israel’s Innovation Authority, which has set up the joint program with Germany, believes the next decade will focus on the application of this research into products — and countries are keen to set up the right ecosystem that will draw companies operating in this field to them.

Over the last decade, the country has focused on creating a “robust research foundation that can support a large industry,” the authority said, with six academic research institutes that are among the world’s most advanced.

In addition, the authority said, there are about 200 new startups that were established over the last decade in the field, many in the development stage.

I know it’s been over 70 years since the events of World War II but this does seem like an unexpected coupling. It is heartening to see that people can resolve the unimaginable within the space of a few generations.

Iran and Cuba

A Nov. 16, 2016 Mehr News Agency press release announces a new laboratory in Cuba,

Iran is ready to build a laboratory center equipped with nanotechnology in one of nano institutes in Cuba, Iran’s VP for Science and Technology Sorena Sattari said Tuesday [Nov. 15, 2016].

Sorena Sattari, Vice-President for Science and Technology, made the remark in a meeting with Fidel Castro Diaz-Balart, scientific adviser to the Cuban president, in Tehran on Tuesday [November 15, 2016], adding that Iran is also ready to present Cuba with a gifted package including educational services related to how to operate the equipment at the lab.

During the meeting, Sattari noted Iran’s various technological achievements including exports of biotechnological medicine to Russia, the extensive nanotechnology plans for high school and university students as well as companies, the presence of about 160 companies active in the field of nanotechnology and the country’s achievements in the field of water treatment.

“We have sealed good nano agreements with Cuba, and are ready to develop our technological cooperation with this country in the field of vaccines and recombinant drugs,” he said.

Sattari maintained that the biggest e-commerce company in the Middle East is situated in Iran, adding “the company which was only established six years ago now sales over $3.5 million in a day, and is even bigger than similar companies in Russia.”

The Cuban official, for his part, welcomed any kind of cooperation with Iran, and thanked the Islamic Republic for its generous proposal on establishing a nanotechnology laboratory in his country.

This coupling is not quite so unexpected as Iran has been cozying up to all kinds of countries in its drive to establish itself as a nanotechnology leader.

A computer that intuitively predicts a molecule’s chemical properties

First, we have emotional artificial intelligence from MIT (Massachusetts Institute of Technology) with their Kismet [emotive AI] project and now we have intuitive computers according to an Oct. 14, 2016 news item on Nanowerk,

Scientists from Moscow Institute of Physics and Technology (MIPT)’s Research Center for Molecular Mechanisms of Aging and Age-Related Diseases together with Inria research center, Grenoble, France have developed a software package called Knodle to determine an atom’s hybridization, bond orders and functional groups’ annotation in molecules. The program streamlines one of the stages of developing new drugs.

An Oct. 14, 2016 Moscow Institute of Physics and Technology press release (also on EurekAlert), which originated the news item, expands on the theme,

Imagine that you were to develop a new drug. Designing a drug with predetermined properties is called drug-design. Once a drug has entered the human body, it needs to take effect on the cause of a disease. On a molecular level this is a malfunction of some proteins and their encoding genes. In drug-design these are called targets. If a drug is antiviral, it must somehow prevent the incorporation of viral DNA into human DNA. In this case the target is viral protein. The structure of the incorporating protein is known, and we also even know which area is the most important – the active site. If we insert a molecular “plug” then the viral protein will not be able to incorporate itself into the human genome and the virus will die. It boils down to this: you find the “plug” – you have your drug.

But how can we find the molecules required? Researchers use an enormous database of substances for this. There are special programs capable of finding a needle in a haystack; they use quantum chemistry approximations to predict the place and force of attraction between a molecular “plug” and a protein. However, databases only store the shape of a substance; information about atom and bond states is also needed for an accurate prediction. Determining these states is what Knodle does. With the help of the new technology, the search area can be reduced from hundreds of thousands to just a hundred. These one hundred can then be tested to find drugs such as Reltagravir – which has actively been used for HIV prevention since 2011.

From science lessons at school everyone is used to seeing organic substances as letters with sticks (substance structure), knowing that in actual fact there are no sticks. Every stick is a bond between electrons which obeys the laws of quantum chemistry. In the case of one simple molecule, like the one in the diagram [diagram follows], the experienced chemist intuitively knows the hybridizations of every atom (the number of neighboring atoms which it is connected to) and after a few hours looking at reference books, he or she can reestablish all the bonds. They can do this because they have seen hundreds and hundreds of similar substances and know that if oxygen is “sticking out like this”, it almost certainly has a double bond. In their research, Maria Kadukova, a MIPT student, and Sergei Grudinin, a researcher from Inria research center located in Grenoble, France, decided to pass on this intuition to a computer by using machine learning.

Compare “A solid hollow object with a handle, opening at the top and an elongation at the side, at the end of which there is another opening” and “A vessel for the preparation of tea”. Both of them describe a teapot rather well, but the latter is simpler and more believable. The same is true for machine learning, the best algorithm for learning is the simplest. This is why the researchers chose to use a nonlinear support vector machines (SVM), a method which has proven itself in recognizing handwritten text and images. On the input it was given the positions of neighboring atoms and on the output collected hybridization.

Good learning needs a lot of examples and the scientists did this using 7605 substances with known structures and atom states. “This is the key advantage of the program we have developed, learning from a larger database gives better predictions. Knodle is now one step ahead of similar programs: it has a margin of error of 3.9%, while for the closest competitor this figure is 4.7%”, explains Maria Kadukova. And that is not the only benefit. The software package can easily be modified for a specific problem. For example, Knodle does not currently work with substances containing metals, because those kind of substances are rather rare. But if it turns out that a drug for Alzheimer’s is much more effective if it has metal, the only thing needed to adapt the program is a database with metallic substances. We are now left to wonder what new drug will be found to treat a previously incurable disease.

Scientists from MIPT's Research Center for Molecular Mechanisms of Aging and Age-Related Diseases together with Inria research center, Grenoble, France have developed a software package called Knodle to determine an atom's hybridization, bond orders and functional groups' annotation in molecules. The program streamlines one of the stages of developing new drugs. Credit: MIPT Press Office

Scientists from MIPT’s Research Center for Molecular Mechanisms of Aging and Age-Related Diseases together with Inria research center, Grenoble, France have developed a software package called Knodle to determine an atom’s hybridization, bond orders and functional groups’ annotation in molecules. The program streamlines one of the stages of developing new drugs. Credit: MIPT Press Office

Here’s a link to and a citation for the paper,

Knodle: A Support Vector Machines-Based Automatic Perception of Organic Molecules from 3D Coordinates by Maria Kadukova and Sergei Grudinin. J. Chem. Inf. Model., 2016, 56 (8), pp 1410–1419 DOI: 10.1021/acs.jcim.5b00512 Publication Date (Web): July 13, 2016

Copyright © 2016 American Chemical Society

This paper is behind a paywall.

Mechanism behind interaction of silver nanoparticles with the cells of the immune system

Scientists have come to a better understanding of the mechanism affecting silver nanoparticle toxicity according to an Aug. 30, 2016 news item on Nanowerk (Note: A link has been removed),

A senior fellow at the Faculty of Chemistry, MSU (Lomonosov Moscow State University), Vladimir Bochenkov together with his colleagues from Denmark succeeded in deciphering the mechanism of interaction of silver nanoparticles with the cells of the immune system. The study is published in the journal Nature Communications (“Dynamic protein coronas revealed as a modulator of silver nanoparticle sulphidation in vitro”).

‘Currently, a large number of products are containing silver nanoparticles: antibacterial drugs, toothpaste, polishes, paints, filters, packaging, medical and textile items. The functioning of these products lies in the capacity of silver to dissolve under oxidation and form ions Ag+ with germicidal properties. At the same time there are research data in vitro, showing the silver nanoparticles toxicity for various organs, including the liver, brain and lungs. In this regard, it is essential to study the processes occurring with silver nanoparticles in biological environments, and the factors affecting their toxicity,’ says Vladimir Bochenkov.

Caption: Increased intensity of the electric field near the silver nanoparticle surface in the excitation of plasmon resonance. Credit: Vladimir Bochenkov

Caption: Increased intensity of the electric field near the silver nanoparticle surface in the excitation of plasmon resonance. Credit: Vladimir Bochenkov

An Aug. 30, 2016 MSU press release on EurekAlert, which originated the news item, provides more information about the research,

The study is devoted to the protein corona — a layer of adsorbed protein molecules, which is formed on the surface of the silver nanoparticles during their contact with the biological environment, for example in blood. Protein crown masks nanoparticles and largely determines their fate: the speed of the elimination from the body, the ability to penetrate to a particular cell type, the distribution between the organs, etc.

According to the latest research, the protein corona consists of two layers: a rigid hard corona — protein molecules tightly bound with silver nanoparticles, and soft corona, consisting of weakly bound protein molecules in a dynamic equilibrium with the solution. Hitherto soft corona has been studied very little because of the experimental difficulties: the weakly bound nanoparticles separated from the protein solution easily desorbed (leave a particle remaining in the solution), leaving only the rigid corona on the nanoparticle surface.

The size of the studied silver nanoparticles was of 50-88 nm, and the diameter of the proteins that made up the crown — 3-7 nm. Scientists managed to study the silver nanoparticles with the protein corona in situ, not removing them from the biological environment. Due to the localized surface plasmon resonance used for probing the environment near the surface of the silver nanoparticles, the functions of the soft corona have been primarily investigated.

‘In the work we showed that the corona may affect the ability of the nanoparticles to dissolve to silver cations Ag+, which determine the toxic effect. In the absence of a soft corona (quickly sharing the medium protein layer with the environment) silver cations are associated with the sulfur-containing amino acids in serum medium, particularly cysteine and methionine, and precipitate as nanocrystals Ag2S in the hard corona,’ says Vladimir Bochenkov.

Ag2S (silver sulfide) famously easily forms on the silver surface even on the air in the presence of the hydrogen sulfide traces. Sulfur is also part of many biomolecules contained in the body, provoking the silver to react and be converted into sulfide. Forming of the nano-crystals Ag2S due to low solubility reduces the bioavailability of the Ag+ ions, reducing the toxicity of silver nanoparticles to null. With a sufficient amount of amino acid sulfur sources available for reaction, all the potentially toxic silver is converted into the nontoxic insoluble sulfide. Scientists have shown that what happens in the absence of a soft corona.

In the presence of a soft corona, the Ag2S silver sulfide nanocrystals are formed in smaller quantities or not formed at all. Scientists attribute this to the fact that the weakly bound protein molecules transfer the Ag+ ions from nanoparticles into the solution, thereby leaving the sulfide not crystallized. Thus, the soft corona proteins are ‘vehicles’ for the silver ions.

This effect, scientists believe, be taken into account when analyzing the stability of silver nanoparticles in a protein environment, and in interpreting the results of the toxicity studies. Studies of the cells viability of the immune system (J774 murine line macrophages) confirmed the reduction in cell toxicity of silver nanoparticles at the sulfidation (in the absence of a soft corona).

Vladimir Bochenkov’s challenge was to simulate the plasmon resonance spectra of the studied systems and to create the theoretical model that allowed quantitative determination of silver sulfide content in situ around nanoparticles, following the change in the absorption bands in the experimental spectra. Since the frequency of the plasmon resonance is sensitive to a change in dielectric constant near the nanoparticle surface, changes in the absorption spectra contain information about the amount of silver sulfide formed.

Knowledge of the mechanisms of formation and dynamics of the behavior of the protein corona, information about its composition and structure are extremely important for understanding the toxicity and hazards of nanoparticles for the human body. In prospect the protein corona formation can be used to deliver drugs in the body, including the treatment of cancer. For this purpose it will be enough to pick such a content of the protein corona, which enables silver nanoparticles penetrate only in the cancer cell and kill it.

Here’s a link to and a citation for the paper describing this fascinating work,

Dynamic protein coronas revealed as a modulator of silver nanoparticle sulphidation in vitro by Teodora Miclăuş, Christiane Beer, Jacques Chevallier, Carsten Scavenius, Vladimir E. Bochenkov, Jan J. Enghild, & Duncan S. Sutherland. Nature Communications 7,
Article number: 11770 doi:10.1038/ncomms11770 Published  09 June 2016

This paper is open access.