Tag Archives: mathematics

Brain stuff: quantum entanglement and a multi-dimensional universe

I have two brain news bits, one about neural networks and quantum entanglement and another about how the brain operates on more than three dimensions.

Quantum entanglement and neural networks

A June 13, 2017 news item on phys.org describes how machine learning can be used to solve problems in physics (Note: Links have been removed),

Machine learning, the field that’s driving a revolution in artificial intelligence, has cemented its role in modern technology. Its tools and techniques have led to rapid improvements in everything from self-driving cars and speech recognition to the digital mastery of an ancient board game.

Now, physicists are beginning to use machine learning tools to tackle a different kind of problem, one at the heart of quantum physics. In a paper published recently in Physical Review X, researchers from JQI [Joint Quantum Institute] and the Condensed Matter Theory Center (CMTC) at the University of Maryland showed that certain neural networks—abstract webs that pass information from node to node like neurons in the brain—can succinctly describe wide swathes of quantum systems.

An artist’s rendering of a neural network with two layers. At the top is a real quantum system, like atoms in an optical lattice. Below is a network of hidden neurons that capture their interactions (Credit: E. Edwards/JQI)

A June 12, 2017 JQI news release by Chris Cesare, which originated the news item, describes how neural networks can represent quantum entanglement,

Dongling Deng, a JQI Postdoctoral Fellow who is a member of CMTC and the paper’s first author, says that researchers who use computers to study quantum systems might benefit from the simple descriptions that neural networks provide. “If we want to numerically tackle some quantum problem,” Deng says, “we first need to find an efficient representation.”

On paper and, more importantly, on computers, physicists have many ways of representing quantum systems. Typically these representations comprise lists of numbers describing the likelihood that a system will be found in different quantum states. But it becomes difficult to extract properties or predictions from a digital description as the number of quantum particles grows, and the prevailing wisdom has been that entanglement—an exotic quantum connection between particles—plays a key role in thwarting simple representations.

The neural networks used by Deng and his collaborators—CMTC Director and JQI Fellow Sankar Das Sarma and Fudan University physicist and former JQI Postdoctoral Fellow Xiaopeng Li—can efficiently represent quantum systems that harbor lots of entanglement, a surprising improvement over prior methods.

What’s more, the new results go beyond mere representation. “This research is unique in that it does not just provide an efficient representation of highly entangled quantum states,” Das Sarma says. “It is a new way of solving intractable, interacting quantum many-body problems that uses machine learning tools to find exact solutions.”

Neural networks and their accompanying learning techniques powered AlphaGo, the computer program that beat some of the world’s best Go players last year (link is external) (and the top player this year (link is external)). The news excited Deng, an avid fan of the board game. Last year, around the same time as AlphaGo’s triumphs, a paper appeared that introduced the idea of using neural networks to represent quantum states (link is external), although it gave no indication of exactly how wide the tool’s reach might be. “We immediately recognized that this should be a very important paper,” Deng says, “so we put all our energy and time into studying the problem more.”

The result was a more complete account of the capabilities of certain neural networks to represent quantum states. In particular, the team studied neural networks that use two distinct groups of neurons. The first group, called the visible neurons, represents real quantum particles, like atoms in an optical lattice or ions in a chain. To account for interactions between particles, the researchers employed a second group of neurons—the hidden neurons—which link up with visible neurons. These links capture the physical interactions between real particles, and as long as the number of connections stays relatively small, the neural network description remains simple.

Specifying a number for each connection and mathematically forgetting the hidden neurons can produce a compact representation of many interesting quantum states, including states with topological characteristics and some with surprising amounts of entanglement.

Beyond its potential as a tool in numerical simulations, the new framework allowed Deng and collaborators to prove some mathematical facts about the families of quantum states represented by neural networks. For instance, neural networks with only short-range interactions—those in which each hidden neuron is only connected to a small cluster of visible neurons—have a strict limit on their total entanglement. This technical result, known as an area law, is a research pursuit of many condensed matter physicists.

These neural networks can’t capture everything, though. “They are a very restricted regime,” Deng says, adding that they don’t offer an efficient universal representation. If they did, they could be used to simulate a quantum computer with an ordinary computer, something physicists and computer scientists think is very unlikely. Still, the collection of states that they do represent efficiently, and the overlap of that collection with other representation methods, is an open problem that Deng says is ripe for further exploration.

Here’s a link to and a citation for the paper,

Quantum Entanglement in Neural Network States by Dong-Ling Deng, Xiaopeng Li, and S. Das Sarma. Phys. Rev. X 7, 021021 – Published 11 May 2017

This paper is open access.

Blue Brain and the multidimensional universe

Blue Brain is a Swiss government brain research initiative which officially came to life in 2006 although the initial agreement between the École Politechnique Fédérale de Lausanne (EPFL) and IBM was signed in 2005 (according to the project’s Timeline page). Moving on, the project’s latest research reveals something astounding (from a June 12, 2017 Frontiers Publishing press release on EurekAlert),

For most people, it is a stretch of the imagination to understand the world in four dimensions but a new study has discovered structures in the brain with up to eleven dimensions – ground-breaking work that is beginning to reveal the brain’s deepest architectural secrets.

Using algebraic topology in a way that it has never been used before in neuroscience, a team from the Blue Brain Project has uncovered a universe of multi-dimensional geometrical structures and spaces within the networks of the brain.

The research, published today in Frontiers in Computational Neuroscience, shows that these structures arise when a group of neurons forms a clique: each neuron connects to every other neuron in the group in a very specific way that generates a precise geometric object. The more neurons there are in a clique, the higher the dimension of the geometric object.

“We found a world that we had never imagined,” says neuroscientist Henry Markram, director of Blue Brain Project and professor at the EPFL in Lausanne, Switzerland, “there are tens of millions of these objects even in a small speck of the brain, up through seven dimensions. In some networks, we even found structures with up to eleven dimensions.”

Markram suggests this may explain why it has been so hard to understand the brain. “The mathematics usually applied to study networks cannot detect the high-dimensional structures and spaces that we now see clearly.”

If 4D worlds stretch our imagination, worlds with 5, 6 or more dimensions are too complex for most of us to comprehend. This is where algebraic topology comes in: a branch of mathematics that can describe systems with any number of dimensions. The mathematicians who brought algebraic topology to the study of brain networks in the Blue Brain Project were Kathryn Hess from EPFL and Ran Levi from Aberdeen University.

“Algebraic topology is like a telescope and microscope at the same time. It can zoom into networks to find hidden structures – the trees in the forest – and see the empty spaces – the clearings – all at the same time,” explains Hess.

In 2015, Blue Brain published the first digital copy of a piece of the neocortex – the most evolved part of the brain and the seat of our sensations, actions, and consciousness. In this latest research, using algebraic topology, multiple tests were performed on the virtual brain tissue to show that the multi-dimensional brain structures discovered could never be produced by chance. Experiments were then performed on real brain tissue in the Blue Brain’s wet lab in Lausanne confirming that the earlier discoveries in the virtual tissue are biologically relevant and also suggesting that the brain constantly rewires during development to build a network with as many high-dimensional structures as possible.

When the researchers presented the virtual brain tissue with a stimulus, cliques of progressively higher dimensions assembled momentarily to enclose high-dimensional holes, that the researchers refer to as cavities. “The appearance of high-dimensional cavities when the brain is processing information means that the neurons in the network react to stimuli in an extremely organized manner,” says Levi. “It is as if the brain reacts to a stimulus by building then razing a tower of multi-dimensional blocks, starting with rods (1D), then planks (2D), then cubes (3D), and then more complex geometries with 4D, 5D, etc. The progression of activity through the brain resembles a multi-dimensional sandcastle that materializes out of the sand and then disintegrates.”

The big question these researchers are asking now is whether the intricacy of tasks we can perform depends on the complexity of the multi-dimensional “sandcastles” the brain can build. Neuroscience has also been struggling to find where the brain stores its memories. “They may be ‘hiding’ in high-dimensional cavities,” Markram speculates.


About Blue Brain

The aim of the Blue Brain Project, a Swiss brain initiative founded and directed by Professor Henry Markram, is to build accurate, biologically detailed digital reconstructions and simulations of the rodent brain, and ultimately, the human brain. The supercomputer-based reconstructions and simulations built by Blue Brain offer a radically new approach for understanding the multilevel structure and function of the brain. http://bluebrain.epfl.ch

About Frontiers

Frontiers is a leading community-driven open-access publisher. By taking publishing entirely online, we drive innovation with new technologies to make peer review more efficient and transparent. We provide impact metrics for articles and researchers, and merge open access publishing with a research network platform – Loop – to catalyse research dissemination, and popularize research to the public, including children. Our goal is to increase the reach and impact of research articles and their authors. Frontiers has received the ALPSP Gold Award for Innovation in Publishing in 2014. http://www.frontiersin.org.

Here’s a link to and a citation for the paper,

Cliques of Neurons Bound into Cavities Provide a Missing Link between Structure and Function by Michael W. Reimann, Max Nolte, Martina Scolamiero, Katharine Turner, Rodrigo Perin, Giuseppe Chindemi, Paweł Dłotko, Ran Levi, Kathryn Hess, and Henry Markram. Front. Comput. Neurosci., 12 June 2017 | https://doi.org/10.3389/fncom.2017.00048

This paper is open access.

Nanotechnology, math, cancer, and a boxing metaphor

Violent metaphors in medicine are not unusual although the reference is often to war rather than boxing as it is in this news from the University of Waterloo (Canada). Still, it seems counter-intuitive to closely link violence with healing but the practice is well entrenched and it seems attempts to counteract it are a ‘losing battle’ (pun intended).

Credit: Gabriel Picolo "2-in-1 punch." Courtesy: University of Waterloo

Credit: Gabriel Picolo “2-in-1 punch.” Courtesy: University of Waterloo

A June 23, 2016 news item on ScienceDaily describes a new approach to cancer therapy,

Math, biology and nanotechnology are becoming strange, yet effective bed-fellows in the fight against cancer treatment resistance. Researchers at the University of Waterloo and Harvard Medical School have engineered a revolutionary new approach to cancer treatment that pits a lethal combination of drugs together into a single nanoparticle.

Their work, published online on June 3, 2016 in the leading nanotechnology journal ACS Nano, finds a new method of shrinking tumors and prevents resistance in aggressive cancers by activating two drugs within the same cell at the same time.

A June 23, 2016 University of Waterloo news release (also on EurekAlert), which originated the news item, provides more information,

Every year thousands of patients die from recurrent cancers that have become resistant to therapy, resulting in one of the greatest unsolved challenges in cancer treatment. By tracking the fate of individual cancer cells under pressure of chemotherapy, biologists and bioengineers at Harvard Medical School studied a network of signals and molecular pathways that allow the cells to generate resistance over the course of treatment.

Using this information, a team of applied mathematicians led by Professor Mohammad Kohandel at the University of Waterloo, developed a mathematical model that incorporated algorithms that define the phenotypic cell state transitions of cancer cells in real-time while under attack by an anticancer agent. The mathematical simulations enabled them to define the exact molecular behavior and pathway of signals, which allow cancer cells to survive treatment over time.

They discovered that the PI3K/AKT kinase, which is often over-activated in cancers, enables cells to undergo a resistance program when pressured with the cytotoxic chemotherapy known as Taxanes, which are conventionally used to treat aggressive breast cancers. This revolutionary window into the life of a cell reveals that vulnerabilities to small molecule PI3K/AKT kinase inhibitors exist, and can be targeted if they are applied in the right sequence with combinations of other drugs.

Previously theories of drug resistance have relied on the hypothesis that only certain, “privileged” cells can overcome therapy. The mathematical simulations demonstrate that, under the right conditions and signaling events, any cell can develop a resistance program.

“Only recently have we begun to appreciate how important mathematics and physics are to understanding the biology and evolution of cancer,” said Professor Kohandel. “In fact, there is now increasing synergy between these disciplines, and we are beginning to appreciate how critical this information can be to create the right recipes to treat cancer.”

Although previous studies explored the use of drug combinations to treat cancer, the one-two punch approach is not always successful. In the new study, led by Professor Aaron Goldman, a faculty member in the division of Engineering in Medicine at Brigham and Women’s Hospital, the scientists realized a major shortcoming of the combination therapy approach is that both drugs need to be active in the same cell, something that current delivery methods can’t guarantee.

“We were inspired by the mathematical understanding that a cancer cell rewires the mechanisms of resistance in a very specific order and time-sensitive manner,” said Professor Goldman. “By developing a 2-in-1 nanomedicine, we could ensure the cell that was acquiring this new resistance saw the lethal drug combination, shutting down the survival program and eliminating the evidence of resistance. This approach could redefine how clinicians deliver combinations of drugs in the clinic.”

The approach the bioengineers took was to build a single nanoparticle, inspired by computer models, that exploit a technique known as supramolecular chemistry. This nanotechnology enables scientists to build cholesterol-tethered drugs together from “tetris-like” building blocks that self-assemble, incorporating multiple drugs into stable, individual nano-vehicles that target tumors through the leaky vasculature. This 2-in-1 strategy ensures that resistance to therapy never has a chance to develop, bringing together the right recipe to destroy surviving cancer cells.

Using mouse models of aggressive breast cancer, the scientists confirmed the predictions from the mathematical model that both drugs must be deterministically delivered to the same cell.

Here’s a link to and a citation for the paper,

Rationally Designed 2-in-1 Nanoparticles Can Overcome Adaptive Resistance in Cancer by Aaron Goldman, Ashish Kulkarni, Mohammad Kohandel, Prithvi Pandey, Poornima Rao, Siva Kumar Natarajan, Venkata Sabbisetti, and Shiladitya Sengupta. ACS Nano, Article ASAP DOI: 10.1021/acsnano.6b00320 Publication Date (Web): June 03, 2016

Copyright © 2016 American Chemical Society

This paper is behind a paywall.

The researchers have made this illustration of their work available,

Courtesy: American Chemical Society

Courtesy: American Chemical Society

Banksy and the mathematicians

Assuming you’ve heard of Banksy (if not, he’s an internationally known graffiti artist), then you understand that no one knows his real name for certain although there are strong suspicions, as of 2008, that he is Robin Gunningham. It seems the puzzle has aroused scientific curiosity according to a March 4, 2016 article by Jill Lawless on CBC (Canadian Broadcasting Corporation) News online,

Elusive street artist Banksy may have been unmasked — by mathematics.

Scientists have applied a type of modelling used to track down criminals and map disease outbreaks to identify the graffiti artist, whose real name has never been confirmed.

The technique, known as geographic profiling, is used by police forces to narrow down lists of suspects by calculating from multiple crime sites where the offender most likely lives.

The March 3, 2016 article in The Economist about the Banksy project describes the model used to derive his identity in more detail,

Their system, Dirichlet process mixture modelling, is more sophisticated than the criminal geographic targeting (CGT) currently favoured by crime-fighters. CGT is based on a simple assumption: that crimes happen near to where those responsible reside. Plot out an incident map and the points should surround the criminal like a doughnut (malefactors tend not to offend on their own doorsteps, but nor do they stray too far). The Dirichlet model allows for more than one “source”—a place relevant to a suspect such as home, work or a frequent pit stop on a commute—but makes no assumptions about their number; it automatically parses the mess of crime sites into clusters of activity.

Then, for each site, it calculates the probability that the given array of activity, and the way it is clustered, would result from any given source. Through a monumental summing of probabilities across each and every possible combination of sources, the model spits out the most likely ones, with considerable precision—down to 50 metres or so in some cases.

While this seems like harmless mathematical modeling, Banksy lawyers were sufficiently concerned over how this work would be promoted that they contacted the publisher according to Jonathan Webb’s March 3, 2016 article for BBC (British Broadcasting Corporation) News online,

A study that tests the method of geographical profiling on Banksy has appeared, after a delay caused by an intervention from the artist’s lawyers.

Scientists at Queen Mary University of London found that the distribution of Banksy’s famous graffiti supported a previously suggested real identity.

The study was due to appear in the Journal of Spatial Science a week ago.

The BBC understands that Banksy’s legal team contacted QMUL staff with concerns about how the study was to be promoted.

Those concerns apparently centred on the wording of a press release, which has now been withdrawn.

Taylor and Francis, which publishes the journal, said that the research paper itself had not been questioned. It appeared online on Thursday [March 3, 2016] unchanged, after being placed “on hold” while conversations between lawyers took place.

The scientists conducted the study to demonstrate the wide applicability of geoprofiling – but also out of interest, said biologist Steve Le Comber, “to see whether it would work”.

The criminologist and former detective who pioneered geoprofiling, Canadian Dr Kim Rossmo [emphasis mine] – now at Texas State University in the US – is a co-author on the paper.

The researchers say their findings support the use of such profiling in counter-terrorism, based on the idea that minor “terrorism-related acts” – like graffiti – could help locate bases before more serious incidents unfold.

I believe the biologist Steve Le Comber is interested in applying the technique to epidemiology (study of patterns in health and disease in various populations). As for Dr. Rossmo, he featured in one of the more bizarre incidents in Vancouver Police Department (VPD) history as described in the Kim Rossmo entry on Wikipedia (Note: Links have been removed),

D. Kim Rossmo is a Canadian criminologist specializing in geographic profiling. He joined the Vancouver Police Department as a civilian employee in 1978 and became a sworn officer in 1980. In 1987 he received a master’s degree in criminology from Simon Fraser University and in 1995 became the first police officer in Canada to obtain a doctorate in criminology.[1] His dissertation research resulted in a new criminal investigative methodology called geographic profiling, based on Rossmo’s formula. This technology was integrated into a specialized crime analysis software product called Rigel. The Rigel product is developed by the software company Environmental Criminology Research Inc. (ECRI), which Rossmo co-founded.[2]

In 1995, he was promoted to detective inspector and founded a geographic profiling section within the Vancouver Police Department. In 1998, his analysis of cases of missing sex trade workers determined that a serial killer was at work, a conclusion ultimately vindicated by the arrest and conviction of Robert Pickton in 2002. A retired Vancouver police staff sergeant has claimed that animosity toward Rossmo delayed the arrest of Pickton, leaving him free to carry out additional murders.[3] His analytic results were not accepted at the time and after a dispute with senior members of the department he left in 2001. His unsuccessful lawsuit against the Vancouver Police Board for wrongful dismissal exposed considerable apparent dysfunction within that department.[1]

It’s still boggles my mind and the reporters covering story that the VPD would dismiss someone who was being lauded internationally for his work and had helped the department solve a very nasty case. In any event, Dr. Rossmo is now at Texas State University.

Getting back to Banksy and geographic profiling, here’s a link to and a citation for the paper,

Tagging Banksy: using geographic profiling to investigate a modern art mystery by Michelle V. Hauge, Mark D. Stevenson, D. Kim Rossmo & Steven C. Le Comber. Journal of Spatial Science DOI:  10.1080/14498596.2016.1138246 Published online: 03 Mar 2016

This paper is behind a paywall.

For anyone curious about Banksy’s work, here’s an image from this Wikipedia entry,

Stencil on the waterline of The Thekla, an entertainment boat in central Bristol – (wider view). The section of the hull with this picture has now been removed and is on display at the M Shed museum. The image of Death is based on a nineteenth-century etching illustrating the pestilence of The Great Stink.[19] Artist: Banksy - Photographed by Adrian Pingstone

Stencil on the waterline of The Thekla, an entertainment boat in central Bristol – (wider view). The section of the hull with this picture has now been removed and is on display at the M Shed museum. The image of Death is based on a nineteenth-century etching illustrating the pestilence of The Great Stink.[19] Artist: Banksy – Photographed by Adrian Pingstone

#BCTECH: being at the Summit (Jan. 18-19, 2016)

#BCTECH Summit 2016*, a joint event between the province of British Columbia (BC, Canada) and the BC Innovation Council (BCIC), a crown corporation formerly known as the Science Council of British Columbia, launched on Jan. 18, 2016. I have written a preview (Jan. 17, 2016 post) and a commentary on the new #BCTECH strategy (Jan. 19, 2016 posting) announced by British Columbia Premier, Christy Clark, on the opening day (Jan. 18, 2016) of the summit.

I was primarily interested in the trade show/research row/technology showcase aspect of the summit focusing (but not exclusively) on nanotechnology. Here’s what I found,

Nano at the Summit

  • Precision NanoSystems: fabricates equipment which allows researchers to create polymer nanoparticles for delivering medications.

One of the major problems with creating nanoparticles is ensuring a consistent size and rapid production. According to Shell Ip, a Precision NanoSystems field application scientist, their NanoAssemblr Platform has solved the consistency problem and a single microfluidic cartridge can produce 15 ml in two minutes. Cartridges can run in parallel for maximum efficiency when producing nanoparticles in greater quantity.

The NanoAssemblr Platform is in use in laboratories around the world (I think the number is 70) and you can find out more on the company’s About our technology webpage,

The NanoAssemblr™ Platform

The microfluidic approach to particle formulation is at the heart of the NanoAssemblr Platform. This well-controlled process mediates bottom-up self-assembly of nanoparticles with reproducible sizes and low polydispersity. Users can control size by process and composition, and adjust parameters such as mixing ratios, flow rate and lipid composition in order to fine-tune nanoparticle size, encapsulation efficiency and much more. The system technology enables manufacturing scale-up through microfluidic reactor parallelization similar to the arraying of transistors on an integrated chip. Superior design ensures that the platform is fast and easy to use with a software controlled manufacturing process. This usability allows for the simplified transfer of manufacturing protocols between sites, which accelerates development, reduces waste and ultimately saves money. Precision NanoSystems’ flagship product is the NanoAssemblr™ Benchtop Instrument, designed for rapid prototyping of novel nanoparticles. Preparation time on the system is streamlined to approximately one minute, with the ability to complete 30 formulations per day in the hands of any user.

The company is located on property known as the Endowment Lands or, more familiarly, the University of British Columbia (UBC).

A few comments before moving on, being able to standardize the production of medicine-bearing nanoparticles is a tremendous step forward which is going to help scientists dealing with other issues. Despite all the talk in the media about delivering nanoparticles with medication directly to diseased cells, there are transport issues: (1) getting the medicine to the right location/organ and (2) getting the medicine into the cell. My Jan. 12, 2016 posting featured a project with Malaysian scientists and a team at Harvard University who are tackling the transport and other nanomedicine) issues as they relate to the lung. As well, I have a Nov. 26, 2015 posting which explores a controversy about nanoparticles getting past the ‘cell walls’ into the nucleus of the cell.

The next ‘nano’ booths were,

  • 4D Labs located at Simon Fraser University (SFU) was initially hailed as a nanotechnology facility but these days they’re touting themselves as an ‘advanced materials’ facility. Same thing, different branding.

They advertise services including hands-on training for technology companies and academics. There is a nanoimaging facility and nanofabrication facility, amongst others.

I spoke with their operations manager, Nathaniel Sieb who mentioned a few of the local companies that use their facilities. (1) Nanotech Security (featured here most recently in a Dec. 29, 2015 post), an SFU spinoff company, does some of their anticounterfeiting research work at 4D Labs. (2) Switch Materials (a smart window company, electrochromic windows if memory serves) also uses the facilities. It is Neil Branda’s (4D Labs Executive Director) company and I have been waiting impatiently (my May 14, 2010 post was my first one about Switch) for either his or someone else’s electrochromic windows (they could eliminate or reduce the need for air conditioning during the hotter periods and reduce the need for heat in the colder periods) to come to market. Seib tells me, I’ll have to wait longer for Switch. (3) A graduate student was presenting his work at the booth, a handheld diagnostic device that can be attached to a smartphone to transmit data to the cloud. While the first application is for diabetics, there are many other possibilities. Unfortunately, glucose means you need to produce blood for the test when I suggested my preference for saliva the student explained some of the difficulties. Apparently, your saliva changes dynamically and frequently and something as simple as taking a sip of orange juice could result in a false reading. Our conversation (mine, Seib’s and the student’s) also drifted over into the difficulties of bringing products to market. Sadly, we were not able to solve that problem in our 10 minute conversation.

  • FPInnovations is a scientific research centre and network for the forestry sector. They had a display near their booth which was like walking into a peculiar forest (I was charmed). The contrast with the less imaginative approaches all around was striking.

FPInnovation helped to develop cellulose nanocrystals (CNC), then called nanocrystalline cellulose (NCC), and I was hoping to be updated about CNC and about the spinoff company Celluforce. The researcher I spoke to was from Sweden and his specialty was business development. He didn’t know much about CNC in Canada and when I commented on how active Sweden has been its pursuit of a CNC application, he noted Finland has been the most active. The researcher noted that making the new materials being derived from the forest, such as CNC, affordable and easily produced for use in applications that have yet to be developed are all necessities and challenges. He mentioned that cultural changes also need to take place. Canadians are accustomed to slicing away and discarding most of the tree instead of using as much of it as possible. We also need to move beyond the construction and pulp & paper sectors (my Feb. 15, 2012 posting featured nanocellulose research in Sweden where sludge was the base material).

Other interests at the Summit

I visited:

  • “The Wearable Lower Limb Anthropomorphic Exoskeleton (WLLAE) – a lightweight, battery-operated and ergonomic robotic system to help those with mobility issues improve their lives. The exoskeleton features joints and links that correspond to those of a human body and sync with motion. SFU has designed, manufactured and tested a proof-of-concept prototype and the current version can mimic all the motions of hip joints.” The researchers (Siamak Arzanpour and Edward Park) pointed out that the ability to mimic all the motions of the hip is a big difference between their system and others which only allow the leg to move forward or back. They rushed the last couple of months to get this system ready for the Summit. In fact, they received their patent for the system the night before (Jan. 17, 2016) the Summit opened.

It’s the least imposing of the exoskeletons I’ve seen (there’s a description of one of the first successful exoskeletons in a May 20, 2014 posting; if you scroll down to the end you’ll see an update about the device’s unveiling at the 2014 World Cup [soccer/football] in Brazil).

Unfortunately, there aren’t any pictures of WLLAE yet and the proof-of-concept version may differ significantly from the final version. This system could be used to help people regain movement (paralysis/frail seniors) and I believe there’s a possibility it could be used to enhance human performance (soldiers/athletes). The researchers still have some significant hoops to jump before getting to the human clinical trial stage. They need to refine their apparatus, ensure that it can be safely operated, and further develop the interface between human and machine. I believe WLLAE is considered a neuroprosthetic device. While it’s not a fake leg or arm, it enables movement (prosthetic) and it operates on brain waves (neuro). It’s a very exciting area of research, consequently, there’s a lot of international competition.

  • Delightfully, after losing contact for a while, I reestablished it with the folks (Sean Lee, Head External Relations and Jim Hanlon, Chief Administrative Officer) at TRIUMF (Canada’s national laboratory for particle and nuclear physics). It’s a consortium of 19 Canadian research institutions (12 full members and seven associate members).

It’s a little disappointing that TRIUMF wasn’t featured in the opening for the Summit since the institution houses theoretical, experimental, and applied science work. It’s a major BC (and Canada) science and technology success story. My latest post (July 16, 2015) about their work featured researchers from California (US) using the TRIUMF cyclotron for imaging nanoscale materials and, on the more practical side, there’s a Mar. 6, 2015 posting about their breakthrough for producing nuclear material-free medical isotopes. Plus, Maclean’s Magazine ran a Jan. 3, 2016 article by Kate Lunau profiling an ‘art/science’ project that took place at TRIUMF (Note: Links have been removed),

It’s not every day that most people get to peek inside a world-class particle physics lab, where scientists probe deep mysteries of the universe. In September [2015], Vancouver’s TRIUMF—home to the world’s biggest cyclotron, a type of particle accelerator—opened its doors to professional and amateur photographers, part of an event called Global Physics Photowalk 2015. (Eight labs around the world participated, including CERN [European particle physics laboratory], in Geneva, where the Higgs boson particle was famously discovered.)

Here’s the local (Vancouver) jury’s pick for the winning image (from the Nov. 4, 2015 posting [Winning Photographs Revealed] by Alexis Fong on the TRIUMF website),

Caption: DESCANT (at TRIUMF) neutron detector array composed of 70 hexagonal detectors Credit: Pamela Joe McFarlane

Caption: DESCANT (at TRIUMF) neutron detector array composed of 70 hexagonal detectors Credit: Pamela Joe McFarlane

With all those hexagons and a spherical shape, the DESCANT looks like a ‘buckyball’ or buckminsterfullerene or C60  to me.

I hope the next Summit features TRIUMF and/or some other endeavours which exemplify, Science, Technology, and Creativity in British Columbia and Canada.

Onto the last booth,

  • MITACS was originally one of the Canadian federal government’s Network Centres for Excellence projects. It was focused on mathematics, networking, and innovation but once the money ran out the organization took a turn. These days, it’s describing itself as (from their About page) “a national, not-for-profit organization that has designed and delivered research and training programs in Canada for 15 years. Working with 60 universities, thousands of companies, and both federal and provincial governments, we build partnerships that support industrial and social innovation in Canada.”Their Jan. 19, 2016 news release (coincidental with the #BCTECH Summit, Jan. 18 – 19, 2016?) features a new report about improving international investment in Canada,

    Opportunities to improve Canada’s attractiveness for R&D investment were identified:

    1.Canada needs to better incentivize R&D by rebalancing direct and indirect support measures

    2.Canada requires a coordinated, client-centric approach to incentivizing R&D

    3.Canada needs to invest in training programs that grow the knowledge economy”

    Oddly, entrepreneurial/corporate/business types never have a problem with government spending when the money is coming to them; it’s only a problem when it’s social services.

    Back to MITACS, one of their more interesting (to me) projects was announced at the 2015 Canadian Science Policy Conference. MITACS has inaugurated a Canadian Science Policy Fellowships programme which in its first year (pilot) will see up up to 10 academics applying their expertise to policy-making while embedded in various federal government agencies. I don’t believe anything similar has occurred here in Canada although, if memory serves, the Brits have a similar programme.

    Finally, I offer kudos to Sherry Zhao, MITACS Business Development Specialist, the only person to ask me how her organization might benefit my business. Admittedly I didn’t talk to a lot of people but it’s striking to me that at an ‘innovation and business’ tech summit, only one person approached me about doing business.  Of course, I’m not a male aged between 25 and 55. So, extra kudos to Sherry Zhao and MITACS.

Christy Clark (Premier of British Columbia), in her opening comments, stated 2800 (they were expecting about 1000) had signed up for the #BCTECH Summit. I haven’t been able to verify that number or get other additional information, e.g., business deals, research breakthroughs, etc. announced at the Summit. Regardless, it was exciting to attend and find out about the latest and greatest on the BC scene.

I wish all the participants great and good luck and look forward to next year’s where perhaps we’ll here about how the province plans to help with the ‘manufacturing middle’ issue. For new products you need to have facilities capable of reproducing your devices at a speed that satisfies your customers; see my Feb. 10, 2014 post featuring a report on this and other similar issues from the US General Accountability Office.

*’BCTECH Summit 2016′ link added Jan. 21, 2016.

Mathematics, music, art, architecture, culture: Bridges 2015

Thanks to Alex Bellos and Tash Reith-Banks for their July 30, 2015 posting on the Guardian science blog network for pointing towards the Bridges 2015 conference,

The Bridges Conference is an annual event that explores the connections between art and mathematics. Here is a selection of the work being exhibited this year, from a Pi pie which vibrates the number pi onto your hand to delicate paper structures demonstrating number sequences. This year’s conference runs until Sunday in Baltimore (Maryland, US).

To whet your appetite, here’s the Pi pie (from the Bellos/Reith-Banks posting),

Pi Pie by Evan Daniel Smith Arduino, vibration motors, tinted silicone, pie tin “This pie buzzes the number pi onto your hand. I typed pi from memory into a computer while using a program I wrote to record it and send it to motors in the pie. The placement of the vibrations on the five fingers uses the structure of the Japanese soroban abacus, and bears a resemblance to Asian hand mnemonics.” Photograph: The Bridges Organisation

Pi Pie by Evan Daniel Smith
Arduino, vibration motors, tinted silicone, pie tin
“This pie buzzes the number pi onto your hand. I typed pi from memory into a computer while using a program I wrote to record it and send it to motors in the pie. The placement of the vibrations on the five fingers uses the structure of the Japanese soroban abacus, and bears a resemblance to Asian hand mnemonics.”
Photograph: The Bridges Organisation

You can find our more about Bridges 2015 here and should you be in the vicinity of Baltimore, Maryland, as a member of the public, you are invited to view the artworks on July 31, 2015,

July 29 – August 1, 2015 (Wednesday – Saturday)
Excursion Day: Sunday, August 2
A Collaborative Effort by
The University of Baltimore and Bridges Organization

A Five-Day Conference and Excursion
Wednesday, July 29 – Saturday, August 1
(Excursion Day on Sunday, August 2)

The Bridges Baltimore Family Day on Friday afternoon July 31 will be open to the Public to visit the BB Art Exhibition and participate in a series of events such as BB Movie Festival, and a series of workshops.

I believe the conference is being held at the University of Baltimore. Presumably, that’s where you’ll find the art show, etc.

Wilkinson Prize for numerical software: call for 2015 submissions

The Wilkinson Prize is not meant to recognize a nice, shiny new algorithm, rather it’s meant for the implementation phase and, as anyone who have ever been involved in that phase of a project can tell you, that phase is often sadly neglected. So, bravo for the Wilkinson Prize!

From the March 27, 2014 Numerical Algorithms Group (NAG) news release, here’s a brief history of the Wilkinson Prize,

Every four years the Numerical Algorithms Group (NAG), the National Physical Laboratory (NPL) and Argonne National Laboratory award the prestigious Wilkinson Prize in honour of the outstanding contributions of Dr James Hardy Wilkinson to the field of numerical software. The next Wilkinson Prize will be awarded at the [2015] International Congress on Industrial and Applied Mathematics in Beijing, and will consist of a $3000 cash prize.

NAG, NPL [UK National Physical Laboratory] and Argonne [US Dept. of Energy, Argonne National Laboratory] are committed to encouraging innovative, insightful and original work in numerical software in the same way that Wilkinson inspired many throughout his career. Wilkinson worked on the Automatic Computing Engine (ACE) while at NPL and later authored numerous papers on his speciality, numerical analysis. He also authored many of the routines for matrix computation in the early marks of the NAG Library.

The most recent Wilkinson Prize was awarded in 2011 to Andreas Waechter and Carl D. Laird for IPOPT. Commenting on winning the Wilkinson Prize Carl D. Laird, Associate Professor at the School of Chemical Engineering, Purdue University, said “I love writing software, and working with Andreas on IPOPT was a highlight of my career. From the beginning, our goal was to produce great software that would be used by other researchers and provide solutions to real engineering and scientific problems.

The Wilkinson Prize is one of the few awards that recognises the importance of implementation – that you need more than a great algorithm to produce high-impact numerical software. It rewards the tremendous effort required to ensure reliability, efficiency, and usability of the software.

Here’s more about the prize (list of previous winners, eligibility, etc.), from the Wilkinson Prize for Numerical Software call for submissions webpage,

Previous Prize winners:

  • 2011: Andreas Waechter and Carl D. Laird for Ipopt
  • 2007: Wolfgang Bangerth for deal.II
  • 2003: Jonathan Shewchuch for Triangle
  • 1999: Matteo Frigo and Steven Johnson for FFTW.
  • 1995: Chris Bischof and Alan Carle for ADIFOR 2.0.
  • 1991: Linda Petzold for DASSL.


The prize will be awarded to the authors of an outstanding piece of numerical software, or to individuals who have made an outstanding contribution to an existing piece of numerical software. In the latter case applicants must clearly be able to distinguish their personal contribution and to have that contribution authenticated, and the submission must be written in terms of that personal contribution and not of the software as a whole. To encourage researchers in the earlier stages of their career all applicants must be at most 40 years of age on January 1, 2014.
Rules for Submission

Each entry must contain the following:

Software written in a widely available high-level programming language.
A two-page summary of the main features of the algorithm and software implementation.
A paper describing the algorithm and the software implementation. The paper should give an analysis of the algorithm and indicate any special programming features.
Documentation of the software which describes its purpose and method of use.
Examples of use of the software, including a test program and data.


The preferred format for submissions is a gzipped, tar archive or a zip file. Please contact us if you would like to use a different submission mechanism. Submissions should include a README file describing the contents of the archive and scripts for executing the test programs. Submissions can be sent by email to wilkinson-prize@nag.co.uk. Contact this address for further information.

The closing date for submissions is July 1, 2014.

Good luck to you all!

Visualizing beautiful math

Two artists ,Yann Pineill and Nicolas Lefaucheux, associated with Parachutes, a video production and graphic design studio located in Paris, France, ,have produced a video demonstrating this quote from Bertrand Russell, which is in the opening frame,

“Mathematics, rightly viewed, possesses not only truth, but supreme beauty — a beauty cold and austere, without the gorgeous trappings of painting or music.” — Bertrand Russell

H/t Mark Wilson’s Nov. 6, 2013 article for Fast Company,

One viewing note, the screen is arranged as a tryptich with the mathematical equation on the left, a schematic in the centre, and the real life manifestation on the right. Enjoy!

Monkey Tales games better than class excercises for teaching maths

Publicizing an unpublished academic paper, which makes the claim that a series of math games, Monkey Tales, are more effective than classroom exercises for teaching maths while trumpeting a series of unsubstantiated statistics, seems a little questionable. The paper featured in a July 8, 2013  news item on ScienceDaily is less like an academic piece and more like an undercover sales document,

To measure the effectiveness of Monkey Tales, a study was carried out with 88 second grade pupils divided into three groups. One group was asked to play the game for a period of three weeks while the second group had to solve similar math exercises on paper and a third group received no assignment. The math performance of the children was measured using an electronic arithmetic test before and after the test period. When results were compared, the children who had played the game provided significantly more correct answers: 6% more than before, compared to only 4% for the group that made traditional exercises and 2% for the control group. In addition, both the group that played the game and that which did the exercises were able to solve the test 30% faster while the group without assignment was only 10% faster.

Ordinarily, this excerpt wouldn’t be a big problem since one would have the opportunity to read the paper and analyse the methodology by asking questions such as this, how were the students chosen? Were the students with higher grades given the game? There’s another issue, percentages can be misleading when one doesn’t have the numbers, e.g., if there’s an increase from one to two, it’s perfectly valid to claim a 100% increase even if it is misleading. Finally, how were they able to measure speed? The control group, i.e., group without assignment, was 10% faster than whom?

The University of Ghent July 8, 2013 news release, which originated the news item, also includes a business case in what is supposed to be a news release about a study on maths education,

Serious or educational games are becoming increasingly important. Market research company iDate estimates that the global turnover was €2.3 billion in 2012 and expects it to rise to €6.6 billion in 2015.* A first important sector in which serious games are being used, is defence. The U.S. Army, for example, uses games to attract recruits and to teach various skills, from tactical combat training to ways of communicating with local people. Serious games are also increasingly used in companies and organizations to train staff. The Flemish company U&I Learning, for example, developed games for Audi in Vorst to teach personnel the safety instructions, for Carrefour to teach student employees how to operate the check-out system and for DHL to optimise the loading and unloading of air freight containers.

Reservations about the study aside, Monkey Tales (for PC only) looks quite charming.

[downloaded from http://www.monkeytalesgames.com/demo.php]

[downloaded from http://www.monkeytalesgames.com/demo.php]

In addition to a demo which can be downloaded, the site’s FAQs (Frequently Asked Questions) provides some information about the games’ backers and the games,

Who created Monkey Tales?
Developed by European schoolbook publisher Die Keure and award winning game developer Larian Studios, Monkey Tales is based on years of research and was developed with the active participation of teachers, schools, universities and educational method-makers.

What does years of research mean ?
Exactly that. The technology behind Monkey Tales has been in development for over 4 years, and has been field tested with over 30 000 children and across several schools, with very active engagement from both teachers and educational method-makers. Additionally, a two years research project is underway in which the universities of Ghent & Leuven are participating to measure the efficiency of the methods used within Monkey Tales.

What is the educational goal behind Monkey Tales?
Monkey Tales’ aim is not to instruct, that’s what teachers and schools are for. Instead it aims to help children rehearse and improve skills they should have, by motivating them to do drill exercises with increasing time pressure.

Because the abilities of children are very diverse, the algorithm behind the game first tries to establish where a child is on the learning curve, and then stimulates the child to make progress. This way frustration is avoided, and the child makes progress without realizing that it’s being pushed forward.

There’s a demonstrable effect that playing the game helps mastery of arithmetic. Parents can experience this themselves by trying out the games.

What can my child learn from Monkey Tales?
Currently there are five games available, covering grades 2 to 6, covering the field of mathematics in line with state standards (Common Core Standards and the 2009 DoDEA standards). Future games in the series will cover language and science.

What’s special about Monkey Tales?
A key feature of Monkey Tales is its unique algorithm that allows the game to automatically adapt to the level of children so that they feel comfortable with their ability to complete the exercises, removing any stress they might feel. From there, the game then presents progressively more difficult exercises, all the time monitoring how the child is performing and adapting if necessary. One of the most remarkable achievements of Monkey Tales is its ability to put children under time pressure to complete exercises without them complaining about it!

Hopefully this Monkey Tales study or a new study will be published and a news release, which by its nature, offers skimpy information won’t provoke any doubts about the validity of the work.

When twice as much (algebra) is good for you

“We find positive and substantial longer-run impacts of double-dose algebra on college entrance exam scores, high school graduation rates and college enrollment rates, suggesting that the policy had significant benefits that were not easily observable in the first couple of years of its existence,” wrote the article’s authors.

The Mar. 21, 2013 news release on EurekAlert which includes the preceding quote recounts an extraordinary story about an approach to teaching algebra that was not enthusiastically adopted at first but first some reason administrators and teachers persisted with it. Chelsey Leu’s Mar. 21, 2013 article (which originated the news release) for UChicago (University of Chicago) News (Note: Links have been removed),

Martin Gartzman sat in his dentist’s waiting room last fall when he read a study in Education Next that nearly brought him to tears.

A decade ago, in his former position as chief math and science officer for Chicago Public Schools [CPS], Gartzman spearheaded an attempt to decrease ninth-grade algebra failure rates, an issue he calls “an incredibly vexing problem.” His idea was to provide extra time for struggling students by having them take two consecutive periods of algebra.

In high schools, ninth-grade algebra is typically the class with the highest failure rate. This presents a barrier to graduation, because high schools usually require three to four years of math to graduate.

Students have about a 20 percent chance of passing the next math level if they don’t first pass algebra, Gartzman said, versus 80 percent for those who do pass. The data are clear: If students fail ninth-grade algebra, the likelihood of passing later years of math, and ultimately of graduating, is slim

Gartzman’s work to decrease algebra failure rates at CPS was motivated by a study of Melissa Roderick, the Hermon Dunlap Smith Professor at UChicago’s School of Social Service Administration. The study emphasized the importance of keeping students academically on track in their freshman year to increase the graduation rate.

Some administrators and teachers resisted the new policy. Teachers called these sessions “double-period hell” because they gathered, in one class, the most unmotivated students who had the biggest problems with math.

Principals and counselors sometimes saw the double periods as punishment for the students, depriving them of courses they may have enjoyed taking and replacing them with courses they disliked.

It seemed to Gartzman that double-period students were learning more math, though he had no supporting data. He gauged students’ progress by class grades, not by standardized tests. The CPS educators had no way of fully assessing their double-period idea. All they knew was that failure rates didn’t budge.

Unfortunately, Leu does not explain why the administrators and teachers continued with the program but it’s a good thing they did (Note: Links have been removed),

“Double-dosing had an immediate impact on student performance in algebra, increasing the proportion of students earning at least a B by 9.4 percentage points, or more than 65 percent,” noted the Education Next article. Although ninth-grade algebra passing rates remained mostly unaffected, “The mean GPA across all math courses taken after freshman year increased by 0.14 grade points on a 4.0 scale.”

They also found significantly increased graduation rates. The researchers concluded on an encouraging note: “Although the intervention was not particularly effective for the average affected student, the fact that it improved high school graduation and college enrollment rates for even a subset of low-performing and at-risk students is extraordinarily promising when targeted at the appropriate students.” [emphasis mine]

Gartzman recalled that reading the article “was mind-blowing for me. I had no idea that the researchers were continuing to study these kids.”

The study had followed a set of students from eighth grade through graduation, while Gartzman’s team could only follow them for a year after the program began. The improvements appeared five years after launching double-dose algebra, hiding them from the CPS team, which had focused on short-term student performance. [emphasis mine]

Gartzman stressed the importance of education policy research. “Nomi and Allensworth did some really sophisticated modeling that only researchers could do, that school districts really can’t do. It validates school districts all over the country who had been investing in double-period strategies.”

I’m not sure I understand the numbers very well (maybe I need a double-dose of numbers). The 9.4% increase for students earning a B sounds good but a mean increase of 0.14 in grade points doesn’t sound as impressive. As for the bit about the program being “not particularly effective for the average affected student,” what kind of student is helped by this program? As for the improvements being seen five years after the program launch. does this mean that students in the program showed improvement five years later (in first year university) or that researchers weren’t able to effectively measure any impact in the grade nine classroom until five years after the program began?

Regardless, it seems there is an improvement and having suffered through my share algebra classes, I applaud the educators for finding a way to help some students, if not all.