Tag Archives: mathematics

Nanotechnology, math, cancer, and a boxing metaphor

Violent metaphors in medicine are not unusual although the reference is often to war rather than boxing as it is in this news from the University of Waterloo (Canada). Still, it seems counter-intuitive to closely link violence with healing but the practice is well entrenched and it seems attempts to counteract it are a ‘losing battle’ (pun intended).

Credit: Gabriel Picolo "2-in-1 punch." Courtesy: University of Waterloo

Credit: Gabriel Picolo “2-in-1 punch.” Courtesy: University of Waterloo

A June 23, 2016 news item on ScienceDaily describes a new approach to cancer therapy,

Math, biology and nanotechnology are becoming strange, yet effective bed-fellows in the fight against cancer treatment resistance. Researchers at the University of Waterloo and Harvard Medical School have engineered a revolutionary new approach to cancer treatment that pits a lethal combination of drugs together into a single nanoparticle.

Their work, published online on June 3, 2016 in the leading nanotechnology journal ACS Nano, finds a new method of shrinking tumors and prevents resistance in aggressive cancers by activating two drugs within the same cell at the same time.

A June 23, 2016 University of Waterloo news release (also on EurekAlert), which originated the news item, provides more information,

Every year thousands of patients die from recurrent cancers that have become resistant to therapy, resulting in one of the greatest unsolved challenges in cancer treatment. By tracking the fate of individual cancer cells under pressure of chemotherapy, biologists and bioengineers at Harvard Medical School studied a network of signals and molecular pathways that allow the cells to generate resistance over the course of treatment.

Using this information, a team of applied mathematicians led by Professor Mohammad Kohandel at the University of Waterloo, developed a mathematical model that incorporated algorithms that define the phenotypic cell state transitions of cancer cells in real-time while under attack by an anticancer agent. The mathematical simulations enabled them to define the exact molecular behavior and pathway of signals, which allow cancer cells to survive treatment over time.

They discovered that the PI3K/AKT kinase, which is often over-activated in cancers, enables cells to undergo a resistance program when pressured with the cytotoxic chemotherapy known as Taxanes, which are conventionally used to treat aggressive breast cancers. This revolutionary window into the life of a cell reveals that vulnerabilities to small molecule PI3K/AKT kinase inhibitors exist, and can be targeted if they are applied in the right sequence with combinations of other drugs.

Previously theories of drug resistance have relied on the hypothesis that only certain, “privileged” cells can overcome therapy. The mathematical simulations demonstrate that, under the right conditions and signaling events, any cell can develop a resistance program.

“Only recently have we begun to appreciate how important mathematics and physics are to understanding the biology and evolution of cancer,” said Professor Kohandel. “In fact, there is now increasing synergy between these disciplines, and we are beginning to appreciate how critical this information can be to create the right recipes to treat cancer.”

Although previous studies explored the use of drug combinations to treat cancer, the one-two punch approach is not always successful. In the new study, led by Professor Aaron Goldman, a faculty member in the division of Engineering in Medicine at Brigham and Women’s Hospital, the scientists realized a major shortcoming of the combination therapy approach is that both drugs need to be active in the same cell, something that current delivery methods can’t guarantee.

“We were inspired by the mathematical understanding that a cancer cell rewires the mechanisms of resistance in a very specific order and time-sensitive manner,” said Professor Goldman. “By developing a 2-in-1 nanomedicine, we could ensure the cell that was acquiring this new resistance saw the lethal drug combination, shutting down the survival program and eliminating the evidence of resistance. This approach could redefine how clinicians deliver combinations of drugs in the clinic.”

The approach the bioengineers took was to build a single nanoparticle, inspired by computer models, that exploit a technique known as supramolecular chemistry. This nanotechnology enables scientists to build cholesterol-tethered drugs together from “tetris-like” building blocks that self-assemble, incorporating multiple drugs into stable, individual nano-vehicles that target tumors through the leaky vasculature. This 2-in-1 strategy ensures that resistance to therapy never has a chance to develop, bringing together the right recipe to destroy surviving cancer cells.

Using mouse models of aggressive breast cancer, the scientists confirmed the predictions from the mathematical model that both drugs must be deterministically delivered to the same cell.

Here’s a link to and a citation for the paper,

Rationally Designed 2-in-1 Nanoparticles Can Overcome Adaptive Resistance in Cancer by Aaron Goldman, Ashish Kulkarni, Mohammad Kohandel, Prithvi Pandey, Poornima Rao, Siva Kumar Natarajan, Venkata Sabbisetti, and Shiladitya Sengupta. ACS Nano, Article ASAP DOI: 10.1021/acsnano.6b00320 Publication Date (Web): June 03, 2016

Copyright © 2016 American Chemical Society

This paper is behind a paywall.

The researchers have made this illustration of their work available,

Courtesy: American Chemical Society

Courtesy: American Chemical Society

Banksy and the mathematicians

Assuming you’ve heard of Banksy (if not, he’s an internationally known graffiti artist), then you understand that no one knows his real name for certain although there are strong suspicions, as of 2008, that he is Robin Gunningham. It seems the puzzle has aroused scientific curiosity according to a March 4, 2016 article by Jill Lawless on CBC (Canadian Broadcasting Corporation) News online,

Elusive street artist Banksy may have been unmasked — by mathematics.

Scientists have applied a type of modelling used to track down criminals and map disease outbreaks to identify the graffiti artist, whose real name has never been confirmed.

The technique, known as geographic profiling, is used by police forces to narrow down lists of suspects by calculating from multiple crime sites where the offender most likely lives.

The March 3, 2016 article in The Economist about the Banksy project describes the model used to derive his identity in more detail,

Their system, Dirichlet process mixture modelling, is more sophisticated than the criminal geographic targeting (CGT) currently favoured by crime-fighters. CGT is based on a simple assumption: that crimes happen near to where those responsible reside. Plot out an incident map and the points should surround the criminal like a doughnut (malefactors tend not to offend on their own doorsteps, but nor do they stray too far). The Dirichlet model allows for more than one “source”—a place relevant to a suspect such as home, work or a frequent pit stop on a commute—but makes no assumptions about their number; it automatically parses the mess of crime sites into clusters of activity.

Then, for each site, it calculates the probability that the given array of activity, and the way it is clustered, would result from any given source. Through a monumental summing of probabilities across each and every possible combination of sources, the model spits out the most likely ones, with considerable precision—down to 50 metres or so in some cases.

While this seems like harmless mathematical modeling, Banksy lawyers were sufficiently concerned over how this work would be promoted that they contacted the publisher according to Jonathan Webb’s March 3, 2016 article for BBC (British Broadcasting Corporation) News online,

A study that tests the method of geographical profiling on Banksy has appeared, after a delay caused by an intervention from the artist’s lawyers.

Scientists at Queen Mary University of London found that the distribution of Banksy’s famous graffiti supported a previously suggested real identity.

The study was due to appear in the Journal of Spatial Science a week ago.

The BBC understands that Banksy’s legal team contacted QMUL staff with concerns about how the study was to be promoted.

Those concerns apparently centred on the wording of a press release, which has now been withdrawn.

Taylor and Francis, which publishes the journal, said that the research paper itself had not been questioned. It appeared online on Thursday [March 3, 2016] unchanged, after being placed “on hold” while conversations between lawyers took place.

The scientists conducted the study to demonstrate the wide applicability of geoprofiling – but also out of interest, said biologist Steve Le Comber, “to see whether it would work”.

The criminologist and former detective who pioneered geoprofiling, Canadian Dr Kim Rossmo [emphasis mine] – now at Texas State University in the US – is a co-author on the paper.

The researchers say their findings support the use of such profiling in counter-terrorism, based on the idea that minor “terrorism-related acts” – like graffiti – could help locate bases before more serious incidents unfold.

I believe the biologist Steve Le Comber is interested in applying the technique to epidemiology (study of patterns in health and disease in various populations). As for Dr. Rossmo, he featured in one of the more bizarre incidents in Vancouver Police Department (VPD) history as described in the Kim Rossmo entry on Wikipedia (Note: Links have been removed),

D. Kim Rossmo is a Canadian criminologist specializing in geographic profiling. He joined the Vancouver Police Department as a civilian employee in 1978 and became a sworn officer in 1980. In 1987 he received a master’s degree in criminology from Simon Fraser University and in 1995 became the first police officer in Canada to obtain a doctorate in criminology.[1] His dissertation research resulted in a new criminal investigative methodology called geographic profiling, based on Rossmo’s formula. This technology was integrated into a specialized crime analysis software product called Rigel. The Rigel product is developed by the software company Environmental Criminology Research Inc. (ECRI), which Rossmo co-founded.[2]

In 1995, he was promoted to detective inspector and founded a geographic profiling section within the Vancouver Police Department. In 1998, his analysis of cases of missing sex trade workers determined that a serial killer was at work, a conclusion ultimately vindicated by the arrest and conviction of Robert Pickton in 2002. A retired Vancouver police staff sergeant has claimed that animosity toward Rossmo delayed the arrest of Pickton, leaving him free to carry out additional murders.[3] His analytic results were not accepted at the time and after a dispute with senior members of the department he left in 2001. His unsuccessful lawsuit against the Vancouver Police Board for wrongful dismissal exposed considerable apparent dysfunction within that department.[1]

It’s still boggles my mind and the reporters covering story that the VPD would dismiss someone who was being lauded internationally for his work and had helped the department solve a very nasty case. In any event, Dr. Rossmo is now at Texas State University.

Getting back to Banksy and geographic profiling, here’s a link to and a citation for the paper,

Tagging Banksy: using geographic profiling to investigate a modern art mystery by Michelle V. Hauge, Mark D. Stevenson, D. Kim Rossmo & Steven C. Le Comber. Journal of Spatial Science DOI:  10.1080/14498596.2016.1138246 Published online: 03 Mar 2016

This paper is behind a paywall.

For anyone curious about Banksy’s work, here’s an image from this Wikipedia entry,

Stencil on the waterline of The Thekla, an entertainment boat in central Bristol – (wider view). The section of the hull with this picture has now been removed and is on display at the M Shed museum. The image of Death is based on a nineteenth-century etching illustrating the pestilence of The Great Stink.[19] Artist: Banksy - Photographed by Adrian Pingstone

Stencil on the waterline of The Thekla, an entertainment boat in central Bristol – (wider view). The section of the hull with this picture has now been removed and is on display at the M Shed museum. The image of Death is based on a nineteenth-century etching illustrating the pestilence of The Great Stink.[19] Artist: Banksy – Photographed by Adrian Pingstone

#BCTECH: being at the Summit (Jan. 18-19, 2016)

#BCTECH Summit 2016*, a joint event between the province of British Columbia (BC, Canada) and the BC Innovation Council (BCIC), a crown corporation formerly known as the Science Council of British Columbia, launched on Jan. 18, 2016. I have written a preview (Jan. 17, 2016 post) and a commentary on the new #BCTECH strategy (Jan. 19, 2016 posting) announced by British Columbia Premier, Christy Clark, on the opening day (Jan. 18, 2016) of the summit.

I was primarily interested in the trade show/research row/technology showcase aspect of the summit focusing (but not exclusively) on nanotechnology. Here’s what I found,

Nano at the Summit

  • Precision NanoSystems: fabricates equipment which allows researchers to create polymer nanoparticles for delivering medications.

One of the major problems with creating nanoparticles is ensuring a consistent size and rapid production. According to Shell Ip, a Precision NanoSystems field application scientist, their NanoAssemblr Platform has solved the consistency problem and a single microfluidic cartridge can produce 15 ml in two minutes. Cartridges can run in parallel for maximum efficiency when producing nanoparticles in greater quantity.

The NanoAssemblr Platform is in use in laboratories around the world (I think the number is 70) and you can find out more on the company’s About our technology webpage,

The NanoAssemblr™ Platform

The microfluidic approach to particle formulation is at the heart of the NanoAssemblr Platform. This well-controlled process mediates bottom-up self-assembly of nanoparticles with reproducible sizes and low polydispersity. Users can control size by process and composition, and adjust parameters such as mixing ratios, flow rate and lipid composition in order to fine-tune nanoparticle size, encapsulation efficiency and much more. The system technology enables manufacturing scale-up through microfluidic reactor parallelization similar to the arraying of transistors on an integrated chip. Superior design ensures that the platform is fast and easy to use with a software controlled manufacturing process. This usability allows for the simplified transfer of manufacturing protocols between sites, which accelerates development, reduces waste and ultimately saves money. Precision NanoSystems’ flagship product is the NanoAssemblr™ Benchtop Instrument, designed for rapid prototyping of novel nanoparticles. Preparation time on the system is streamlined to approximately one minute, with the ability to complete 30 formulations per day in the hands of any user.

The company is located on property known as the Endowment Lands or, more familiarly, the University of British Columbia (UBC).

A few comments before moving on, being able to standardize the production of medicine-bearing nanoparticles is a tremendous step forward which is going to help scientists dealing with other issues. Despite all the talk in the media about delivering nanoparticles with medication directly to diseased cells, there are transport issues: (1) getting the medicine to the right location/organ and (2) getting the medicine into the cell. My Jan. 12, 2016 posting featured a project with Malaysian scientists and a team at Harvard University who are tackling the transport and other nanomedicine) issues as they relate to the lung. As well, I have a Nov. 26, 2015 posting which explores a controversy about nanoparticles getting past the ‘cell walls’ into the nucleus of the cell.

The next ‘nano’ booths were,

  • 4D Labs located at Simon Fraser University (SFU) was initially hailed as a nanotechnology facility but these days they’re touting themselves as an ‘advanced materials’ facility. Same thing, different branding.

They advertise services including hands-on training for technology companies and academics. There is a nanoimaging facility and nanofabrication facility, amongst others.

I spoke with their operations manager, Nathaniel Sieb who mentioned a few of the local companies that use their facilities. (1) Nanotech Security (featured here most recently in a Dec. 29, 2015 post), an SFU spinoff company, does some of their anticounterfeiting research work at 4D Labs. (2) Switch Materials (a smart window company, electrochromic windows if memory serves) also uses the facilities. It is Neil Branda’s (4D Labs Executive Director) company and I have been waiting impatiently (my May 14, 2010 post was my first one about Switch) for either his or someone else’s electrochromic windows (they could eliminate or reduce the need for air conditioning during the hotter periods and reduce the need for heat in the colder periods) to come to market. Seib tells me, I’ll have to wait longer for Switch. (3) A graduate student was presenting his work at the booth, a handheld diagnostic device that can be attached to a smartphone to transmit data to the cloud. While the first application is for diabetics, there are many other possibilities. Unfortunately, glucose means you need to produce blood for the test when I suggested my preference for saliva the student explained some of the difficulties. Apparently, your saliva changes dynamically and frequently and something as simple as taking a sip of orange juice could result in a false reading. Our conversation (mine, Seib’s and the student’s) also drifted over into the difficulties of bringing products to market. Sadly, we were not able to solve that problem in our 10 minute conversation.

  • FPInnovations is a scientific research centre and network for the forestry sector. They had a display near their booth which was like walking into a peculiar forest (I was charmed). The contrast with the less imaginative approaches all around was striking.

FPInnovation helped to develop cellulose nanocrystals (CNC), then called nanocrystalline cellulose (NCC), and I was hoping to be updated about CNC and about the spinoff company Celluforce. The researcher I spoke to was from Sweden and his specialty was business development. He didn’t know much about CNC in Canada and when I commented on how active Sweden has been its pursuit of a CNC application, he noted Finland has been the most active. The researcher noted that making the new materials being derived from the forest, such as CNC, affordable and easily produced for use in applications that have yet to be developed are all necessities and challenges. He mentioned that cultural changes also need to take place. Canadians are accustomed to slicing away and discarding most of the tree instead of using as much of it as possible. We also need to move beyond the construction and pulp & paper sectors (my Feb. 15, 2012 posting featured nanocellulose research in Sweden where sludge was the base material).

Other interests at the Summit

I visited:

  • “The Wearable Lower Limb Anthropomorphic Exoskeleton (WLLAE) – a lightweight, battery-operated and ergonomic robotic system to help those with mobility issues improve their lives. The exoskeleton features joints and links that correspond to those of a human body and sync with motion. SFU has designed, manufactured and tested a proof-of-concept prototype and the current version can mimic all the motions of hip joints.” The researchers (Siamak Arzanpour and Edward Park) pointed out that the ability to mimic all the motions of the hip is a big difference between their system and others which only allow the leg to move forward or back. They rushed the last couple of months to get this system ready for the Summit. In fact, they received their patent for the system the night before (Jan. 17, 2016) the Summit opened.

It’s the least imposing of the exoskeletons I’ve seen (there’s a description of one of the first successful exoskeletons in a May 20, 2014 posting; if you scroll down to the end you’ll see an update about the device’s unveiling at the 2014 World Cup [soccer/football] in Brazil).

Unfortunately, there aren’t any pictures of WLLAE yet and the proof-of-concept version may differ significantly from the final version. This system could be used to help people regain movement (paralysis/frail seniors) and I believe there’s a possibility it could be used to enhance human performance (soldiers/athletes). The researchers still have some significant hoops to jump before getting to the human clinical trial stage. They need to refine their apparatus, ensure that it can be safely operated, and further develop the interface between human and machine. I believe WLLAE is considered a neuroprosthetic device. While it’s not a fake leg or arm, it enables movement (prosthetic) and it operates on brain waves (neuro). It’s a very exciting area of research, consequently, there’s a lot of international competition.

  • Delightfully, after losing contact for a while, I reestablished it with the folks (Sean Lee, Head External Relations and Jim Hanlon, Chief Administrative Officer) at TRIUMF (Canada’s national laboratory for particle and nuclear physics). It’s a consortium of 19 Canadian research institutions (12 full members and seven associate members).

It’s a little disappointing that TRIUMF wasn’t featured in the opening for the Summit since the institution houses theoretical, experimental, and applied science work. It’s a major BC (and Canada) science and technology success story. My latest post (July 16, 2015) about their work featured researchers from California (US) using the TRIUMF cyclotron for imaging nanoscale materials and, on the more practical side, there’s a Mar. 6, 2015 posting about their breakthrough for producing nuclear material-free medical isotopes. Plus, Maclean’s Magazine ran a Jan. 3, 2016 article by Kate Lunau profiling an ‘art/science’ project that took place at TRIUMF (Note: Links have been removed),

It’s not every day that most people get to peek inside a world-class particle physics lab, where scientists probe deep mysteries of the universe. In September [2015], Vancouver’s TRIUMF—home to the world’s biggest cyclotron, a type of particle accelerator—opened its doors to professional and amateur photographers, part of an event called Global Physics Photowalk 2015. (Eight labs around the world participated, including CERN [European particle physics laboratory], in Geneva, where the Higgs boson particle was famously discovered.)

Here’s the local (Vancouver) jury’s pick for the winning image (from the Nov. 4, 2015 posting [Winning Photographs Revealed] by Alexis Fong on the TRIUMF website),

Caption: DESCANT (at TRIUMF) neutron detector array composed of 70 hexagonal detectors Credit: Pamela Joe McFarlane

Caption: DESCANT (at TRIUMF) neutron detector array composed of 70 hexagonal detectors Credit: Pamela Joe McFarlane

With all those hexagons and a spherical shape, the DESCANT looks like a ‘buckyball’ or buckminsterfullerene or C60  to me.

I hope the next Summit features TRIUMF and/or some other endeavours which exemplify, Science, Technology, and Creativity in British Columbia and Canada.

Onto the last booth,

  • MITACS was originally one of the Canadian federal government’s Network Centres for Excellence projects. It was focused on mathematics, networking, and innovation but once the money ran out the organization took a turn. These days, it’s describing itself as (from their About page) “a national, not-for-profit organization that has designed and delivered research and training programs in Canada for 15 years. Working with 60 universities, thousands of companies, and both federal and provincial governments, we build partnerships that support industrial and social innovation in Canada.”Their Jan. 19, 2016 news release (coincidental with the #BCTECH Summit, Jan. 18 – 19, 2016?) features a new report about improving international investment in Canada,

    Opportunities to improve Canada’s attractiveness for R&D investment were identified:

    1.Canada needs to better incentivize R&D by rebalancing direct and indirect support measures

    2.Canada requires a coordinated, client-centric approach to incentivizing R&D

    3.Canada needs to invest in training programs that grow the knowledge economy”

    Oddly, entrepreneurial/corporate/business types never have a problem with government spending when the money is coming to them; it’s only a problem when it’s social services.

    Back to MITACS, one of their more interesting (to me) projects was announced at the 2015 Canadian Science Policy Conference. MITACS has inaugurated a Canadian Science Policy Fellowships programme which in its first year (pilot) will see up up to 10 academics applying their expertise to policy-making while embedded in various federal government agencies. I don’t believe anything similar has occurred here in Canada although, if memory serves, the Brits have a similar programme.

    Finally, I offer kudos to Sherry Zhao, MITACS Business Development Specialist, the only person to ask me how her organization might benefit my business. Admittedly I didn’t talk to a lot of people but it’s striking to me that at an ‘innovation and business’ tech summit, only one person approached me about doing business.  Of course, I’m not a male aged between 25 and 55. So, extra kudos to Sherry Zhao and MITACS.

Christy Clark (Premier of British Columbia), in her opening comments, stated 2800 (they were expecting about 1000) had signed up for the #BCTECH Summit. I haven’t been able to verify that number or get other additional information, e.g., business deals, research breakthroughs, etc. announced at the Summit. Regardless, it was exciting to attend and find out about the latest and greatest on the BC scene.

I wish all the participants great and good luck and look forward to next year’s where perhaps we’ll here about how the province plans to help with the ‘manufacturing middle’ issue. For new products you need to have facilities capable of reproducing your devices at a speed that satisfies your customers; see my Feb. 10, 2014 post featuring a report on this and other similar issues from the US General Accountability Office.

*’BCTECH Summit 2016′ link added Jan. 21, 2016.

Mathematics, music, art, architecture, culture: Bridges 2015

Thanks to Alex Bellos and Tash Reith-Banks for their July 30, 2015 posting on the Guardian science blog network for pointing towards the Bridges 2015 conference,

The Bridges Conference is an annual event that explores the connections between art and mathematics. Here is a selection of the work being exhibited this year, from a Pi pie which vibrates the number pi onto your hand to delicate paper structures demonstrating number sequences. This year’s conference runs until Sunday in Baltimore (Maryland, US).

To whet your appetite, here’s the Pi pie (from the Bellos/Reith-Banks posting),

Pi Pie by Evan Daniel Smith Arduino, vibration motors, tinted silicone, pie tin “This pie buzzes the number pi onto your hand. I typed pi from memory into a computer while using a program I wrote to record it and send it to motors in the pie. The placement of the vibrations on the five fingers uses the structure of the Japanese soroban abacus, and bears a resemblance to Asian hand mnemonics.” Photograph: The Bridges Organisation

Pi Pie by Evan Daniel Smith
Arduino, vibration motors, tinted silicone, pie tin
“This pie buzzes the number pi onto your hand. I typed pi from memory into a computer while using a program I wrote to record it and send it to motors in the pie. The placement of the vibrations on the five fingers uses the structure of the Japanese soroban abacus, and bears a resemblance to Asian hand mnemonics.”
Photograph: The Bridges Organisation

You can find our more about Bridges 2015 here and should you be in the vicinity of Baltimore, Maryland, as a member of the public, you are invited to view the artworks on July 31, 2015,

July 29 – August 1, 2015 (Wednesday – Saturday)
Excursion Day: Sunday, August 2
A Collaborative Effort by
The University of Baltimore and Bridges Organization

A Five-Day Conference and Excursion
Wednesday, July 29 – Saturday, August 1
(Excursion Day on Sunday, August 2)

The Bridges Baltimore Family Day on Friday afternoon July 31 will be open to the Public to visit the BB Art Exhibition and participate in a series of events such as BB Movie Festival, and a series of workshops.

I believe the conference is being held at the University of Baltimore. Presumably, that’s where you’ll find the art show, etc.

Wilkinson Prize for numerical software: call for 2015 submissions

The Wilkinson Prize is not meant to recognize a nice, shiny new algorithm, rather it’s meant for the implementation phase and, as anyone who have ever been involved in that phase of a project can tell you, that phase is often sadly neglected. So, bravo for the Wilkinson Prize!

From the March 27, 2014 Numerical Algorithms Group (NAG) news release, here’s a brief history of the Wilkinson Prize,

Every four years the Numerical Algorithms Group (NAG), the National Physical Laboratory (NPL) and Argonne National Laboratory award the prestigious Wilkinson Prize in honour of the outstanding contributions of Dr James Hardy Wilkinson to the field of numerical software. The next Wilkinson Prize will be awarded at the [2015] International Congress on Industrial and Applied Mathematics in Beijing, and will consist of a $3000 cash prize.

NAG, NPL [UK National Physical Laboratory] and Argonne [US Dept. of Energy, Argonne National Laboratory] are committed to encouraging innovative, insightful and original work in numerical software in the same way that Wilkinson inspired many throughout his career. Wilkinson worked on the Automatic Computing Engine (ACE) while at NPL and later authored numerous papers on his speciality, numerical analysis. He also authored many of the routines for matrix computation in the early marks of the NAG Library.

The most recent Wilkinson Prize was awarded in 2011 to Andreas Waechter and Carl D. Laird for IPOPT. Commenting on winning the Wilkinson Prize Carl D. Laird, Associate Professor at the School of Chemical Engineering, Purdue University, said “I love writing software, and working with Andreas on IPOPT was a highlight of my career. From the beginning, our goal was to produce great software that would be used by other researchers and provide solutions to real engineering and scientific problems.

The Wilkinson Prize is one of the few awards that recognises the importance of implementation – that you need more than a great algorithm to produce high-impact numerical software. It rewards the tremendous effort required to ensure reliability, efficiency, and usability of the software.

Here’s more about the prize (list of previous winners, eligibility, etc.), from the Wilkinson Prize for Numerical Software call for submissions webpage,

Previous Prize winners:

  • 2011: Andreas Waechter and Carl D. Laird for Ipopt
  • 2007: Wolfgang Bangerth for deal.II
  • 2003: Jonathan Shewchuch for Triangle
  • 1999: Matteo Frigo and Steven Johnson for FFTW.
  • 1995: Chris Bischof and Alan Carle for ADIFOR 2.0.
  • 1991: Linda Petzold for DASSL.


The prize will be awarded to the authors of an outstanding piece of numerical software, or to individuals who have made an outstanding contribution to an existing piece of numerical software. In the latter case applicants must clearly be able to distinguish their personal contribution and to have that contribution authenticated, and the submission must be written in terms of that personal contribution and not of the software as a whole. To encourage researchers in the earlier stages of their career all applicants must be at most 40 years of age on January 1, 2014.
Rules for Submission

Each entry must contain the following:

Software written in a widely available high-level programming language.
A two-page summary of the main features of the algorithm and software implementation.
A paper describing the algorithm and the software implementation. The paper should give an analysis of the algorithm and indicate any special programming features.
Documentation of the software which describes its purpose and method of use.
Examples of use of the software, including a test program and data.


The preferred format for submissions is a gzipped, tar archive or a zip file. Please contact us if you would like to use a different submission mechanism. Submissions should include a README file describing the contents of the archive and scripts for executing the test programs. Submissions can be sent by email to wilkinson-prize@nag.co.uk. Contact this address for further information.

The closing date for submissions is July 1, 2014.

Good luck to you all!

Visualizing beautiful math

Two artists ,Yann Pineill and Nicolas Lefaucheux, associated with Parachutes, a video production and graphic design studio located in Paris, France, ,have produced a video demonstrating this quote from Bertrand Russell, which is in the opening frame,

“Mathematics, rightly viewed, possesses not only truth, but supreme beauty — a beauty cold and austere, without the gorgeous trappings of painting or music.” — Bertrand Russell

H/t Mark Wilson’s Nov. 6, 2013 article for Fast Company,

One viewing note, the screen is arranged as a tryptich with the mathematical equation on the left, a schematic in the centre, and the real life manifestation on the right. Enjoy!

Monkey Tales games better than class excercises for teaching maths

Publicizing an unpublished academic paper, which makes the claim that a series of math games, Monkey Tales, are more effective than classroom exercises for teaching maths while trumpeting a series of unsubstantiated statistics, seems a little questionable. The paper featured in a July 8, 2013  news item on ScienceDaily is less like an academic piece and more like an undercover sales document,

To measure the effectiveness of Monkey Tales, a study was carried out with 88 second grade pupils divided into three groups. One group was asked to play the game for a period of three weeks while the second group had to solve similar math exercises on paper and a third group received no assignment. The math performance of the children was measured using an electronic arithmetic test before and after the test period. When results were compared, the children who had played the game provided significantly more correct answers: 6% more than before, compared to only 4% for the group that made traditional exercises and 2% for the control group. In addition, both the group that played the game and that which did the exercises were able to solve the test 30% faster while the group without assignment was only 10% faster.

Ordinarily, this excerpt wouldn’t be a big problem since one would have the opportunity to read the paper and analyse the methodology by asking questions such as this, how were the students chosen? Were the students with higher grades given the game? There’s another issue, percentages can be misleading when one doesn’t have the numbers, e.g., if there’s an increase from one to two, it’s perfectly valid to claim a 100% increase even if it is misleading. Finally, how were they able to measure speed? The control group, i.e., group without assignment, was 10% faster than whom?

The University of Ghent July 8, 2013 news release, which originated the news item, also includes a business case in what is supposed to be a news release about a study on maths education,

Serious or educational games are becoming increasingly important. Market research company iDate estimates that the global turnover was €2.3 billion in 2012 and expects it to rise to €6.6 billion in 2015.* A first important sector in which serious games are being used, is defence. The U.S. Army, for example, uses games to attract recruits and to teach various skills, from tactical combat training to ways of communicating with local people. Serious games are also increasingly used in companies and organizations to train staff. The Flemish company U&I Learning, for example, developed games for Audi in Vorst to teach personnel the safety instructions, for Carrefour to teach student employees how to operate the check-out system and for DHL to optimise the loading and unloading of air freight containers.

Reservations about the study aside, Monkey Tales (for PC only) looks quite charming.

[downloaded from http://www.monkeytalesgames.com/demo.php]

[downloaded from http://www.monkeytalesgames.com/demo.php]

In addition to a demo which can be downloaded, the site’s FAQs (Frequently Asked Questions) provides some information about the games’ backers and the games,

Who created Monkey Tales?
Developed by European schoolbook publisher Die Keure and award winning game developer Larian Studios, Monkey Tales is based on years of research and was developed with the active participation of teachers, schools, universities and educational method-makers.

What does years of research mean ?
Exactly that. The technology behind Monkey Tales has been in development for over 4 years, and has been field tested with over 30 000 children and across several schools, with very active engagement from both teachers and educational method-makers. Additionally, a two years research project is underway in which the universities of Ghent & Leuven are participating to measure the efficiency of the methods used within Monkey Tales.

What is the educational goal behind Monkey Tales?
Monkey Tales’ aim is not to instruct, that’s what teachers and schools are for. Instead it aims to help children rehearse and improve skills they should have, by motivating them to do drill exercises with increasing time pressure.

Because the abilities of children are very diverse, the algorithm behind the game first tries to establish where a child is on the learning curve, and then stimulates the child to make progress. This way frustration is avoided, and the child makes progress without realizing that it’s being pushed forward.

There’s a demonstrable effect that playing the game helps mastery of arithmetic. Parents can experience this themselves by trying out the games.

What can my child learn from Monkey Tales?
Currently there are five games available, covering grades 2 to 6, covering the field of mathematics in line with state standards (Common Core Standards and the 2009 DoDEA standards). Future games in the series will cover language and science.

What’s special about Monkey Tales?
A key feature of Monkey Tales is its unique algorithm that allows the game to automatically adapt to the level of children so that they feel comfortable with their ability to complete the exercises, removing any stress they might feel. From there, the game then presents progressively more difficult exercises, all the time monitoring how the child is performing and adapting if necessary. One of the most remarkable achievements of Monkey Tales is its ability to put children under time pressure to complete exercises without them complaining about it!

Hopefully this Monkey Tales study or a new study will be published and a news release, which by its nature, offers skimpy information won’t provoke any doubts about the validity of the work.

When twice as much (algebra) is good for you

“We find positive and substantial longer-run impacts of double-dose algebra on college entrance exam scores, high school graduation rates and college enrollment rates, suggesting that the policy had significant benefits that were not easily observable in the first couple of years of its existence,” wrote the article’s authors.

The Mar. 21, 2013 news release on EurekAlert which includes the preceding quote recounts an extraordinary story about an approach to teaching algebra that was not enthusiastically adopted at first but first some reason administrators and teachers persisted with it. Chelsey Leu’s Mar. 21, 2013 article (which originated the news release) for UChicago (University of Chicago) News (Note: Links have been removed),

Martin Gartzman sat in his dentist’s waiting room last fall when he read a study in Education Next that nearly brought him to tears.

A decade ago, in his former position as chief math and science officer for Chicago Public Schools [CPS], Gartzman spearheaded an attempt to decrease ninth-grade algebra failure rates, an issue he calls “an incredibly vexing problem.” His idea was to provide extra time for struggling students by having them take two consecutive periods of algebra.

In high schools, ninth-grade algebra is typically the class with the highest failure rate. This presents a barrier to graduation, because high schools usually require three to four years of math to graduate.

Students have about a 20 percent chance of passing the next math level if they don’t first pass algebra, Gartzman said, versus 80 percent for those who do pass. The data are clear: If students fail ninth-grade algebra, the likelihood of passing later years of math, and ultimately of graduating, is slim

Gartzman’s work to decrease algebra failure rates at CPS was motivated by a study of Melissa Roderick, the Hermon Dunlap Smith Professor at UChicago’s School of Social Service Administration. The study emphasized the importance of keeping students academically on track in their freshman year to increase the graduation rate.

Some administrators and teachers resisted the new policy. Teachers called these sessions “double-period hell” because they gathered, in one class, the most unmotivated students who had the biggest problems with math.

Principals and counselors sometimes saw the double periods as punishment for the students, depriving them of courses they may have enjoyed taking and replacing them with courses they disliked.

It seemed to Gartzman that double-period students were learning more math, though he had no supporting data. He gauged students’ progress by class grades, not by standardized tests. The CPS educators had no way of fully assessing their double-period idea. All they knew was that failure rates didn’t budge.

Unfortunately, Leu does not explain why the administrators and teachers continued with the program but it’s a good thing they did (Note: Links have been removed),

“Double-dosing had an immediate impact on student performance in algebra, increasing the proportion of students earning at least a B by 9.4 percentage points, or more than 65 percent,” noted the Education Next article. Although ninth-grade algebra passing rates remained mostly unaffected, “The mean GPA across all math courses taken after freshman year increased by 0.14 grade points on a 4.0 scale.”

They also found significantly increased graduation rates. The researchers concluded on an encouraging note: “Although the intervention was not particularly effective for the average affected student, the fact that it improved high school graduation and college enrollment rates for even a subset of low-performing and at-risk students is extraordinarily promising when targeted at the appropriate students.” [emphasis mine]

Gartzman recalled that reading the article “was mind-blowing for me. I had no idea that the researchers were continuing to study these kids.”

The study had followed a set of students from eighth grade through graduation, while Gartzman’s team could only follow them for a year after the program began. The improvements appeared five years after launching double-dose algebra, hiding them from the CPS team, which had focused on short-term student performance. [emphasis mine]

Gartzman stressed the importance of education policy research. “Nomi and Allensworth did some really sophisticated modeling that only researchers could do, that school districts really can’t do. It validates school districts all over the country who had been investing in double-period strategies.”

I’m not sure I understand the numbers very well (maybe I need a double-dose of numbers). The 9.4% increase for students earning a B sounds good but a mean increase of 0.14 in grade points doesn’t sound as impressive. As for the bit about the program being “not particularly effective for the average affected student,” what kind of student is helped by this program? As for the improvements being seen five years after the program launch. does this mean that students in the program showed improvement five years later (in first year university) or that researchers weren’t able to effectively measure any impact in the grade nine classroom until five years after the program began?

Regardless, it seems there is an improvement and having suffered through my share algebra classes, I applaud the educators for finding a way to help some students, if not all.

Ramanujan—a math genius who left behind math formulas that took 90 years to decode

1920, the year mathematician Srinivasa Ramanujan died, is also the year he left behind mathematical formulas that may help unlock the secrets of black holes (from the Dec. 11, 2012 posting by Carol Clark for Emory University’s e-science commons blog),

“No one was talking about black holes back in the 1920s when Ramanujan first came up with mock modular forms, and yet, his work may unlock secrets about them,” Ono [Emory University mathematician Ken Ono] says.

Expansion of modular forms is one of the fundamental tools for computing the entropy of a modular black hole. Some black holes, however, are not modular, but the new formula based on Ramanujan’s vision may allow physicists to compute their entropy as though they were.

Ramanujan was on his death bed (at the age of 32) when he devised his last formulas (from the Clark posting),

Accessed from http://esciencecommons.blogspot.ca/2012/12/math-formula-gives-new-glimpse-into.html

Accessed from http://esciencecommons.blogspot.ca/2012/12/math-formula-gives-new-glimpse-into.html

… A devout Hindu, Ramanujan said that his findings were divine, revealed to him in dreams by the goddess Namagiri.

While on his death-bed in 1920, Ramanujan wrote a letter to his mentor, English mathematician G. H. Hardy. The letter described several new functions that behaved differently from known theta functions, or modular forms, and yet closely mimicked them. Ramanujan conjectured that his mock modular forms corresponded to the ordinary modular forms earlier identified by Carl Jacobi, and that both would wind up with similar outputs for roots of 1.

No one at the time understood what Ramanujan was talking about. “It wasn’t until 2002, through the work of Sander Zwegers, that we had a description of the functions that Ramanujan was writing about in 1920,” Ono says.

This year (2012) a number of special events have been held to commemorate Ramanujan’s accomplishments (Note: I have removed links), from the Clark posting,

December 22 [2012] marks the 125th anniversary of the birth of Srinivasa Ramanujan, an Indian mathematician renowned for somehow intuiting extraordinary numerical patterns and connections without the use of proofs or modern mathematical tools. ..

“I wanted to do something special, in the spirit of Ramanujan, to mark the anniversary,” says Emory mathematician Ken Ono. “It’s fascinating to me to explore his writings and imagine how his brain may have worked. It’s like being a mathematical anthropologist.”

Ono, a number theorist whose work has previously uncovered hidden meanings in the notebooks of Ramanujan, set to work on the 125th-anniversary project with two colleagues and former students: Amanda Folsom, from Yale, and Rob Rhoades, from Stanford.

The result is a formula for mock modular forms that may prove useful to physicists who study black holes. The work, which Ono recently presented at the Ramanujan 125 conference at the University of Florida, also solves one of the greatest puzzles left behind by the enigmatic Indian genius.

Here’s a trailer for the forthcoming movie (a docu-drama) about Ramanujan, from the Clark posting,

Here’s a description of Ramanujan from Wikipedia, which gives some insight into the nature of his genius (Note: I have removed links and a footnote),

Srinivasa Ramanujan FRS (…) (22 December 1887 – 26 April 1920) was an Indian mathematician and autodidact who, with almost no formal training in pure mathematics, made extraordinary contributions to mathematical analysis, number theory, infinite series, and continued fractions. Living in India with no access to the larger mathematical community, which was centered in Europe at the time, Ramanujan developed his own mathematical research in isolation. As a result, he sometimes rediscovered known theorems in addition to producing new work. Ramanujan was said to be a natural genius by the English mathematician G.H. Hardy, in the same league as mathematicians like Euler and Gauss.

There is a little more to Ono’s latest work concerning Ramanujan’s deathbed math functions (from the Clark posting),

After coming up with the formula for computing a mock modular form, Ono wanted to put some icing on the cake for the 125th-anniversary celebration. He and Emory graduate students Michael Griffin and Larry Rolen revisited the paragraph in Ramanujan’s last letter that gave a vague description for how he arrived at the functions. That one paragraph has inspired hundreds of papers by mathematicians, who have pondered its hidden meaning for eight decades.

“So much of what Ramanujan offers comes from mysterious words and strange formulas that seem to defy mathematical sense,” Ono says. “Although we had a definition from 2002 for Ramanujan’s functions, it was still unclear how it related to Ramanujan’s awkward and imprecise definition.”

Ono and his students finally saw the meaning behind the puzzling paragraph, and a way to link it to the modern definition. “We developed a theorem that shows that the bizarre methodology he used to construct his examples is correct,” Ono says. “For the first time, we can prove that the exotic functions that Ramanujan conjured in his death-bed letter behave exactly as he said they would, in every case.”

Ono is now on a mathematicians’ tour in India (from the Clark posting),

Ono will spend much of December in India, taking overnight trains to Mysore, Bangalore, Chennai and New Dehli, as part of a group of distinguished mathematicians giving talks about Ramanujan in the lead-up to the anniversary date.

“Ramanujan is a hero in India so it’s kind of like a math rock tour,” Ono says, adding, “I’m his biggest fan. My professional life is inescapably intertwined with Ramanujan. Many of the mathematical objects that I think about so profoundly were anticipated by him. I’m so glad that he existed.”

Between this and the series developed by Alex Bellos about mathematics in Japan (my Oct. 17, 2012 posting), it seems that attention is turning eastward where the study and development of mathematics is concerned. H/T to EurekAlert’s Dec. 17, 2012 news release and do read Clark’s article if you want more information about Ono and Ramanujan.

Take control of a 17th century scientific genius (Newton, Galileo, Keppler, Liebniz, or Kircher) in The New Science board game

Thank you to David Bruggeman (Pasco Phronesis) for the Sept. 16, 2012 posting (by way of Twitter and @JeanLucPiquant) about The New Science Game currently listed on the Kickstarter crowdfunding site. From the description of The New Science board game on Kickstarter,

The New Science gives you control of one of five legendary geniuses from the scientific revolution in a race to research, successfully experiment on, and finally publish some of the critical early advances that shaped modern science.

This fun, fast, easy-to-learn worker placement game for 2-5 players is ideal for casual and serious gamers alike. The rules are easy to learn and teach, but the many layers of shifting strategy make each game a new challenge that tests your mind and gets your competitive juices flowing.

Each scientist has their own unique strengths and weaknesses. No two scientists play the same way, so each time you try someone new it provides a different and satisfying play experience. Your scientist’s mat also serves as a player aid, repeating all of the key technology information from the game board for your easy reference.

The “five legendary geniuses’ are Isaac Newton, Galileo Galilei, Johannes Kepler, Gottfried Liebniz, and Athanasius Kircher. The Kickstarter campaign to take control of the five has raised $5,058 US of the $16,000 requested and it ends on Oct. 17, 2012.

The game is listed on boardgamegeek.com with additional details such as this,

Designer: Dirk Knemeyer

Artist: Heiko Günther

Publisher: Conquistador Games

# of players: 2-5

User suggested ages: 12 and up


Players control one of the great scientists during the 17th century Scientific Revolution in Europe. Use your limited time and energy to make discoveries, test hypotheses, publish papers, correspond with other famous scientists, hire assistants into your laboratory and network with other people who can help your progress. ’emphasis mine] Discoveries follow historical tech trees in the key sciences of the age: Astronomy, Mathematics, Physics, Biology and Chemistry. The scientist who accumulates the most prestige will be appointed the first President of the Royal Society.

The activities listed in the game description “make discoveries, test hypotheses,” etc. must sound very familiar to a contemporary scientist.

There’s also an explanatory video as seen on the Kickstarter campaign page and embedded here below,

David notes this about game quality in his Sept. 16, 2012 posting (Note: I have removed a link),

The game was heavily tested by the folks at Game Salute, and comes with the kind of quality details you might expect from games like Ticket to Ride or the various version of Catan.  If you’re interested in getting a copy of the game, it will run $49 U.S., plus shipping for destinations outside the U.S.  See the Kickstarter page for more details.

You can find out more about Conquistador Games here.