Category Archives: science

Beatrix Potter and her science on her 150th birthday

July 28, 2016 was the 150th anniversary of Beatrix Potter‘s birthday. Known by many through her children’s books, she has left an indelible mark on many of us. Hop-skip-jump.com has a description of an extraordinary woman, from their Beatrix Potter 150 years page,

An artist, storyteller, botanist, environmentalist, farmer and impeccable businesswoman, Potter was a visionary and a trailblazer. Single-mindedly determined and ambitious she overcame professional rejection, academic humiliation, and personal heartbreak, going on to earn her fortune and a formidable reputation.

A July 27, 2016 posting by Alex Jackson on the Guardian science blogs provides more information about Potter’s science (Note: Links have been removed),

Influenced by family holidays in Scotland, Potter was fascinated by the natural world from a young age. Encouraged to follow her interests, she explored the outdoors with sketchbook and camera, honing her skills as an artist, by drawing and sketching her school room pets: mice, rabbits and hedgehogs. Led first by her imagination, she developed a broad interest in the natural sciences: particularly archaeology, entomology and mycology, producing accurate watercolour drawings of unusual fossils, fungi, and archaeological artefacts.

Potter’s uncle, Sir Henry Enfield Roscoe FRS, an eminent nineteenth-century chemist, recognised her artistic talent and encouraged her scientific interests. By the 1890s, Potter’s skills in mycology drew Roscoe’s attention when he learned she had successfully germinated spores of a class of fungi, and had ideas on how they reproduced. He used his scientific connections with botanists at Kew’s Royal Botanic Gardens to gain a student card for his niece and to introduce her to Kew botanists interested in mycology.

Although Potter had good reason to think that her success might break some new ground, the botanists at Kew were sceptical. One Kew scientist, George Massee, however, was sufficiently interested in Potter’s drawings, encouraging her to continue experimenting. Although the director of Kew, William Thistleton-Dyer refused to give Potter’s theories or her drawings much attention both because she was an amateur and a female, Roscoe encouraged his niece to write up her investigations and offer her drawings in a paper to the Linnean Society.

In 1897, Potter put forward her paper, which Massee presented to the Linnean Society, since women could not be members or attend a meeting. Her paper, On the Germination of the Spores of the Agaricineae, was not given much notice and she quickly withdrew it, recognising that her samples were likely contaminated. Sadly, her paper has since been lost, so we can only speculate on what Potter actually concluded.

Until quite recently, Potter’s accomplishments and her experiments in natural science went unrecognised. Upon her death in 1943, Potter left hundreds of her mycological drawings and paintings to the Armitt Museum and Library in Ambleside, where she and her husband had been active members. Today, they are valued not only for their beauty and precision, but also for the assistance they provide modern mycologists in identifying a variety of fungi.

In 1997, the Linnean Society issued a posthumous apology to Potter, noting the sexism displayed in the handling of her research and its policy toward the contributions of women.

A rarely seen very early Beatrix Potter drawing, A Dream of Toasted Cheese was drawn to celebrate the publication of Henry Roscoe’s chemistry textbook in 1899. Illustration: Beatrix Potter/reproduced courtesy of the Lord Clwyd collection (image by way of The Guardian newspaper)

A rarely seen very early Beatrix Potter drawing, A Dream of Toasted Cheese was drawn to celebrate the publication of Henry Roscoe’s chemistry textbook in 1899. Illustration: Beatrix Potter/reproduced courtesy of the Lord Clwyd collection (image by way of The Guardian newspaper)

I’m sure you recognized the bunsen burner. From the James posting (Note: A link has been removed),

London-born, Henry Roscoe, whose family roots were in Liverpool, studied at University College London, before moving to Heidelberg, Germany, where he worked under Robert Bunsen, inventor of the new-fangled apparatus that inspired Potter’s drawing. Together, using magnesium as a light source, Roscoe and Bunsen reputedly carried out the first flashlight photography in 1864. Their research laid the foundations of comparative photochemistry.

These excerpts do not give full justice to James’ piece which I encourage you to read in its entirety.

Should you be going to the UK and inclined to follow up further, there’s a listing of 2016 events being held to honour Potter on the UK National Trust’s Celebrating Beatrix Potter’s anniversary in the Lake District webpage.

A couple of Frankenstein dares from The Frankenstein Bicentennial project

Drat! I’ve gotten the information about the first Frankenstein dare (a short story challenge) a little late in the game since the deadline is 11:59 pm PDT on July 31, 2016. In any event, here’s more about the two dares,

And for those who like their information in written form, here are the details from the Arizona State University’s (ASU) Frankenstein Bicentennial Dare (on The Franklin Bicentennial Project website),

Two centuries ago, on a dare to tell the best scary story, 19-year-old Mary Shelley imagined an idea that became the basis for Frankenstein. Mary’s original concept became the novel that arguably kick-started the genres of science fiction and Gothic horror, but also provided an enduring myth that shapes how we grapple with creativity, science, technology, and their consequences.
Two hundred years later, inspired by that classic dare, we’re challenging you to create new myths for the 21st century along with our partners National Novel Writing Month (NaNoWriMo), Chabot Space and Science Center, and Creative Nonfiction magazine.

FRANKENSTEIN 200

Presented by NaNoWriMo and the Chabot Space and Science Center

Frankenstein is a classic of Gothic literature – a gripping, tragic story about Victor Frankenstein’s failure to accept responsibility for the consequences of bringing new life into the world. In this dare, we’re challenging you to write a scary story that explores the relationship between creators and the “monsters” they create.

Almost anything that we create can become monstrous: a misinterpreted piece of architecture; a song whose meaning has been misappropriated; a big, but misunderstood idea; or, of course, an actual creature. And in Frankenstein, Shelley teaches us that monstrous does not always mean evil – in fact, creators can prove to be more destructive and inhuman than the things they bring into being

Tell us your story in 1,000 – 1,800 words on Medium.com and use the hashtag #Frankenstein200. Read other #Frankenstein200 stories, and use the recommend button at the bottom of each post for the stories you like. Winners in the short fiction contest will receive personal feedback from Hugo and Sturgeon Award-winning science fiction and fantasy author Elizabeth Bear, as well as a curated selection of classic and contemporary science fiction books and  Frankenstein goodies, courtesy of the NaNoWriMo team.

Rules and Mechanics

  • There are no restrictions on content. Entry is limited to one submission per author. Submissions must be in English and between 1,000 to 1,800 words. You must follow all Medium Terms of Service, including the Rules.
  • All entries submitted and tagged as #Frankenstein200 and in compliance with the rules outlined here will be considered.
  • The deadline for submissions is 11:59 PM on July 31, 2016.
  • Three winners will be selected at random on August 1, 2016.
  • Each winner receives the following prize package including:
  • Additionally, one of the three winners, chosen at random, will receive written coaching/feedback from Elizabeth Bear on his or her entry.
  • Select stories will be featured on Frankenscape, a public geo-storytelling project hosted by ASU’s Frankenstein Bicentennial Project. Stories may also be featured in National Novel Writing Month communications and social media platforms.
  • U.S. residents only [emphasis mine]; void where prohibited by law. No purchase is necessary to enter or win.

Dangerous Creations: Real-life Frankenstein Stories

Presented by Creative Nonfiction magazine

Creative Nonfiction magazine is daring writers to write original and true stories that explore humans’ efforts to control and redirect nature, the evolving relationships between humanity and science/technology, and contemporary interpretations of monstrosity.

Essays must be vivid and dramatic; they should combine a strong and compelling narrative with an informative or reflective element and reach beyond a strictly personal experience for some universal or deeper meaning. We’re open to a broad range of interpretations of the “Frankenstein” theme, with the understanding that all works submitted must tell true stories and be factually accurate. Above all, we’re looking for well-written prose, rich with detail and a distinctive voice.

Creative Nonfiction editors and a judge (to be announced) will award $10,000 and publication for Best Essay and two $2,500 prizes and publication for runners-up. All essays submitted will be considered for publication in the winter 2018 issue of the magazine.

Deadline for submissions: March 20, 2017.
For complete guidelines: www.creativenonfiction.org/submissions

[Note: There is a submission fee for the nonfiction dare and no indication as to whether or not there are residency requirements.]

A July 27, 2016 email received from The Frankenstein Bicentennial Project (which is how I learned about the dares somewhat belatedly) has this about the first dare,

Planetary Design, Transhumanism, and Pork Products
Our #Frankenstein200 Contest Took Us in Some Unexpected Directions

Last month [June 2016], we partnered with National Novel Writing Month (NaNoWriMo) and The Chabot Space and Science Center to dare the world to create stories in the spirit of Mary Shelley’s Frankenstein, to celebrate the 200th anniversary of the novel’s conception.

We received a bevy of intriguing and sometimes frightening submissions that explore the complex relationships between creators and their “monsters.” Here are a few tales that caught our eye:

The Man Who Harnessed the Sun
By Sandra Knisely
Eliza has to choose between protecting the scientist who once gave her the world and punishing him for letting it all slip away. Read the story…

The Mortality Complex
By Brandon Miller
When the boogeyman of medical students reflects on life. Read the story…

Bacon Man
By Corey Pressman
A Frankenstein story in celebration of ASU’s Frankenstein Bicentennial Project. And bacon. Read the story… 

You can find the stories that have been submitted to date for the creative short story dare at Medium.com.

Good luck! And, don’t forget to tag your short story with #Frankenstein200 and submit it by July 31, 2016 (if you are a US resident). There’s still lots of time to enter a submission for a creative nonfiction piece.

Connecting chaos and entanglement

Researchers seem to have stumbled across a link between classical and quantum physics. A July 12, 2016 University of California at Santa Barbara (UCSB) news release (also on EurekAlert) by Sonia Fernandez provides a description of both classical and quantum physics, as well as, the research that connects the two,

Using a small quantum system consisting of three superconducting qubits, researchers at UC Santa Barbara and Google have uncovered a link between aspects of classical and quantum physics thought to be unrelated: classical chaos and quantum entanglement. Their findings suggest that it would be possible to use controllable quantum systems to investigate certain fundamental aspects of nature.

“It’s kind of surprising because chaos is this totally classical concept — there’s no idea of chaos in a quantum system,” Charles Neill, a researcher in the UCSB Department of Physics and lead author of a paper that appears in Nature Physics. “Similarly, there’s no concept of entanglement within classical systems. And yet it turns out that chaos and entanglement are really very strongly and clearly related.”

Initiated in the 15th century, classical physics generally examines and describes systems larger than atoms and molecules. It consists of hundreds of years’ worth of study including Newton’s laws of motion, electrodynamics, relativity, thermodynamics as well as chaos theory — the field that studies the behavior of highly sensitive and unpredictable systems. One classic example of chaos theory is the weather, in which a relatively small change in one part of the system is enough to foil predictions — and vacation plans — anywhere on the globe.

At smaller size and length scales in nature, however, such as those involving atoms and photons and their behaviors, classical physics falls short. In the early 20th century quantum physics emerged, with its seemingly counterintuitive and sometimes controversial science, including the notions of superposition (the theory that a particle can be located in several places at once) and entanglement (particles that are deeply linked behave as such despite physical distance from one another).

And so began the continuing search for connections between the two fields.

All systems are fundamentally quantum systems, according [to] Neill, but the means of describing in a quantum sense the chaotic behavior of, say, air molecules in an evacuated room, remains limited.

Imagine taking a balloon full of air molecules, somehow tagging them so you could see them and then releasing them into a room with no air molecules, noted co-author and UCSB/Google researcher Pedram Roushan. One possible outcome is that the air molecules remain clumped together in a little cloud following the same trajectory around the room. And yet, he continued, as we can probably intuit, the molecules will more likely take off in a variety of velocities and directions, bouncing off walls and interacting with each other, resting after the room is sufficiently saturated with them.

“The underlying physics is chaos, essentially,” he said. The molecules coming to rest — at least on the macroscopic level — is the result of thermalization, or of reaching equilibrium after they have achieved uniform saturation within the system. But in the infinitesimal world of quantum physics, there is still little to describe that behavior. The mathematics of quantum mechanics, Roushan said, do not allow for the chaos described by Newtonian laws of motion.

To investigate, the researchers devised an experiment using three quantum bits, the basic computational units of the quantum computer. Unlike classical computer bits, which utilize a binary system of two possible states (e.g., zero/one), a qubit can also use a superposition of both states (zero and one) as a single state. Additionally, multiple qubits can entangle, or link so closely that their measurements will automatically correlate. By manipulating these qubits with electronic pulses, Neill caused them to interact, rotate and evolve in the quantum analog of a highly sensitive classical system.

The result is a map of entanglement entropy of a qubit that, over time, comes to strongly resemble that of classical dynamics — the regions of entanglement in the quantum map resemble the regions of chaos on the classical map. The islands of low entanglement in the quantum map are located in the places of low chaos on the classical map.

“There’s a very clear connection between entanglement and chaos in these two pictures,” said Neill. “And, it turns out that thermalization is the thing that connects chaos and entanglement. It turns out that they are actually the driving forces behind thermalization.

“What we realize is that in almost any quantum system, including on quantum computers, if you just let it evolve and you start to study what happens as a function of time, it’s going to thermalize,” added Neill, referring to the quantum-level equilibration. “And this really ties together the intuition between classical thermalization and chaos and how it occurs in quantum systems that entangle.”

The study’s findings have fundamental implications for quantum computing. At the level of three qubits, the computation is relatively simple, said Roushan, but as researchers push to build increasingly sophisticated and powerful quantum computers that incorporate more qubits to study highly complex problems that are beyond the ability of classical computing — such as those in the realms of machine learning, artificial intelligence, fluid dynamics or chemistry — a quantum processor optimized for such calculations will be a very powerful tool.

“It means we can study things that are completely impossible to study right now, once we get to bigger systems,” said Neill.

Experimental link between quantum entanglement (left) and classical chaos (right) found using a small quantum computer. Photo Credit: Courtesy Image (Courtesy: UCSB)

Experimental link between quantum entanglement (left) and classical chaos (right) found using a small quantum computer. Photo Credit: Courtesy Image (Courtesy: UCSB)

Here’s a link to and a citation for the paper,

Ergodic dynamics and thermalization in an isolated quantum system by C. Neill, P. Roushan, M. Fang, Y. Chen, M. Kolodrubetz, Z. Chen, A. Megrant, R. Barends, B. Campbell, B. Chiaro, A. Dunsworth, E. Jeffrey, J. Kelly, J. Mutus, P. J. J. O’Malley, C. Quintana, D. Sank, A. Vainsencher, J. Wenner, T. C. White, A. Polkovnikov, & J. M. Martinis. Nature Physics (2016)  doi:10.1038/nphys3830 Published online 11 July 2016

This paper is behind a paywall.

A selection of science songs for summer

Canada’s Perimeter Institute for Theoretical Physics (PI) has compiled a list of science songs and it includes a few Canadian surprises. Here’s more from the July 21, 2016 PI notice received via email.

Ah, summer.

School’s out, the outdoors beckon, and with every passing second a 4.5-billion-year-old nuclear fireball fuses 620 million tons of hydrogen so brightly you’ve gotta wear shades.

Who says you have to stop learning science over the summer?

All you need is the right soundtrack to your next road trip, backyard barbeque, or day at the beach.

Did we miss your favourite science song? Tweet us @Perimeter with the hashtag #SciencePlaylist.

You can find the list and accompanying videos on The Ultimate Science Playlist webpage on the PI website. Here are a few samples,

“History of Everything” – Barenaked Ladies (The Big Bang Theory theme)

You probably know this one as the theme song of The Big Bang Theory. But here’s something you might not know. The tune began as an improvised ditty Barenaked Ladies’ singer Ed Robertson performed one night in Los Angeles after reading Simon Singh’s book Big Bang: The Most Important Scientific Discovery of All Time and Why You Need to Know About It. Lo and behold, in the audience that night were Chuck Lorre and Bill Prady, creators of The Big Bang Theory. The rest is history (of everything).

“Bohemian Gravity” – A Capella Science (Tim Blais)

Tim Blais, the one-man choir behind A Capella Science, is a master at conveying complex science in fun musical parodies. “Bohemian Gravity” is his most famous, but be sure to also check out our collaboration with him about gravitational waves, “LIGO: Feel That Space.”

“NaCl” – Kate and Anna McGarrigle

“NaCl” is a romantic tale of the courtship of a chlorine atom and a sodium atom, who marry and become sodium chloride. “Think of the love you eat,” sings Kate McGarrigle, “when you salt your meat.”

This is just a sampling. At this point, there are 15 science songs on the webpage. Surprisingly, rap is not represented. One other note, you’ll notice all of my samples are Canadian. (Sadly, I had other videos as well but every time I saved a draft I lost at least half or more. It seems the maximum allowed to me is three.).

Here are the others I wanted to include:

“Mandelbrot Set” – Jonathan Coulton

Singer-songwriter Jonathan Coulton (JoCo, to fans) is arguably the patron saint of geek-pop, having penned the uber-catchy credits songs of the Portal games, as well as this loving tribute to a particular set of complex numbers that has a highly convoluted fractal boundary when plotted.

“Higgs Boson Sonification” – Traq 

CERN physicist Piotr Traczyk (a.k.a. Traq) “sonified” data from the experiment that uncovered the Higgs boson, turning the discovery into a high-energy metal riff.

“Why Does the Sun Shine?” – They Might Be Giants

Choosing just one song for this playlist by They Might Be Giants is a tricky task, since They Definitely Are Nerdy. But this one celebrates physics, chemistry, and astronomy while also being absurdly catchy, so it made the list. Honourable mention goes to their entire album for kids, Here Comes Science.

In any event, the PI list is a great introduction to science songs and The Ultimate Science Playlist includes embedded videos for all 15 of the songs selected so far. Happy Summer!

Cornwall (UK) connects with University of Southern California for performance by a quantum computer (D-Wave) and mezzo soprano Juliette Pochin

The upcoming performance of a quantum computer built by D-Wave Systems (a Canadian company) and Welsh mezzo soprano Juliette Pochin is the première of “Superposition” by Alexis Kirke. A July 13, 2016 news item on phys.org provides more detail,

What happens when you combine the pure tones of an internationally renowned mezzo soprano and the complex technology of a $15million quantum supercomputer?

The answer will be exclusively revealed to audiences at the Port Eliot Festival [Cornwall, UK] when Superposition, created by Plymouth University composer Alexis Kirke, receives its world premiere later this summer.

A D-Wave 1000 Qubit Quantum Processor. Credit: D-Wave Systems Inc

A D-Wave 1000 Qubit Quantum Processor. Credit: D-Wave Systems Inc

A July 13, 2016 Plymouth University press release, which originated the news item, expands on the theme,

Combining the arts and sciences, as Dr Kirke has done with many of his previous works, the 15-minute piece will begin dark and mysterious with celebrated performer Juliette Pochin singing a low-pitched slow theme.

But gradually the quiet sounds of electronic ambience will emerge over or beneath her voice, as the sounds of her singing are picked up by a microphone and sent over the internet to the D-Wave quantum computer at the University of Southern California.

It then reacts with behaviours in the quantum realm that are turned into sounds back in the performance venue, the Round Room at Port Eliot, creating a unique and ground-breaking duet.

And when the singer ends, the quantum processes are left to slowly fade away naturally, making their final sounds as the lights go to black.

Dr Kirke, a member of the Interdisciplinary Centre for Computer Music Research at Plymouth University, said:

“There are only a handful of these computers accessible in the world, and this is the first time one has been used as part of a creative performance. So while it is a great privilege to be able to put this together, it is an incredibly complex area of computing and science and it has taken almost two years to get to this stage. For most people, this will be the first time they have seen a quantum computer in action and I hope it will give them a better understanding of how it works in a creative and innovative way.”

Plymouth University is the official Creative and Cultural Partner of the Port Eliot Festival, taking place in South East Cornwall from July 28 to 31, 2016 [emphasis mine].

And Superposition will be one of a number of showcases of University talent and expertise as part of the first Port Eliot Science Lab. Being staged in the Round Room at Port Eliot, it will give festival goers the chance to explore science, see performances and take part in a range of experiments.

The three-part performance will tell the story of Niobe, one of the more tragic figures in Greek mythology, but in this case a nod to the fact the heart of the quantum computer contains the metal named after her, niobium. It will also feature a monologue from Hamlet, interspersed with terms from quantum computing.

This is the latest of Dr Kirke’s pioneering performance works, with previous productions including an opera based on the financial crisis and a piece using a cutting edge wave-testing facility as an instrument of percussion.

Geordie Rose, CTO and Founder, D-Wave Systems, said:

“D-Wave’s quantum computing technology has been investigated in many areas such as image recognition, machine learning and finance. We are excited to see Dr Kirke, a pioneer in the field of quantum physics and the arts, utilising a D-Wave 2X in his next performance. Quantum computing is positioned to have a tremendous social impact, and Dr Kirke’s work serves not only as a piece of innovative computer arts research, but also as a way of educating the public about these new types of exotic computing machines.”

Professor Daniel Lidar, Director of the USC Center for Quantum Information Science and Technology, said:

“This is an exciting time to be in the field of quantum computing. This is a field that was purely theoretical until the 1990s and now is making huge leaps forward every year. We have been researching the D-Wave machines for four years now, and have recently upgraded to the D-Wave 2X – the world’s most advanced commercially available quantum optimisation processor. We were very happy to welcome Dr Kirke on a short training residence here at the University of Southern California recently; and are excited to be collaborating with him on this performance, which we see as a great opportunity for education and public awareness.”

Since I can’t be there, I’m hoping they will be able to successfully livestream the performance. According to Kirke who very kindly responded to my query, the festival’s remote location can make livecasting a challenge. He did note that a post-performance documentary is planned and there will be footage from the performance.

He has also provided more information about the singer and the technical/computer aspects of the performance (from a July 18, 2016 email),

Juliette Pochin: I’ve worked with her before a couple of years ago. She has an amazing voice and style, is musically adventurousness (she is a music producer herself), and brings great grace and charisma to a performance. She can be heard in the Harry Potter and Lord of the Rings soundtracks and has performed at venues such as the Royal Albert Hall, Proms in the Park, and Meatloaf!

Score: The score is in 3 parts of about 5 minutes each. There is a traditional score for parts 1 and 3 that Juliette will sing from. I wrote these manually in traditional music notation. However she can sing in free time and wait for the computer to respond. It is a very dramatic score, almost operatic. The computer’s responses are based on two algorithms: a superposition chord system, and a pitch-loudness entanglement system. The superposition chord system sends a harmony problem to the D-Wave in response to Juliette’s approximate pitch amongst other elements. The D-Wave uses an 8-qubit optimizer to return potential chords. Each potential chord has an energy associated with it. In theory the lowest energy chord is that preferred by the algorithm. However in the performance I will combine the chord solutions to create superposition chords. These are chords which represent, in a very loose way, the superposed solutions which existed in the D-Wave before collapse of the qubits. Technically they are the results of multiple collapses, but metaphorically I can’t think of a more beautiful representation of superposition: chords. These will accompany Juliette, sometimes clashing with her. Sometimes giving way to her.

The second subsystem generates non-pitched noises of different lengths, roughnesses and loudness. These are responses to Juliette, but also a result of a simple D-Wave entanglement. We know the D-Wave can entangle in 8-qubit groups. I send a binary representation of the Juliette’s loudness to 4 qubits and one of approximate pitch to another 4, then entangle the two. The chosen entanglement weights are selected for their variety of solutions amongst the qubits, rather than by a particular musical logic. So the non-pitched subsystem is more of a sonification of entanglement than a musical algorithm.

Thank you Dr. Kirke for a fascinating technical description and for a description of Juliette Pochin that makes one long to hear her in performance.

For anyone who’s thinking of attending the performance or curious, you can find out more about the Port Eliot festival here, Juliette Pochin here, and Alexis Kirke here.

For anyone wondering about data sonficiatiion, I also have a Feb. 7, 2014 post featuring a data sonification project by Dr. Domenico Vicinanza which includes a sound clip of his Voyager 1 & 2 spacecraft duet.

Exploring the fundamental limits of invisibility cloaks

There’s some interesting work on invisibility cloaks coming from the University of Texas at Austin according to a July 6, 2015 news item on Nanowerk,

Researchers in the Cockrell School of Engineering at The University of Texas at Austin have been able to quantify fundamental physical limitations on the performance of cloaking devices, a technology that allows objects to become invisible or undetectable to electromagnetic waves including radio waves, microwaves, infrared and visible light.

A July 5, 2016 University of Texas at Austin news release (also on EurekAlert), which originated the news item, expands on the theme,

The researchers’ theory confirms that it is possible to use cloaks to perfectly hide an object for a specific wavelength, but hiding an object from an illumination containing different wavelengths becomes more challenging as the size of the object increases.

Andrea Alù, an electrical and computer engineering professor and a leading researcher in the area of cloaking technology, along with graduate student Francesco Monticone, created a quantitative framework that now establishes boundaries on the bandwidth capabilities of electromagnetic cloaks for objects of different sizes and composition. As a result, researchers can calculate the expected optimal performance of invisibility devices before designing and developing a specific cloak for an object of interest. …

Cloaks are made from artificial materials, called metamaterials, that have special properties enabling a better control of the incoming wave, and can make an object invisible or transparent. The newly established boundaries apply to cloaks made of passive metamaterials — those that do not draw energy from an external power source.

Understanding the bandwidth and size limitations of cloaking is important to assess the potential of cloaking devices for real-world applications such as communication antennas, biomedical devices and military radars, Alù said. The researchers’ framework shows that the performance of a passive cloak is largely determined by the size of the object to be hidden compared with the wavelength of the incoming wave, and it quantifies how, for shorter wavelengths, cloaking gets drastically more difficult.

For example, it is possible to cloak a medium-size antenna from radio waves over relatively broad bandwidths for clearer communications, but it is essentially impossible to cloak large objects, such as a human body or a military tank, from visible light waves, which are much shorter than radio waves.

“We have shown that it will not be possible to drastically suppress the light scattering of a tank or an airplane for visible frequencies with currently available techniques based on passive materials,” Monticone said. “But for objects comparable in size to the wavelength that excites them (a typical radio-wave antenna, for example, or the tip of some optical microscopy tools), the derived bounds show that you can do something useful, the restrictions become looser, and we can quantify them.”

In addition to providing a practical guide for research on cloaking devices, the researchers believe that the proposed framework can help dispel some of the myths that have been developed around cloaking and its potential to make large objects invisible.
“The question is, ‘Can we make a passive cloak that makes human-scale objects invisible?’ ” Alù said. “It turns out that there are stringent constraints in coating an object with a passive material and making it look as if the object were not there, for an arbitrary incoming wave and observation point.”

Now that bandwidth limits on cloaking are available, researchers can focus on developing practical applications with this technology that get close to these limits.

“If we want to go beyond the performance of passive cloaks, there are other options,” Monticone said. “Our group and others have been exploring active and nonlinear cloaking techniques, for which these limits do not apply. Alternatively, we can aim for looser forms of invisibility, as in cloaking devices that introduce phase delays as light is transmitted through, camouflaging techniques, or other optical tricks that give the impression of transparency, without actually reducing the overall scattering of light.”

Alù’s lab is working on the design of active cloaks that use metamaterials plugged to an external energy source to achieve broader transparency bandwidths.

“Even with active cloaks, Einstein’s theory of relativity fundamentally limits the ultimate performance for invisibility,” Alù said. “Yet, with new concepts and designs, such as active and nonlinear metamaterials, it is possible to move forward in the quest for transparency and invisibility.”

The researchers have prepared a diagram illustrating their work,

The graph shows the trade-off between how much an object can be made transparent (scattering reduction; vertical axis) and the color span (bandwidth; horizontal axis) over which this phenomenon can be achieved. Courtesy: University of Texas at Austin

The graph shows the trade-off between how much an object can be made transparent (scattering reduction; vertical axis) and the color span (bandwidth; horizontal axis) over which this phenomenon can be achieved. Courtesy: University of Texas at Austin

Here’s a link to and a citation for the paper,

Invisibility exposed: physical bounds on passive cloaking by Francesco Monticone and Andrea Alù. Optica Vol. 3, Issue 7, pp. 718-724 (2016) •doi: 10.1364/OPTICA.3.000718

This paper is open access.

Re-envisioning the laboratory: an art/sci or sci-art (take your pick) symposium

DFA186 Hades. 2012. Unique digital C-print on watercolor paper. Artist: Brandon Ballengee

DFA186 Hades. 2012. Unique digital C-print on watercolor paper. Artist: Brandon Ballengée

Artist (work seen above)/Biologist/Environmental Activist, Brandon Ballengée will be a keynote speaker at the Re-envisioning the Laboratory: Sci-Art Symposium being held at the University of Wyoming. Thursday, Sept. 8, 2016 is for the evening reception while the symposium is being held Friday, Sept. 9 – Saturday, Sept. 10, 2016. You can read more about the symposium (the schedule is not yet complete) in a July 12, 2016 posting by CommNatural (Bethann G. Merkle) on her CommNatural blog,

I’m super excited to invite you to register for a Sci-Art Symposium I’ve been co-planning for the past year. The big idea is to bring together a wide-ranging set of ideas, examples, and thinkers/do-ers to build a powerful foundation for on-going SciArt synergy on the University of Wyoming campus, in Wyoming communities, and beyond. We’re organizing sessions around not just beautiful examples and great ideas, but also challenges and funding opportunities, with the intent to address not just what works, but how it works, what gets in the way, and how to move ahead with the SciArt initiatives you envision.

The rest of this blog post provides essential information about the symposium. If you have any questions, don’t hesitate to contact me or any of the other organizers – there’s a slew of us from art and science disciplines across campus!

Hope to see you there!

SYMPOSIUM INFORMATION

The 2016 Sci-Art Symposium will provide a forum for inspiration, scholarly research, networking and opportunities to get the tools, methods and momentum to take on innovative interdisciplinary work across community, disciplinary, and topical boundaries. Sessions will be organized into five thematic categories: influences and opportunities, processes and methods, outcomes and products, challenges and opportunities, and next steps and future applications. Keynote address will feature artist-biologist Brandon Ballengée, and other sessions will feature presenters from throughout the nation.

Registration Fees:

$75  General Admission
$0    Full-time Student Admission (Only applicable to students enrolled in full-time schedule, may be asked for verification)

Click here for transportation and lodging information, on the event website.

CONTACT INFORMATION

If you have questions about your registration or if you need to cancel your attendance, please contact Katie Christensen, Curator of Education and Statewide Engagement, at katie.christensen@uwyo.edu or 307-766-3496.

Re-envisioning the Lab: 2016 Sci-Art Symposium is made possible by University of Wyoming Art Museum, in partnership with the Biodiversity Institute, Haub School of Environment and Natural Resources, Department of Art and Art History, Science and Math Teaching Center and MFA in Creative Writing.

I’m a little surprised that the US National Science Foundation is not one of the funders. In fact, most, if not all, of the funders are part of the University of Wyoming.

As to whether there is a correct form: artsci or sciart; art/sci or sci/art; sci-art or art-sci; SciArt or ArtSci, and whether the terms refer to the same thing or two different approaches to bringing together art and science in a project, I have no idea. Perhaps they’ll discuss terminology at the symposium.

One final thought, since they don’t have the final schedule nailed down, perhaps it’s possible to submit a proposal for a talk or entry for a sciart piece. Good luck!

Deep learning and some history from the Swiss National Science Foundation (SNSF)

A June 27, 2016 news item on phys.org provides a measured analysis of deep learning and its current state of development (from a Swiss perspective),

In March 2016, the world Go champion Lee Sedol lost 1-4 against the artificial intelligence AlphaGo. For many, this was yet another defeat for humanity at the hands of the machines. Indeed, the success of the AlphaGo software was forged in an area of artificial intelligence that has seen huge progress over the last decade. Deep learning, as it’s called, uses artificial neural networks to process algorithmic calculations. This software architecture therefore mimics biological neural networks.

Much of the progress in deep learning is thanks to the work of Jürgen Schmidhuber, director of the IDSIA (Istituto Dalle Molle di Studi sull’Intelligenza Artificiale) which is located in the suburbs of Lugano. The IDSIA doctoral student Shane Legg and a group of former colleagues went on to found DeepMind, the startup acquired by Google in early 2014 for USD 500 million. The DeepMind algorithms eventually wound up in AlphaGo.

“Schmidhuber is one of the best at deep learning,” says Boi Faltings of the EPFL Artificial Intelligence Lab. “He never let go of the need to keep working at it.” According to Stéphane Marchand-Maillet of the University of Geneva computing department, “he’s been in the race since the very beginning.”

A June 27, 2016 SNSF news release (first published as a story in Horizons no. 109 June 2016) by Fabien Goubet, which originated the news item, goes on to provide a brief history,

The real strength of deep learning is structural recognition, and winning at Go is just an illustration of this, albeit a rather resounding one. Elsewhere, and for some years now, we have seen it applied to an entire spectrum of areas, such as visual and vocal recognition, online translation tools and smartphone personal assistants. One underlying principle of machine learning is that algorithms must first be trained using copious examples. Naturally, this has been helped by the deluge of user-generated content spawned by smartphones and web 2.0, stretching from Facebook photo comments to official translations published on the Internet. By feeding a machine thousands of accurately tagged images of cats, for example, it learns first to recognise those cats and later any image of a cat, including those it hasn’t been fed.

Deep learning isn’t new; it just needed modern computers to come of age. As far back as the early 1950s, biologists tried to lay out formal principles to explain the working of the brain’s cells. In 1956, the psychologist Frank Rosenblatt of the New York State Aeronautical Laboratory published a numerical model based on these concepts, thereby creating the very first artificial neural network. Once integrated into a calculator, it learned to recognise rudimentary images.

“This network only contained eight neurones organised in a single layer. It could only recognise simple characters”, says Claude Touzet of the Adaptive and Integrative Neuroscience Laboratory of Aix-Marseille University. “It wasn’t until 1985 that we saw the second generation of artificial neural networks featuring multiple layers and much greater performance”. This breakthrough was made simultaneously by three researchers: Yann LeCun in Paris, Geoffrey Hinton in Toronto and Terrence Sejnowski in Baltimore.

Byte-size learning

In multilayer networks, each layer learns to recognise the precise visual characteristics of a shape. The deeper the layer, the more abstract the characteristics. With cat photos, the first layer analyses pixel colour, and the following layer recognises the general form of the cat. This structural design can support calculations being made upon thousands of layers, and it was this aspect of the architecture that gave rise to the name ‘deep learning’.

Marchand-Maillet explains: “Each artificial neurone is assigned an input value, which it computes using a mathematical function, only firing if the output exceeds a pre-defined threshold”. In this way, it reproduces the behaviour of real neurones, which only fire and transmit information when the input signal (the potential difference across the entire neural circuit) reaches a certain level. In the artificial model, the results of a single layer are weighted, added up and then sent as the input signal to the following layer, which processes that input using different functions, and so on and so forth.

For example, if a system is trained with great quantities of photos of apples and watermelons, it will progressively learn to distinguish them on the basis of diameter, says Marchand-Maillet. If it cannot decide (e.g., when processing a picture of a tiny watermelon), the subsequent layers take over by analysing the colours or textures of the fruit in the photo, and so on. In this way, every step in the process further refines the assessment.

Video games to the rescue

For decades, the frontier of computing held back more complex applications, even at the cutting edge. Industry walked away, and deep learning only survived thanks to the video games sector, which eventually began producing graphics chips, or GPUs, with an unprecedented power at accessible prices: up to 6 teraflops (i.e., 6 trillion calculations per second) for a few hundred dollars. “There’s no doubt that it was this calculating power that laid the ground for the quantum leap in deep learning”, says Touzet. GPUs are also very good at parallel calculations, a useful function for executing the innumerable simultaneous operations required by neural networks.
Although image analysis is getting great results, things are more complicated for sequential data objects such as natural spoken language and video footage. This has formed part of Schmidhuber’s work since 1989, and his response has been to develop recurrent neural networks in which neurones communicate with each other in loops, feeding processed data back into the initial layers.

Such sequential data analysis is highly dependent on context and precursory data. In Lugano, networks have been instructed to memorise the order of a chain of events. Long Short Term Memory (LSTM) networks can distinguish ‘boat’ from ‘float’ by recalling the sound that preceded ‘oat’ (i.e., either ‘b’ or ‘fl’). “Recurrent neural networks are more powerful than other approaches such as the Hidden Markov models”, says Schmidhuber, who also notes that Google Voice integrated LSTMs in 2015. “With looped networks, the number of layers is potentially infinite”, says Faltings [?].

For Schmidhuber, deep learning is just one aspect of artificial intelligence; the real thing will lead to “the most important change in the history of our civilisation”. But Marchand-Maillet sees deep learning as “a bit of hype, leading us to believe that artificial intelligence can learn anything provided there’s data. But it’s still an open question as to whether deep learning can really be applied to every last domain”.

It’s nice to get an historical perspective and eye-opening to realize that scientists have been working on these concepts since the 1950s.

Simulating elementary physics in a quantum simulation (particle zoo in a quantum computer?)

Whoever wrote the news release used a very catchy title “Particle zoo in a quantum computer”; I just wish they’d explained it. Looking up the definition for a ‘particle zoo’ didn’t help as much as I’d hoped. From the particle zoo entry on Wikipedia (Note: Links have been removed),

In particle physics, the term particle zoo[1][2] is used colloquially to describe a relatively extensive list of the then known “elementary particles” that almost look like hundreds of species in the zoo.

In the history of particle physics, the situation was particularly confusing in the late 1960s. Before the discovery of quarks, hundreds of strongly interacting particles (hadrons) were known, and believed to be distinct elementary particles in their own right. It was later discovered that they were not elementary particles, but rather composites of the quarks. The set of particles believed today to be elementary is known as the Standard Model, and includes quarks, bosons and leptons.

I believe the writer used the term to indicate that the simulation undertaken involved elementary particles. If you have a better explanation, please feel free to add it to the comments for this post.

Here’s the news from a June 22, 2016 news item on ScienceDaily,

Elementary particles are the fundamental buildings blocks of matter, and their properties are described by the Standard Model of particle physics. The discovery of the Higgs boson at the CERN in 2012 constitutes a further step towards the confirmation of the Standard Model. However, many aspects of this theory are still not understood because their complexity makes it hard to investigate them with classical computers. Quantum computers may provide a way to overcome this obstacle as they can simulate certain aspects of elementary particle physics in a well-controlled quantum system. Physicists from the University of Innsbruck and the Institute for Quantum Optics and Quantum Information (IQOQI) at the Austrian Academy of Sciences have now done exactly that: In an international first, Rainer Blatt’s and Peter Zoller’s research teams have simulated lattice gauge theories in a quantum computer. …

A June 23, 2016 University of Innsbruck (Universität Innsbruck) press release, which seems  to have originated the news item, provides more detail,

Gauge theories describe the interaction between elementary particles, such as quarks and gluons, and they are the basis for our understanding of fundamental processes. “Dynamical processes, for example, the collision of elementary particles or the spontaneous creation of particle-antiparticle pairs, are extremely difficult to investigate,” explains Christine Muschik, theoretical physicist at the IQOQI. “However, scientists quickly reach a limit when processing numerical calculations on classical computers. For this reason, it has been proposed to simulate these processes by using a programmable quantum system.” In recent years, many interesting concepts have been proposed, but until now it was impossible to realize them. “We have now developed a new concept that allows us to simulate the spontaneous creation of electron-positron pairs out of the vacuum by using a quantum computer,” says Muschik. The quantum system consists of four electromagnetically trapped calcium ions that are controlled by laser pulses. “Each pair of ions represent a pair of a particle and an antiparticle,” explains experimental physicist Esteban A. Martinez. “We use laser pulses to simulate the electromagnetic field in a vacuum. Then we are able to observe how particle pairs are created by quantum fluctuations from the energy of this field. By looking at the ion’s fluorescence, we see whether particles and antiparticles were created. We are able to modify the parameters of the quantum system, which allows us to observe and study the dynamic process of pair creation.”

Combining different fields of physics

With this experiment, the physicists in Innsbruck have built a bridge between two different fields in physics: They have used atomic physics experiments to study questions in high-energy physics. While hundreds of theoretical physicists work on the highly complex theories of the Standard Model and experiments are carried out at extremely expensive facilities, such as the Large Hadron Collider at CERN, quantum simulations may be carried out by small groups in tabletop experiments. “These two approaches complement one another perfectly,” says theoretical physicist Peter Zoller. “We cannot replace the experiments that are done with particle colliders. However, by developing quantum simulators, we may be able to understand these experiments better one day.” Experimental physicist Rainer Blatt adds: “Moreover, we can study new processes by using quantum simulation. For example, in our experiment we also investigated particle entanglement produced during pair creation, which is not possible in a particle collider.” The physicists are convinced that future quantum simulators will potentially be able to solve important questions in high-energy physics that cannot be tackled by conventional methods.

Foundation for a new research field

It was only a few years ago that the idea to combine high-energy and atomic physics was proposed. With this work it has been implemented experimentally for the first time. “This approach is conceptually very different from previous quantum simulation experiments studying many-body physics or quantum chemistry. The simulation of elementary particle processes is theoretically very complex and, therefore, has to satisfy very specific requirements. For this reason it is difficult to develop a suitable protocol,” underlines Zoller. The conditions for the experimental physicists were equally demanding: “This is one of the most complex experiments that has ever been carried out in a trapped-ion quantum computer,” says Blatt. “We are still figuring out how these quantum simulations work and will only gradually be able to apply them to more challenging phenomena.” The great theoretical as well as experimental expertise of the physicists in Innsbruck was crucial for the breakthrough. Both Blatt and Zoller emphasize that they have been doing research on quantum computers for many years now and have gained a lot of experience in their implementation. Innsbruck has become one of the leading centers for research in quantum physics; here, the theoretical and experimental branches work together at an extremely high level, which enables them to gain novel insights into fundamental phenomena.

Here’s a link to and a citation for the paper,

Real-time dynamics of lattice gauge theories with a few-qubit quantum computer by Esteban A. Martinez, Christine A. Muschik, Philipp Schindler, Daniel Nigg, Alexander Erhard, Markus Heyl, Philipp Hauke, Marcello Dalmonte, Thomas Monz, Peter Zoller, & Rainer Blatt.  Nature 534, 516–519 (23 June 2016)  doi:10.1038/nature18318 Published online 22 June 2016

This paper is behind a paywall.

There is a soundcloud audio file featuring an explanation of the work from the lead author, Esteban A. Martinez,

Third assessment of The State of Science and Technology and Industrial Research and Development in Canada announced

The last State of Science and Technology and Industrial Research and Development in Canada assessments were delivered in 2012 and 2013 respectively, which seems a shortish gap between assessments, as these things go. Having On a positive note, this may mean that the government has seen the importance of a more agile approach as the pace of new discoveries is ever quickening. Here’s more from a June 29, 2016 announcement from the Canadian Council of Academies (CCA; received via email),

CCA to undertake third assessment on the State of S&T and IR&D

June 29, 2016 (Ottawa, ON) – The Council of Canadian Academies (CCA) is pleased to announce the launch of a new assessment on the state of science and technology (S&T) and industrial research and development (IR&D) in Canada. This assessment, referred by Innovation, Science and Economic Development Canada (ISED), will be the third installment in the state of S&T and IR&D series by the CCA.

“I’m delighted the government continues to recognize the value of the CCA’s state of S&T and IR&D reports,” said Eric M. Meslin, President and CEO of the Council of Canadian Academies. “An updated assessment will enable policy makers, and others, such as industry leaders, universities, and the private sector, to draw on current Canadian S&T and IR&D data to make evidence-informed decisions.”

The CCA’s reports on the state of S&T and state of IR&D provide valuable data and analysis documenting Canada’s S&T and IR&D strengths and weaknesses. New data will help identify trends that have emerged in the Canadian S&T and IR&D environment in the past four to five years.

Under the guidance of the CCA’s Scientific Advisory Committee, a multidisciplinary, multi-sectoral expert panel is being assembled. It is anticipated that the final report will be released in a two-part sequence, with an interim report released in late 2016 and a final report released in 2017.

To learn more about this and the CCA’s other active assessments, visit Assessments in Progress.

The announcement offers information about the series of assessments,

About the State of S&T and IR&D Assessment Series

Current charge: What is the current state of science and technology and industrial research and development in Canada?

Sponsor: Innovation, Science and Economic Development Canada (ISED)

This assessment will be the third edition in the State of S&T and Industrial R&D assessment series.

Background on the Series

  • In 2006, the CCA completed its first report on The State of Science and Technology in Canada. The findings were integral to the identification of S&T priority areas in the federal government’s 2007 S&T strategy,  Mobilizing Science and Technology to Canada’s Advantage [the original link was not functional; I found the report on an archived page].
  • In 2010 the CCA was again asked to assess the state of S&T in Canada.  The State of Science and Technology in Canada, 2012 updated the 2006 report and provided a thorough analysis of the scientific disciplines and technological applications where Canada excelled in a global context. It also identified Canada’s S&T strengths, regional specializations, and emerging research areas.
  • In 2013, the CCA published The State of Industrial R&D in Canada. This report provided an in-depth analysis of research and development activities in Canadian industries and is one of the most detailed and systematic studies of the state of IR&D ever undertaken in Canada.

I wrote three posts after the second assessment was delivered in 2012. My Sept. 27, 2012 posting was an announcement of its launch and then I offered a two-part critique: part 1 was in a Dec. 28, 2012 posting and part 2 was in a second Dec. 28, 2012 posting. I did not write about the 2013 report on Canada’s industrial research and development efforts.

Given the size of the 2012 assessment of science and technology at 232 pp. (PDF) and the 2013 assessment of industrial research and development at 220 pp. (PDF) with two expert panels, the imagination boggles at the potential size of the 2016 expert panel and of the 2016 assessment combining the two areas.

Given the timing for the interim report (late 2016), I wonder if they are planning to release at the 2016 Canadian Science Policy Conference, which is being held in Ottawa from Nov. 8 – 10, 2016 (for the second year in a row and, I believe, the third time in eight conferences).