It’s the 150th anniversary for a series of equations electric charges and electric and magnetic fields that are still being explored. Jon Butterworth in a Nov. 22, 2015 posting on the Guardian science blog network explains (Note: A link has been removed),
The chances are that you are reading this article on some kind of electronic technology. You are definitely seeing it via visible light, unless you have a braille or audio converter. And it probably got to you via wifi or a mobile phone signal. All of those things are understood in terms of the relationships between electric charges and electric and magnetic fields summarised in Maxwell’s [James Clerk Maxwell] equations, published by the Royal Society in 1865, 150 years ago.
Verbally, the equations can be summarised as something like:
Electric and magnetic fields make electric charges move. Electric charges cause electric fields, but there are no magnetic charges. Changes in magnetic fields cause electric fields, and vice versa.
The equations specify precisely how it all happens, but that is the gist of it.
Butterworth got a rare opportunity to see the original manuscript,
Original manuscript of Maxwell’s seminal paper Photograph: Jon Butterworth/Royal Society [downloaded from http://www.theguardian.com/science/life-and-physics/2015/nov/22/maxwells-equations-150-years-of-light]
I love this description from Butterworth,
It was submitted in 1864 but, in a situation familiar to scientists everywhere, was held up in peer review. There’s a letter, dated March 1865, from William Thomson (later Lord Kelvin) saying he was sorry for being slow, that he’d read most of it and it seemed pretty good (“decidely suitable for publication”).
Then, there’s this,
The equations seem to have been very much a bottom-up affair, in that Maxwell collected together a number of known laws which were used to describe various experimental results, and (with a little extra ingredient of his own) fitted them into a unified framework. What is amazing is how much that framework then reveals, both in terms of deep physical principles, and rich physical phenomena.
I’m not excerpting any part of Butterworth’s description of how Maxwell fit these equations together for his unification theory as I think it should be read in its totality.
The section on quantum mechanics is surprising,
Now, one thing Maxwell’s equations don’t contain is quantum mechanics [emphasis mine]. They are classical equations. But if you take the quantum mechnical description of an electron, and you enforce the same charge conservation law/voltage symmetry that was contained in the classical Maxwell’s equations, something marvellous happens [emphasis mine]. The symmetry is denoted “U(1)”, and if you enforce it locally – that it, you say that you have to be allowed make different U(1) type changes to electrons at different points in space, you actually generate the quantum mechanical version of Maxwell’s equations out of nowhere [emphasis mine]. You produce the equations that describe the photon, and the whole of quantum electrodynamics.
I encourage you to read Butterworth’s Nov. 22, 2015 posting where he also mention two related art/science projects and has embedded a video animation of the principles discussed in his posting.
For anyone unfamiliar with Butterworth, there’s this description at the Guardian,
Despite all the excitement and claims for nanoparticles as vehicles for drug delivery to ‘sick’ cells there is at least one substantive problem, the drug-laden nanoparticles don’t actually enter the interior of the cell. They are held in a kind of cellular ‘waiting room’.
Leonid Schneider in a Nov. 20, 2015 posting on his For Better Science blog describes the process in more detail,
A large body of scientific nanotechnology literature is dedicated to the biomedical aspect of nanoparticle delivery into cells and tissues. The functionalization of the nanoparticle surface is designed to insure their specificity at targeting only a certain type of cells, such as cancers cells. Other technological approaches aim at the cargo design, in order to ensure the targeted release of various biologically active agents: small pharmacological substances, peptides or entire enzymes, or nucleotides such as regulatory small RNAs or even genes. There is however a main limitation to this approach: though cells do readily take up nanoparticles through specific membrane-bound receptor interaction (endocytosis) or randomly (pinocytosis), these nanoparticles hardly ever truly reach the inside of the cell, namely its nucleocytoplasmic space. Solid nanoparticles are namely continuously surrounded by the very same membrane barrier they first interacted with when entering the cell. These outer-cell membrane compartments mature into endosomal and then lysosomal vesicles, where their cargo is subjected to low pH and enzymatic digestion. The nanoparticles, though seemingly inside the cell, remain actually outside. …
What follows is a stellar piece featuring counterclaims about and including Schneider’s own journalistic research into scientific claims that the problem of gaining entry to a cell’s true interior has been addressed by technologies developed in two different labs.
Having featured one of the technologies here in a July 24, 2015 posting titled: Sticky-flares nanotechnology to track and observe RNA (ribonucleic acid) regulation and having been contacted a couple of times by one of the scientists, Raphaël Lévy from the University of Liverpool (UK), challenging the claims made (Lévy’s responses can be found in the comments section of the July 2015 posting), I thought a followup of sorts was in order.
Leviathan and the Air-Pump: Hobbes, Boyle, and the Experimental Life (published 1985) is a book by Steven Shapin and Simon Schaffer. It examines the debate between Robert Boyle and Thomas Hobbes over Boyle’s air-pump experiments in the 1660s.
The style seems more genteel than what a contemporary Canadian or US audience is accustomed to but Hobbes and Boyle (and proponents of both sides) engaged in bruising communication.
There was a lot at stake then and now. It’s not just the power, prestige, and money, as powerfully motivating as they are, it’s the research itself. Scientists work for years to achieve breakthroughs or to add more to our common store of knowledge. It’s painstaking and if you work at something for a long time, you tend to be invested in it. Saying you’ve wasted ten years of your life looking at the problem the wrong way or have misunderstood your data is not easy.
As for the current debate, Schneider’s description gives no indication that there is rancour between any of the parties but it does provide a fascinating view of two scientists challenging one of the US’s nanomedicine rockstars, Chad Mirkin. The following excerpt follows the latest technical breakthroughs to the interior portion of the cell through three phases of the naming conventions (Nano-Flares, also known by its trade name, SmartFlares, which is a precursor technology to Sticky-Flares), Note: Links have been removed,
The next family of allegedly nucleocytoplasmic nanoparticles which Lévy turned his attention to, was that of the so called “spherical nucleic acids”, developed in the lab of Chad Mirkin, multiple professor and director of the International Institute for Nanotechnology at the Northwestern University, USA. These so called “Nano-Flares” are gold nanoparticles, functionalized with fluorophore-coupled oligonucleotides matching the messenger RNA (mRNA) of interest (Prigodich et al., ACS Nano 3:2147-2152, 2009; Seferos et al., J Am. Chem.Soc. 129:15477-15479, 2007). The mRNA detection method is such that the fluorescence is initially quenched by the gold nanoparticle proximity. Yet when the oligonucleotide is displaced by the specific binding of the mRNA molecules present inside the cell, the fluorescence becomes detectable and serves thus as quantitative read-out for the intracellular mRNA abundance. Exactly this is where concerns arise. To find and bind mRNA, spherical nucleic acids must leave the endosomal compartments. Is there any evidence that Nano-Flares ever achieve this and reach intact the nucleocytoplasmatic space, where their target mRNA is?
Lévy’s lab has focused its research on the commercially available analogue of the Nano-Flares, based on the patent to Mirkin and Northwestern University and sold by Merck Millipore under the trade name of SmartFlares. These were described by Mirkin as “a powerful and prolific tool in biology and medical diagnostics, with ∼ 1,600 unique forms commercially available today”. The work, led by Lévy’s postdoctoral scientist David Mason, now available in post-publication process at ScienceOpen and on Figshare, found no experimental evidence for SmartFlares to be ever found outside the endosomal membrane vesicles. On the contrary, the analysis by several complementary approaches, i.e., electron, fluorescence and photothermal microscopy, revealed that the probes are retained exclusively within the endosomal compartments.
In fact, even Merck Millipore was apparently well aware of this problem when the product was developed for the market. As I learned, Merck performed a number of assays to address the specificity issue. Multiple hundred-fold induction of mRNA by biological cell stimulation (confirmed by quantitative RT-PCR) led to no significant changes in the corresponding SmartFlare signal. Similarly, biological gene downregulation or experimental siRNA knock-down had no effect on the corresponding SmartFlare fluorescence. Cell lines confirmed as negative for a certain biomarker proved highly positive in a SmartFlare assay. Live cell imaging showed the SmartFlare signal to be almost entirely mitochondrial, inconsistent with reported patterns of the respective mRNA distributions. Elsewhere however, cyanine dye-labelled oligonucleotides were found to unspecifically localise to mitochondria (Orio et al., J. RNAi Gene Silencing 9:479-485, 2013), which might account to the often observed punctate Smart Flare signal.
More recently, Mirkin lab has developed a novel version of spherical nucleic acids, named Sticky-Flares (Briley et al., PNAS 112:9591-9595, 2015), which has also been patented for commercial use. The claim is that “the Sticky-flare is capable of entering live cells without the need for transfection agents and recognizing target RNA transcripts in a sequence-specific manner”. To confirm this, Lévy used the same approach as for the striped nanoparticles [not excerpted here]: he approached Mirkin by email and in person, requesting the original microscopy data from this publication. As Mirkin appeared reluctant, Lévy invoked the rules for data sharing by the journal PNAS, the funder NSF as well as the Northwestern University. After finally receiving Mirkin’s thin-optical microscopy data by air mail, Lévy and Mason re-analyzed it and determined the absence of any evidence for endosomal escape, while all Sticky-Flare particles appeared to be localized exclusively inside vesicular membrane compartments, i.e., endosomes (Mason & Levy, bioRxiv 2015).
I encourage you to read Schneider’s Nov. 20, 2015 posting in its entirety as these excerpts can’t do justice to it.
The PNAS surprise
PNAS (Proceedings of the National Academy of Science) published one of Mirkin’s papers on ‘Sticky-flares’ and is where scientists, Raphaël Lévy and David Mason, submitted a letter outlining their concerns with the ‘Sticky-flares’ research. Here’s the response as reproduced in Lévy’s Nov. 16, 2015 posting on his Rapha-Z-Lab blog
Dear Dr. Levy,
I regret to inform you that the PNAS Editorial Board has declined to publish your Letter to the Editor. After careful consideration, the Board has decided that your letter does not contribute significantly to the discussion of this paper.
Thank you for submitting your comments to PNAS.
Judge for yourself, Lévy’s and Mason’s letter can be found here (pdf) and here.
My primary interest in this story is in the view it provides of the scientific process and the importance of and difficulty associated with the debates.
I can’t venture an opinion about the research or the counterarguments other than to say that Lévy’s and Mason’s thoughtful challenge bears more examination than PNAS is inclined to accord. If their conclusions or Chad Mirkin’s are wrong, let that be determined in an open process.
I’ll leave the very last comment to Schneider who is both writer and cartoonist, from his Nov. 20, 2015 posting,
On Tuesday, November 24, 2015 at 7:30 pm in the back room of The Railway Club (2nd floor of 579 Dunsmuir St. [at Seymour St.]), Café Scientifique will be hosting a talk about climate change and the rise of complex life (from the Nov. 12, 2015 announcement),
Our speaker for the evening will be Dr. Mark Jellinek. The title of his talk is:
The Formation and Breakup of Earth’s Supercontinents and the Remarkable Link to Earth’s Climate and the Rise of Complex Life
Earth history is marked by the intermittent formation and breakup of “supercontinents”, where all the land mass is organized much like a completed jigsaw puzzle centered at the equator or pole of the planet. Such events disrupt the mantle convective motions that cool our planet, affecting the volcanic and weathering processes that maintain Earth’s remarkably hospitable climate, in turn. In this talk I will explore how the last two supercontinental cycles impelled Earth into profoundly different climate extreme’s: a ~150 million year long cold period involving protracted global glaciations beginning about 800 million years ago and a ~100 million year long period of extreme warming beginning about 170 million years ago. One of the most provocative features of the last period of global glaciation is the rapid emergence of complex, multicellular animals about 650 million years ago. Why global glaciation might stimulate such an evolutionary bifurcation is, however, unclear. Predictable environmental stresses related to effects of the formation and breakup of the supercontinent Rodinia on ocean chemistry and Earth’s surface climate may play a crucial and unexpected role that I will discuss.
A professor in the Dept. of Earth, Ocean and Atmospheric Sciences at the University of British Columbia, Dr. Jellinek’s research interests include Volcanology, Geodynamics, Planetary Science, Geological Fluid Mechanics. You can find out more about Dr. Jellinek and his work here.
Joyce Murray and the Paris Climate Conference (sold out)
Joyce Murray is a Canadian Member of Parliament, (Liberal) for the riding of Vancouver Quadra who hosts a regular breakfast meeting where topics of interest (child care, seniors, transportation, the arts, big data, etc.) are discussed. From a Nov. 13, 2015 email announcement,
You are invited to our first post-election Vancouver Quadra MP Breakfast Connections on November 27th at Enigma Restaurant, for a discussion with Dr. Mark Jaccard on why the heat will be on world leaders in Paris, in the days leading to December 12th, at the Paris Climate Conference (COP 21).
After 20 years of UN negotiations, the world expects a legally binding universal agreement on climate to keep temperature increases below 2°C! The climate heat will especially be on laggards like Canada and Australia’s new Prime Ministers. What might be expected of the Right Honorable Justin Trudeau and his provincial premiers? What are the possible outcomes of COP21?
Dr. Jaccard has worked with leadership in countries like China and the United States, and helped develop British Columbia’s innovative Climate Action Plan and Carbon Tax.
Join us for this unique opportunity to engage with a climate policy expert who has participated in this critical global journey. From the occasion of the 1992 Rio Earth Summit resulting in the UN Framework Convention on Climate Change (UNFCCC), through the third Conference of Parties’ (COP3) Kyoto Protocol, to COP21 today, the building blocks for a binding international solution have been assembled. What’s still missing?
Mark has been a professor in the School of Resource and Environmental Management at Simon Fraser University since 1986 and is a global leader and consultant on structuring climate mitigation solutions. Former Chair and CEO of the British Columbia Utilities Commission, he has published over 100 academic papers, most of these related to his principal research focus: the design and application of energy-economy models that assess the effectiveness of sustainable energy and climate policies.
When: Friday November 27th 7:30 to 9:00AM
Where: Enigma Restaurant 4397 west 10th Avenue (at Trimble)
Cost: $20 includes a hot buffet breakfast; $10 for students (cash only please)
RSVP by emailing email@example.com or call 604-664-9220
D-Wave Systems, a Canadian quantum computing company, seems to be making new business announcements on a weekly basis. After last week’s US Los Alamos National Laboratory announcement (Nov. 12, 2015 posting) , there’s a Nov. 16, 2015 news item on Nanotechnology Now,
Harris & Harris Group, Inc. (NASDAQ:TINY), an investor in transformative companies enabled by disruptive science, notes that its portfolio company, D-Wave Systems, Inc., announced that it has entered into a multi-year agreement with Lockheed Martin to upgrade the company’s 512-qubit D-Wave Two™ quantum computer to the new D-Wave 2X™ system with 1,000+ qubits.
D-Wave Systems Inc., the world’s first quantum computing company, today announced that it has entered into a multi-year agreement with Lockheed Martin (NYSE: LMT) to upgrade the company’s 512-qubit D-Wave Two™ quantum computer to the new D-Wave 2X™ system with 1,000+ qubits. This represents the second system upgrade since Lockheed Martin became D-Wave’s first customer in 2011 with the purchase of a 128 qubit D-Wave One™ system. The agreement includes the system, maintenance and associated professional services.
“Our mission is to solve complex challenges, advance scientific discovery and deliver innovative solutions to our customers, which requires expertise in the most advanced technologies,” said Greg Tallant, Lockheed Martin fellow and lead for the University of Southern California-Lockheed Martin Quantum Computation Center (QCC). “Through our continued investment in D-Wave technology, we are able to push the boundaries of quantum computing and apply the latest technologies to address the real-world problems being faced by our customers.”
For quantum computing, the performance gain over traditional computing is most evident in exceedingly complex computational problems. This could be in areas such as validating the performance of software or vehicle planning and scheduling. With the new D-Wave system, Lockheed Martin researchers will be able to explore solutions for significantly larger computational problems with improved accuracy and execution time.
The new system will be hosted at the University of Southern California-Lockheed Martin Quantum Computation Center, which first began exploring the power of quantum computing with the D-Wave One, the world’s first quantum computer.
The installation of the D-Wave 2X system will be completed in January 2016.
Who knows what next week will bring for D-Wave, which by the way is located in Vancouver, Canada or, more accurately, Burnaby?
Just hours prior to the terrorist bombings in Paris (Friday, Nov. 13, 2015), Tash Reith-Banks published a Nov. 13, 2015 essay (one of a series) in the Guardian about science, technology, engineering, and mathematics (STEM) as those specialties apply to humanitarian aid with a special emphasis on Syrian refugee crisis.
This first essay focuses on how engineering and mathematics are essential when dealing with crises (from Reith-Banks’s Nov. 13, 2015 essay), Note: Links have been removed,
Engineering is a clear starting point: sanitation, shelter and supply lines are all essential in any crisis. As Martin McCann, CEO at RedR, which trains humanitarian NGO workers says: “There is the obvious work in providing water and sanitation and shelter. By shelter, we mean not only shelter or housing for disaster-affected people or refugees, but also structures to store both food and non-food items. Access is always critical, so once again engineers are needed to build roads or in some cases temporary landing strips.”
Emergency structures need to be light and fast to transport and erect, but tend not to be durable. One recent development comes from engineers Peter Brewin and Will Crawford of Concrete Canvas., The pair have developed a rapid-setting concrete-impregnated fabric that requires only air and water to harden into a water-proof, fire-resistant construction. This has been used to create rapidly deployable concrete shelters that can be carried in a bag and set up in an hour.
Here’s what one of the concrete shelters looks like,
A Concrete Canvas shelter. Once erected the structure takes 24 hours to harden, and then can be further insulated with earth or snow if necessary. Photograph: Gareth Phillips/Gareth Phillips for the Guardian
There are many kinds of crises which can lead to a loss of shelter, access to water and food, and diminished safety and health as Reith-Banks also notes in a passage featuring mathematics (Note: A link has been removed),
Maths might seem a far cry from the sort of practical innovation described above, but of course it’s the root of great logistics. Alistair Clark from the University of the West of England is using advanced mathematical modelling to improve humanitarian supply chains to ensure aid is sent exactly where it is needed. Part of the Newton Mobility scheme, Clark’s project will partner with Brazilian disaster relief agencies and develop ways of modelling everything from landslides to torrential downpours in order to create sophisticated humanitarian supply chains that can rapidly adapt to a range of possible disaster scenarios and changing circumstances.
In a similar vein, Professor Amr Elnashai, founder and co-editor of the Journal of Earthquake Engineering, works in earthquake-hit areas to plan humanitarian relief for future earthquakes. He recently headed a large research and development effort funded by the Federal Emergency Management Agency in the USA (FEMA), to develop a computer model of the impact of earthquakes on the central eight states in the USA. This included social impact, temporary housing allocation, disaster relief, medical and educational care, as well as engineering damage and its economic impact.
Reith-Banks also references nanotechnology (Note: A link has been removed),
… Up to 115 people die every hour in Africa from diseases linked to contaminated drinking water and poor sanitation, particularly in the wake of conflicts and environmental disasters. Dr Askwar Hilonga recently won the Royal Academy of Engineering Africa Prize, which is dedicated to African inventions with the potential to bring major social and economic benefits to the continent. Hilonga has invented a low cost, sand-based water filter. The filter combines nanotechnology with traditional sand-filtering methods to provide safe drinking water without expensive treatment facilities. …
Dr. Hilonga who is based in Tanzania was featured here in a June 16, 2015 posting about the Royal Academy of Engineering Prize, his research, and his entrepreneurial efforts.
Reith-Banks’s* essay provides a valuable and unexpected perspective on the humanitarian crises which afflict this planet *and I’m looking forward to the rest of the series*.
*’Reith-Banks’s’ replaced ‘This’ and ‘and I’m looking forward to the rest of the series’ was added Nov. 17, 2015 at 1620 hours PST.
It can be euphoric experience making a major technical breakthrough (June 2015), selling to a new large customer (Nov. 2015) and impressing your important customers so they upgrade to the new system (Oct. 2015) within a few short months.* D-Wave Systems (a Vancouver-based quantum computer company) certainly has cause to experience it given the events of the last six weeks or so. Yesterday, in a Nov. 11, 2015, D-Wave news release, the company trumpeted its sale of a 1000+ Qubit system (Note: Links have been removed),
D-Wave Systems Inc., the world’s first quantum computing company, announced that Los Alamos National Laboratory will acquire and install the latest D-Wave quantum computer, the 1000+ qubit D-Wave 2X™ system. Los Alamos, a multidisciplinary research institution engaged in strategic science on behalf of national security, will lead a collaboration within the Department of Energy and with select university partners to explore the capabilities and applications of quantum annealing technology, consistent with the goals of the government-wide National Strategic Computing Initiative. The National Strategic Computing Initiative, created by executive order of President Obama in late July , is intended “to maximize [the] benefits of high-performance computing (HPC) research, development, and deployment.”
“Los Alamos is a global leader in high performance computing and a pioneer in the application of new architectures to solve critical problems related to national security, energy, the environment, materials, health and earth science,” said Robert “Bo” Ewald, president of D-Wave U.S. “As we work jointly with scientists and engineers at Los Alamos we expect to be able to accelerate the pace of quantum software development to advance the state of algorithms, applications and software tools for quantum computing.”
Harris & Harris Group, Inc. (NASDAQ:TINY), an investor in transformative companies enabled by disruptive science, notes that its portfolio company, D-Wave Systems, Inc., announced that Los Alamos National Laboratory will acquire and install the latest D-Wave quantum computer, the 1000+ qubit D-Wave 2X™ system.
The news about the Los Alamos sale comes only weeks after D-Wave announced renewed agreements with Google, NASA (US National Aeronautics and Space Administration), and the Universities Space Research Association (USRA) in the aftermath of a technical breakthrough. See my Oct. 5, 2015 posting for more details about the agreements, the type of quantum computer D-Wave sells, and news of interesting and related research in Australia. Cracking the 512 qubit barrier also occasioned a posting here (June 26, 2015) where I described the breakthrough, the company, and included excerpts from an Economist article which mentioned D-Wave in its review of research in the field of quantum computing.
Congratulations to D-Wave!
*’It can be euphoric selling to your first large and/or important customers and D-Wave Systems (a Vancouver-based quantum computer company) certainly has cause to experience it. ‘ changed to more accurately express my thoughts to ‘It can be euphoric experience making a major technical breakthrough (June 2015), selling to a new large customer (Nov. 2015) and impressing your important customers so they upgrade to the new system (Oct. 2015) within a few short months.’ on Nov. 12, 2015 at 1025 hours PST.
There’s some nanotechnology in the new James Bond movie, Spectre, according to Johnny Brayson in his Nov. 5, 2015 (?) article for Bustle (Note: A link has been removed),
James Bond has always been known for his gadgets, and although Daniel Craig’s version of the character has been considerably less doohickey-heavy than past iterations, he’s still managed to make use of a few over the years, from his in-car defibrillator in Casino Royale to his biometric-coded gun in Skyfall. But Spectre, the newest Bond film, changes up the formula and brings more gadgets than fans have seen in years. There are returning favorites like a tricked out Aston Martin and an exploding watch, but there’s also a new twist on an old gadget that allows Bond to be tracked by his bosses, an injected microchip that records his every move. …
To Bond fans, though, the technology isn’t totally new. In Casino Royale, Bond is injected with a microchip that tracks his location and monitors his vital signs. However, when he’s captured by the bad guys, the device is cut out of his arm, rendering it useless. MI6 seems to have learned their lesson in Spectre, because this time around Bond is injected with Smart Blood, consisting of nanotechnology that does the same thing while flowing microscopically through his veins. As for whether it could really happen, the answer is not yet, but someday it could be.
Brayson provides an introduction to some of the exciting developments taking place scientifically in an intriguing way by relating those developments to a James Bond movie. Unfortunately, some of his details are wrong. For example, he is describing a single microchip introduced subcutaneously (under the skin) synonymously with ‘smart blood’ which would be many, many microchips prowling your bloodstream.
So, enjoy the article but exercise some caution. For example, this part in his article is mostly right (Note: Links have been removed),
However, there does actually exist nanotechnology that has been safely inserted into a human body — just not for the purposes of tracking. Some “nanobots”, microscopic robots, have been used within the human eye to deliver drugs directly to the area that needs them [emphasis mine], and the idea is that one day similar nanobots will be able to be injected into one’s bloodstream to administer medication or even perform surgery. Some scientists even believe that a swarm of nanobots in the bloodstream could eventually make humans immune to disease, as the bots would simply destroy or fix any issues as soon as they arrive.
According to a Jan. 30, 2015 article by Jacopo Prisco for CNN, scientists at ETH Zurich were planning to start human clinical trials to test ‘micro or nanobots’ in the human eye. I cannot find any additional information about the proposed trials. Similarly, Israeli researcher Ido Bachelet announced a clinical trial of DNA nanobots on one patient to cure their leukemia (my Jan. 7, 2015 posting). An unsuccessful attempt to get updated information can found in a May 2015 Reddit Futurology posting.
That film has been doing very well and, for the most part, seems to have gotten kudos for its science. However for those who like to dig down for more iinformation, Jeffrey Kluger’s Sept. 30, 2015 article for Time magazine expresses some reservations about the science while enthusing over its quality as a film,
… Go see The Martian. But still: Don’t expect all of the science to be what it should be. The hard part about good science fiction has always been the fiction part. How many liberties can you take and how big should they be before you lose credibility? In the case of The Martian, the answer is mixed.
The story’s least honest device is also its most important one: the massive windstorm that sweeps astronaut Mark Watney (Matt Damon) away, causing his crew mates to abandon him on the planet, assuming he has been killed. That sets the entire castaway tale into motion, but on a false note, because while Mars does have winds, its atmosphere is barely 1% of the density of Earth’s, meaning it could never whip up anything like the fury it does in the story.
“I needed a way to force the astronauts off the planet, so I allowed myself some leeway,” Weir conceded in a statement accompanying the movie’s release. …
It was exceedingly cool actually, and for that reason Weir’s liberty could almost be forgiven, but then the story tries to have it both ways with the same bit of science. When a pressure leak causes an entire pod on Watney’s habitat to blow up, he patches a yawning opening in what’s left of the dwelling with plastic tarp and duct tape. That might actually be enough to do the job in the tenuous atmosphere that does exist on Mars. But in the violent one Weir invents for his story, the fix wouldn’t last a day.
There’s more to this entertaining and educational article including embedded images and a video.
The story of science in the Muslim world is extraordinary, influencing science to this day, and is not well known even within its own community. The days when Muslim or Islamic scientists led the world are long gone and that is cause for concern. An Oct. 29, 2015 Malaysian Industry-Government Group for High Technology press release on EurekAlert argues that universities in Muslim countries must reinvent themselves to transform society and achieve scientific excellence,
A Task Force of international experts, formed by the Muslim World Science Initiative, today released a report [Science at Universities of the Muslim World] on the state of science at universities of the Muslim world.
To assess the state of science at universities of the Muslim world, the Task Force reviewed the rankings of Muslim-world’s universities globally, scientific production (number of papers published and citations), the level of spending on research and development (R&D), female participation in the scientific workforce, and other indicators.
The results were compared to those of countries deemed comparable in terms of gross domestic product (GDP) per capita, e.g. Brazil, Israel, Spain, South Africa, and South Korea.
The Task Force noted recent improvements in scientific publishing across a number of countries and a relatively healthy gender ratio among university students, even though the overall state of science in the Muslim World remains ‘poor,’ as depicted by
the disproportionately small number of Nobel Laureates
the small number of universities in top global rankings
the low spending on R&D, and
the abysmal performance of pre-university students on math and science tests
Seeking to assess if universities were the ‘main culprits’ in this sorry state of affairs, the Task Force highlighted significant challenges at the Universities of the Muslim World.
In particular, the Task Force lamented the fact that science education in most Organization of Islamic Cooperation (OIC) member countries was extremely narrow in focus and did little to enable students to think critically, especially beyond their respective domains of specialty.
The Task Force calls for broad liberal education for scientists and engineers to enable them to function effectively in addressing complex multi-disciplinary challenges that the world faces today.
The Task Force also noted that self-censorship was often practiced in the selection of topics to be taught, particularly regarding controversial subjects such as the theory of evolution.
The Task Force called for the introduction and systematic study of philosophy of science and history of the sciences of the Muslim ‘Golden Age’ and beyond for students to navigate and develop a perspective on these difficult disciplinary boundaries and overlaps. The language of instruction also created significant challenges.
Faculty members were also ill-trained to teach using cutting-edge methods such as inquiry-based science education and had little autonomy to innovate.
While the Task Force called for greater autonomy for the universities, it also emphasized that they must become meritocracies and aspire for true scientific excellence rather than playing for temporary gains in numbers or rankings. It also calls for zero tolerance on plagiarism and other forms of academic misconduct.
The Report of the Task Force includes: a foreword by the Chair, Tan Sri Zakri Abdul Hamid, the main assessment and recommendations, and individual essays written by the Task Force members on issues, including
Science, Society & the University
Are universities of the Muslim world helping spread a culture of science through society?
Should Religion Be Kept Out of the Science Classroom?
STEM Education and the Muslim Gender Divide and
The Need of Liberal Education for Science and Engineering
The Task Force is putting out an open call for universities across the Muslim world to join a voluntary Network of Excellence of Universities for Science (NEXUS), to be launched early next year.
This peer group will be managed by the task force and housed in Tan Sri Zakri’s office. NEXUS will run summer schools for university administrators, monitor the progress of reforms at participating universities, and issue a peer report card that will assess the performance of the universities in meeting milestones, thus recognizing and inspiring further improvements. True transformation will require much broader action from ministries, regulators and funding agencies, and these may be the most resistant to change.
Releasing the Report of the Task Force, Tan Sri Zakri Abdul Hamid stressed that “universities must reinvent themselves to lead the scientific reforms in the Muslim World, and as they do so they must embrace key ideas of merit and transparency, engagement with society, and pedagogical and curricular innovation.”
Professor Nidhal Guessoum, the Task Force’s Convenor, noted that “Task Force members strongly believe that the most appropriate venue for action on our recommendations is the university itself. The most essential ingredient in creating excellence in science and science teaching at a university is a realization, within a university’s highest leadership and its faculty, of the need to give up the old and dated ways, renew the purpose, and re-write the genetic code of their university.
Dr. Athar Osama, the Director of the Project noted that “the purpose of Muslim World Science Initiative is to jumpstart a dialogue within the society on critical issues at the intersection of science, society, and Islam. The Task Force has done a commendable job in laying the groundwork for a very important conversation about our universities.”
The divide between science/technology/engineering/mathematics (STEM) education and other fields of interest such as social sciences, the arts, and the humanities may be larger in the Islamic world (and to some extent reversed with humanities looking down on science) but it is a problem elsewhere, often expressed as a form of snobbery, as I alluded to in my Aug. 7, 2015 posting titled: Science snobbery and the problem of accessibility.
An Oct. 28, 2015 Nature essay about Islam, science, and the report by Nidhal Guessou and Athar Osama (two members of the Task Force; Note: Links have been removed) provides more context,
The Islamic civilization lays claim to the world’s oldest continually operational university. The University of Qarawiyyin was founded in Fes, Morocco, in ad 859, at the beginning of an Islamic Golden Age. Despite such auspicious beginnings, universities in the region are now in dire straits, as demonstrated by a report we have authored, released this week (see go.nature.com/korli3).
The 57 countries of the Muslim world — those with a Muslim-majority population, and part of the Organisation of Islamic Cooperation (OIC) — are home to nearly 25% of the world’s people. But as of 2012, they had contributed only 1.6% of the world’s patents, 6% of its academic publications, and 2.4% of the global research expenditure1, 2.
The authors note problems and at least one success with regard to curriculum (from the Nature essay; Note: Links have been removed),
Science classes themselves have serious problems. The textbooks used in OIC universities are often imported from the United States or Europe. Although the content is of a high standard, they assume a Western experience and use English or French as the language of instruction. This disadvantages many students, and creates a disconnect between their education and culture. To encourage the production of higher-quality, local textbooks and other academic material, universities need to reward staff for producing these at least as much as they do for research publication.
Some basic facts are seen as controversial, and marginalized. Evolution, for example, is usually taught only to biology students, often as “a theory”, and is rarely connected to the rest of the body of knowledge. One ongoing study has found, for example, that most Malaysian physicians and medical students reject evolution (see go.nature.com/38cswo). Evolution needs to be taught widely and shown to be compatible with Islam and its culture6. Teaching the philosophy and history of science would help, too.
The global consensus is that enquiry-based science education fosters the deepest understanding of scientific concepts and laws. But in most OIC universities, lecture-based teaching still prevails. Exceptions are rare. One is the Petroleum Institute, an engineering university in Abu Dhabi, UAE, where the faculty has created a hands-on experience with positive results on student interest and enrolment, particularly of women.
For anyone interested in the full report, it can be requested from the Muslim Science website.
One final comment, here’s the list of task force members in the Oct. 29, 2015 news release which includes someone from Mauritius (my father was born there),
Tan Sri Zakri Abdul Hamid, Science Advisor to Prime Minister of Malaysia, Chair of the Task Force on Science at the Universities of the Muslim World
Prof. Nidhal Guessoum, American University of Sharjah, UAE, Convenor of the Task Force on Science at Universities of the Muslim World
Dr. Mohammad Yusoff Sulaiman, President and CEO, MiGHT, Malaysia, Co-Convenor of the Task Force on Science at Universities of the Muslim World.
Dr. Moneef Zou’bi, Executive Director, Islamic World Academy of Science (IAS)
Prof. Adil Najam, Dean Frederick S. Pardee School of Global Studies, Boston University and former Vice Chancellor, Lahore University of Management Sciences (LUMS)
Prof. Ameenah Gurib-Fakim, Fellow of IAS, President of the Republic of Mauritius, and Professor at University of Mauritius
Prof. Mustafa El-Tayeb, President , Future University, Khartoum, Sudan
Prof. Abdur Razak Dzulkifli, President of International Association of Universities (IAU), and former Vice Chancellor USM, Malaysia
Dr. Nadia Alhasani, Dean of Student Life (formerly Dean of Women in Science and Engineering (WiSE), The Petroleum Institute, Abu Dhabi, UAE
Prof. Jamal Mimouni, Professor, University of Constantine-1, Algeria
Dr. Dato Lee Yee Cheong, Chair ISTIC Governing Board / Chair IAP SEP Global Council
Prof. Michael Reiss, Professor of Science Education, UCL Institute of Education, University College, London, Expert Advisor to the Muslim-Science.Com Task Force on Science at Universities of the Muslim World
Prof. Bruce Alberts, Professor of Biochemistry, University of California, San Francisco; President Emeritus, National Academy of Sciences, and Recipient, 2014 US Presidential Medal of Science, Expert Advisor to the Muslim-Science.Com Task Force on Science at Universities of the Muslim World
Professor Shoaib S. H. Zaidi, Professor and Dean of School of Sciences and Engineering, Habib University, Karachi
Dr. Athar Osama, Founder Muslim World Science Initiative, and Project Director of the Task Forces Project.
This show is still making its way around the world with the latest stop, as of Oct. 20, 2015, at the Library of Alexandria in Egypt.
A Jan. 21, 2010 article by Nick Higham and Margaret Ryan for BBC (British Broadcasting Corporation) news online describes some of the exhibit highlights,
From about 700 to 1700, many of history’s finest scientists and technologists were to be found in the Muslim world.
In Christian Europe the light of scientific inquiry had largely been extinguished with the collapse of the Roman empire. But it survived, and indeed blazed brightly, elsewhere.
From Moorish Spain across North Africa to Damascus, Baghdad, Persia and all the way to India, scientists in the Muslim world were at the forefront of developments in medicine, astronomy, engineering, hydraulics, mathematics, chemistry, map-making and exploration.
Salim Al-Hassani, a former professor of engineering at Umist (University of Manchester Institute of Science and Technology) is a moving force behind the exhibition, 1001 Inventions.
Visitors to the exhibition will be greeted by a 20 ft high replica of a spectacular clock designed in 1206 by the inventor Al-Jazari.
It incorporates elements from many cultures, representing the different cultural and scientific traditions which combined and flowed through the Muslim world.
The clock’s base is an elephant, representing India; inside the elephant the water-driven works of the clock derive from ancient Greece.
A Chinese dragon swings down from the top of the clock to mark the hours. At the top is a phoenix, representing ancient Egypt.
Sitting astride the elephant and inside the framework of the clock are automata, or puppets, wearing Arab turbans.
Elsewhere in the exhibition are displays devoted to water power, the spread of education (one of the world’s first universities was founded by a Muslim woman, Fatima al-Fihri), Muslim architecture and its influence on the modern world and Muslim explorers and geographers.
There is a display of 10th Century surgeons’ instruments, a lifesize model of a man called Abbas ibn Firnas, allegedly the first person to have flown with wings, and a model of the vast 100 yard-long junk commanded by the Muslim Chinese navigator, Zheng He.
The description of the exhibition items is compelling.
Science and the modern world debate (Humanism and Islam)
Yasmin Khan has written up a transcript of sorts in a Nov. 6, 2015 posting on the Guardian science blogs about a science debate (which took place Wednesday, Oct. 28, 2015 in London, UK) where Humanist and Islamic perspectives were being discussed (Note: Links have been removed),
Two important figures came head-to-head at Conway Hall, to discuss Islamic versus Humanist perspectives on science and the modern world. Jim Al-Khalili made the final public appearance of his term as president of the British Humanist Association during this stimulating, and at times provoking, debate with Ziauddin Sardar, chair of the Muslim Institute.
Al-Khalili advocated the values of the European Enlightenment, arguing that ever since the “Age of Reason” took hold during the 18th century, Humanists have looked to science instead of religion to explore and comprehend the world. Sardar upheld the view that it is the combination of faith and reason that offers a fuller understanding of the world, maintaining that it was this worldview that enabled the development of science in the Islamic golden Age.
A practising Muslim, Sardar is on an independent mission to promote rational, considered thought in interpreting the Qur’an. He explained that when he came to the UK from Pakistan, he found comfort in the familiar language of mathematics, which set him on a trajectory to train as a physicist: “God doesn’t need me, I need him. It makes me a better person and a better scientist”, he said.
In short, Sardar’s view is that although human knowledge at times converges with the Qur’an, the text should certainly not be treated as a scientific encyclopaedia. In support of this view, Sardar lamented the emergence of the I’jaz movement, which insists the Qur’an contains descriptions of modern scientific phenomena ranging from quantum mechanics to accurate descriptions of the stages of embryology and geology. In Sardar’s opinion, this stems from insecurity and a personal need to vindicate Islam to others.
Jim Al-Khalili agreed that ascribing literal meanings to religious texts can be perilous and that these verses should be interpreted more metaphorically. Likewise, when Einstein famously said “God does not play dice” he was using a figure of speech to acknowledge that there are things we don’t yet understand but this shouldn’t stop us from trying to find out more.
Whilst Al-Khalili is a staunch atheist, he adopts what he describes as an “accommodationist” approach in his interactions with people of religious faith: “I don’t think people who believe in God are irrational, I just don’t see a need to believe there is a purpose for why things are the way they are.” Born in Bagdad, Al-Khalili grew up in Iraq. His mother was Christian and his father was Shia, but he never heard them quarrel about religion. By the time he reached his teens he felt that he had distanced himself from needing any form of spirituality and his subsequent scientific training cemented this worldview. He asserted that his core values are empathy, humility and respect, without being driven by a reward in an afterlife: “It’s not just people of religious faith that have a moral compass – morality is what makes us human.”
I encourage you to read Khan’s piece (Nov. 6, 2015 posting) in its entirety as she provides historical and contemporary context to what seems to have been a fascinating and nuanced debate. Plus, there’s a bit of a bonus at the end where Khan is described as the producer of Sindbad Sci-Fi, a website where they are Reimagining Arab Science Fiction. From the website’s About page,
Sindbad Sci-Fi is an initiative for spurring the discovery of and engagement with Arab Science Fiction through dialogue. Our aim is to sustain a growing community of interest through brokering face-to-face and online discussion, building new partnerships and project collaborations along the way.
Many of us know and love Sindbad the sailor as the fictional sailor from the Arabian Book of OneThousand and One Nights, considered as being an early composite work of proto-science fiction and fantasy. His extraordinary voyages led him to adventures in magical places whilst meeting monsters and encountering supernatural phenomena.
Sindbad Sci-Fi is reviving Sindbad’s adventurous spirit for exploration and discovery. Join us as we continue star trekking across the Middle East, North Africa, South Asia and beyond. Together, we will boldly go where no one else has gone before!
I’m pretty sure somebody associated with this site is a Star Trek fan.
“Industry City Distillery has been a beautiful accident from the start,” so begins Robb Todd’s Oct. 23, 2015 article for Fast Company about a remarkable vodka distillery situated in New York City,
Cofounders David Kyrejko and Zachary Bruner didn’t decide to make vodka because they love vodka. The distillery came about as the byproduct of a byproduct, faced challenges most distilleries don’t face, and had a goal very different from others in the drinking game.
“We make booze to pay for art and science,” Kyrejko says. [emphasis mine]
It all started with experiments focused on aquatic ecosystems and carbon dioxide production,
He [Kyrejko] used fermentation to create CO2 [carbon dioxide] and the byproduct was alcohol. That byproduct made Kyrejko think about its applications and implications. Now, that thinking has manifested as a liquid that more and more people in New York City are coveting in the form of Industry Standard vodka.
At least part of the reason this vodka is so coveted (Note: A link has been removed),
“Vodka is one of the easiest things to make if you don’t care,” Kyrejko says, “and one of the hardest if you do.”
Vodka is difficult because there’s no way to mask the imperfections as with other liquors. To make a spirit there are usually three “cuts” made during distillation: heads, hearts, and tails. What most people drink comes from the hearts. But Kyrejko and Bruner cut theirs 30 times.
“The art is knowing how to blend cuts,” Kyrejko says, adding that other makers do not blend their vodka. “It’s a giant pain in the ass.”
Thought has been put into reducing the company’s footprint,
They say they’ve considered the waste they produce from business and environmental standpoints, as well as the energy they use to create their burning water. So they lean on beet sugar instead of grain, and sacrifice the aesthetics of their stills by insulating them rather than polishing the copper to impress tour groups. And even with about 10,000 square feet of space, they use very little of it for equipment.
“The truth is, running a distillery in an urban setting using ‘traditional’ technology just doesn’t make any sense at all,” Kyrejko says.
This is why their initial goal was to build machines that were three times more efficient than what is commercially available, he says. Now, though, he says their machines and processes are up to six times more efficient, and take up a fraction of the space and resources as traditional methods.
It’s an interesting story although I do have one quibble; I would have liked to have learned more about their art and scienceor art/science, efforts. Maybe next story, eh?
An Oct. 20, 2015 posting by Lynn Bergeson on Nanotechnology Now announces a US White House challenge incorporating nanotechnology, computing, and brain research (Note: A link has been removed),
On October 20, 2015, the White House announced a grand challenge to develop transformational computing capabilities by combining innovations in multiple scientific disciplines. See https://www.whitehouse.gov/blog/2015/10/15/nanotechnology-inspired-grand-challenge-future-computing The Office of Science and Technology Policy (OSTP) states that, after considering over 100 responses to its June 17, 2015, request for information, it “is excited to announce the following grand challenge that addresses three Administration priorities — the National Nanotechnology Initiative, the National Strategic Computing Initiative (NSCI), and the BRAIN initiative.” The grand challenge is to “[c]reate a new type of computer that can proactively interpret and learn from data, solve unfamiliar problems using what it has learned, and operate with the energy efficiency of the human brain.”
Here’s where the Oct. 20, 2015 posting, which originated the news item, by Lloyd Whitman, Randy Bryant, and Tom Kalil for the US White House blog gets interesting,
While it continues to be a national priority to advance conventional digital computing—which has been the engine of the information technology revolution—current technology falls far short of the human brain in terms of both the brain’s sensing and problem-solving abilities and its low power consumption. Many experts predict that fundamental physical limitations will prevent transistor technology from ever matching these twin characteristics. We are therefore challenging the nanotechnology and computer science communities to look beyond the decades-old approach to computing based on the Von Neumann architecture as implemented with transistor-based processors, and chart a new path that will continue the rapid pace of innovation beyond the next decade.
There are growing problems facing the Nation that the new computing capabilities envisioned in this challenge might address, from delivering individualized treatments for disease, to allowing advanced robots to work safely alongside people, to proactively identifying and blocking cyber intrusions. To meet this challenge, major breakthroughs are needed not only in the basic devices that store and process information and the amount of energy they require, but in the way a computer analyzes images, sounds, and patterns; interprets and learns from data; and identifies and solves problems. [emphases mine]
Many of these breakthroughs will require new kinds of nanoscale devices and materials integrated into three-dimensional systems and may take a decade or more to achieve. These nanotechnology innovations will have to be developed in close coordination with new computer architectures, and will likely be informed by our growing understanding of the brain—a remarkable, fault-tolerant system that consumes less power than an incandescent light bulb.
Recent progress in developing novel, low-power methods of sensing and computation—including neuromorphic, magneto-electronic, and analog systems—combined with dramatic advances in neuroscience and cognitive sciences, lead us to believe that this ambitious challenge is now within our reach. …
This is the first time I’ve come across anything that publicly links the BRAIN initiative to computing, artificial intelligence, and artificial brains. (For my own sake, I make an arbitrary distinction between algorithms [artificial intelligence] and devices that simulate neural plasticity [artificial brains].)The emphasis in the past has always been on new strategies for dealing with Parkinson’s and other neurological diseases and conditions.