What better way to say ‘Happy Canada Day’ than to highlight a data sonfication project from HotPopRobot. These are not all of the awards won by the HotPopRobot team (based in Toronto, Canada), from the hotpoprobot.com homepage,
Micro:bit Challenge North America Runners Up 2020.
NASA SpaceApps 2019, 2018, 2017, 2014.
Imagining the Skies 2019.
Jesse Ketchum Astronomy Award 2018. Hon.
Mention at 2019 NASA Planetary Defense Conference. Emerald Code Grand Prize 2018.
Canadian Space Apps 2017
Here’s more about this intriguing team from the site’s About Us page,
HotPopRobot is a maker-family enterprise co-founded in 2014 by Artash […], Arushi […], Rati, and Vikas to bring discussions on Science, Space Exploration, Astronomy, and Technology in our everyday conversation. It encourages families, kids, and youths to become creators (and not consumers), scientists, artists, or whatever they want to be by undertaking projects on space, robotics, coding, and science.
We started this enterprise after winning the NASA Space Apps Toronto 2014 Award for our Mars Rover: CuriousBot. We ended up among the top 5 NASA Space Apps Winners (people’s choice) globally! We won the NASA SpaceApps Challenge Toronto again in 2019, 2018, and 2017 as well as the Canadian Space Agency’s Space Apps Challenge 2017 for our project – “Yes I Can” which used RadarSat-2 satellite data to recreate the #Canada150 logo. We ended up getting invited to the Canadian Space Agency to present our project and meet the new Canadian Astronauts.
The latest project is a musical based on data sonification of data on COVID-19 impacts in Toronto, Canada. Here’s a video of the ‘Toronto COVID-19 Lockdown Musical’ or more formally the ‘Musical Scales Project’,
As of June 2020, Artash and Arushi are in grade eight and grade five, respectively, which means they are likely 13 and 10 years old now and were seven and four years old, respectively, when they and their parents started the HotPopRobots enterprise in 2014.
Definitely visit their website if you’re interested in artificial intelligence, robots, machine learning, as well as, their other topics.
Regarding their latest project, here’s more about the Musical Scales Project from a June 19 (?), 2020 posting on their website,
The beauty of the human mind is that once you set it free, it soars high. Our mind too was teeming with big questions that we wanted to find the answers about. Would the COVID19 lockdown have increased the bird density in the city skies, would the closure of all economic activities have affected the rotation of the Earth, would an Alien civilization be able to figure out that something drastic must have happened on Earth?
All questions are good questions. But from our previous experiences of making projects, we knew we had to limit our imagination for the time being and focus on practicality to come up with workable project design. Once we have made something and it works, we could always keep improving it or make newer versions of the same.
So between the two of us [Artash and Arushi], we limited our questions to:
Has the noise levels on our streets gone down?
Has the air we breathe become cleaner?
Have the traffic levels on our streets gone down?
Has the lockdown affected the vibration of the Earth due to the stopping of businesses, economic, and construction activities?
We often have to dismantle some of our older projects to get the components for our newer projects. It is not a good feeling as we often use our older projects to give demonstrations at various public events. So where possible we try to make our projects modular so that we can use the same components for more than one project.
We ended up collecting the following sensors and cameras for this project.
Light Sensor: It measures the light around us. It has a photo-resistor whose value decreases when light falls on it. It is the base sensor that will help us visualize separate daily data readings as well as changes in data collected during day and night.
Sound Sensor: To listen to street noise around us. It is similar to a microphone but gives analog values of sound levels. This raw data then has to be calibrated to understand how it changes with the change in sound levels.
PM 2.5 Dust Sensor: It is a sensor to measure particulate matters of 2.5 microns in the air. There is a small heater in the sensor which directs the flow of air in the sensor in an upward direction (convection current). The flow of air passes through infrared light which bounces around. The more the bounce the more the particulate matter or more polluted the air.
Temperature Sensor: We wanted to see how much the temperature was changing around us. The sensor is just like a digital thermometer but it prints out the readings.
Humidity Sensor: It measures how damp the air around is. We measure humidity and temperature as they both affect the pollution levels.
Intel RealSense Camera: To get a wide overview of the traffic on King Street. Its high resolution allows us to apply machine learning for object identification and tracking.
In addition to getting data from our sensors, we had to rely on external databases to get some other information.
Covid19 Infection Rates in Toronto: from City of Toronto Public Health website
The intensity of Night Lights Over Toronto: Using NASA Night Light Data to understand changes in night lights over Toronto during different weeks.
Seismic Vibrations in Toronto: We got the displacement data of Earth along the vertical direction from the Leslie Spit Seismic Station in Toronto.
We used the free Musical Algorithm software (www.musicalgorithms.org) to bring all the data together and create the COVID19 Lockdown Musical.
The descriptions and instructions are comprehensive, which is very helpful if you’re planning your own project.
A June 1, 2020 essay by Maywa Montenegro (Postdoctoral Fellow, University of California at Davis) for The Conversation posits that new regulations (which in fact result in deregulation) are likely to create problems,
In May , federal regulators finalized a new biotechnology policy that will bring sweeping changes to the U.S. food system. Dubbed “SECURE,”the rule revises U.S. Department of Agriculture regulations over genetically engineered plants, automatically exempting many gene-edited crops from government oversight. Companies and labs will be allowed to “self-determine” whether or not a crop should undergo regulatory review or environmental risk assessment.
Initial responses to this new policy have followed familiar fault lines in the food community. Seed industry trade groups and biotech firms hailed the rule as “important to support continuing innovation.” Environmental and small farmer NGOs called the USDA’s decision “shameful” and less attentive to public well-being than to agribusiness’s bottom line.
But the gene-editing tool CRISPR was supposed to break the impasse in old GM wars by making biotechnology more widely affordable, accessible and thus democratic.
In my research, I study how biotechnology affects transitions to sustainable food systems. It’s clear that since 2012 the swelling R&D pipeline of gene-edited grains, fruits and vegetables, fish and livestock has forced U.S. agencies to respond to the so-called CRISPR revolution.
Yet this rule change has a number of people in the food and scientific communities concerned. To me, it reflects the lack of accountability and trust between the public and government agencies setting policies.
Is there a better way?
… I have developed a set of principles and practices for governing CRISPR based on dialogue with front-line communities who are most affected by the technologies others usher in. Communities don’t just have to adopt or refuse technology – they can co-create [emphasis mine] it.
One way to move forward in the U.S. is to take advantage of common ground between sustainable agriculture movements and CRISPR scientists. The struggle over USDA rules suggests that few outside of industry believe self-regulation is fair, wise or scientific.
If you have the time and the inclination, do read the essay in its entirety.
Anyone who has read my COVID-19 op-ed for the Canadian Science Policy may see some similarity between Montenegro’s “co-create” and this from my May 15, 2020 posting which included my reference materials or this version on the Canadian Science Policy Centre where you can find many other COVID-19 op-eds)
In addition to engaging experts as we navigate our way into the future, we can look to artists, writers, citizen scientists, elders, indigenous communities, rural and urban communities, politicians, philosophers, ethicists, religious leaders, and bureaucrats of all stripes for more insight into the potential for collateral and unintended consequences.
To be clear, I think times of crises are when a lot of people call for more co-creation and input. Here’s more about Montenegro’s work on her profile page (which includes her academic credentials, research interests and publications) on the University of California at Berkeley’s Department of Environmental Science, Policy, and Management webspace. She seems to have been making the call for years.
I am a US-Dutch-Peruvian citizen who grew up in Appalachia, studied molecular biology in the Northeast, worked as a journalist in New York City, and then migrated to the left coast to pursue a PhD. My indigenous ancestry, smallholder family history, and the colonizing/decolonizing experiences of both the Netherlands and Peru informs my personal and professional interests in seeds and agrobiodiversity. My background engenders a strong desire to explore synergies between western science and the indigenous/traditional knowledge systems that have historically been devalued and marginalized.
Trained in molecular biology, science writing, and now, a range of critical social and ecological theory, I incorporate these perspectives into research on seeds.
I am particularly interested in the relationship between formal seed systems – characterized by professional breeding, certification, intellectual property – and commercial sale and informal seed systems through which farmers traditionally save, exchange, and sell seeds. …
You can find more on her Twitter feed, which is where I discovered a call for papers for a “Special Feature: Gene Editing the Food System” in the journal, Elementa: Science of the Anthropocene. They have a rolling deadline, which started in February 2020. At this time, there is one paper in the series,
One of them looks to be screaming (Edvard Munch, anyone?) and none of it looks how I imagined an oceanic ‘living dinosaur’ might. While the news is not in my main area of interest (emerging technology), it is close to home. A June 1, 2020 University of British Columbia news release (also on EurekAlert) describes the glass sponge reefs (living dinosaurs) in the Pacific Northwest and current concerns about their welfare,
Warming ocean temperatures and acidification drastically reduce the skeletal strength and filter-feeding capacity of glass sponges, according to new UBC research.
The findings, published in Scientific Reports, indicate that ongoing climate change could have serious, irreversible impacts on the sprawling glass sponge reefs of the Pacific Northwest and their associated marine life – the only known reefs of their kind in the world.
Ranging from the Alaska-Canada border and down through the Strait of Georgia, the reefs play an essential role in water quality by filtering microbes and cycling nutrients through food chains. They also provide critical habitat for many fish and invertebrates, including rockfish, spot prawns, herring, halibut and sharks.
“Glass sponge reefs are ‘living dinosaurs’ thought to have been extinct for 40 million years before they were re-discovered in B.C. in 1986,” said Angela Stevenson, who led the study as a postdoctoral fellow at UBC Zoology. “Their sheer size and tremendous filtration capacity put them at the heart of a lush and productive underwater system, so we wanted to examine how climate change might impact their survival.”
Although the reefs are subject to strong, ongoing conservation efforts focused on limiting damage to their delicate glass structures, scientists know little about how these sponges respond to environmental changes.
For the study, Stevenson harvested Aphrocallistes vastus, one of three types of reef-building glass sponges, from Howe Sound and brought them to UBC where she ran the first successful long-term lab experiment involving live sponges by simulating their natural environment as closely as possible.
She then tested their resilience by placing them in warmer and more acidic waters that mimicked future projected ocean conditions.
Over a period of four months, Stevenson measured changes to their pumping capacity, body condition and skeletal strength, which are critical indicators of their ability to feed and build reefs.
Within one month, ocean acidification and warming, alone and in combination, reduced the sponges’ pumping capacity by more than 50 per cent and caused tissue losses of 10 to 25 per cent, which could starve the sponges.
“Most worryingly, pumping began to slow within two weeks of exposure to elevated temperatures,” said Stevenson.
The combination of acidification and warming also made their bodies weaker and more elastic by half. That could curtail reef formation and cause brittle reefs to collapse under the weight of growing sponges or animals walking and swimming among them.
Year-long temperature data collected from Howe Sound reefs in 2016 suggest it’s only a matter of time before sponges are exposed to conditions which exceed these thresholds.
“In Howe Sound, we want to figure out a way to track changes in sponge growth, size and area and area in the field so we can better understand potential climate implications at a larger scale,” said co-author Jeff Marliave, senior research scientist at the Ocean Wise Research Institute. “We also want to understand the microbial food webs that support sponges and how they might be influenced by climate cycles.”
Stevenson credits bottom-up community-led efforts and strong collaborations with government for the healthy, viable state of the B.C. reefs today. Added support for such community efforts and educational programs will be key to relieving future pressures.
“When most people think about reefs, they think of tropical shallow-water reefs like the beautiful Great Barrier Reef in Australia,” added Stevenson. “But we have these incredible deep-water reefs in our own backyard in Canada. If we don’t do our best to stand up for them, it will be like discovering a herd of dinosaurs and then immediately dropping dynamite on them.”
The colossal reefs can grow to 19 metres in height and are built by larval sponges settling atop the fused dead skeletons of previous generations. In northern B.C. the reefs are found at depths of 90 to 300 metres, while in southern B.C., they can be found as shallow as 22 metres.
The sponges feed by pumping sea water through their delicate bodies, filtering almost 80 per cent of microbes and particles and expelling clean water.
It’s estimated that the 19 known reefs in the Salish Sea can filter 100 billion litres of water every day, equivalent to one per cent of the total water volume in the Strait of Georgia and Howe Sound combined.
Munch’s The Scream is an icon of modern art, the Mona Lisa for our time. As Leonardo da Vinci evoked a Renaissance ideal of serenity and self-control, Munch defined how we see our own age – wracked with anxiety and uncertainty.
Essentially The Scream is autobiographical, an expressionistic construction based on Munch’s actual experience of a scream piercing through nature while on a walk, after his two companions, seen in the background, had left him. …
For all the times I’ve seen the image, I had no idea the inspiration was acoustic.
In any event, the image seems sadly à propos both for the glass sponge reefs (and nature generally) and with regard to Black Lives Matter (BLM). A worldwide conflagration was ignited by George Floyd’s death in Minneapolis on May 25, 2020. This African-American man died while saying, “I can’t breathe,” as a police officer held Floyd down with a knee on his neck. RIP (rest in peace) George Floyd while the rest of us make the changes necessary, no matter how difficult to create a just and respectful world for all. Black Lives Matter.
The Woodrow Wilson International Center for Scholars (or Wilson Center; located in Washington, DC) has a new initiative, the ‘Thing Tank’ (am enjoying the word play). It’s all about low cost science tools and their possible impact on the practice of science. Here’s more from a May 27, 2020 email notice,
From a foldable microscope made primarily from paper, to low cost and open microprocessors supporting research from cognitive neuroscience to oceanography, to low cost sensors measuring air quality in communities around the world, the things of science — that is, the physical tools that generate data or contribute to scientific processes — are changing the way that science happens.
The nature of tool design is changing, as more and more people share designs openly, create do-it-yourself (DIY) tools as a substitute for expensive, proprietary equipment, or design for mass production. The nature of tool access and use is changing too, as more tools become available at a price point that is do-able for non-professionals. This may be breaking down our reliance on expensive, proprietary designs traditionally needed to make scientific progress. This may also be building new audiences for tools, and making science more accessible to those traditionally limited by cost, geography, or infrastructure. But questions remain: will low cost and/or open tools become ubiquitous, replacing expensive, proprietary designs? Will the use of these tools fundamentally change how we generate data and knowledge, and apply it to global problems? Will the result be more, and better, science? And if so, what is standing in the way of widespread adoption and use?
In the Science and Technology Innovation Program at the Wilson Center, we often consider how new approaches to science are changing the way that science happens. Over the last five years, we’ve investigated how emerging enthusiasm in citizen science — the involvement of the public in scientific research — has changed the way that the public sees science, and contributes to data-driven decision-making. We have explored crowdsourcing and citizen science as two important paradigms of interest within and beyond US federal agencies, and investigated associated legal issues. We’ve documented how innovations in open science, especially open and FAIR data, can make information more shareable and impactful. Across our efforts, we explore and evaluate emerging technology and governance models with the goal of understanding how to maximize benefit and minimize risk. In the process, we convene scientists, practitioners, and policy makers to maximize the value of new approaches to science.
Now, we are expanding our attention to explore how innovation in the physical tools of science accelerate science, support decision-making, and broaden participation. We want to understand the current and potential value of these tools and approaches, and how they are changing the way we do science — now, and in the future.
THING Tank, our new initiative, fits well within the overall mission of the Wilson Center. As a think tank associated with the United States federal government, the Wilson Center is a boundary organization linking academia and the public policy community to create actionable research while bringing stakeholders together. Innovative and accessible tools for science are important to academia and policy alike. We hope to also bridge these perspectives with critical, on the ground activities, and understand and elevate the individuals, non-profits, community groups, and others working in this space.
The notice was in fact an excerpt from a May 19, 2020 article by Alison Parker and Anne Bowser on the Wilson Center website, I believe Bowser and Parker are the organizers behind the Think Tank initiative.
There are big plans for future activities such as workshops, a member directory and other outreach efforts. There’s also this,
We want to hear from you!
This space touches many communities, networks and stakeholders, from those advancing science, those working together to promote ideals of openness, to those developing solutions in a commercial context. No matter your interest, we want to hear from you! We’re looking for contributions to this effort, that can take a variety of forms:
Help us catch up to speed. We recognize that there are decades of foundational work and ongoing activities, and are eager to learn more.
Help us connect to broader communities, networks, and stakeholders. What is the best way to get broad input? Who isn’t in our network, that should be?
Introduce your communities and stakeholders to public policy audiences by contributing blog posts and social media messaging – more information on this coming soon!
Explore converging communities and accelerators and barriers by participating in workshops and events – definitely virtually, and hopefully in person as well.
Contribute and review content about case studies, definitions, and accelerators and barriers.
Share our products with your networks if you think they are useful.
To start, we will host a series of virtual happy hours exploring the role of openness, authority, and community in open science and innovation for crisis and disaster response. How have tools for science impacted the response to COVID-19, and how is the governance of those devices, and their data, evolving in emergency use?
How one is to contact the organizers is not immediately clear to me. They’ve not included any contact details on that webpage but you can subscribe to the newsletter,
More and more, this resembles a public relations campaign. First, CRISPR (clustered regularly interspersed short palindromic repeats) gene editing is going to be helpful with COVID-19 and now it can help us to deal with conservation issues. (See my May 26, 2020 posting about the latest CRISPR doings as of May 7, 2020; included is a brief description of the patent dispute between Broad Institute and UC Berkeley and musings about a public relations campaign.)
The gene-editing technology CRISPR has been used for a variety of agricultural and public health purposes — from growing disease-resistant crops to, more recently, a diagnostic test for the virus that causes COVID-19. Now a study involving fish that look nearly identical to the endangered Delta smelt finds that CRISPR can be a conservation and resource management tool, as well. The researchers think its ability to rapidly detect and differentiate among species could revolutionize environmental monitoring.
The study, published in the journal Molecular Ecology Resources, was led by scientists at the University of California, Davis, and the California Department of Water Resources in collaboration with MIT Broad Institute [emphasis mine].
As a proof of concept, it found that the CRISPR-based detection platform SHERLOCK (Specific High-sensitivity Enzymatic Reporter Unlocking) [emphasis mine] was able to genetically distinguish threatened fish species from similar-looking nonnative species in nearly real time, with no need to extract DNA.
“CRISPR can do a lot more than edit genomes,” said co-author Andrea Schreier, an adjunct assistant professor in the UC Davis animal science department. “It can be used for some really cool ecological applications, and we’re just now exploring that.”
WHEN GETTING IT WRONG IS A BIG DEAL
The scientists focused on three fish species of management concern in the San Francisco Estuary: the U.S. threatened and California endangered Delta smelt, the California threatened longfin smelt and the nonnative wakasagi. These three species are notoriously difficult to visually identify, particularly in their younger stages.
Hundreds of thousands of Delta smelt once lived in the Sacramento-San Joaquin Delta before the population crashed in the 1980s. Only a few thousand are estimated to remain in the wild.
“When you’re trying to identify an endangered species, getting it wrong is a big deal,” said lead author Melinda Baerwald, a project scientist at UC Davis at the time the study was conceived and currently an environmental program manager with California Department of Water Resources.
For example, state and federal water pumping projects have to reduce water exports if enough endangered species, like Delta smelt or winter-run chinook salmon, get sucked into the pumps. Rapid identification makes real-time decision making about water operations feasible.
FROM HOURS TO MINUTES
Typically to accurately identify the species, researchers rub a swab over the fish to collect a mucus sample or take a fin clip for a tissue sample. Then they drive or ship it to a lab for a genetic identification test and await the results. Not counting travel time, that can take, at best, about four hours.
SHERLOCK shortens this process from hours to minutes. Researchers can identify the species within about 20 minutes, at remote locations, noninvasively, with no specialized lab equipment. Instead, they use either a handheld fluorescence reader or a flow strip that works much like a pregnancy test — a band on the strip shows if the target species is present.
“Anyone working anywhere could use this tool to quickly come up with a species identification,” Schreier said.
OTHER CRYPTIC CRITTERS
While the three fish species were the only animals tested for this study, the researchers expect the method could be used for other species, though more research is needed to confirm. If so, this sort of onsite, real-time capability may be useful for confirming species at crime scenes, in the animal trade at border crossings, for monitoring poaching, and for other animal and human health applications.
“There are a lot of cryptic species we can’t accurately identify with our naked eye,” Baerwald said. “Our partners at MIT are really interested in pathogen detection for humans. We’re interested in pathogen detection for animals as well as using the tool for other conservation issues.”
SHERLOCK is an evolution of CRISPR technology, which others use to make precise edits in genetic code. SHERLOCK can detect the unique genetic fingerprints of virtually any DNA or RNA sequence in any organism or pathogen. Developed by our founders and licensed exclusively from the Broad Institute, SHERLOCK is a method for single molecule detection of nucleic acid targets and stands for Specific High Sensitivity Enzymatic Reporter unLOCKing. It works by amplifying genetic sequences and programming a CRISPR molecule to detect the presence of a specific genetic signature in a sample, which can also be quantified. When it finds those signatures, the CRISPR enzyme is activated and releases a robust signal. This signal can be adapted to work on a simple paper strip test, in laboratory equipment, or to provide an electrochemical readout that can be read with a mobile phone.
Ensuring the SHERLOCK diagnostic platform is easily accessible, especially in the developing world, where the need for inexpensive, reliable, field-based diagnostics is the most urgent
SHERLOCK (Specific High-sensitivity Enzymatic Reporter unLOCKing) is a CRISPR-based diagnostic tool that is rapid, inexpensive, and highly sensitive, with the potential to have a transformative effect on research and global public health. The SHERLOCK platform can detect viruses, bacteria, or other targets in clinical samples such as urine or blood, and reveal results on a paper strip — without the need for extensive specialized equipment. This technology could potentially be used to aid the response to infectious disease outbreaks, monitor antibiotic resistance, detect cancer, and more. SHERLOCK tools are freely available [emphasis mine] for academic research worldwide, and the Broad Institute’s licensing framework [emphasis mine] ensures that the SHERLOCK diagnostic platform is easily accessible in the developing world, where inexpensive, reliable, field-based diagnostics are urgently needed.
Here’s what I suspect. as stated, the Broad Institute has free SHERLOCK licenses for academic institutions and not-for-profit organizations but Sherlock Biosciences, a Broad Institute spinoff company, is for-profit and has trademarked SHERLOCK for commercial purposes.
This looks like a relatively subtle campaign to influence public perceptions. Genetic modification or genetic engineering as exemplified by the CRISPR gene editing technique is a force for the good of all. It will help us in our hour of need (COVID-19 pandemic) and it can help us save various species and better manage our resources.
This contrasts greatly with the publicity generated by the CRISPR twins situation where a scientist claimed to have successfully edited the germline for twins, Lulu and Nana. This was done despite a voluntary, worldwide moratorium on germline editing of viable embryos. (Search the terms [either here or on a standard search engine] ‘CRISPR twins’, ‘Lulu and Nana’, and/or ‘He Jiankui’ for details about the scandal.
In addition to presenting CRISPR as beneficial in the short term rather than the distant future, this publicity also subtly positions the Broad Institute as CRISPR’s owner.
Clustered regularly interspersed short palindromic repeats (CRISPR) gene editing has been largely confined to laboratory use or tested in agricultural trials. I believe that is true worldwide excepting the CRISPR twin scandal. (There are numerous postings about the CRISPR twins here including a Nov. 28, 2018 post, a May 17, 2019 post, and a June 20, 2019 post. Update: It was reported (3rd. para.) in December 2019 that He had been sentenced to three years jail time.)
Connie Lin in a May 7, 2020 article for Fast Company reports on this surprising decision by the US Food and Drug Administration (FDA), Note: A link has been removed),
The U.S. Food and Drug Administration has granted Emergency Use Authorization to a COVID-19 test that uses controversial gene-editing technology CRISPR.
This marks the first time CRISPR has been authorized by the FDA, although only for the purpose of detecting the coronavirus, and not for its far more contentious applications. The new test kit, developed by Cambridge, Massachusetts-based Sherlock Biosciences, will be deployed in laboratories certified to carry out high-complexity procedures and is “rapid,” returning results in about an hour as opposed to those that rely on the standard polymerase chain reaction method, which typically requires six hours.
The announcement was made in the FDA’s Coronavirus (COVID-19) Update: May 7, 2020 Daily Roundup (4th item in the bulleted list), Or, you can read the May 6, 2020 letter (PDF) sent to John Vozella of Sherlock Biosciences by the FDA.
Sherlock Biosciences, an Engineering Biology company dedicated to making diagnostic testing better, faster and more affordable, today announced the company has received Emergency Use Authorization (EUA) from the U.S. Food and Drug Administration (FDA) for its Sherlock™ CRISPR SARS-CoV-2 kit for the detection of the virus that causes COVID-19, providing results in approximately one hour.
“While it has only been a little over a year since the launch of Sherlock Biosciences, today we have made history with the very first FDA-authorized use of CRISPR technology, which will be used to rapidly identify the virus that causes COVID-19,” said Rahul Dhanda, co-founder, president and CEO of Sherlock Biosciences. “We are committed to providing this initial wave of testing kits to physicians, laboratory experts and researchers worldwide to enable them to assist frontline workers leading the charge against this pandemic.”
The Sherlock™ CRISPR SARS-CoV-2 test kit is designed for use in laboratories certified under the Clinical Laboratory Improvement Amendments of 1988 (CLIA), 42 U.S.C. §263a, to perform high complexity tests. Based on the SHERLOCK method, which stands for Specific High-sensitivity Enzymatic Reporter unLOCKing, the kit works by programming a CRISPR molecule to detect the presence of a specific genetic signature – in this case, the genetic signature for SARS-CoV-2 – in a nasal swab, nasopharyngeal swab, oropharyngeal swab or bronchoalveolar lavage (BAL) specimen. When the signature is found, the CRISPR enzyme is activated and releases a detectable signal. In addition to SHERLOCK, the company is also developing its INSPECTR™ platform to create an instrument-free, handheld test – similar to that of an at-home pregnancy test – that utilizes Sherlock Biosciences’ Synthetic Biology platform to provide rapid detection of a genetic match of the SARS-CoV-2 virus.
“When our lab collaborated with Dr. Feng Zhang’s team to develop SHERLOCK, we believed that this CRISPR-based diagnostic method would have a significant impact on global health,” said James J. Collins, co-founder and board member of Sherlock Biosciences and Termeer Professor of Medical Engineering and Science for MIT’s Institute for Medical Engineering and Science (IMES) and Department of Biological Engineering. “During what is a major healthcare crisis across the globe, we are heartened that the first FDA-authorized use of CRISPR will aid in the fight against this global COVID-19 pandemic.”
Access to rapid diagnostics is critical for combating this pandemic and is a primary focus for Sherlock Biosciences co-founder and board member, David R. Walt, Ph.D., who co-leads the Mass [Massachusetts] General Brigham Center for COVID Innovation.
“SHERLOCK enables rapid identification of a single alteration in a DNA or RNA sequence in a single molecule,” said Dr. Walt. “That precision, coupled with its capability to be deployed to multiplex over 100 targets or as a simple point-of-care system, will make it a critical addition to the arsenal of rapid diagnostics already being used to detect COVID-19.”
This development is particularly interesting since there was a major intellectual property dispute over CRISPR between the Broad Institute (a Harvard University and Massachusetts Institute of Technology [MIT] joint initiative), and the University of California at Berkeley (UC Berkeley). The Broad Institute mostly won in the first round of the patent fight, as I noted in a March 15, 2017 post but, as far as I’m aware, UC Berkeley is still disputing that decision.
In the period before receiving authorization, it appears that Sherlock Biosciences was doing a little public relations and ‘consciousness raising’ work. Here’s a sample from a May 5, 2020 article by Sharon Begley for STAT (Note: Links have been removed),
The revolutionary genetic technique better known for its potential to cure thousands of inherited diseases could also solve the challenge of Covid-19 diagnostic testing, scientists announced on Tuesday. A team headed by biologist Feng Zhang of the McGovern Institute at MIT and the Broad Institute has repurposed the genome-editing tool CRISPR into a test able to quickly detect as few as 100 coronavirus particles in a swab or saliva sample.
Crucially, the technique, dubbed a “one pot” protocol, works in a single test tube and does not require the many specialty chemicals, or reagents, whose shortage has hampered the rollout of widespread Covid-19 testing in the U.S. It takes about an hour to get results, requires minimal handling, and in preliminary studies has been highly accurate, Zhang told STAT. He and his colleagues, led by the McGovern’s Jonathan Gootenberg and Omar Abudayyeh, released the protocol on their STOPCovid.science website.
Because the test has not been approved by the Food and Drug Administration, it is only for research purposes for now. But minutes before speaking to STAT on Monday, Zhang and his colleagues were on a conference call with FDA officials about what they needed to do to receive an “emergency use authorization” that would allow clinical use of the test. The FDA has used EUAs to fast-track Covid-19 diagnostics as well as experimental therapies, including remdesivir, after less extensive testing than usually required.
For an EUA, the agency will require the scientists to validate the test, which they call STOPCovid, on dozens to hundreds of samples. Although “it is still early in the process,” Zhang said, he and his colleagues are confident enough in its accuracy that they are conferring with potential commercial partners who could turn the test into a cartridge-like device, similar to a pregnancy test, enabling Covid-19 testing at doctor offices and other point-of-care sites.
“It could potentially even be used at home or at workplaces,” Zhang said. “It’s inexpensive, does not require a lab, and can return results within an hour using a paper strip, not unlike a pregnancy test. This helps address the urgent need for widespread, accurate, inexpensive, and accessible Covid-19 testing.” Public health experts say the availability of such a test is one of the keys to safely reopening society, which will require widespread testing, and then tracing and possibly isolating the contacts of those who test positive.
A May 15, 2020 news item on Nanowerk provides context for an announcement of a research breakthrough on quantum entanglement,
Quantum entanglement is a process by which microscopic objects like electrons or atoms lose their individuality to become better coordinated with each other. Entanglement is at the heart of quantum technologies that promise large advances in computing, communications and sensing, for example detecting gravitational waves.
Entangled states are famously fragile: in most cases even a tiny disturbance will undo the entanglement. For this reason, current quantum technologies take great pains to isolate the microscopic systems they work with, and typically operate at temperatures close to absolute zero.
The ICFO [Institute of Photonic Sciences; Spain] team, in contrast, heated a collection of atoms to 450 Kelvin, millions of times hotter than most atoms used for quantum technology. Moreover, the individual atoms were anything but isolated; they collided with each other every few microseconds, and each collision set their electrons spinning in random directions.
The researchers used a laser to monitor the magnetization of this hot, chaotic gas. The magnetization is caused by the spinning electrons in the atoms, and provides a way to study the effect of the collisions and to detect entanglement. What the researchers observed was an enormous number of entangled atoms – about 100 times more than ever before observed. They also saw that the entanglement is non-local – it involves atoms that are not close to each other. Between any two entangled atoms there are thousands of other atoms, many of which are entangled with still other atoms, in a giant, hot and messy entangled state.
What they also saw, as Jia Kong, first author of the study, recalls, “is that if we stop the measurement, the entanglement remains for about 1 millisecond, which means that 1000 times per second a new batch of 15 trillion atoms is being entangled. And you must think that 1 ms is a very long time for the atoms, long enough for about fifty random collisions to occur. This clearly shows that the entanglement is not destroyed by these random events. This is maybe the most surprising result of the work”.
The observation of this hot and messy entangled state paves the way for ultra-sensitive magnetic field detection. For example, in magnetoencephalography (magnetic brain imaging), a new generation of sensors uses these same hot, high-density atomic gases to detect the magnetic fields produced by brain activity. The new results show that entanglement can improve the sensitivity of this technique, which has applications in fundamental brain science and neurosurgery.
As ICREA [Catalan Institution for Research and Advanced Studies] Prof. at ICFO Morgan Mitchell states, “this result is surprising, a real departure from what everyone expects of entanglement.” He adds “we hope that this kind of giant entangled state will lead to better sensor performance in applications ranging from brain imaging to self-driving cars to searches for dark matter
A Spin Singlet and QND
A spin singlet is one form of entanglement where the multiple particles’ spins–their intrinsic angular momentum–add up to 0, meaning the system has zero total angular momentum. In this study, the researchers applied quantum non-demolition (QND) measurement to extract the information of the spin of trillions of atoms. The technique passes laser photons with a specific energy through the gas of atoms. These photons with this precise energy do not excite the atoms but they themselves are affected by the encounter. The atoms’ spins act as magnets to rotate the polarization of the light. By measuring how much the photons’ polarization has changed after passing through the cloud, the researchers are able to determine the total spin of the gas of atoms.
The SERF regime
Current magnetometers operate in a regime that is called SERF, far away from the near absolute zero temperatures that researchers typically employ to study entangled atoms. In this regime, any atom experiences many random collisions with other neighbouring atoms, making collisions the most important effect on the state of the atom. In addition, because they are in a hot medium rather than an ultracold one, the collisions rapidly randomize the spin of the electrons in any given atom. The experiment shows, surprisingly, that this kind of disturbance does not break the entangled states, it merely passes the entanglement from one atom to another.
I’ve bookended information about the talk with physicist Katie Mack at Canada’s Perimeter Institute on May 6, 2020 with two items on visual art and mathematics and the sciences.
You’ll find this image and a few more in a fascinating 2017 paper (see link and citation below) about mathematical sculpture,
Ferguson [Helaman Ferguson], who holds a doctorate in mathematics, never chose between art and science: now nearly 77 years old, he’s a mathematical sculptor. Working in stone and bronze, Ferguson creates sculptures, often placed on college campuses, that turn deep mathematical ideas into solid objects that anyone—seasoned professors, curious children, wayward mathophobes—can experience for themselves.
Mathematics has an intrinsic aesthetic—proofs are often described as “beautiful” or “elegant”—that can be difficult for mathematicians to communicate to outsiders, says Ferguson. “It isn’t something you can tell somebody about on the street,” he says. “But if I hand them a sculpture, they’re immediately relating to it.” Sculpture, he says, can tell a story about math in an accessible language.
Live webcast: theoretical cosmologist & science communicator Katie Mack
The live webcast will take place at 4 pm PT (1600 hours) on Wednesday, May 6, 2020. Here’s more about Katie Mack and the webcast from the event webpage (click through to the event page to get to the webcast) on the Perimeter Institute of Theoretical Physics (PI) website,
In a special live webcast on May 6  at 7 pm ET [4 pm PT], theoretical cosmologist and science communicator Katie Mack — known to her many Twitter followers as @astrokatie — will answer questions about her favourite subject: the end of the universe.
Mack, who holds a Simons Emmy Noether Visiting Fellowship at Perimeter, will give viewers a sneak peek at her upcoming book, The End of Everything (Astrophysically Speaking). She will then participate in a live “ask me anything” session, answering questions submitted via social media using the hashtag #piLIVE.
Mack is an Assistant Professor at North Carolina State University whose research investigates dark matter, vacuum decay, and the epoch of reionization. Mack is a popular science communicator on social media, and has contributed to Scientific American, Slate, Sky & Telescope, Time, and Cosmos.
Uniting quantum theory with Einstein’s Theory of General Relativity with a drawing about light
The article by Stephon Alexander was originally published March 16, 2017 for Nautilus. My excerpts are from a getpocket.com selection,
My aim as a theoretical physicist is to unite quantum theory with Einstein’s Theory of General Relativity. While there are a few proposals for this unification, such as string theory and loop quantum gravity, many roadblocks to a complete unification remain.
Einstein’s theory tells us the gravitational force is a direct manifestation of space and time bending. The sun bends the fabric of space, much like a sleeping person bends a mattress. Planetary orbits, including Earth’s, are motion along the contours of the bent space created by the sun. This theory provides some critical insights into the nature of light.
… one summer, I had the most unexpected breakthrough. Beth Jacobs, a member of the New York Academy of Sciences’ Board of Governors, invited me and some friends to her New York City apartment to meet the Oakes twins, artists who have gained attention in recent years for their drawings as well as the innovative technique and inventions they deploy to create them. An Oakes work, Irwin Gardens at the Getty in Winter (2011), an intricate drawing of the famous gardens designed by Robert Irwin at The Getty Museum in Los Angeles, was displayed on the balcony of Jacobs’ apartment overlooking Central Park, with the backdrop of the New York City skyline lit with a warm orange sky moments before sunset.
As I gazed at the drawing, I could feel the artists challenging me to reconsider the nature of light. I began to realize I should consider not only the physics of light, but also how light information is perceived by observers, when theorizing and conceiving new principles to unify quantum mechanics and general relativity. …
Ryan and Trevor Oakes, 35, have been exploring the impact and intersection of visual perception and the physics of light since they were kids. After attending The Cooper Union for the Advancement of Science and Art in New York City, and years of experimentation and inventing new techniques, the twins exploited the notion that light information is better described when originating from a spherical surface.
Artist Joseph Nechvatal has a longstanding interest in viruses, i.e., computer viruses and that work seems strangely apt as we cope with the COVID-19 pandemic. He very kindly sent me some à propos information (received via an April 5, 2020 email),
I wanted to let you know that _viral symphOny_ (2006-2008), my 1 hour 40 minute collaborative electronic noise music symphony, created using custom artificial life C++ software based on the viral phenomenon model, is available to the world for free here:
Before you click the link and dive in you might find these bits of information interesting. BTW, I do provide the link again at the end of this post.
Origin of and concept behind the term ‘computer virus’
As I’ve learned to expect, there are two and possibly more origin stories for the term ‘computer virus’. Refreshingly, there is near universal agreement in the material I’ve consulted about John von Neuman’s role as the originator of the concept. After that, it gets more complicated; Wikipedia credits a writer for christening the term (Note: Links have been removed),
The first academic work on the theory of self-replicating computer programs was done in 1949 by John von Neumann who gave lectures at the University of Illinois about the “Theory and Organization of Complicated Automata”. The work of von Neumann was later published as the “Theory of self-reproducing automata”. In his essay von Neumann described how a computer program could be designed to reproduce itself. Von Neumann’s design for a self-reproducing computer program is considered the world’s first computer virus, and he is considered to be the theoretical “father” of computer virology. In 1972, Veith Risak directly building on von Neumann’s work on self-replication, published his article “Selbstreproduzierende Automaten mit minimaler Informationsübertragung” (Self-reproducing automata with minimal information exchange). The article describes a fully functional virus written in assembler programming language for a SIEMENS 4004/35 computer system. In 1980 Jürgen Kraus wrote his diplom thesis “Selbstreproduktion bei Programmen” (Self-reproduction of programs) at the University of Dortmund. In his work Kraus postulated that computer programs can behave in a way similar to biological viruses.
The first known description of a self-reproducing program in a short story occurs in 1970 in The Scarred Man by Gregory Benford [emphasis mine] which describes a computer program called VIRUS which, when installed on a computer with telephone modem dialing capability, randomly dials phone numbers until it hit a modem that is answered by another computer. It then attempts to program the answering computer with its own program, so that the second computer will also begin dialing random numbers, in search of yet another computer to program. The program rapidly spreads exponentially through susceptible computers and can only be countered by a second program called VACCINE.
The idea was explored further in two 1972 novels, When HARLIE Was One by David Gerrold and The Terminal Man by Michael Crichton, and became a major theme of the 1975 novel The Shockwave Rider by John Brunner.
The 1973 Michael Crichton sci-fi movie Westworld made an early mention of the concept of a computer virus, being a central plot theme that causes androids to run amok. Alan Oppenheimer’s character summarizes the problem by stating that “…there’s a clear pattern here which suggests an analogy to an infectious disease process, spreading from one…area to the next.” To which the replies are stated: “Perhaps there are superficial similarities to disease” and, “I must confess I find it difficult to believe in a disease of machinery.”
Scientific American has an October 19, 2001 article citing four different experts’ answer to the question “When did the term ‘computer virus’ arise?” Three of the experts cite academics as the source for the term (usually Fred Cohen). One of the experts does mention writers (for the most part, not the same writers cited in the Wikipedia entry quotation in the above).
One expert discusses the concept behind the term and confirms what most people will suspect. Interestingly, this expert’s origin story varies somewhat from the other three.
The concept behind the first malicious computer programs was described years ago in the Computer Recreations column of Scientific American. The metaphor of the “computer virus” was adopted because of the similarity in form, function and consequence with biological viruses that attack the human system. Computer viruses can insert themselves in another program, taking over control or adversely affecting the function of the program.
Like their biological counterparts, computer viruses can spread rapidly and self-replicate systematically. They also mimic living viruses in the way they must adapt through mutation [emphases mine] to the development of resistance within a system: the author of a computer virus must upgrade his creation in order to overcome the resistance (antiviral programs) or to take advantage of new weakness or loophole within the system.
Computer viruses also act like biologics [emphasis mine] in the way they can be set off: they can be virulent from the outset of the infection, or they can be activated by a specific event (logic bomb). But computer viruses can also be triggered at a specific time (time bomb). Most viruses act innocuous towards a system until their specific condition is met.
The computer industry has expanded the metaphor to now include terms like inoculation, disinfection, quarantine and sanitation [emphases mine]. Now if your system gets infected by a computer virus you can quarantine it until you can call the “virus doctor” who can direct you to the appropriate “virus clinic” where your system can be inoculated and disinfected and an anti-virus program can be prescribed.
More about Joseph Nechvatal and his work on viruses
The similarities between computer and biological viruses are striking and with that in mind, here’s a clip featuring part of viral symphOny,
Before giving you a second link to Nechvatal’s entire viral symphOny, here’s some context about him and his work, from the Joseph Nechvatal Wikipedia entry, (Note: Links have been removed),
He began using computers to make “paintings” in 1986  and later, in his signature work, began to employ computer viruses. These “collaborations” with viral systems positioned his work as an early contribution to what is increasingly referred to as a post-human aesthetic.
From 1991–1993 he was artist-in-residence at the Louis Pasteur Atelier in Arbois, France and at the Saline Royale/Ledoux Foundation’s computer lab. There he worked on The Computer Virus Project, which was an artistic experiment with computer viruses and computer animation. He exhibited at Documenta 8 in 1987.
In 1999 Nechvatal obtained his Ph.D. in the philosophy of art and new technology concerning immersive virtual reality at Roy Ascott’s Centre for Advanced Inquiry in the Interactive Arts (CAiiA), University of Wales College, Newport, UK (now the Planetary Collegium at the University of Plymouth). There he developed his concept of viractualism, a conceptual art idea that strives “to create an interface between the biological and the technological.” According to Nechvatal, this is a new topological space.
In 2002 he extended his experimentation into viral artificial life through a collaboration with the programmer Stephane Sikora of music2eye in a work called the Computer Virus Project II, inspired by the a-life work of John Horton Conway (particularly Conway’s Game of Life), by the general cellular automata work of John von Neumann, by the genetic programming algorithms of John Koza and the auto-destructive art of Gustav Metzger.
In 2005 he exhibited Computer Virus Project II works (digital paintings, digital prints, a digital audio installation and two live electronic virus-attack art installations) in a solo show called cOntaminatiOns at Château de Linardié in Senouillac, France. In 2006 Nechvatal received a retrospective exhibition entitled Contaminations at the Butler Institute of American Art’s Beecher Center for Arts and Technology.
Dr. Nechvatal has also contributed to digital audio work with his noise music viral symphOny [emphasis mine], a collaborative sound symphony created by using his computer virus software at the Institute for Electronic Arts at Alfred University.viral symphOny was presented as a part of nOise anusmOs in New York in 2012.
Gold stars for everyone who recognized the loose paraphrasing of the title, Love in the Time of Cholera, for Gabrial Garcia Marquez’s 1985 novel.
I wrote my headline and first paragraph yesterday and found this in my email box this morning, from a March 25, 2020 University of British Columbia news release, which compares times, diseases, and scares of the past with today’s COVID-19 (Perhaps politicians and others could read this piece and stop using the word ‘unprecedented’ when discussing COVID-19?),
How globalization stoked fear of disease during the Romantic era
In the late 18th and early 19th centuries, the word “communication” had several meanings. People used it to talk about both media and the spread of disease, as we do today, but also to describe transport—via carriages, canals and shipping.
Miranda Burgess, an associate professor in UBC’s English department, is working on a book called Romantic Transport that covers these forms of communication in the Romantic era and invites some interesting comparisons to what the world is going through today.
We spoke with her about the project.
What is your book about?
It’s about global infrastructure at the dawn of globalization—in particular the extension of ocean navigation through man-made inland waterways like canals and ship’s canals. These canals of the late 18th and early 19th century were like today’s airline routes, in that they brought together places that were formerly understood as far apart, and shrunk time because they made it faster to get from one place to another.
This book is about that history, about the fears that ordinary people felt in response to these modernizations, and about the way early 19th-century poets and novelists expressed and responded to those fears.
What connections did those writers make between transportation and disease?
In the 1810s, they don’t have germ theory yet, so there’s all kinds of speculation about how disease happens. Works of tropical medicine, which is rising as a discipline, liken the human body to the surface of the earth. They talk about nerves as canals that convey information from the surface to the depths, and the idea that somehow disease spreads along those pathways.
When the canals were being built, some writers opposed them on the grounds that they could bring “strangers” through the heart of the city, and that standing water would become a breeding ground for disease. Now we worry about people bringing disease on airplanes. It’s very similar to that.
What was the COVID-19 of that time?
Probably epidemic cholera [emphasis mine], from about the 1820s onward. The Quarterly Review, a journal that novelist Walter Scott was involved in editing, ran long articles that sought to trace the map of cholera along rivers from South Asia, to Southeast Asia, across Europe and finally to Britain. And in the way that its spread is described, many of the same fears that people are evincing now about COVID-19 were visible then, like the fear of clothes. Is it in your clothes? Do we have to burn our clothes? People were concerned.
What other comparisons can be drawn between those times and what is going on now?
Now we worry about the internet and “fake news.” In the 19th century, they worried about what William Wordsworth called “the rapid communication of intelligence,” which was the daily newspaper. Not everybody had access to newspapers, but each newspaper was read by multiple families and newspapers were available in taverns and coffee shops. So if you were male and literate, you had access to a newspaper, and quite a lot of women did, too.
Paper was made out of rags—discarded underwear. Because of the French Revolution and Napoleonic Wars that followed, France blockaded Britain’s coast and there was a desperate shortage of rags to make paper, which had formerly come from Europe. And so Britain started to import rags from the Caribbean that had been worn by enslaved people.
Papers of the time are full of descriptions of the high cost of rags, how they’re getting their rags from prisons, from prisoners’ underwear, and fear about the kinds of sweat and germs that would have been harboured in those rags—and also discussions of scarcity, as people stole and hoarded those rags. It rings very well with what the internet is telling us now about a bunch of things around COVID-19.
Pietsch, who is also curator emeritus of fishes at the Burke Museum of Natural History and Culture, has published over 200 articles and a dozen books on the biology and behavior of marine fishes. He wrote this book with Rachel J. Arnold, a faculty member at Northwest Indian College in Bellingham and its Salish Sea Research Center.
These walking fishes have stepped into the spotlight lately, with interest growing in recent decades. And though these predatory fishes “will almost certainly devour anything else that moves in a home aquarium,” Pietsch writes, “a cadre of frogfish aficionados around the world has grown within the dive community and among aquarists.” In fact, Pietsch said, there are three frogfish public groups on Facebook, with more than 6,000 members.
First, what is a frogfish?
Ted Pietsch: A member of a family of bony fishes, containing 52 species, all of which are highly camouflaged and whose feeding strategy consists of mimicking the immobile, inert, and benign appearance of a sponge or an algae-encrusted rock, while wiggling a highly conspicuous lure to attract prey.
This is a fish that “walks” and “hops” across the sea bottom, and clambers about over rocks and coral like a four-legged terrestrial animal but, at the same time, can jet-propel itself through open water. Some lay their eggs encapsulated in a complex, floating, mucus mass, called an “egg raft,” while some employ elaborate forms of parental care, carrying their eggs around until they hatch.
They are among the most colorful of nature’s productions, existing in nearly every imaginable color and color pattern, with an ability to completely alter their color and pattern in a matter of days or seconds. All these attributes combined make them one of the most intriguing groups of aquatic vertebrates for the aquarist, diver, and underwater photographer as well as the professional zoologist.
I couldn’t resist the ‘frog’ reference and I’m glad since this is a good read with a number of fascinating photographs and illustrations.,
A March 24, 2020 news item on phys.org features the future of building construction as perceived by synthetic biologists,
Buildings are not unlike a human body. They have bones and skin; they breathe. Electrified, they consume energy, regulate temperature and generate waste. Buildings are organisms—albeit inanimate ones.
But what if buildings—walls, roofs, floors, windows—were actually alive—grown, maintained and healed by living materials? Imagine architects using genetic tools that encode the architecture of a building right into the DNA of organisms, which then grow buildings that self-repair, interact with their inhabitants and adapt to the environment.
A March 23, 2020 essay by Wil Srubar (Professor of Architectural Engineering and Materials Science, University of Colorado Boulder), which originated the news item, provides more insight,
Living architecture is moving from the realm of science fiction into the laboratory as interdisciplinary teams of researchers turn living cells into microscopic factories. At the University of Colorado Boulder, I lead the Living Materials Laboratory. Together with collaborators in biochemistry, microbiology, materials science and structural engineering, we use synthetic biology toolkits to engineer bacteria to create useful minerals and polymers and form them into living building blocks that could, one day, bring buildings to life.
In our most recent work, published in Matter, we used photosynthetic cyanobacteria to help us grow a structural building material – and we kept it alive. Similar to algae, cyanobacteria are green microorganisms found throughout the environment but best known for growing on the walls in your fish tank. Instead of emitting CO2, cyanobacteria use CO2 and sunlight to grow and, in the right conditions, create a biocement, which we used to help us bind sand particles together to make a living brick.
By keeping the cyanobacteria alive, we were able to manufacture building materials exponentially. We took one living brick, split it in half and grew two full bricks from the halves. The two full bricks grew into four, and four grew into eight. Instead of creating one brick at a time, we harnessed the exponential growth of bacteria to grow many bricks at once – demonstrating a brand new method of manufacturing materials.
Researchers have only scratched the surface of the potential of engineered living materials. Other organisms could impart other living functions to material building blocks. For example, different bacteria could produce materials that heal themselves, sense and respond to external stimuli like pressure and temperature, or even light up. If nature can do it, living materials can be engineered to do it, too.
It also take less energy to produce living buildings than standard ones. Making and transporting today’s building materials uses a lot of energy and emits a lot of CO2. For example, limestone is burned to make cement for concrete. Metals and sand are mined and melted to make steel and glass. The manufacture, transport and assembly of building materials account for 11% of global CO2 emissions. Cement production alone accounts for 8%. In contrast, some living materials, like our cyanobacteria bricks, could actually sequester CO2.
The field of engineered living materials is in its infancy, and further research and development is needed to bridge the gap between laboratory research and commercial availability. Challenges include cost, testing, certification and scaling up production. Consumer acceptance is another issue. For example, the construction industry has a negative perception of living organisms. Think mold, mildew, spiders, ants and termites. We’re hoping to shift that perception. Researchers working on living materials also need to address concerns about safety and biocontamination.
The [US] National Science Foundation recently named engineered living materials one of the country’s key research priorities. Synthetic biology and engineered living materials will play a critical role in tackling the challenges humans will face in the 2020s and beyond: climate change, disaster resilience, aging and overburdened infrastructure, and space exploration.
If you have time and interest, this is fascinating. Strubar is a little exuberant and, at this point, I welcome it.
With the significant part of the global population forced to work from home, the occurrence of lower back pain may increase. Lithuanian scientists have devised a spinal stabilisation exercise programme for managing lower back pain for people who perform a sedentary job. After testing the programme with 70 volunteers, the researchers have found that the exercises are not only efficient in diminishing the non-specific lower back pain, but their effect lasts 3 times longer than that of a usual muscle strengthening exercise programme.
According to the World Health Organisation, lower back pain is among the top 10 diseases and injuries that are decreasing the quality of life across the global population. It is estimated that non-specific low back pain is experienced by 60% to 70% of people in industrialised societies. Moreover, it is the leading cause of activity limitation and work absence throughout much of the world. For example, in the United Kingdom, low back pain causes more than 100 million workdays lost per year, in the United States – an estimated 149 million.
Chronic lower back pain, which starts from long-term irritation or nerve injury affects the emotions of the afflicted. Anxiety, bad mood and even depression, also the malfunctioning of the other bodily systems – nausea, tachycardia, elevated arterial blood pressure – are among the conditions, which may be caused by lower back pain.
During the coronavirus disease (COVID-19) outbreak, with a significant part of the global population working from home and not always having a properly designed office space, the occurrence of lower back pain may increase.
“Lower back pain is reaching epidemic proportions. Although it is usually clear what is causing the pain and its chronic nature, people tend to ignore these circumstances and are not willing to change their lifestyle. Lower back pain usually comes away itself, however, the chances of the recurring pain are very high”, says Dr Irina Klizienė, a researcher at Kaunas University of Technology (KTU) Faculty of Social Sciences, Humanities and Arts.
Dr Klizienė, together with colleagues from KTU and from Lithuanian Sports University has designed a set of stabilisation exercises aimed at strengthening the muscles which support the spine at the lower back, i.e. lumbar area. The exercise programme is based on Pilates methodology.
According to Dr Klizienė, the stability of lumbar segments is an essential element of body biomechanics. Previous research evidence shows that in order to avoid the lower back pain it is crucial to strengthen the deep muscles, which are stabilising the lumbar area of the spine. One of these muscles is multifidus muscle.
“Human central nervous system is using several strategies, such as preparing for keeping the posture, preliminary adjustment to the posture, correcting the mistakes of the posture, which need to be rectified by specific stabilising exercises. Our aim was to design a set of exercises for this purpose”, explains Dr Klizienė.
The programme, designed by Dr Klizienė and her colleagues is comprised of static and dynamic exercises, which train the muscle strength and endurance. The static positions are to be held from 6 to 20 seconds; each exercise to be repeated 8 to 16 times.
The previous set is a little puzzling but perhaps you’ll find these ones below easier to follow,
I think more pictures of intervening moves would have been useful. Now. getting back to the press release,
In order to check the efficiency of the programme, 70 female volunteers were randomly enrolled either to the lumbar stabilisation exercise programme or to a usual muscle strengthening exercise programme. Both groups were exercising twice a week for 45 minutes for 20 weeks. During the experiment, ultrasound scanning of the muscles was carried out.
As soon as 4 weeks in lumbar stabilisation programme, it was observed that the cross-section area of the multifidus muscle of the subjects of the stabilisation group has increased; after completing the programme, this increase was statistically significant (p < 0,05). This change was not observed in the strengthening group.
Moreover, although both sets of exercises were efficient in eliminating lower back pain and strengthening the muscles of the lower back area, the effect of stabilisation exercises lasted 3 times longer – 12 weeks after the completion of the stabilisation programme against 4 weeks after the completion of the muscle strengthening programme.
“There are only a handful of studies, which have directly compared the efficiency of stabilisation exercises against other exercises in eliminating lower back pain”, says Dr Klizienė, “however, there are studies proving that after a year, lower back pain returned only to 30% of people who have completed a stabilisation exercise programme, and to 84% of people who haven’t taken these exercises. After three years these proportions are 35% and 75%.”
According to her, research shows that the spine stabilisation exercises are more efficient than medical intervention or usual physical activities in curing the lower back pain and avoiding the recurrence of the symptoms in the future.