Monthly Archives: December 2023

Shape-changing speaker (aka acoustic swarms) for sound control

To alleviate any concerns, these swarms are not kin to Michael Crichton’s swarms in his 2002 novel, Prey or his 2011 novel, Micro (published after his death).

A September 21, 2023 news item on ScienceDaily announces this ‘acoustic swarm’ research,

In virtual meetings, it’s easy to keep people from talking over each other. Someone just hits mute. But for the most part, this ability doesn’t translate easily to recording in-person gatherings. In a bustling cafe, there are no buttons to silence the table beside you.

The ability to locate and control sound — isolating one person talking from a specific location in a crowded room, for instance — has challenged researchers, especially without visual cues from cameras.

A team led by researchers at the University of Washington has developed a shape-changing smart speaker, which uses self-deploying microphones to divide rooms into speech zones and track the positions of individual speakers. With the help of the team’s deep-learning algorithms, the system lets users mute certain areas or separate simultaneous conversations, even if two adjacent people have similar voices. Like a fleet of Roombas, each about an inch in diameter, the microphones automatically deploy from, and then return to, a charging station. This allows the system to be moved between environments and set up automatically. In a conference room meeting, for instance, such a system might be deployed instead of a central microphone, allowing better control of in-room audio.

The team published its findings Sept. 21 [2023] in Nature Communications.

A September 21, 2023 University of Washington (state) news release (also on EurekAlert), which originated the news item, delves further into the work, Note: Links have been removed,

“If I close my eyes and there are 10 people talking in a room, I have no idea who’s saying what and where they are in the room exactly. That’s extremely hard for the human brain to process. Until now, it’s also been difficult for technology,” said co-lead author Malek Itani, a UW doctoral student in the Paul G. Allen School of Computer Science & Engineering. “For the first time, using what we’re calling a robotic ‘acoustic swarm,’ we’re able to track the positions of multiple people talking in a room and separate their speech.”

Previous research on robot swarms has required using overhead or on-device cameras, projectors or special surfaces. The UW team’s system is the first to accurately distribute a robot swarm using only sound.

The team’s prototype consists of seven small robots that spread themselves across tables of various sizes. As they move from their charger, each robot emits a high frequency sound, like a bat navigating, using this frequency and other sensors to avoid obstacles and move around without falling off the table. The automatic deployment allows the robots to place themselves for maximum accuracy, permitting greater sound control than if a person set them. The robots disperse as far from each other as possible since greater distances make differentiating and locating people speaking easier. Today’s consumer smart speakers have multiple microphones, but clustered on the same device, they’re too close to allow for this system’s mute and active zones.

“If I have one microphone a foot away from me, and another microphone two feet away, my voice will arrive at the microphone that’s a foot away first. If someone else is closer to the microphone that’s two feet away, their voice will arrive there first,” said co-lead author Tuochao Chen, a UW doctoral student in the Allen School. “We developed neural networks that use these time-delayed signals to separate what each person is saying and track their positions in a space. So you can have four people having two conversations and isolate any of the four voices and locate each of the voices in a room.”

The team tested the robots in offices, living rooms and kitchens with groups of three to five people speaking. Across all these environments, the system could discern different voices within 1.6 feet (50 centimeters) of each other 90% of the time, without prior information about the number of speakers. The system was able to process three seconds of audio in 1.82 seconds on average — fast enough for live streaming, though a bit too long for real-time communications such as video calls.

As the technology progresses, researchers say, acoustic swarms might be deployed in smart homes to better differentiate people talking with smart speakers. That could potentially allow only people sitting on a couch, in an “active zone,” to vocally control a TV, for example.

Researchers plan to eventually make microphone robots that can move around rooms, instead of being limited to tables. The team is also investigating whether the speakers can emit sounds that allow for real-world mute and active zones, so people in different parts of a room can hear different audio. The current study is another step toward science fiction technologies, such as the “cone of silence” in “Get Smart” and“Dune,” the authors write.

Of course, any technology that evokes comparison to fictional spy tools will raise questions of privacy. Researchers acknowledge the potential for misuse, so they have included guards against this: The microphones navigate with sound, not an onboard camera like other similar systems. The robots are easily visible and their lights blink when they’re active. Instead of processing the audio in the cloud, as most smart speakers do, the acoustic swarms process all the audio locally, as a privacy constraint. And even though some people’s first thoughts may be about surveillance, the system can be used for the opposite, the team says.

“It has the potential to actually benefit privacy, beyond what current smart speakers allow,” Itani said. “I can say, ‘Don’t record anything around my desk,’ and our system will create a bubble 3 feet around me. Nothing in this bubble would be recorded. Or if two groups are speaking beside each other and one group is having a private conversation, while the other group is recording, one conversation can be in a mute zone, and it will remain private.”

Takuya Yoshioka, a principal research manager at Microsoft, is a co-author on this paper, and Shyam Gollakota, a professor in the Allen School, is a senior author. The research was funded by a Moore Inventor Fellow award.

Two of the paper`s authors, Malek Itani and Tuochao Chen, have written a ‘Behind the Paper’ article for Nature.com’s Electrical and Electronic Engineering Community, from their September 21, 2023 posting,

Sound is a versatile medium. In addition to being one of the primary means of communication for us humans, it serves numerous purposes for organisms across the animal kingdom. Particularly, many animals use sound to localize themselves and navigate in their environment. Bats, for example, emit ultrasonic sound pulses to move around and find food in the dark. Similar behavior can be observed in Beluga whales to avoid obstacles and locate one other.

Various animals also have a tendency to cluster together into swarms, forming a unit greater than the sum of its parts. Famously, bees agglomerate into swarms to more efficiently search for a new colony. Birds flock to evade predators. These behaviors have caught the attention of scientists for quite some time, inspiring a handful of models for crowd control, optimization and even robotics. 

A key challenge in building robot swarms for practical purposes is the ability for the robots to localize themselves, not just within the swarm, but also relative to other important landmarks. …

Here’s a link to and a citation for the paper,

Creating speech zones with self-distributing acoustic swarms by Malek Itani, Tuochao Chen, Takuya Yoshioka & Shyamnath Gollakota. Nature Communications volume 14, Article number: 5684 (2023) DOI: https://doi.org/10.1038/s41467-023-40869-8 Published: 21 September 2023

This paper is open access.

Science communication perspectives from a documentary filmmaker

Marina Joubert, science communication researcher at Stellenbosch University in South Africa, has written a September 21, 2023 essay about science communication and documentary filmmaker, Sonya Pemberton, on The Conversation (h/t Sept. 21, 2023 news item on phys.org), Note: Links have been removed,

In general, people trust scientists more than they do most other professions. But this isn’t the case universally. Trust in science dropped in sub-Saharan Africa after the pandemic. In other parts of the world, in particular the US, public opinion about science is driven by political ideology and is becoming increasingly polarised.

As multi award-winning Australian filmmaker Sonya Pemberton put it during a plenary address at the 2023 Public Communication of Science and Technology Conference: “We have access to so much information, and yet simultaneously some areas of science are facing walls of doubt, disbelief and distrust.”

So what’s the solution? Communication, Pemberton told attendees at the conference, held in April in Rotterdam, in the Netherlands:

Her assertion, and her approach to making films, is rooted in evidence from science communication research. To build trust with an audience, scientists must demonstrate that they are competent experts. But they must also come across as warm, caring and human.

Pemberton – and we, a group of South African science communication academics who attended the conference – are part of a global movement in our discipline towards using the science of science communication. In essence, this is about building our science engagement efforts on evidence, rather than on a gut feeling.

Pemberton has one guiding principle: know your audience. She also has five golden rules for effective science communication:

  • acknowledge uncertainty
  • avoid polarising messages
  • check for biases
  • incite curiosity
  • embrace complexity.

… where her five rules come in. They are the way, she believes, to engage those who dislike, distrust or dismiss science. Her approach draws on the Yale University-based Cultural Cognition project, which involves an interdisciplinary team of scholars using what they call “empirical methods to examine the impact of group values on perceptions of risk and related facts”.

1. Acknowledge uncertainty

Sometimes scientists are wrong. …

Joubert’s September 21, 2023 essay also has an embedded SWIPE SciComm Issue no. 3 video interview with Pemberton (runtime of almost 47 mins.).

As for SWIPE SciComm, it is a mobile science communication magazine that was launched according to Dr. Tullio Rossi’s October 26, 2022 blog post on that date on Dr. Rossi’s ‘Animate Your Science’ website,

I’m SO EXCITED to share a major new project with you!!!

We’ve been keeping this project secret for the past 6 months, and it’s now time to reveal that we created the world’s FIRST science communication magazine!

Wait, what? Yes you heard right!

My team and I realised that there was not a single science communication magazine out there, so so we decided to create one!

And since it’s 2022, and we don’t like to cut down trees for paper or burn fuel to ship it, we made it mobile-first 📱. It’s a new kind of magazine that you can read on your phone without downloading any apps simply by swiping and scrolling. 📲

Here’s an overview of what to expect:

Interviews with leading personalities
Tutorials
Reviewing the “science” of science communication
Guest articles
Case studies

Dr. Rossi’s declaration may be a bit of a surprise to the folks at Sage Journals who publish Science Communication,

Science Communication is an international and highly ranked communication research journal that publishes manuscripts that are of the highest quality, in terms of theory and methods. We define science broadly to include social science, technology, environment, engineering, and health, as well as the physical and natural sciences. However, across all scientific contexts, communication must be at the center of the investigation. We also recognize the critical importance of science communication practice and expect all manuscripts to address the practical implications of their research, as well as theory.

Perhaps Dr. Rossi meant the mobile aspect? In that case, SWIPE SciComm seems to be a first.

Dr. Rossi’s free magazine initiative is part of his larger venture for-profit venture, Animate Your Science, which was last mentioned here in a July 15, 2019 posting (scroll down to the text immediately following the image of an x-rayed hand followed by an embedded vide).

Robot that can maneuver through living lung tissue

Caption: Overview of the semiautonomous medical robot’s three stages in the lungs. Credit: Kuntz et al.

This looks like one robot operating on another robot; I guess the researchers want to emphasize the fact that this autonomous surgical procedure isn’t currently being tested on human beings.

There’s more in a September 21, 2023 news item on ScienceDaily,

Scientists have shown that their steerable lung robot can autonomously maneuver the intricacies of the lung, while avoiding important lung structures.

Lung cancer is the leading cause of cancer-related deaths in the United States. Some tumors are extremely small and hide deep within lung tissue, making it difficult for surgeons to reach them. To address this challenge, UNC -Chapel Hill and Vanderbilt University researchers have been working on an extremely bendy but sturdy robot capable of traversing lung tissue.

Their research has reached a new milestone. In a new paper, published in Science Robotics, Ron Alterovitz, PhD, in the UNC Department of Computer Science, and Jason Akulian, MD MPH, in the UNC Department of Medicine, have proven that their robot can autonomously go from “Point A” to “Point B” while avoiding important structures, such as tiny airways and blood vessels, in a living laboratory model.

Thankfully there’s a September 21, 2023 University of North Carolina (UNC) news release (also on EurekAlert), which originated the news item, to provide more information, Note: Links have been removed,

“This technology allows us to reach targets we can’t otherwise reach with a standard or even robotic bronchoscope,” said Dr. Akulian, co-author on the paper and Section Chief of Interventional Pulmonology and Pulmonary Oncology in the UNC Division of Pulmonary Disease and Critical Care Medicine. “It gives you that extra few centimeters or few millimeters even, which would help immensely with pursuing small targets in the lungs.”

The development of the autonomous steerable needle robot leveraged UNC’s highly collaborative culture by blending medicine, computer science, and engineering expertise. In addition to Alterovitz and Akulian, the development effort included Yueh Z. Lee, MD, PhD, at the UNC Department of Radiology, as well as Robert J. Webster III at Vanderbilt University and Alan Kuntz at the University of Utah.

The robot is made of several separate components. A mechanical control provides controlled thrust of the needle to go forward and backward and the needle design allows for steering along curved paths. The needle is made from a nickel-titanium alloy and has been laser etched to increase its flexibility, allowing it to move effortlessly through tissue.

As it moves forward, the etching on the needle allows it to steer around obstacles with ease. Other attachments, such as catheters, could be used together with the needle to perform procedures such as lung biopsies.

To drive through tissue, the needle needs to know where it is going. The research team used CT scans of the subject’s thoracic cavity and artificial intelligence to create three-dimensional models of the lung, including the airways, blood vessels, and the chosen target. Using this 3-D model and once the needle has been positioned for launch, their AI-driven software instructs it to automatically travel from “Point A” to “Point B” while avoiding important structures.

“The autonomous steerable needle we’ve developed is highly compact, but the system is packed with a suite of technologies that allow the needle to navigate autonomously in real-time,” said Alterovitz, the principal investigator on the project and senior author on the paper. “It’s akin to a self-driving car, but it navigates through lung tissue, avoiding obstacles like significant blood vessels as it travels to its destination.”

The needle can also account for respiratory motion. Unlike other organs, the lungs are constantly expanding and contracting in the chest cavity. This can make targeting especially difficult in a living, breathing subject. According to Akulian, it’s like shooting at a moving target.

The researchers tested their robot while the laboratory model performed intermittent breath holding. Every time the subject’s breath is held, the robot is programmed to move forward.

“There remain some nuances in terms of the robot’s ability to acquire targets and then actually get to them effectively,” said Akulian, who is also a member of the UNC Lineberger Comprehensive Cancer Center, “and while there’s still a lot of work to be done, I’m very excited about continuing to push the boundaries of what we can do for patients with the world-class experts that are here.”

“We plan to continue creating new autonomous medical robots that combine the strengths of robotics and AI to improve medical outcomes for patients facing a variety of health challenges while providing guarantees on patient safety,” added Alterovitz.

Here’s a link to and a citation for the paper,

Autonomous medical needle steering in vivo by Alan Kuntz, Maxwell Emerson, Tayfun Efe Ertop, Inbar Fried, Mengyu Fu, Janine Hoelscher, Margaret Rox, Jason Akulian, Erin A. Gillaspie, Yueh Z. Lee, Fabien Maldonado, Robert J. Webster III, and Ron Alterovitz. Science Robotics 20 Sep 2023 Vol 8, Issue 82 DOI: 10.1126/scirobotics.adf7614

This paper is behind a paywall.

AI-led corporate entities as a new species of legal subject

An AI (artificial intelligence) agent running a business? Not to worry, lawyers are busy figuring out the implications according to this October 26, 2023 American Association for the Advancement of Science (AAAS) news release on EurekAlert,

For the first time in human history, say Daniel Gervais and John Nay in a Policy Forum, nonhuman entities that are not directed by humans – such as artificial intelligence (AI)-operated corporations – should enter the legal system as a new “species” of legal subject. AI has evolved to the point where it could function as a legal subject with rights and obligations, say the authors. As such, before the issue becomes too complex and difficult to disentangle, “interspecific” legal frameworks need to be developed by which AI can be treated as legal subjects, they write. Until now, the legal system has been univocal – it allows only humans to speak to its design and use. Nonhuman legal subjects like animals have necessarily instantiated their rights through human proxies. However, their inclusion is less about defining and protecting the rights and responsibilities of these nonhuman subjects and more a vehicle for addressing human interests and obligations as it relates to them. In the United States, corporations are recognized as “artificial persons” within the legal system. However, the laws of some jurisdictions do not always explicitly require corporate entities to have human owners or managers at their helm. Thus, by law, nothing generally prevents an AI from operating a corporate entity. Here, Gervais and Nay highlight the rapidly realizing concept of AI-operated “zero-member LLCs” – or a corporate entity operating autonomously without any direct human involvement in the process. The authors discuss several pathways in which such AI-operated LLCs and their actions could be handled within the legal system. As the idea of ceasing AI development and use is highly unrealistic, Gervais and Nay discuss other options, including regulating AI by treating the machines as legally inferior to humans or engineering AI systems to be law-abiding and bringing them into the legal fold now before it becomes too complicated to do so.

Gervais and Nay have written an October 26, 2023 essay “AIs could soon run businesses – it’s an opportunity to ensure these ‘artificial persons’ follow the law” for The Conversation, which helps clarify matters, Note: Links have been removed,

Only “persons” can engage with the legal system – for example, by signing contracts or filing lawsuits. There are two main categories of persons: humans, termed “natural persons,” and creations of the law, termed “artificial persons.” These include corporations, nonprofit organizations and limited liability companies (LLCs).

Up to now, artificial persons have served the purpose of helping humans achieve certain goals. For example, people can pool assets in a corporation and limit their liability vis-à-vis customers or other persons who interact with the corporation. But a new type of artificial person is poised to enter the scene – artificial intelligence systems, and they won’t necessarily serve human interests.

As scholars who study AI and law we believe that this moment presents a significant challenge to the legal system: how to regulate AI within existing legal frameworks to reduce undesirable behaviors, and how to assign legal responsibility for autonomous actions of AIs.

One solution is teaching AIs to be law-abiding entities.

This is far from a philosophical question. The laws governing LLCs in several U.S. states do not require that humans oversee the operations of an LLC. In fact, in some states it is possible to have an LLC with no human owner, or “member” [emphasis mine] – for example, in cases where all of the partners have died. Though legislators probably weren’t thinking of AI when they crafted the LLC laws, the possibility for zero-member LLCs opens the door to creating LLCs operated by AIs.

Many functions inside small and large companies have already been delegated to AI in part, including financial operations, human resources and network management, to name just three. AIs can now perform many tasks as well as humans do. For example, AIs can read medical X-rays and do other medical tasks, and carry out tasks that require legal reasoning. This process is likely to accelerate due to innovation and economic interests.

I found the essay illuminating and the abstract for the paper (link and citation for paper at end of this post), a little surprising,

Several experts have warned about artificial intelligence (AI) exceeding human capabilities, a “singularity” [emphasis mine] at which it might evolve beyond human control. Whether this will ever happen is a matter of conjecture. A legal singularity is afoot, however: For the first time, nonhuman entities that are not directed by humans may enter the legal system as a new “species” of legal subjects. This possibility of an “interspecific” legal system provides an opportunity to consider how AI might be built and governed. We argue that the legal system may be more ready for AI agents than many believe. Rather than attempt to ban development of powerful AI, wrapping of AI in legal form could reduce undesired AI behavior by defining targets for legal action and by providing a research agenda to improve AI governance, by embedding law into AI agents, and by training AI compliance agents.

it was a little unexpected to see the ‘singularity’ mentioned. it’s a term I associate with the tech and the sci fi communities.For anyone unfamiliar with the term, here’s a description from the ‘Technological singularity’ Wikipedia entry, Note: Links have been removed,

The technological singularity—or simply the singularity[1]—is a hypothetical future point in time at which technological growth becomes uncontrollable and irreversible, resulting in unforeseeable consequences for human civilization.[2][3] According to the most popular version of the singularity hypothesis, I. J. Good’s intelligence explosion model, an upgradable intelligent agent will eventually enter a “runaway reaction” of self-improvement cycles, each new and more intelligent generation appearing more and more rapidly, causing an “explosion” in intelligence and resulting in a powerful superintelligence that qualitatively far surpasses all human intelligence.[4]

The first person to use the concept of a “singularity” in the technological context was the 20th-century Hungarian-American mathematician John von Neumann.[5] Stanislaw Ulam reports in 1958 an earlier discussion with von Neumann “centered on the accelerating progress of technology and changes in the mode of human life, which gives the appearance of approaching some essential singularity in the history of the race beyond which human affairs, as we know them, could not continue”.[6] Subsequent authors have echoed this viewpoint.[3][7]

The concept and the term “singularity” were popularized by Vernor Vinge first in 1983 in an article that claimed that once humans create intelligences greater than their own, there will be a technological and social transition similar in some sense to “the knotted space-time at the center of a black hole”,[8] and later in his 1993 essay The Coming Technological Singularity,[4][7] in which he wrote that it would signal the end of the human era, as the new superintelligence would continue to upgrade itself and would advance technologically at an incomprehensible rate. He wrote that he would be surprised if it occurred before 2005 or after 2030.[4] Another significant contributor to wider circulation of the notion was Ray Kurzweil’s 2005 book The Singularity Is Near, predicting singularity by 2045.[7]

Finally, here’s a link to and a citation for the paper,

Law could recognize nonhuman AI-led corporate entities by Daniel J. Gervais and John J. Nay. Science 26 Oct 2023 Vol 382, Issue 6669 pp. 376-378 DOI: 10.1126/science.adi8678

This paper is behind a paywall.

They glow under stress: soft, living materials made with algae

Caption: These soft, living materials glow in response to mechanical stress, such as compression, stretching or twisting. Credit: UC San Diego Jacobs School of Engineering

An October 20, 2023 news item on phys.org describes research into bioluminescent materials, Note: A link has been removed,

A team of researchers led by the University of California San Diego has developed soft yet durable materials that glow in response to mechanical stress, such as compression, stretching or twisting. The materials derive their luminescence from single-celled algae known as dinoflagellates.

The work, inspired by the bioluminescent waves observed during red tide events at San Diego’s beaches, was published Oct. 20 [2023] in Science Advances.

An October 23, 2023 University of California at San Diego news release (also on EurekAlert but published October 20, 2023) by Liezel Labios, which originated the news item, delves further into the research,

An exciting feature of these materials is their inherent simplicity—they need no electronics, no external power source,” said study senior author Shengqiang Cai, a professor of mechanical and aerospace engineering at the UC San Diego Jacobs School of Engineering. “We demonstrate how we can harness the power of nature to directly convert mechanical stimuli into light emission.”

This study was a multi-disciplinary collaboration involving engineers and materials scientists in Cai’s lab, marine biologist Michael Latz at UC San Diego’s Scripps Institution of Oceanography, and physics professor Maziyar Jalaal at University of Amsterdam.

The primary ingredients of the bioluminescent materials are dinoflagellates and a seaweed-based polymer called alginate. These elements were mixed to form a solution, which was then processed with a 3D printer to create a diverse array of shapes, such as grids, spirals, spiderwebs, balls, blocks and pyramid-like structures. The 3D-printed structures were then cured as a final step.

When the materials are subjected to compression, stretching or twisting, the dinoflagellates within them respond by emitting light. This response mimics what happens in the ocean, when dinoflagellates produce flashes of light as part of a predator defense strategy. In tests, the materials glowed when the researchers pressed on them and traced patterns on their surface. The materials were even sensitive enough to glow under the weight of a foam ball rolling on their surface.

The greater the applied stress, the brighter the glow. The researchers were able to quantify this behavior and developed a mathematical model that can predict the intensity of the glow based on the magnitude of the mechanical stress applied.

The researchers also demonstrated techniques to make these materials resilient in various experimental conditions. To reinforce the materials so that they can bear substantial mechanical loads, a second polymer, poly(ethylene glycol) diacrylate, was added to the original blend. Also, coating the materials with a stretchy rubber-like polymer called Ecoflex provided protection in acidic and basic solutions. With this protective layer, the materials could even be stored in seawater for up to five months without losing their form or bioluminescent properties.

Another beneficial feature of these materials is their minimal maintenance requirements. To keep working, the dinoflagellates within the materials need periodic cycles of light and darkness. During the light phase, they photosynthesize to produce food and energy, which are then used in the dark phase to emit light when mechanical stress is applied. This behavior mirrors the natural processes at play when the dinoflagellates cause bioluminescence in the ocean during red tide events. 

“This current work demonstrates a simple method to combine living organisms with non-living components to fabricate novel materials that are self-sustaining and are sensitive to fundamental mechanical stimuli found in nature,” said study first author Chenghai Li, a mechanical and aerospace engineering Ph.D. candidate in Cai’s lab.

The researchers envision that these materials could potentially be used as mechanical sensors to gauge pressure, strain or stress. Other potential applications include soft robotics and biomedical devices that use light signals to perform treatment or controlled drug release.

However, there is much work to be done before these applications can be realized. The researchers are working on further improving and optimizing the materials.

Here’s a link to and a citation for the paper,

Ultrasensitive and robust mechanoluminescent living composites by Chenghai Li, Nico Schramma, Zijun Wang, Nada F. Qari, Maziyar Jalaal, Michael I. Latz, and Shengqiang Cai. Science Advances 20 Oct 2023 Vol 9, Issue 42 DOI: 10.1126/sciadv.adi8643

This paper is open access.

AI for salmon recovery

Hopefully you won’t be subjected to a commercial prior to this 3 mins. 49 secs. video about the salmon and how artificial intelligence (AI) could make a difference in theirs and our continued survival,

Video caption: Wild Salmon Center is partnering with First Nations to pilot the Salmon Vision technology. (Credit: Olivia Leigh Nowak/Le Colibri Studio.)

An October 19, 2023 news item on phys.org announces this research, Note: Links have been removed,

Scientists and natural resource managers from Canadian First Nations, governments, academic institutions, and conservation organizations published the first results of a unique salmon population monitoring tool in Frontiers in Marine Science.

This groundbreaking new technology, dubbed “Salmon Vision,” combines artificial intelligence with age-old fishing weir technology. Early assessments show it to be remarkably adept at identifying and counting fish species, potentially enabling real-time salmon population monitoring for fisheries managers.

An October 19, 2023 Wild Salmon Center news release on EurekAlert, which originated the news item, provides more detail about the work,

“In recent years, we’ve seen the promise of underwater video technology to help us literally see salmon return to rivers,” says lead author Dr. Will Atlas, Senior Watershed Scientist with the Portland-based Wild Salmon Center. “That dovetails with what many of our First Nations partners are telling us: that we need to automate fish counting to make informed decisions while salmon are still running.” 

The Salmon Vision pilot study annotates more than 500,000 individual video frames captured at two Indigenous-run fish counting weirs on the Kitwanga and Bear Rivers of B.C.’s Central Coast. 

The first-of-its-kind deep learning computer model, developed in data partnership with the Gitanyow Fisheries Authority and Skeena Fisheries Commission, shows promising accuracy in identifying salmon species. It yielded mean average precision rates of 67.6 percent in tracking 12 different fish species passing through custom fish-counting boxes at the two weirs, with scores surpassing 90 and 80 percent for coho and sockeye salmon: two of the principal fish species targeted by First Nations, commercial, and recreational fishers. 

“When we envisioned providing fast grants for projects focused on Indigenous futurism and climate resilience, this is the type of project that we hoped would come our way,” says Dr. Keolu Fox, a professor at the University of California-San Diego, and one of several reviewers in an early crowdfunding round for the development of Salmon Vision. 

Collaborators on the model, funded by the British Columbia Salmon Recovery and Innovation Fund, include researchers and fisheries managers with Simon Fraser University and Douglas College computing sciences, the Pacific Salmon Foundation, Gitanyow Fisheries Authority, and the Skeena Fisheries Commission. Following these exciting early results, the next step is to expand the model with partner First Nations into a half-dozen new watersheds on B.C.’s North and Central Coast.

Real-time data on salmon returns is critical on several fronts. According to Dr. Atlas, many fisheries in British Columbia have been data-poor for decades. That leaves fisheries managers to base harvest numbers on early-season catch data, rather than the true number of salmon returning. Meanwhile, changing weather patterns, stream flows, and ocean conditions are creating more variable salmon returns: uncertainty that compounds the ongoing risks of overfishing already-vulnerable populations.

“Without real-time data on salmon returns, it’s extremely difficult to build climate-smart, responsive fisheries,” says Dr. Atlas. “Salmon Vision data collection and analysis can fill that information gap.” 

It’s a tool that he says will be invaluable to First Nation fisheries managers and other organizations both at the decision-making table—in providing better information to manage conservation risks and fishing opportunities—and in remote rivers across salmon country, where on-the-ground data collection is challenging and costly. 

The Salmon Vision team is implementing automated counting on a trial basis in several rivers around the B.C. North and Central Coasts in 2023. The goal is to provide reliable real-time count data by 2024.

This October 18, 2023 article by Ramona DeNies for the Wild Salmon Center (WSC) is nicely written although it does cover some of the same material seen in the news release, Note: A link has been removed,

Right now, in rivers across British Columbia’s Central Coast, we don’t know how many salmon are actually returning. At least, not until fishing seasons are over.

And yet, fisheries managers still have to make decisions. They have to make forecasts, modeled on data from the past. They have to set harvest targets for commercial and recreational fisheries. And increasingly, they have to make the call on emergency closures, when things start looking grim.

“On the north and central coast of BC, we’ve seen really wildly variable returns of salmon over the last decade,” says Dr. Will Atlas, Wild Salmon Center Senior Watershed Scientist. “With accelerating climate change, every year is unprecedented now. Yet from a fisheries management perspective, we’re still going into most seasons assuming that this year will look like the past.”

One answer, Dr. Atlas says, is “Salmon Vision.” Results from this first-of-its-kind technology—developed by WSC in data partnership with the Gitanyow Fisheries Authority and Skeena Fisheries Commission—were recently published in Frontiers in Marine Science.

There are embedded images in DeNies’ October 18, 2023 article; it’s where I found the video.

Here’s a link to and a citation for the paper,

Wild salmon enumeration and monitoring using deep learning empowered detection and tracking by William I. Atlas, Sami Ma, Yi Ching Chou, Katrina Connors, Daniel Scurfield, Brandon Nam, Xiaoqiang Ma, Mark Cleveland, Janvier Doire, Jonathan W. Moore, Ryan Shea, Jiangchuan Liu. Front. Mar. Sci., 20 September 2023 Volume 10 – 2023 DOI: https://doi.org/10.3389/fmars.2023.1200408

This paper appears to be open access.

The University of British Columbia and its November 28, 2023 Great UBC Bug Bake Off

Last week, I received (via email) this enticing November 27, 2023 University of British Columbia media advisory,

Welcome, baking enthusiasts and insect epicureans, to the Great UBC Bug
Bake Off!

On Nov. 28 [2023], media are invited as four teams of faculty of land and food
systems students engage in a six-legged culinary showdown. Students will
showcase insect-laden dishes that are delicious, nutritious and
environmentally friendly. Esteemed judges, including UBC executive chef
David Speight, will weigh in on the taste, texture and insect ingenuity
of the creations.

We spoke to course instructor and sessional lecturer Dr. Yasmin Akhtar
about the competition, and why she advocates for entomophagy – eating
insects and bugs.

WHY DO YOU HOST THIS INSECT DISH COMPETITION?

This competition is the culmination of my applied biology course
“Insects as Food and Feed” where we spent the semester learning
about the benefits and risks of eating and using insects. One of my
goals is to reduce the negative perceptions people may have of eating
bugs. This competition is a fun way to raise awareness among students
about the nutritional value of insects, their role in sustainable food
systems and the importance of considering alternative protein sources.

WHAT ARE THE BENEFITS OF EATING INSECTS?

In addition to being really tasty, there are two main benefits of eating
insects.

Many insects are incredibly nutritious: They are high in protein,
calcium, good fatty acids and vitamins. For example, a species of
grasshoppers commonly eaten in Mexico, Sphenarium purpurascens,
contain 48 grams of protein per 100 grams, compared to 27 grams of
protein per 100 grams of beef. Insect protein is also easily absorbed by
humans and some insects contain all the essential amino acids that
humans need.

The other benefit is environmental. Rearing insects requires much less
space, fewer resources like water and much less feed. They produce much
lower greenhouse gas emissions than cattle or pigs, for example. It also
encourages the sustainable use of diverse insect species, rather than
relying on a small number of traditional livestock species to meet the
world’s needs.

It is also relatively cheap to rear insects, which means that
small-scale farmers can benefit.

WHAT ARE SOME EASY WAYS TO INCORPORATE BUGS INTO YOUR DIET?

Insect flours and insect powders are an easy way to incorporate bugs
into your diet – especially if you are wary of eating insects whole.
You can purchase insect flour online and simply replace wheat flour in
any recipe with the insect flour for tasty, high-protein baked products
like muffins or as filling in samosas.

Barbecuing insects is another great option: they absorb flavour really
well, and dry out to become very crunchy. Barbecued crickets are my
favourite! I also really like chocolate-covered ants, and adding insect
powder to green tea.

WHAT ARE SOME RISKS OF EATING INSECTS THAT PEOPLE SHOULD BE AWARE OF?

Insects live in a lot of different environments, including soil, and can
be infested with microorganisms like bacteria, fungi and other viruses.
Just like other animal proteins, insects should be treated before they
are consumed – using heat to boil or cook them, for example.

If capturing insects from the wild, you need to be aware that they may
be contaminated with pesticides that were used to spray fruits and
vegetables. A better option would be purchase them from insect farms,
where they are safely raised to be used as food.

Lastly, if you’re allergic to seafood, then you’ll likely also be
allergic to insects because they share similar protein allergens.

EVENT: GREAT UBC BUG BAKE OFF

Date/time: Tuesday, Nov. 28, 11:15 a.m. – 1 p.m.

Contest will begin promptly at 11:30 a.m. so please arrive early to set
up.

Location: Vij’s Kitchen, Room 130, 2205 East Mall

As you might have expected, the media attended. From a November 28, 2023 article by Stefan Labbé for vancouverisawesome.com

Inside a culinary lab at the University of British Columbia, nine students took turns offering a menu of insect-infused recipes to a panel of judges. 

Beef tacos wrapped in cricket flour-laced tortillas. Mealworm ginger sugar cookies “to add a little protein during the holidays.” And cheesecake with a layer of crushed cricket fudge. Judge and UBC executive chef David Speight snapped off a piece of ginger cookie in his mouth. 

“It doesn’t really taste like mealworm,” he said with a smile. “That’s good.”

The competition, billed as the Great UBC Bug Bake Off, pit the students against each other to see who could come up with the tastiest, and perhaps least offensive dish. But for students who had just spent months learning about insects as food and feed, the stakes of eating bugs was much larger. 

“We’re going hungry globally,” said UBC student Rozy Etaghene, after presenting her cheesecake.

By 2050, the global population is expected to hit nine million people [sic; the UN projection is for 9.8 billion]. To feed all those mouths, agricultural production will have to double, according to the UN’s Food and Agricultural Organization. But agriculture already takes up 30 per cent of the planet’s land, with up to 70 per cent of that reserved for livestock like cattle, pigs and chickens.

But substituting chicken wings for fried crickets is not always an easy sell. A decade ago, Vancouver chef Vikram Vij donated $250,000 to renovate UBC’s culinary lab. At the time, the co-owner of Vij’s restaurants, Meeru Dhalwala, was in the midst of experimentation, first putting insects on the menu in 2008.

It all started with roasted crickets, an insect that requires only two kilograms of feed for every one kilogram of body weight gain. Spiced with cayenne, cumin and coriander, Dhalwala said she would treat them like ground almonds. 

“I made a cricket paratha, like a flatbread,” she said. “It was a really big deal at the time.”

Back at the UBC culinary lab, the judges had come to a decision: Etaghene’s cheesecake had lost out to a pound cake and plate of cranberry short-bread cookies — both baked with cricket flour.

dhalwala-cricket-parantha
A cricket paratha served at Meeru Dhalwala’s restaurant in Seattle sold four times better than in Vancouver, says the restaurateur. Stefan Labbé/Glacier Media

Labbé’s November 28, 2023 article offers a lot of information on insects as food in Canada and in the world, as well as, more about the bake off.

Another November 28, 2023 article this time written by Cosmin Dzsurdzsa for True North (I have more about True North after the excerpt) highlights other aspects of the event, Note: Links have been removed,

Canadian journalists were so eager to attend the University of British Columbia’s Bug Bake Off on Tuesday [November 28, 2023] to get a taste of edible insect creations that the event was booked to capacity the night before.

Former CBC producer and UBC media relations specialist Sachintha Wickramasinghe told True North on Monday that the event was at capacity.

“There’s been significant interest since this morning and we are already at capacity for media,” said Wicramansinghe. 

There has been growing interest by governments and the private sector to warm consumers up to the idea of edible insects. The Liberal government has lavished edible insect cricket farming companies with hundreds of thousands of dollars worth of subsidies [emphasis mine]. 

For anyone curious about True North, there’s this from the True North Centre for Public Policy Wikipedia entry, Note: Links have been removed,

The True North Centre for Public Policy is a Canadian media outlet that simultaneously describes itself as a “media company”, an “advocacy organization” and as a “registered charity with the government of Canada.”[1][2] It operates a digital media arm known simply as True North [emphasis mine].[3][4]

In 1994, the Independent Immigration Aid Association was started with the goal of helping immigrants from the United Kingdom settle in British Columbia.[2][5] According to Daniel Brown, a former director of the charity, a new board of directors took control of the charity in 2017 and renamed it the True North Centre for Public Policy.[2] Control was handed off to three people:[2]

  • Kaz Nejatian, a former staffer for United Conservative Party leader Jason Kenney, and current COO of Shopify.[6]
  • William McBeath, the director of Training and Marketing for the right-wing Manning Centre for Building Democracy.
  • Erynne Schuster, an Edmonton-based lawyer.

Nejatian’s wife, Candice Malcolm, describes herself as the “founder and Editor-In-Chief” of True North.[7][8]

The political leanings of the people in charge of True North in its various manifestations don’t seem to have influenced Dzsurdzsa’s November 28, 2023 article unduly. however, I’m a little surprised by the stated size of the industry subsidies made by the Liberal government. I found an $8.5 million dollar investment (isn’t that similar to a subsidy?) for one project alone in a June 29, 2022 article by Nicole Kerwin for Pet Food Processing, Note: A link has been removed,

Agriculture and Agri-Food Canada revealed June 27 [2022] an $8.5 million investment to Aspire, an insect agricultural company, to build a new production facility in Canada. The facility will process cricket-based protein, helping to advance the use of insect proteins in human and pet food products.

According to Agriculture and Agri-Food Canada, food-grade processing of insects is relatively new in Canada, however insect-based proteins create an opportunity for the country’s agri-food industry to develop more sustainable products.

“The strength of Canadian agriculture has always been its openness to new ideas and new approaches,” said Peter Fragiskatos, parliamentary secretary to the Minister of National Revenue and member of Parliament for London North Center. “Aspire [Food Group] is helping to re-shape how we think about agriculture and opening the door to new product and market opportunities.”

Founded in 2013, Aspire strives to tackle worldwide food scarcity with a focus on edible insect production, therefore developing highly nutritious foods and lowering its environmental impact. Currently, the company has production facilities in London, Ontario, and Austin, Texas. In 2020, Aspire purchased 12 acres of land in Ontario to construct what it expects to be the largest automated, food-grade cricket processing facility in the world.

“Aspire is re-imagining what it means to sustainably produce food, and how smart technology can turn that vision into a reality,” said Francious Drouin, parliamentary secretary to the Minister of Agriculture and Agri-food Canada. “Aspire’s innovative facility will help further establish London’s reputation as a hub for cutting-edge technology, strongly contributing to Ontario and Canada’s position as an innovator in agriculture and agri-food.”

Apsire [sic] plans to use the investment, as well as smart technology, to build its first commercial insect production facility in Ontario. The facility will boost Aspire’s insect farming capabilities, providing it with the ability to grow and monitor billions of crickets, which will be used to create nutrient-rich protein ingredients for use in the human and pet food industries.

Getting back to the Bake Off, there’s a Canadian Broadcasting Corporation (CBC) video (runtime: 3 mins. 34 secs.),

UBC Bug Bake Off serves up insect dishes

Students at the University of British Columbia have whipped up some protein-rich dishes made with a special ingredient: bugs. Our Science and Climate Specialist Darius Mahdavi tried the insect-laden dishes and brought some for our Dan Burritt as well.

Sadly, you will have to endure a couple of commercials before getting to the ‘main course’.

Scientists at Indian Institute of Science (IISc) created hybrid nanoparticles made of gold and copper sulfide that can kill cancer cells

It’s been a while since there was a theranostic (diagnosis and therapy combined in one treatment) story here.

Caption: Schematic indicating photo-theranostic potential of TSP-CA Credit: Madhavi Tripathi

A September 11, 2023 news item on phys.org announces the research, Note: A link has been removed,

Scientists at the Indian Institute of Science (IISc) have developed a new approach to potentially detect and kill cancer cells, especially those that form a solid tumor mass. They have created hybrid nanoparticles made of gold and copper sulfide that can kill cancer cells using heat and enable their detection using sound waves, according to a study published in ACS Applied Nano Materials.

A September 11, 2023 Indian Institute of Science (IISC) press release (also on EurekAlert), which originated the news item, provides more detail about the research,

Early detection and treatment are key in the battle against cancer. Copper sulphide nanoparticles have previously received attention for their application in cancer diagnosis, while gold nanoparticles, which can be chemically modified to target cancer cells, have shown anticancer effects. In the current study, the IISc team decided to combine these two into hybrid nanoparticles.  

“These particles have photothermal, oxidative stress, and photoacoustic properties,” says Jaya Prakash, Assistant Professor at the Department of Instrumentation and Applied Physics (IAP), IISc, and one of the corresponding authors of the paper. PhD students Madhavi Tripathi and Swathi Padmanabhan are co-first authors.

When light is shined on these hybrid nanoparticles, they absorb the light and generate heat, which can kill cancer cells. These nanoparticles also produce singlet oxygen atoms that are toxic for the cells. “We want both these mechanisms to kill the cancer cell,” Jaya Prakash explains.  

The researchers say that the nanoparticles can also help diagnose certain cancers. Existing methods such as standalone CT and MRI scans require trained radiology professionals to decipher the images. The photoacoustic property of the nanoparticles allows them to absorb light and generate ultrasound waves, which can be used to detect cancer cells with high contrast once the particles reach them. The ultrasound waves generated from the particles allow for a more accurate image resolution as sound waves scatter less when they pass through tissues compared to light. Scans created from the generated ultrasound waves can also provide better clarity and can be used to measure the oxygen saturation in the tumour, boosting their detection.

“You can integrate this with existing systems of detection or treatment,” says Ashok M Raichur, Professor at the Department of Materials Engineering, and another corresponding author. For example, the nanoparticles can be triggered to produce heat by shining a light on them using an endoscope that is typically used for cancer screening. 

Previously developed nanoparticles have limited applications because of their large size. The IISc team used a novel reduction method to deposit tiny seeds of gold onto the copper sulphide surface. The resulting hybrid nanoparticles – less than 8 nm in size – can potentially travel inside tissues easily and reach tumours. The researchers believe that the nanoparticles’ small size would also allow them to leave the human body naturally without accumulating, although extensive studies have to be carried out to determine if they are safe to use inside the human body.  

In the current study, the researchers have tested their nanoparticles on lung cancer and cervical cancer cell lines in the lab. They now plan to take the results forward for clinical development.  

Here’s a link to and a citation for the paper,

Seed-Mediated Galvanic Synthesis of CuS–Au Nanohybrids for Photo-Theranostic Applications by Madhavi Tripathi, Swathi Padmanabhan, Jaya Prakash, and Ashok M. Raichur. ACS Appl. Nano Mater. 2023, 6, 16, 14861–14875 DOI: https://doi.org/10.1021/acsanm.3c02405 Publication Date:August 10, 2023 Copyright © 2023 American Chemical Society

This paper is behind a paywall.