Monthly Archives: June 2021

2021 version of graphene-enhanced sports shoes/sneakers/running shoes/runners/trainers

My June 21, 2018 posting was the last time these graphene-enhanced sports shoes/sneakers/running shoes/runners/trainers were mentioned here (it was also the first time). The latest version features newly graphene-enhanced shoe soles that last twice as long as the industry standard according to a March 30, 2021 article by Robert Lea for Azonano (Note: A link has been removed),

Thanks to researchers at the University of Manchester and UK-based sportswear manufacturer Inov-8, graphene can now be found at the tips of your toes as well as your fingers.

In 2017 Inov-8 brought to the market the first running shoe that utilizes graphene in its grips, and 4 years later the manufacturer is still innovating, offering a wide range of products that rely on the wonder material. 

Now, as well as finding its way into the grips of the company’s running shoes, graphene is also found in the soles of the company’s latest long-distance running shoe too¹. 

Using graphene as part of the cushioning insole in trail running shoes has led to a shoe that lasts twice as long as leading competitors’ footwear, the company says.

When Inov-8 began their quest to use graphene to improve running shoes, the initial goal was to employ the material to create improved rubber grips that would not wear down as quickly as other running shoes and retain grip for longer during this slower wearing process.

The company teamed with the University of Manchester to make this goal a reality, …

The graphene-enhanced grip proved such a hit with consumers that in the four years since its induction, shoes featuring the outer-sole now account for 50% of overall sales.

Building upon the success of Inov-8’s graphene gripped running shoe, the company has expanded its use of the material to a midsole foam. The graphene replaces EVA foam plates of carbon which are traditionally used in this form of long-distance running shoe.

A March 24, 2021 University of Manchester press release describes the latest use of graphene in Inov-8’s shoes,

Sports footwear firm inov-8 has unveiled the world’s first running shoe to use a graphene-enhanced foam in the sole, bucking the widespread trend for carbon-plate technology and doubling the industry standard for longevity.

Developed in collaboration with graphene experts at The University of Manchester, the cushioned foam, called G-FLY™, features as part of inov-8’s new trail shoe, the TRAILFLY ULTRA G 300 MAX™, designed for ultramarathon and long-distance runners.

Tests have shown the foam delivers 25% greater energy return than standard EVA foams and is far more resistant to compressive wear. It therefore maintains optimum levels of underfoot bounce and comfort for much longer.

This helps runners maintain a faster speed over greater distances, aid their feet in feeling fresher for longer, and prolong the life of their footwear.

Michael Price, COO of Lake District-based inov-8, said: …

“We’ve worked incredibly hard for the past two years with the university and leading footwear industry veteran Doug Sheridan in developing this innovation. A team of 40 athletes from across the world tested prototype shoes and more than 50 mixes of graphene-enhanced foam. Trail test reports show G-FLY foam still performing well after 1,200km – double the industry standard.”

Dr Aravind Vijayaraghavan, Reader in Nanomaterials at the University, home to both the National Graphene Institute and Graphene Engineering Innovation Centre, said: “As well as on the trail, we also tested extensively in the laboratory, including subjecting the foam to aggressive ageing tests that mimic extensive use. Despite being significantly aged, the G-FLY foam still delivered more energy return than some unaged foams.

The company inov-8 can be found here.

Nanosensors use AI to explore the biomolecular world

EPFL scientists have developed AI-powered nanosensors that let researchers track various kinds of biological molecules without disturbing them. Courtesy: École polytechnique fédérale de Lausanne (EPFL)

If you look at the big orange dot (representing the nanosensors?), you’ll see those purplish/fuschia objects resemble musical notes (biological molecules?). I think that brainlike object to the left and in light blue is the artificial intelligence (AI) component. (If anyone wants to correct my guesses or identify the bits I can’t, please feel free to add to the Comments for this blog.)

Getting back to my topic, keep the ‘musical notes’ in mind as you read about some of the latest research from l’École polytechnique fédérale de Lausanne (EPFL) in an April 7, 2021 news item on Nanowerk,

The tiny world of biomolecules is rich in fascinating interactions between a plethora of different agents such as intricate nanomachines (proteins), shape-shifting vessels (lipid complexes), chains of vital information (DNA) and energy fuel (carbohydrates). Yet the ways in which biomolecules meet and interact to define the symphony of life is exceedingly complex.

Scientists at the Bionanophotonic Systems Laboratory in EPFL’s School of Engineering have now developed a new biosensor that can be used to observe all major biomolecule classes of the nanoworld without disturbing them. Their innovative technique uses nanotechnology, metasurfaces, infrared light and artificial intelligence.

To each molecule its own melody

In this nano-sized symphony, perfect orchestration makes physiological wonders such as vision and taste possible, while slight dissonances can amplify into horrendous cacophonies leading to pathologies such as cancer and neurodegeneration.

An April 7, 2021 EPFL press release, which originated the news item, provides more detail,

“Tuning into this tiny world and being able to differentiate between proteins, lipids, nucleic acids and carbohydrates without disturbing their interactions is of fundamental importance for understanding life processes and disease mechanisms,” says Hatice Altug, the head of the Bionanophotonic Systems Laboratory. 

Light, and more specifically infrared light, is at the core of the biosensor developed by Altug’s team. Humans cannot see infrared light, which is beyond the visible light spectrum that ranges from blue to red. However, we can feel it in the form of heat in our bodies, as our molecules vibrate under the infrared light excitation.

Molecules consist of atoms bonded to each other and – depending on the mass of the atoms and the arrangement and stiffness of their bonds – vibrate at specific frequencies. This is similar to the strings on a musical instrument that vibrate at specific frequencies depending on their length. These resonant frequencies are molecule-specific, and they mostly occur in the infrared frequency range of the electromagnetic spectrum. 

“If you imagine audio frequencies instead of infrared frequencies, it’s as if each molecule has its own characteristic melody,” says Aurélian John-Herpin, a doctoral assistant at Altug’s lab and the first author of the publication. “However, tuning into these melodies is very challenging because without amplification, they are mere whispers in a sea of sounds. To make matters worse, their melodies can present very similar motifs making it hard to tell them apart.” 

Metasurfaces and artificial intelligence

The scientists solved these two issues using metasurfaces and AI. Metasurfaces are man-made materials with outstanding light manipulation capabilities at the nano scale, thereby enabling functions beyond what is otherwise seen in nature. Here, their precisely engineered meta-atoms made out of gold nanorods act like amplifiers of light-matter interactions by tapping into the plasmonic excitations resulting from the collective oscillations of free electrons in metals. “In our analogy, these enhanced interactions make the whispered molecule melodies more audible,” says John-Herpin.

AI is a powerful tool that can be fed with more data than humans can handle in the same amount of time and that can quickly develop the ability to recognize complex patterns from the data. John-Herpin explains, “AI can be imagined as a complete beginner musician who listens to the different amplified melodies and develops a perfect ear after just a few minutes and can tell the melodies apart, even when they are played together – like in an orchestra featuring many instruments simultaneously.” 

The first biosensor of its kind

When the scientists’ infrared metasurfaces are augmented with AI, the new sensor can be used to analyze biological assays featuring multiple analytes simultaneously from the major biomolecule classes and resolving their dynamic interactions. 

“We looked in particular at lipid vesicle-based nanoparticles and monitored their breakage through the insertion of a toxin peptide and the subsequent release of vesicle cargos of nucleotides and carbohydrates, as well as the formation of supported lipid bilayer patches on the metasurface,” says Altug.

This pioneering AI-powered, metasurface-based biosensor will open up exciting perspectives for studying and unraveling inherently complex biological processes, such as intercellular communication via exosomesand the interaction of nucleic acids and carbohydrates with proteins in gene regulation and neurodegeneration. 

“We imagine that our technology will have applications in the fields of biology, bioanalytics and pharmacology – from fundamental research and disease diagnostics to drug development,” says Altug. 

Here’s a link to and a citation for the paper,

Infrared Metasurface Augmented by Deep Learning for Monitoring Dynamics between All Major Classes of Biomolecules by Aurelian John‐Herpin, Deepthy Kavungal. Lea von Mücke, Hatice Altug. Advanced Materials Volume 33, Issue 14 April 8, 2021 2006054 DOI: https://doi.org/10.1002/adma.202006054 First published: 22 February 2021

This paper is open access.

Superstar engineers and fantastic fiction writers podcast series

The ‘Inventive Podcast’ features the superstar engineers and fantastic fiction writers of the headline. The University of Salford (UK) launched the series on Wednesday, June 23, 2021or International Women in Engineering Day. Here’s more about the series from a June 21, 2021 University of Salford press release (Note: I liked the title so much I ‘borrowed’ it),

Superstar engineers and fantastic fiction writers collaborate on the brand-new Inventive Podcast

The University of Salford has announced the launch of the brand-new Inventive Podcast featuring the incredible stories of engineers whose innovative work is transforming the world we live in.

Professor Trevor Cox, Inventive Host and an Acoustical Engineer from the University of Salford said: “Engineering is so central to our lives, and yet as a subject it’s strangely hidden in plain sight. I came up with idea of Inventive to explore new ways of telling the story of engineering by mixing fact and fiction.”  He went on to comment, “Given the vast number of podcasts out there, it’s surprising how few shows focus on engineering (beyond tech).”

The project is funded by the Engineering and Physical Sciences [Research] Council (EPSRC) and brings together two Schools at the University: Science, Engineering and Environment & Arts, Media and Creative Technology.  The series will debut on Wednesday 23 June [2021], International Women in Engineering Day, with a further with 6 new episodes dropping across the summer.

Over the course of the eleven-episode series, Professor Cox meets incredible Inventive engineers. In the first episode he interviews: electronics engineer, Shrouk el Attar, a refugee and campaigner for LGBT rights, recently awarded the Women’s Engineering Society (WES) Prize for her work in femtech, smart tech that improves the lives of cis women and trans men, at the Institution of Engineering and Technology Young Woman Engineer of the Year Awards 2021; structural engineer Roma Agrawal designed the foundation and spire of London’s The Shard; and chemical engineer Askwar Hilonga who didn’t have access to clean water growing up in his village in Tanzania, but has gone on to win the Africa Prize for Engineering Innovation for his water purification nano filter.

This podcast is not just for engineers and techies! Engineering is typically represented in the media by historical narratives or ‘boy’s toys’ approach – biggest, longest, tallest. We know that has limited appeal, so we set ourselves a challenge to reach a wider audience. Engineering needs to tell better stories with people at the centre. So, we’ve interwoven factual interviews with stories commissioned from fantastic writers: C M Taylor’s piece The Night Builder, is inspired by structural engineer Roma Agrawal and includes a Banksy-like figure who works with concrete. Science Fiction writer Emma Newman’s Healing the Fractured is inspired by engineer Greg Bowie who makes trauma plates to treat broke bones and is set in a dystopian future, reminiscent of Handmaid’s Tale, with the engineer as an unexpected hero.

For more information and to sign-up for the latest episodes go to: www.inventivepodcast.com

I listened to Trevor Cox’s interview for the first and, so far, only Inventive episode, with engineer, Shrouk El-Attar, which includes award-winning writer and poet, Tania Hershman, performing her piece ‘Human Being As Circuit Board, Human Being as Dictionary‘ combining fiction, poetry and non-fiction based on El-Attar’s story. (Check out Shrouk El-Attar’s eponymous website here.)

I recognized one of the upcoming interview subjects, Askwar Hilonga, as his work with water filters in Tanzania has been featured here twice, notably in this June 16, 2015 posting.

Finally Tania Hershman (Twitter: @taniahershman) has an eponymous website here. (Note: In September 2021 she will be leading a 4-week online Science-Flavoured Writing course for the London Lit Lab. A science background isn’t necessary and, if you’re short on cash, there are some options.)

Future of Being Human: a call for proposals

The Canadian Institute for Advanced Research (CIFAR) is investigating the ‘Future of Being Human’ and has instituted a global call for proposals but there is one catch, your team has to have one person (with or without citizenship) who’s living and working in Canada. (Note: I am available.)

Here’s more about the call (from the CIFAR Global Call for Ideas: The Future of Being Human webpage),

New program proposals should explore the long term intersection of humans, science and technology, social and cultural systems, and our environment. Our understanding of the world around us, and new insights into individual and societal behaviour, have the potential to provide enormous benefits to humanity and the planet. 

We invite bold proposals from researchers at universities or research institutions that ask new questions about our complex emerging world. We are confronting challenging problems that require a diverse team incorporating multiple disciplines (potentially spanning the humanities, social sciences, arts, physical sciences, and life sciences [emphasis mine]) to engage in a sustained dialogue to develop new insights, and change the conversation on important questions facing science and humanity.

CIFAR is committed to creating a more diverse, equitable, and inclusive environment. We welcome proposals that include individuals from countries and institutions that are not yet represented in our research community.

Here’s a description, albeit, a little repetitive, of what CIFAR is asking researchers to do (from the Program Guide [PDF]),

For CIFAR’s next Global Call for Ideas, we are soliciting proposals related to The Future of Being Human, exploring in the long term the intersection of humans, science and technology, social and cultural systems, and our environment. Our understanding of the natural world around us, and new insights into individual and societal behaviour, have the potential to provide enormous benefits to humanity and the planet. We invite bold proposals that ask new questions about our complex emerging world, where the issues under study are entangled and dynamic. We are confronting challenging problems that necessitate a diverse team incorporating multiple disciplines (potentially spanning the humanities, social sciences, arts, physical sciences, and life sciences) to engage in a sustained dialogue to develop new insights, and change the conversation on important questions facing science and humanity. [p. 2 print; p. 4 PDF]

There is an upcoming information webinar (from the CIFAR Global Call for Ideas: The Future of Being Human webpage),

Monday, June 28, 2021 – 1:00pm – 1:45pm EDT

Webinar Sign-Up

Also from the CIFAR Global Call for Ideas: The Future of Being Human webpage, here are the various deadlines and additional sources of information,

August 17, 2021

Registration deadline

January 26, 2022

LOI [Letter of Intent] deadline

Spring 2022

LOIs invited to Full Proposal

Fall 2022

Full proposals due

March 2023

New program announcement and celebration

Resources

Program Guide [PDF]

Frequently Asked Questions

Good luck!

Mechano-photonic artificial synapse is bio-inspired

The word ‘memristor’ usually pops up when there’s research into artificial synapses but not in this new piece of research. I didn’t see any mention of the memristor in the paper’s references either but I did find James Gimzewski from the University of California at Los Angeles (UCLA) whose research into brainlike computing (neuromorphic computing) is running parallel but separately to the memristor research.

Dr. Thamarasee Jeewandara has written a March 25, 2021 article for phys.org about the latest neuromorphic computing research (Note: Links have been removed)

Multifunctional and diverse artificial neural systems can incorporate multimodal plasticity, memory and supervised learning functions to assist neuromorphic computation. In a new report, Jinran Yu and a research team in nanoenergy, nanoscience and materials science in China and the US., presented a bioinspired mechano-photonic artificial synapse with synergistic mechanical and optical plasticity. The team used an optoelectronic transistor made of graphene/molybdenum disulphide (MoS2) heterostructure and an integrated triboelectric nanogenerator to compose the artificial synapse. They controlled the charge transfer/exchange in the heterostructure with triboelectric potential and modulated the optoelectronic synapse behaviors readily, including postsynaptic photocurrents, photosensitivity and photoconductivity. The mechano-photonic artificial synapse is a promising implementation to mimic the complex biological nervous system and promote the development of interactive artificial intelligence. The work is now published on Science Advances.

The human brain can integrate cognition, learning and memory tasks via auditory, visual, olfactory and somatosensory interactions. This process is difficult to be mimicked using conventional von Neumann architectures that require additional sophisticated functions. Brain-inspired neural networks are made of various synaptic devices to transmit information and process using the synaptic weight. Emerging photonic synapse combine the optical and electric neuromorphic modulation and computation to offer a favorable option with high bandwidth, fast speed and low cross-talk to significantly reduce power consumption. Biomechanical motions including touch, eye blinking and arm waving are other ubiquitous triggers or interactive signals to operate electronics during artificial synapse plasticization. In this work, Yu et al. presented a mechano-photonic artificial synapse with synergistic mechanical and optical plasticity. The device contained an optoelectronic transistor and an integrated triboelectric nanogenerator (TENG) in contact-separation mode. The mechano-optical artificial synapses have huge functional potential as interactive optoelectronic interfaces, synthetic retinas and intelligent robots. [emphasis mine]

As you can see Jeewandara has written quite a technical summary of the work. Here’s an image from the Science Advances paper,

Fig. 1 Biological tactile/visual neurons and mechano-photonic artificial synapse. (A) Schematic illustrations of biological tactile/visual sensory system. (B) Schematic diagram of the mechano-photonic artificial synapse based on graphene/MoS2 (Gr/MoS2) heterostructure. (i) Top-view scanning electron microscope (SEM) image of the optoelectronic transistor; scale bar, 5 μm. The cyan area indicates the MoS2 flake, while the white strip is graphene. (ii) Illustration of charge transfer/exchange for Gr/MoS2 heterostructure. (iii) Output mechano-photonic signals from the artificial synapse for image recognition.

You can find the paper here,

Bioinspired mechano-photonic artificial synapse based on graphene/MoS2 heterostructure by Jinran Yu, Xixi Yang, Guoyun Gao, Yao Xiong, Yifei Wang, Jing Han, Youhui Chen, Huai Zhang, Qijun Sun and Zhong Lin Wang. Science Advances 17 Mar 2021: Vol. 7, no. 12, eabd9117 DOI: 10.1126/sciadv.abd9117

This appears to be open access.

Gold nanoparticle tattoo changes medical diagnostics?

The tattoos are in fact implantable sensors. Here’s more from an April 6, 2021 news item on ScienceDaily,

The idea of implantable sensors that continuously transmit information on vital values and concentrations of substances or drugs in the body has fascinated physicians and scientists for a long time. Such sensors enable the constant monitoring of disease progression and therapeutic success. However, until now implantable sensors have not been suitable to remain in the body permanently but had to be replaced after a few days or weeks. On the one hand, there is the problem of implant rejection because the body recognizes the sensor as a foreign object. On the other hand, the sensor’s color which indicates concentration changes has been unstable so far and faded over time. Scientists at Johannes Gutenberg University Mainz (JGU) have developed a novel type of implantable sensor which can be operated in the body for several months. The sensor is based on color-stable gold nanoparticles that are modified with receptors for specific molecules. Embedded into an artificial polymeric tissue, the nanogold is implanted under the skin where it reports changes in drug concentrations by changing its color.

An April 6, 2021 Johannes Gutenberg Universitaet Mainz press release (also on EurekAlert), which originated the news item, provides more detail about the proposed tattoo/implantable sensors,

Implant reports information as an “invisible tattoo”

Professor Carsten Sönnichsen’s research group at JGU has been using gold nanoparticles as sensors to detect tiny amounts of proteins in microscopic flow cells for many years. Gold nanoparticles act as small antennas for light: They strongly absorb and scatter it and, therefore, appear colorful. They react to alterations in their surrounding by changing color. Sönnichsen’s team has exploited this concept for implanted medical sensing.

To prevent the tiny particles from swimming away or being degraded by immune cells, they are embedded in a porous hydrogel with a tissue-like consistency. Once implanted under the skin, small blood vessels and cells grow into the pores. The sensor is integrated in the tissue and is not rejected as a foreign body. “Our sensor is like an invisible tattoo, not much bigger than a penny and thinner than one millimeter,” said Professor Carsten Sönnichsen, head of the Nanobiotechnology Group at JGU. Since the gold nanoparticles are infrared, they are not visible to the eye. However, a special kind of measurement device can detect their color noninvasively through the skin.

In their study published in Nano Letters, the JGU researchers implanted their gold nanoparticle sensors under the skin of hairless rats. Color changes in these sensors were monitored following the administration of various doses of an antibiotic. The drug molecules are transported to the sensor via the bloodstream. By binding to specific receptors on the surface of the gold nanoparticles, they induce color change that is dependent on drug concentration. Thanks to the color-stable gold nanoparticles and the tissue-integrating hydrogel, the sensor was found to remain mechanically and optically stable over several months.

Huge potential of gold nanoparticles as long-lasting implantable medical sensors

“We are used to colored objects bleaching over time. Gold nanoparticles, however, do not bleach but keep their color permanently. As they can be easily coated with various different receptors, they are an ideal platform for implantable sensors,” explained Dr. Katharina Kaefer, first author of the study.

The novel concept is generalizable and has the potential to extend the lifetime of implantable sensors. In future, gold nanoparticle-based implantable sensors could be used to observe concentrations of different biomarkers or drugs in the body simultaneously. Such sensors could find application in drug development, medical research, or personalized medicine, such as the management of chronic diseases.

Interdisciplinary team work brought success

Sönnichsen had the idea of using gold nanoparticles as implanted sensors already in 2004 when he started his research in biophysical chemistry as a junior professor in Mainz. However, the project was not realized until ten years later in cooperation with Dr. Thies Schroeder and Dr. Katharina Kaefer, both scientists at JGU. Schroeder was experienced in biological research and laboratory animal science and had already completed several years of research work in the USA. Kaefer was looking for an exciting topic for her doctorate and was particularly interested in the complex and interdisciplinary nature of the project. Initial results led to a stipend awarded to Kaefer by the Max Planck Graduate Center (MPGC) as well as financial support from Stiftung Rheinland-Pfalz für Innovation. “Such a project requires many people with different scientific backgrounds. Step by step we were able to convince more and more people of our idea,” said Sönnichsen happily. Ultimately, it was interdisciplinary teamwork that resulted in the successful development of the first functional implanted sensor with gold nanoparticles.

The researchers have provided an image which illustrates several elements described in the press release,

Caption: Gold nanoparticles embedded in a porous hydrogel can be implanted under the skin and used as medical sensors. The sensor is like an invisible tattoo revealing concentration changes of substances in the blood by color change. Credit: ill./©: Nanobiotechnology Group, JGU Department of Chemistry

Here’s a link to and a citation for the paper,

Implantable Sensors Based on Gold Nanoparticles for Continuous Long-Term Concentration Monitoring in the Body by Katharina Kaefer, Katja Krüger, Felix Schlapp, Hüseyin Uzun, Sirin Celiksoy, Bastian Flietel, Axel Heimann, Thies Schroeder, Oliver Kempski, and Carsten Sönnichsen. Nano Lett. 2021, XXXX, XXX, XXX-XXX DOI: https://doi.org/10.1021/acs.nanolett.1c00887 Publication Date:March 30, 2021 © 2021 The Authors. Published by American Chemical Society

This paper is behind a paywall.

The Internet of Bodies and Ghislaine Boddington

I stumbled across this event on my Twitter feed (h/t @katepullinger; Note: Kate Pullinger is a novelist and Professor of Creative Writing and Digital Media, Director of the Centre for Cultural and Creative Industries [CCCI] at Bath Spa University in the UK).

Anyone who visits here with any frequency will have noticed I have a number of articles on technology and the body (you can find them in the ‘human enhancement’ category and/or search fro the machine/flesh tag). Boddington’s view is more expansive than the one I’ve taken and I welcome it. First, here’s the event information and, then, a link to her open access paper from February 2021.

From the CCCI’s Annual Public Lecture with Ghislaine Boddington eventbrite page,

This year’s CCCI Public Lecture will be given by Ghislaine Boddington. Ghislaine is Creative Director of body>data>space and Reader in Digital Immersion at University of Greenwich. Ghislaine has worked at the intersection of the body, the digital, and spatial research for many years. This will be her first in-person appearance since the start of the pandemic, and she will share with us the many insights she has gathered during this extraordinary pivot to online interfaces much of the world has been forced to undertake.

With a background in performing arts and body technologies, Ghislaine is recognised as a pioneer in the exploration of digital intimacy, telepresence and virtual physical blending since the early 90s. As a curator, keynote speaker and radio presenter she has shared her outlook on the future human into the cultural, academic, creative industries and corporate sectors worldwide, examining topical issues with regards to personal data usage, connected bodies and collective embodiment. Her research led practice, examining the evolution of the body as the interface, is presented under the heading ‘The Internet of Bodies’. Recent direction and curation outputs include “me and my shadow” (Royal National Theatre 2012), FutureFest 2015-18 and Collective Reality (Nesta’s FutureFest / SAT Montreal 2016/17). In 2017 Ghislaine was awarded the international IX Immersion Experience Visionary Pioneer Award. She recently co-founded University of Greenwich Strategic Research Group ‘CLEI – Co-creating Liveness in Embodied Immersion’ and is an Associate Editor for AI & Society (Springer). Ghislaine is a long term advocate for diversity and inclusion, working as a Trustee for Stemette Futures and Spokesperson for Deutsche Bank ‘We in Social Tech’ initiative. She is a team member and presenter with BBC World Service flagship radio show/podcast Digital Planet.

Date and time

Thu, 24 June 2021
08:00 – 09:00 [am] PDT

@GBoddington

@bodydataspace

@ConnectedBodies

Boddington’s paper is what ignited my interest; here’s a link to and a citation for it,

The Internet of Bodies—alive, connected and collective: the virtual physical future of our bodies and our senses by Ghislaine Boddington. AI Soc. 2021 Feb 8 : 1–17. DOI: 10.1007/s00146-020-01137-1 PMCID: PMC7868903 PMID: 33584018

Some excerpts from this open access paper,

The Weave—virtual physical presence design—blending processes for the future

Coming from a performing arts background, dance led, in 1989, I became obsessed with the idea that there must be a way for us to be able to create and collaborate in our groups, across time and space, whenever we were not able to be together physically. The focus of my work, as a director, curator and presenter across the last 30 years, has been on our physical bodies and our data selves and how they have, through the extended use of our bodies into digitally created environments, started to merge and converge, shifting our relationship and understanding of our identity and our selfhood.

One of the key methodologies that I have been using since the mid-1990s is inter-authored group creation, a process we called The Weave (Boddington 2013a, b). It uses the simple and universal metaphor of braiding, plaiting or weaving three strands of action and intent, these three strands being:

1. The live body—whether that of the performer, the participant, or the public;

2. The technologies of today—our tools of virtually physical reflection;

3. The content—the theme in exploration.

As with a braid or a plait, the three strands must be weaved simultaneously. What is key to this weave is that in any co-creation between the body and technology, the technology cannot work without the body; hence, there will always be virtual/physical blending. [emphasis mine]

Cyborgs

Cyborg culture is also moving forward at a pace with most countries having four or five cyborgs who have reached out into media status. Manel Munoz is the weather man as such, fascinated and affected by cyclones and anticyclones, his back of the head implant sent vibrations to different sides of his head linked to weather changes around him.

Neil Harbisson from Northern Ireland calls himself a trans-species rather than a cyborg, because his implant is permanently fused into the crown of his head. He is the first trans-species/cyborg to have his passport photo accepted as he exists with his fixed antenna. Neil has, from birth, an eye condition called greyscale, which means he only sees the world in grey and white. He uses his antennae camera to detect colour, and it sends a vibration with a different frequency for each colour viewed. He is learning what colours are within his viewpoint at any given time through the vibrations in his head, a synaesthetic method of transference of one sense for another. Moon Ribas, a Spanish choreographer and a dancer, had two implants placed into the top of her feet, set to sense seismic activity as it occurs worldwide. When a small earthquake occurs somewhere, she received small vibrations; a bigger eruption gives her body a more intense vibration. She dances as she receives and reacts to these transferred data. She feels a need to be closer to our earth, a part of nature (Harbisson et al. 2018).

Medical, non medical and sub-dermal implants

Medical implants, embedded into the body or subdermally (nearer the surface), have rapidly advanced in the last 30 years with extensive use of cardiac pacemakers, hip implants, implantable drug pumps and cochlear implants helping partial deaf people to hear.

Deep body and subdermal implants can be personalised to your own needs. They can be set to transmit chosen aspects of your body data outwards, but they also can receive and control data in return. There are about 200 medical implants in use today. Some are complex, like deep brain stimulation for motor neurone disease, and others we are more familiar with, for example, pacemakers. Most medical implants are not digitally linked to the outside world at present, but this is in rapid evolution.

Kevin Warwick, a pioneer in this area, has interconnected himself and his partner with implants for joint use of their personal and home computer systems through their BrainGate (Warwick 2008) implant, an interface between the nervous system and the technology. They are connected bodies. He works onwards with his experiments to feel the shape of distant objects and heat through fingertip implants.

‘Smart’ implants into the brain for deep brain stimulation are in use and in rapid advancement. The ethics of these developments is under constant debate in 2020 and will be onwards, as is proved by the mass coverage of the Neuralink, Elon Musk’s innovation which connects to the brain via wires, with the initial aim to cure human diseases such as dementia, depression and insomnia and onwards plans for potential treatment of paraplegia (Musk 2016).

Given how many times I’ve featured art/sci (also know as, art/science and/or sciart) and cyborgs and medical implants here, my excitement was a given.

For anyone who wants to pursue Boddington’s work further, her eponymous website is here, the body>data>space is here, and her University of Greenwich profile page is here.

For anyone interested in the Centre for Creative and Cultural Industries (CCCI), their site is here.

Finally, here’s one of my earliest pieces about cyborgs titled ‘My mother is a cyborg‘ from April 20, 2012 and my September 17, 2020 posting titled, ‘Turning brain-controlled wireless electronic prostheses into reality plus some ethical points‘. If you scroll down to the ‘Brain-computer interfaces, symbiosis, and ethical issues’ subhead, you’ll find some article excerpts about a fascinating qualitative study on implants and ethics.

A new generation of xenobots made with frog cells

I meant to feature this work last year when it was first announced so I’m delighted a second chance has come around so soon after. From a March 31, 2021 news item on ScienceDaily,

Last year, a team of biologists and computer scientists from Tufts University and the University of Vermont (UVM) created novel, tiny self-healing biological machines from frog cells called “Xenobots” that could move around, push a payload, and even exhibit collective behavior in the presence of a swarm of other Xenobots.

Get ready for Xenobots 2.0.

Here’s a video of the Xenobot 2.0. It’s amazing but, for anyone who has problems with animal experimentation, this may be disturbing,


The next version of Xenobots have been created – they’re faster, live longer, and can now record information. (Source: Doug Blackiston & Emma Lederer)

A March 31, 2021 Tufts University news release by Mike Silver (also on EurekAlert and adapted and published as Scientists Create the Next Generation of Living Robots on the University of Vermont website as a UVM Today story),

The same team has now created life forms that self-assemble a body from single cells, do not require muscle cells to move, and even demonstrate the capability of recordable memory. The new generation Xenobots also move faster, navigate different environments, and have longer lifespans than the first edition, and they still have the ability to work together in groups and heal themselves if damaged. The results of the new research were published today [March 31, 2021] in Science Robotics.

Compared to Xenobots 1.0, in which the millimeter-sized automatons were constructed in a “top down” approach by manual placement of tissue and surgical shaping of frog skin and cardiac cells to produce motion, the next version of Xenobots takes a “bottom up” approach. The biologists at Tufts took stem cells from embryos of the African frog Xenopus laevis (hence the name “Xenobots”) and allowed them to self-assemble and grow into spheroids, where some of the cells after a few days differentiated to produce cilia – tiny hair-like projections that move back and forth or rotate in a specific way. Instead of using manually sculpted cardiac cells whose natural rhythmic contractions allowed the original Xenobots to scuttle around, cilia give the new spheroidal bots “legs” to move them rapidly across a surface. In a frog, or human for that matter, cilia would normally be found on mucous surfaces, like in the lungs, to help push out pathogens and other foreign material. On the Xenobots, they are repurposed to provide rapid locomotion. 

“We are witnessing the remarkable plasticity of cellular collectives, which build a rudimentary new ‘body’ that is quite distinct from their default – in this case, a frog – despite having a completely normal genome,” said Michael Levin, Distinguished Professor of Biology and director of the Allen Discovery Center at Tufts University, and corresponding author of the study. “In a frog embryo, cells cooperate to create a tadpole. Here, removed from that context, we see that cells can re-purpose their genetically encoded hardware, like cilia, for new functions such as locomotion. It is amazing that cells can spontaneously take on new roles and create new body plans and behaviors without long periods of evolutionary selection for those features.”

“In a way, the Xenobots are constructed much like a traditional robot.  Only we use cells and tissues rather than artificial components to build the shape and create predictable behavior.” said senior scientist Doug Blackiston, who co-first authored the study with research technician Emma Lederer. “On the biology end, this approach is helping us understand how cells communicate as they interact with one another during development, and how we might better control those interactions.”

While the Tufts scientists created the physical organisms, scientists at UVM were busy running computer simulations that modeled different shapes of the Xenobots to see if they might exhibit different behaviors, both individually and in groups. Using the Deep Green supercomputer cluster at UVM’s Vermont Advanced Computing Core, the team, led by computer scientists and robotics experts Josh Bongard and Sam Kriegman, simulated the Xenbots under hundreds of thousands of random environmental conditions using an evolutionary algorithm.  These simulations were used to identify Xenobots most able to work together in swarms to gather large piles of debris in a field of particles

“We know the task, but it’s not at all obvious — for people — what a successful design should look like. That’s where the supercomputer comes in and searches over the space of all possible Xenobot swarms to find the swarm that does the job best,” says Bongard. “We want Xenobots to do useful work. Right now we’re giving them simple tasks, but ultimately we’re aiming for a new kind of living tool that could, for example, clean up microplastics in the ocean or contaminants in soil.” 

It turns out, the new Xenobots are much faster and better at tasks such as garbage collection than last year’s model, working together in a swarm to sweep through a petri dish and gather larger piles of iron oxide particles. They can also cover large flat surfaces, or travel through narrow capillary tubes.

These studies also suggest that the in silico [computer] simulations could in the future optimize additional features of biological bots for more complex behaviors. One important feature added in the Xenobot upgrade is the ability to record information.

Now with memory

A central feature of robotics is the ability to record memory and use that information to modify the robot’s actions and behavior. With that in mind, the Tufts scientists engineered the Xenobots with a read/write capability to record one bit of information, using a fluorescent reporter protein called EosFP, which normally glows green. However, when exposed to light at 390nm wavelength, the protein emits red light instead. 

The cells of the frog embryos were injected with messenger RNA coding for the EosFP protein before stem cells were excised to create the Xenobots. The mature Xenobots now have a built-in fluorescent switch which can record exposure to blue light around 390nm.
The researchers tested the memory function by allowing 10 Xenobots to swim around a surface on which one spot is illuminated with a beam of 390nm light. After two hours, they found that three bots emitted red light. The rest remained their original green, effectively recording the “travel experience” of the bots.

This proof of principle of molecular memory could be extended in the future to detect and record not only light but also the presence of radioactive contamination, chemical pollutants, drugs, or a disease condition. Further engineering of the memory function could enable the recording of multiple stimuli (more bits of information) or allow the bots to release compounds or change behavior upon sensation of stimuli. 

“When we bring in more capabilities to the bots, we can use the computer simulations to design them with more complex behaviors and the ability to carry out more elaborate tasks,” said Bongard. “We could potentially design them not only to report conditions in their environment but also to modify and repair conditions in their environment.”

Xenobot, heal thyself

“The biological materials we are using have many features we would like to someday implement in the bots – cells can act like sensors, motors for movement, communication and computation networks, and recording devices to store information,” said Levin. “One thing the Xenobots and future versions of biological bots can do that their metal and plastic counterparts have difficulty doing is constructing their own body plan as the cells grow and mature, and then repairing and restoring themselves if they become damaged. Healing is a natural feature of living organisms, and it is preserved in Xenobot biology.” 

The new Xenobots were remarkably adept at healing and would close the majority of a severe full-length laceration half their thickness within 5 minutes of the injury. All injured bots were able to ultimately heal the wound, restore their shape and continue their work as before. 

Another advantage of a biological robot, Levin adds, is metabolism. Unlike metal and plastic robots, the cells in a biological robot can absorb and break down chemicals and work like tiny factories synthesizing and excreting chemicals and proteins. The whole field of synthetic biology – which has largely focused on reprogramming single celled organisms to produce useful molecules – can now be exploited in these multicellular creatures

Like the original Xenobots, the upgraded bots can survive up to ten days on their embryonic energy stores and run their tasks without additional energy sources, but they can also carry on at full speed for many months if kept in a “soup” of nutrients. 

What the scientists are really after

An engaging description of the biological bots and what we can learn from them is presented in a TED talk by Michael Levin. In his TED Talk, professor Levin describes not only the remarkable potential for tiny biological robots to carry out useful tasks in the environment or potentially in therapeutic applications, but he also points out what may be the most valuable benefit of this research – using the bots to understand how individual cells come together, communicate, and specialize to create a larger organism, as they do in nature to create a frog or human. It’s a new model system that can provide a foundation for regenerative medicine.

Xenobots and their successors may also provide insight into how multicellular organisms arose from ancient single celled organisms, and the origins of information processing, decision making and cognition in biological organisms. 

Recognizing the tremendous future for this technology, Tufts University and the University of Vermont have established the Institute for Computer Designed Organisms (ICDO), to be formally launched in the coming months, which will pull together resources from each university and outside sources to create living robots with increasingly sophisticated capabilities.

The ultimate goal for the Tufts and UVM researchers is not only to explore the full scope of biological robots they can make; it is also to understand the relationship between the ‘hardware’ of the genome and the ‘software’ of cellular communications that go into creating organized tissues, organs and limbs. Then we can gain greater control of that morphogenesis for regenerative medicine, and the treatment of cancer and diseases of aging.

Here’s a link to and a citation for the paper,

A cellular platform for the development of synthetic living machines by Douglas Blackiston, Emma Lederer, Sam Kriegman, Simon Garnier, Joshua Bongard, and Michael Levin. Science Robotics 31 Mar 2021: Vol. 6, Issue 52, eabf1571 DOI: 10.1126/scirobotics.abf1571

This paper is behind a paywall.

The Canada Council for the Arts, a digital strategy research report on blockchains and culture, and Vancouver (Canada)

Is the May 17, 2021 “Blockchains & Cultural Padlocks (BACP) Digital Strategy Research Report” discussing a hoped for future transformative experience? Given the report’s subtitle: “Towards a Digitally Cooperative Culture: Recommoning Land, Data and Objects,” and the various essays included in the 200 pp document, I say the answer is ‘yes’.

The report was launched by 221 A, a Vancouver (Canada)-based arts and culture organization and funded by the Canada Council for the Arts through their Digital Strategy Fund. Here’s more from the BACP report in the voice of its research leader, Jesse McKee,

… The blockchain is the openly readable and unalterable ledger technology, which is most broadly known for supporting such applications as bitcoin and other cryptocurrencies. This report documents the first research phase in a three-phased approach to establishing our digital strategy [emphasis mine], as we [emphasis mine] learn from the blockchain development communities. This initiative’s approach is an institutional one, not one that is interpreting the technology for individuals, artists and designers alone. The central concept of the blockchain is that exchanges of value need not rely on centralized authentication from institutions such as banks, credit cards or the state, and that this exchange of value is better programmed and tracked with metadata to support the virtues, goals and values of a particular network. This concept relies on a shared, decentralized and trustless ledger. “Trustless” in the blockchain community is an evolution of the term trust, shifting its signification as a contract usually held between individuals, managed and upheld by a centralized social institution, and redistributing it amongst the actors in a blockchain network who uphold the platform’s technical operational codes and can access ledgers of exchange. All parties involved in the system are then able to reach a consensus on what the canonical truth is regarding the holding and exchange of value within the system.

… [from page 6 of the report]

McKee manages to keep the report from floating away in a sea of utopian bliss with some cautionary notes. Still, as a writer I’m surprised he didn’t notice that ‘blockchain‘ which (in English) is supposed to ‘unlock padlocks’ poses a linguistic conundrum if nothing else.

This looks like an interesting report but it’s helpful to have some ‘critical theory’ jargon. That said, the bulk of the report is relatively accessible reading although some of the essays (at the end) from the artist-researchers are tough going.

One more thought, the report does present many exciting and transformative possibilities and I would dearly love to see much of this come to pass. I am more hesitant than McKee and his colleagues and that hesitation is beautifully described in an essay (The Vampire Problem: Illustrating the Paradox of Transformative Experience) first published September 3, 2017 by Maria Popova (originally published on Brain Pickings),

To be human is to suffer from a peculiar congenital blindness: On the precipice of any great change, we can see with terrifying clarity the familiar firm footing we stand to lose, but we fill the abyss of the unfamiliar before us with dread at the potential loss rather than jubilation over the potential gain of gladnesses and gratifications we fail to envision because we haven’t yet experienced them. …

Arts and blockchain events in Vancouver

The 221 A launch event for the report kicked off a series of related events, here’s more from a 221 A May 17, 2021 news release (Note: the first and second events have already taken place),

Events Series

Please join us for a live stream events series bringing together key contributors of the Blockchains & Cultural Padlocks Research Report alongside a host of leading figures across academic, urbanism, media and blockchain development communities.

Blockchains & Cultural Padlocks Digital Strategy Launch

May 25, 10 am PDT / 1 pm EDT / 7 CEST

With Jesse McKee, BACP Lead Investigator and 221A Head of Strategy; Rosemary Heather, BACP Editorial Director and Principal Researcher; moderated by Svitlana Matviyenko, Assistant Professor and Associate Director of Simon Fraser University’s Digital Democracies Institute.

The Valuation of Necessity: A Cosmological View of our Technologies and Culture

June 4, 10 am PDT / 1 pm EDT / 7pm CEST

With BACP researcher, artist and theorist Patricia Reed; critical geographer Maral Sotoudehnia, and Wassim Alsindi of 0x Salon, Berlin, who conducts research on the legal and ecological externalities of blockchain networks.

Recommoning Territory: Diversifying Housing Tenure Through Platform Cooperatives

June 18, 10 am PDT / 1 pm EDT / 7pm CEST

With 221A Fellows Maksym Rokmaniko and Francis Tseng (DOMA [a nonprofit organization developing a distributed housing platform]); Andy Yan (Simon Fraser University); and BACP researcher and critical geographer Maral Sotoudehnia.

Roundtable: Decentralized Autonomous Organizations (DAOs) & Social Tokens

Released June 25, Pre-recorded

Roundtable co-organized with Daniel Keller of newmodels.io, with participation from development teams and researchers from @albiverse, trust.support, Circles UBI, folia.app, SayDAO, and Blockchain@UBC

Blockchains & Cultural Padlocks is supported by the Digital Strategy Fund of the Canada Council for the Arts.

For more, contact us hello@221a.ca

Coming up: Vancouver’s Voxel Bridge

The Vancouver Biennale folks first sent me information about Voxel Bridge in 2018 but this new material is the most substantive description yet, even without an opening date. From a June 6, 2021 article by Kevin Griffin for the Vancouver Sun (Note: Links have been removed),

The underside of the Cambie Bridge is about to be transformed into the unique digital world of Voxel Bridge. Part of the Vancouver Biennale, Voxel Bridge will exist both as a physical analogue art work and an online digital one.

The public art installation is by Jessica Angel. When it’s fully operational, Voxel Bridge will have several non-fungible tokens called NFTs that exist in an interactive 3-D world that uses blockchain technology. The intention is to create a fully immersive installation. Voxel Bridge is being described as the largest digital public art installation of its kind.

“To my knowledge, nothing has been done at this scale outdoors that’s fully interactive,” said Sammi Wei, the Vancouver Biennale‘s operations director. “Once the digital world is built in your phone, you’ll be able to walk around objects. When you touch one, it kind of vibrates.”

Just as a pixel refers to a point in a two-dimensional world, voxel refers to a similar unit in a 3-D world.

Voxel Bridge will be about itself: it will tell the story of what it means to use new decentralized technology called blockchain to create Voxel Bridge.

There are a few more Voxel Bridge details in a June 7, 2021 article by Vincent Plana for the Daily Hive,

Voxel Bridge draws parallels between blockchain technology and the structural integrity of the underpass itself. The installation will be created by using adhesive vinyl and augmented reality technology.

Gfiffin’s description in his June 6, 2021 article gives you a sense of what it will be like to become immersed in Voxel Bridge,

Starting Monday [June 14, 2021], a crew will begin installing a vinyl overlay directly on the architecture on the underside of the bridge deck, around the columns, and underfoot on the sidewalk from West 2nd to the parking-lot road. Enclosing a space of about 18,000 square feet, the vinyl layer will be visible without any digital enhancement. It will look like an off-kilter circuit board.

“It’ll be like you’re standing in the middle of a circuit board,” [emphasis mine] she said. “At the same time, the visual perception will be slightly off. It’s like an optical illusion. You feel the ground is not quite where it’s supposed to be.”

Griffin’s June 6, 2021 article offers good detail and a glossary.

So, Vancouver is offering more than one opportunity to learn about and/or experience blockchain.