Just when I thought I was almost caught up, I found this. The study I will be highlighting is from August 2023 but there are interesting developments all the way into October 2023 and beyond. First, the latest in AI (artificial intelligence) devices from an October 5, 2023 article by Lucas Arender for the Daily Hive, which describes the devices as AI wearables (you could also them wearable technology), Note: Links have been removed,
- Rewind.ai launched Pendant, a necklace that records your conversations and transfers them to your smartphone, creating an audio database (of sorts) for your life.
- Meta unveiled a pair of Ray-Ban smart glasses that include an AI chatbot that users can communicate with (which might make you look like you’re talking to yourself).
- Sam Altman-backed startup Humane teased its new AI pin at Paris Fashion Week— a screenless lapel device that projects a smartphone-like interface onto users’ hands.
- Microsoft filed a patent for an AI backpack that features GPS, voice command, and cameras that could… help us walk in the right direction?
The second item in the list ‘Ray-Ban Meta Smart Glasses’ is further described in an October 17, 2023 article by Sarah Bartnicka for the Daily Hive, Note: A link has been removed,
It’s a glorious day for tech dads everywhere: Meta and Ray-Ban smart glasses are officially for sale in Canada.
Driving the news: Meta has become the latest billion-dollar company to officially enter the smart glasses market with the second iteration [emphasis mine] of its design with Ray-Bans, now including a built-in Meta AI assistant, hands-free live streaming features, and a personal audio system.
…
This time around, the technology is better, and both Meta and Snap are pitching their smart glasses as a tool for creators to stay connected with their audiences rather than just a sleek piece of hardware that can blend your digital and physical realities [augmented or extended reality?].
…
Yes, but: As smart glasses creep back into the limelight, people are wary about wearing cameras on their faces. Concerns about always-on cameras and microphones that allow users to record their surroundings without the consent of others will likely stick around. [emphasis mine]
So, are these AI or smart or augmented reality (AR) glasses? In my October 22, 2021 post, I explored a number of realities in the context of the metaverse. Yes, it gets confusing. At any rate, i found these definitions,
Happily, I have found a good summarized description of VR/AR/MR/XR in a March 20, 2018 essay by North of 41 on medium.com,
“Summary: VR is immersing people into a completely virtual environment; AR is creating an overlay of virtual content, but can’t interact with the environment; MR is a mixed of virtual reality and the reality, it creates virtual objects that can interact with the actual environment. XR brings all three Reality (AR, VR, MR) together under one term.”
If you have the interest and approximately five spare minutes, read the entire March 20, 2018 essay, which has embedded images illustrating the various realities.
…
This may change over time but for now, answering the question, “AI or smart or augmented reality (AR) glasses?” you can say any or all three.
Research from August 2023 and power imbalance
This August 28, 2023 Cornell University news release (also on EurekAlert) by Patricia Waldron uses the terms ‘smart’ and ‘augmented reality’, interchangeably or synonymously, Note: Links have been removed,
Someone wearing augmented reality (AR) or “smart” glasses could be Googling your face, turning you into a cat or recording your conversation – and that creates a major power imbalance, said Cornell researchers.
Currently, most work on AR glasses focuses primarily on the experience of the wearer. Researchers from the Cornell Ann S. Bowers College of Computing and Information Science and Brown University teamed up to explore how this technology affects interactions between the wearer and another person. Their explorations showed that, while the device generally made the wearer less anxious, things weren’t so rosy on the other side of the glasses.
Jenny Fu, a doctoral student in the field of information science, presented the findings in a new study, “Negotiating Dyadic Interactions through the Lens of Augmented Reality Glasses,” at the 2023 ACM Designing Interactive Systems Conference in July.
AR glasses superimpose virtual objects and text over the field of view to create a mixed-reality world for the user. Some designs are big and bulky, but as AR technology advances, smart glasses are becoming indistinguishable from regular glasses, raising concerns that a wearer could be secretly recording someone or even generating deepfakes with their likeness.
For the new study, Fu and co-author Malte Jung, associate professor of information science and the Nancy H. ’62 and Philip M. ’62 Young Sesquicentennial Faculty Fellow, worked with Ji Won Chung, a doctoral student, and Jeff Huang, associate professor of computer science, both at Brown, and Zachary Deocadiz-Smith, an independent extended reality designer.
They observed five pairs of individuals – a wearer and a non-wearer – as each pair discussed a desert survival activity. The wearer received Spectacles, an AR glasses prototype on loan from Snap Inc., the company behind Snapchat. The Spectacles look like avant-garde sunglasses and, for the study, came equipped with a video camera and five custom filters that transformed the non-wearer into a deer, cat, bear, clown or pig-bunny.
Following the activity, the pairs engaged in a participatory design session where they discussed how AR glasses could be improved, both for the wearer and the non-wearer. The participants were also interviewed and asked to reflect on their experiences.
According to the wearers, the fun filters reduced their anxiety and put them at ease during the exercise. The non-wearers, however, reported feeling disempowered because they didn’t know what was happening on the other side of the lenses. They were also upset that the filters robbed them of control over their own appearance. The possibility that the wearer could be secretly recording them without consent – especially when they didn’t know what they looked like – also put the non-wearers at a disadvantage.
The non-wearers weren’t completely powerless, however. A few demanded to know what the wearer was seeing, and moved their faces or bodies to evade the filters – giving them some control in negotiating their presence in the invisible mixed-reality world. “I think that’s the biggest takeaway I have from this study: I’m more powerful than I thought I was,” Fu said.
Another issue is that, like many AR glasses, Spectacles have darkened lenses so the wearer can see the projected virtual images. This lack of transparency also degraded the quality of the social interaction, the researchers reported.
“There is no direct eye contact, which makes people very confused, because they don’t know where the person is looking,” Fu said. “That makes their experiences of this conversation less pleasant, because the glasses blocked out all these nonverbal interactions.”
To create more positive experiences for people on both sides of the lenses, the study participants proposed that smart glasses designers add a projection display and a recording indicator light, so people nearby will know what the wearer is seeing and recording.
Fu also suggests designers test out their glasses in a social environment and hold a participatory design process like the one in their study. Additionally, they should consider these video interactions as a data source, she said.
That way, non-wearers can have a voice in the creation of the impending mixed-reality world.
Rina Diane Caballar’s September 25, 2023 article for IEEE (Institute of Electrical and Electronics Engineers) Spectrum magazine provides a few more insights about the research, Note: Links have been removed,
…
“This AR filter interaction is likely to happen in the future with the commercial emergence of AR glasses,” says Jenny Fu, a doctoral student at Cornell University’s Bowers College of Computing and Information Science and one of the two lead authors of the study. “How will that look like, and what are the social and emotional consequences of interacting and communicating through AR glasses?”
…
“When we think about design in HCI [human-computer interface], there is often a tendency to focus on the primary user and design just for them,” Jung says. “Because these technologies are so deeply embedded in social interactions and are used with others and around others, we often forget these ‘onlookers’ and we’re not designing with them in mind.”
…
Moreover, involving nonusers is especially key in developing more equitable tech products and creating more inclusive experiences. “That’s one of the points why previous AR iterations may not have worked—they designed it for the individual and not for the people surrounding them,” says Chung. She adds that a mindset shift is needed to actively make tech that doesn’t exclude people, which could lead to social systems that promote engagement and foster a sense of belonging for everyone.
…
Caballar’s September 25, 2023 article also appears in the January 2024 print version of the IEEE Spectrum with the title ““AR Glasses Upset the Social Dynamic.”
Here’s a link to and a citation for the paper,
Negotiating Dyadic Interactions through the Lens of Augmented Reality Glasses by Ji Won Chung, Xiyu Jenny Fu, Zachary Deocadiz-Smith, Malte F Jung, Jeff Huang. DIS ’23: Proceedings of the 2023 ACM Designing Interactive Systems Conference July 2023 Pages 493–508 DOI: https://doi.org/10.1145/3563657.3595967
This paper is behind a paywall. For the curious, ACM stands for Association for Computing Machinery.