Tag Archives: Jeremy Hainsworth

XoMotion, an exoskeleton developed in Canada causes commotion

I first stumbled across these researchers in 2016 when their project was known as “Wearable Lower Limb Anthropomorphic Exoskeleton (WLLAE).” In my January 20, 2016 posting, “#BCTECH: being at the Summit (Jan. 18-19, 2016),” an event put on by the province of British Columbia (BC, Canada) and the BC Innovation Council (BCIC), I visited a number of booths and talks at the #BC TECH Summit and had this to say about WLLAE,

“The Wearable Lower Limb Anthropomorphic Exoskeleton (WLLAE) – a lightweight, battery-operated and ergonomic robotic system to help those with mobility issues improve their lives. The exoskeleton features joints and links that correspond to those of a human body and sync with motion. SFU has designed, manufactured and tested a proof-of-concept prototype and the current version can mimic all the motions of hip joints.” The researchers (Siamak Arzanpour and Edward Park) pointed out that the ability to mimic all the motions of the hip is a big difference between their system and others which only allow the leg to move forward or back. They rushed the last couple of months to get this system ready for the Summit. In fact, they received their patent for the system the night before (Jan. 17, 2016) the Summit opened.

Unfortunately, there aren’t any pictures of WLLAE yet and the proof-of-concept version may differ significantly from the final version. This system could be used to help people regain movement (paralysis/frail seniors) and I believe there’s a possibility it could be used to enhance human performance (soldiers/athletes). The researchers still have some significant hoops to jump before getting to the human clinical trial stage. They need to refine their apparatus, ensure that it can be safely operated, and further develop the interface between human and machine. I believe WLLAE is considered a neuroprosthetic device. While it’s not a fake leg or arm, it enables movement (prosthetic) and it operates on brain waves (neuro). It’s a very exciting area of research, consequently, there’s a lot of international competition. [ETA January 3, 2024: I’m pretty sure I got the neuroprosthetic part wrong]

Time moved on and there was a name change and then there was this November 10, 2023 article by Jeremy Hainsworth for the Vancouver is Awesome website,

Vancouver-based fashion designer Chloe Angus thought she’d be in a wheelchair for the rest of her life after being diagnosed with an inoperable benign tumour in her spinal cord in 2015, resulting in permanent loss of mobility in her legs.

Now, however, she’s been using a state-of-the-art robotic exoskeleton known as XoMotion that can help physically disabled people self-balance, walk, sidestep, climb stairs and crouch.

“The first time I walked with the exoskeleton was a jaw-dropping experience,” said Angus. “After all these years, the exoskeleton let me stand up and walk on my own without falling. I felt like myself again.”

She added the exoskeleton has the potential to completely change the world for people with motion disabilities.

XoMotion is the result of a decade of research and the product of a Simon Fraser University spinoff company, Human in Motion Robotics (HMR) Inc. It’s the brainchild of professors Siamak Arzanpour and Edward Park.

Arzanpour and Park, both researchers in the Burnaby-based university’s School of Mechatronic Systems Engineering, began work on the device in 2014. They had a vision to enhance exoskeleton technology and empower individuals with mobility challenges to have more options for movement.

“We felt that there was an immediate need to help people with motion disabilities to walk again, with a full range of motion. At the time, exoskeletons could only walk forward. That was the only motion possible,” Arzanpour said.

A November 15, 2023 article (with an embedded video) by Amy Judd & Alissa Thibault for Global News (television) highlights Alexander’s story,

SFU professors Siamak Arzanpour and Edward Park wanted to help people with motion disabilities to walk freely, naturally and independently.

The exoskeleton [XoMotion] is now the most advanced of its kind in the world.

Chloe Angus, who lost her mobility in her legs in 2015, now works for the team.

She said the exoskeleton makes her feel like herself again.

She was diagnosed with an inoperable benign tumor in her spinal cord in 2015 which resulted in a sudden and permanent loss of mobility in her legs. At the time, doctors told Angus that she would need a wheelchair to move for the rest of her life.

Now she is part of the project and defying all odds.

“After all these years, the exoskeleton let me stand up and walk on my own without falling. I felt like myself again.”

There’s a bit more information in the November 8, 2023 Simon Fraser University (SFU) news release (which has the same embedded video as the Global News article) by Ray Sharma,

The state-of-the-art robotic exoskeleton known as XoMotion is the result of a decade of research and the product of an SFU spin off company, Human in Motion Robotics (HMR) Inc. The company has recently garnered millions in investment, an overseas partnership and a suite of new offices in Vancouver.

XoMotion allows individuals with mobility challenges to stand up and walk on their own, without additional support. When in use, XoMotion maintains its stability and simultaneously encompasses all the ranges of motion and degrees of freedom needed for users to self-balance, walk, sidestep, climb stairs, crouch, and more. 

Sensors within the lower-limb exoskeleton mimic the human body’s sense of logic to identify structures along the path, and in-turn, generate a fully balanced motion.

SFU professors Siamak Arzanpour and Edward Park, both researchers in the School of Mechatronic Systems Engineering, began work on the device in 2014 with a vision to enhance exoskeleton technology and empower individuals with mobility challenges to have more options for movement. 

“We felt that there was an immediate need to help people with motion disabilities to walk again, with a full range of motion. At the time, exoskeletons could only walk forward. That was the only motion possible,” says Arzanpour. 

The SFU professors, who first met in 2001 as graduate students at the University of Toronto, co-founded HMR in 2016, bringing together a group of students, end-users, therapists, and organizations to build upon the exoskeleton. Currently, 70 per cent of HMR employees are SFU graduates. 

In recent years, HMR has garnered multiple streams of investment, including a contract with Innovative Solutions Canada, and $10 million in funding during their Series A round in May, including an $8 million investment and strategic partnership from Beno TNR, a prominent Korean technology investment firm.

I decided to bring the embedded video here, it runs a little over 2 mins.,

You can find the Human in Robotics (HMR) website here.

Machine decision-making (artificial intelligence) in British Columbia’s government (Canada)

Jeremy Hainsworth’s September 19, 2023 article on the Vancouver is Awesome website was like a dash of cold water. I had no idea that plans for using AI (artificial intelligence) in municipal administration were so far advanced (although I did cover this AI development, “Predictive policing in Vancouver—the first jurisdiction in Canada to employ a machine learning system for property theft reduction” in a November 23, 2017 posting). From Hainsworth’s September 19, 2023 article, Note: A link has been removed,

Human discretion and the ability to follow decision-making must remain top of mind employing artificial intelligence (AI) to providing public services, Union of BC Municipalities conference delegates heard Sept. 19 [2023].

And, delegates heard from Office of the Ombudsperson of B.C. representatives, decisions made by machines must be fair and transparent.

“This is the way of the future — using AI systems for delivering municipal services,” Zoë Macmillan, office manager of investigations, health and local services.

The risk in getting it wrong on fairness and privacy issues, said Wendy Byrne, office consultation and training officer, is a loss of trust in government.

It’s an issue the office has addressed itself, due to the impacts automated decision-making could have on British Columbians, in terms of the fairness they receive around public services. The issue has been covered in a June 2021 report, Getting Ahead of the Curve [emphasis mine]. The work was done jointly with B.C.’s Office of the Information and Privacy Commissioner.

And, said office representatives, there also needs to be AI decision-making trails that can be audited when it comes to transparency in decision-making and for people appealing decisions made by machines.

She [Zoë Macmillan] said many B.C. communities are on the verge of implementing AI for providing citizens with services. In Vancouver and Kelowna, AI is already being used [emphasis mine] in some permitting systems.

The public, meanwhile, needs to be aware when an automated decision-making system is assisting them with an issue, she [Wendy Byrne] noted.

It’s not clear from Hainsworth’s article excerpts seen here but the report, “Getting Ahead of the Curve” was a joint Yukon and British Columbia (BC) effort. Here’s a link to the report (PDF) and an excerpt, Note: I’d call this an executive summary,

Message from the Officers

With the proliferation of instantaneous and personalized services increasingly being delivered to people in many areas in the private sector, the public is increasingly expecting the same approach when receiving government services. Artificial intelligence (AI) is touted as an effective, efficient and cost-saving solution to these growing expectations. However, ethical and legal concerns are being raised as governments in Canada and abroad are experimenting with AI technologies in
decision-making under inadequate regulation and, at times, in a less than transparent manner.

As public service oversight officials upholding the privacy and fairness rights of citizens, it is our responsibility to be closely acquainted with emerging issues that threaten those rights. There is no timelier an issue that intersects with our
respective mandates as privacy commissioners and ombudsman, than the increasing use of artificial intelligence by the governments and public bodies we oversee.

The digital era has brought swift and significant change to the delivery of public services. The benefits of providing the public with increasingly convenient and timely service has spurred a range of computer-based platforms, from digital assistants to automated systems of approval for a range of services – building permits, inmate releases, social assistance applications, and car insurance premiums [emphasis mine] to name a few. While this kind of machine-based service delivery was once narrowly applied in the public sector, the use of artificial intelligence by the public sector is gaining a stronger foothold in countries around the world, including here in Canada. As public bodies become larger and more complex, the perceived benefits of efficiency, accessibility and accuracy of algorithms to make decisions once made by humans, can be initially challenging to refute.

Fairness and privacy issues resulting from the use of AI are well documented, with many commercial facial recognition systems and assessment tools demonstrating bias and augmenting the ability to use personal information in ways that infringe
privacy interests. Similar privacy and fairness issues are raised by the use of AI in government. People often have no choice but to interact with government and the decisions of government can have serious, long-lasting impacts on our lives. A failure to consider how AI technologies create tension with the fairness and privacy obligations of democratic institutions poses risks for the public and undermines trust in government.

In examining examples of how these algorithms have been used in practice, this report demonstrates that there are serious legal and ethical concerns for public sector administrators. Key privacy concerns relate to the lack of transparency of closed proprietary systems that prove challenging to review, test and monitor. Current privacy laws do not contemplate the use of AI and as such lack obligations for key
imperatives around the collection and use of personal information in machine-based
systems. From a fairness perspective, the use of AI in the public sector challenges key pillars of administrative fairness. For example, how algorithmic decisions are made, explained, reviewed or appealed, and how bias is prevented all present challenging questions.

As the application of AI in public administration continues to gain momentum, the intent of this report is to provide both important context regarding the challenges AI presents in public sector decision-making, as well as practical recommendations that aim to set consistent parameters for transparency, accountability, legality and procedural fairness for AI’s use by public bodies. The critically important values of
privacy protection and administrative fairness cannot be left behind as the field of AI continues to evolve and these principles must be more expressly articulated in legislation, policy and applicable procedural applications moving forward.

This joint report urges governments to respect and fulfill fairness and privacy principles in their adoption of AI technologies. It builds on extensive literature on public sector AI by providing concrete, technology-sensitive, implementable guidance on building fairness and privacy into public sector AI. The report also recommends capacity-building, co-operation and public engagement initiatives government should undertake to promote the public’s trust and buy-in of AI.

This report pinpoints the persistent challenges with AI that merit attention from a fairness and privacy perspective; identifies where existing regulatory measures and instruments for administrative fairness and privacy protection in the age of AI fall short and where they need to be enhanced; and sets out detailed, implementable guidance on incorporating administrative fairness and privacy principles across the various stages of the AI lifecycle, from inception and design, to testing, implementation and mainstreaming.

The final chapter contains our recommendations for the development of a framework to facilitate the responsible use of AI systems by governments. Our recommendations include:

– The need for public authorities to make a public commitment to guiding principles for the use of AI that incorporate transparency, accountability, legality, procedural fairness and the protection of privacy. These principles should apply to all existing and new programs or activities, be included in any tendering documents by public authorities for third-party contracts or AI systems delivered by service providers, and be used to assess legacy projects so they are brought into compliance within a reasonable timeframe.

– The need for public authorities to notify an individual when an AI system is used to make a decision about them and describe to the individual in a way that is understandable how that system operates.

– Government promote capacity building, co-operation, and public engagement on AI.
This should be carried out through public education initiatives, building subject-matter knowledge and expertise on AI across government ministries, developing capacity to support knowledge sharing and expertise between government and AI developers and vendors, and establishing or growing the capacity to develop open-source, high-quality data sets for training and testing Automated Decision Systems (ADS).

– Requiring all public authorities to complete and submit an Artificial Intelligence Fairness and Privacy Impact Assessment (AIFPIA) for all existing and future AI programs for review by the relevant oversight body.

– Special rules or restrictions for the use of highly sensitive information by AI.

… [pp. 1-3]

These are the contributors to the report: Alexander Agnello: Policy Analyst, B.C. Office of the Ombudsperson; Ethan Plato: Policy Analyst, B.C. Office of the Information and Privacy Commissioner; and Sebastian Paauwe: Investigator and Compliance Review Officer, Office of the Yukon Ombudsman and Information and Privacy Commissioner.

A bit startling to see how pervasive ” … automated systems of approval for a range of services – building permits, inmate releases, social assistance applications, and car insurance premiums …” are already. Not sure I’d call this 60 pp. report “Getting Ahead of the Curve” (PDF). It seems more like it was catching up—even in 2021.

Finally, there’s my October 27, 2023 post about the 2023 Canadian Science Policy Conference highlighting a few of the sessions. Scroll down to the second session, “901 – The new challenges of information in parliaments“, where you’ll find this,

… This panel proposes an overview … including a discussion on emerging issues impacting them, such as the integration of artificial intelligence and the risks of digital interference in democratic processes.

Interesting, eh?