Tag Archives: SRI International

Smart paint that ‘talks’ to canes for better safety crossing the street

It would be nice if they had some video of people navigating with the help of this ‘smart’ paint. Perhaps one day. Meanwhile, Adele Peters in her March 7, 2018 article for Fast Company provides a vivid description of how a sight-impaired or blind person could navigate more safely and easily,

The crosswalk on a road in front of the Ohio State School for the Blind looks like one that might be found at any intersection. But the white stripes at the edges are made with “smart paint”–and if a student who is visually impaired crosses while using a cane with a new smart tip, the cane will vibrate when it touches the lines.

The paint uses rare-earth nanocrystals that can emit a unique light signature, which a sensor added to the tip of a cane can activate and then read. “If you pulse a laser or LED into these materials, they’ll pulse back at you at a very specific frequency,” says Josh Collins, chief technology officer at Intelligent Materials [sic], the company that manufacturers the oxides that can be added to paint.

While digging down for more information, this February 12, 2018 article by Ben Levine for Government Technology Magazine was unearthed (Note: Links have been removed),

In this installment of the Innovation of the Month series (read last month’s story here), we explore the use of smart technologies to help blind and visually impaired people better navigate the world around them. A team at Ohio State University has been working on a “smart paint” application to do just that.

MetroLab’s Executive Director Ben Levine sat down with John Lannutti, professor of materials science engineering at Ohio State University; Mary Ball-Swartwout, orientation and mobility specialist at the Ohio State School for the Blind; and Josh Collins, chief technology officer at Intelligent Material to learn more.

John Lannutti (OSU): The goal of “smart paint for networked smart cities” is to assist people who are blind and visually impaired by implementing a “smart paint” technology that provides accurate location services. You might think, “Can’t GPS do that?” But, surprisingly, current GPS-based solutions actually cannot tell whether somebody is walking on the sidewalk or down the middle of the street. Meanwhile, modern urban intersections are becoming increasingly complex. That means that finding a crosswalk, aligning to cross and maintaining a consistent crossing direction while in motion can be challenging for people who are visually impaired.

And of course, crosswalks aren’t the only challenge. For example, our current mapping technologies are unable to provide the exact location of a building’s entrance. We have a technology solution to those challenges. Smart paint is created by adding exotic light-converting oxides to standard road paints. The paint is detected using a “smart cane,” a modified white cane that detects the smart paint and enables portal-to-portal guidance. The smart cane can also be used to notify vehicles — including autonomous vehicles — of a user’s presence in a crosswalk.

As part of this project, we have a whole team of educational, city and industrial partners, including:

Educational partners: 

  • Ohio State School for the Blind — testing and implementation of smart paint technology in Columbus involving both students and adults
  • Western Michigan University — implementation of smart paint technology with travelers who are blind and visually impaired to maximize orientation and mobility
  • Mississippi State University — the impacts of smart paint technology on mobility and employment for people who are blind and visually impaired

City partners:  

  • Columbus Smart Cities Initiative — rollout of smart paint within Columbus and the paint’s interaction with the Integrated Data Exchange (IDE), a cloud-based platform that dynamically collects user data to show technological impact
  • The city of Tampa, Fla. — rollout of smart paint at the Lighthouse for the Blind
  • The Hillsborough Area Transit Regional Authority, Hillsborough County, Fla. — integration of smart paint with existing bus lines to enable precise location determination
  • The American Council of the Blind — implementation of smart paint with the annual American Council of the Blind convention
  • MetroLab Network — smart paint implementation in city-university partnerships

Industrial collaborators:  

  • Intelligent Material — manufactures and supplies the unique light-converting oxides that make the paint “smart”
  • Crown Technology — paint manufacturing, product evaluation and technical support
  • SRI International — design and manufacturing of the “smart” white cane hardware

Levine: Can you describe what this project focused on and what motivated you to address this particular challenge?

Lannutti: We have been working with Intelligent Material in integrating light-converting oxides into polymeric matrices for specific applications for several years. Intelligent Material supplies these oxides for highly specialized applications across a variety of industries, and has deep experience in filtering and processing the resulting optical outputs. They were already looking at using this technology for automotive applications when the idea to develop applications for people who are blind was introduced. We were extremely fortunate to have the Ohio State School for the Blind (OSSB) right here in Columbus and even more fortunate to have interested collaborators there who have helped us at every step of the way. They even have a room filled with previous white cane technologies; we used those to better understand what works and what doesn’t, helping refine our own product. At about this same time, the National Science Foundation released a call for Smart and Connected Communities proposals, which gave us both a goal and a “home” for this idea.

Levine: How will the tools developed in this project impact planning and the built environment?

Ball-Swartwout: One of the great things about smart paint is that it can be added to the built environment easily at little extra cost. We expect that once smart paint is widely adopted, most sighted users will not notice much difference as smart paint is not visually different from regular road paint. Some intersections might need to have more paint features that enable smart white cane-guided entry from the sidewalk into the crosswalk. Paint that tells users that they have reached their destination may become visible as horizontal stripes along modern sidewalks. These paints could be either gray or black or even invisible to sighted pedestrians, but would still be detectable by “smart” white canes to tell users that they have arrived at their destination.

Levine: Can you tell us about the new technologies that are associated with this project? Can you talk about the status quo versus your vision for the future?

Collins: Beyond converting ceramics in paint, placing a highly sensitive excitation source and detector package at the tip of a moving white cane is truly novel. Also challenging is powering this package using minimal battery weight to decrease the likelihood of wrist and upper neck fatigue.

The status quo is that the travel of citizens who are blind and visually impaired can be unpredictable. They need better technologies for routine travel and especially for travel to any new destinations. In addition, we anticipate that this technology could assist in the travel of people who have a variety of physical and cognitive impairments.

Our vision for the future of this technology is that it will be widespread and utilized constantly. Outside the U.S., Japan and Europe have integrated relatively expensive technologies into streets and sidewalks, and we see smart paint replacing that very quickly. Because the “pain” of installing smart paint is very small, we believe that grass-roots pressure will enable rapid introduction of this technology.

Levine: What was the most surprising thing you learned during this process?

Lannutti: In my mind, the most surprising thing was discovering that sound was not necessarily the best means of guiding users who are blind. This is a bias on the part of sighted individuals as we are used to beeping and buzzing noises that guide or inform us throughout our day. Pedestrians who are blind, on the other hand, need to constantly listen to aspects of their environment to successfully navigate it. For example, listening to traffic noise is extremely important to them as a means of avoiding danger. People who are blind or visually impaired cannot see but need to hear their environment. So we had to dial back our expectations regarding the utility of sound. Instead, we now focus on vibration along the white cane as a means of alerting the user.

If those interested, Levine’s article is well worth reading in its entirety.

Thankfully they’ve added some information to the website for Intelligent Material (Solutions) since I first viewed it.

There’s a bit more information on the Intelligent Material (Solutions’) YouTube video webpage,

Intelligent Material Solutions, Inc. is a privately held business headquartered in Princeton, NJ in the SRI/Sarnoff Campus, formerly RCA Labs. Our technology can be traced through scientific discoveries dating back over 50 years. We are dedicated to solving the worlds’ most challenging problems and in doing so have assembled an innovative, multi-discipliary team of leading scientists from industry and academia to ensure rapid transition from our labs to the world.

The video was published on December 6, 2017. You can find even more details at the company’s LinkedIn page.

Tracking artificial intelligence

Researchers at Stanford University have developed an index for measuring (tracking) the progress made by artificial intelligence (AI) according to a January 9, 2018 news item on phys.org (Note: Links have been removed),

Since the term “artificial intelligence” (AI) was first used in print in 1956, the one-time science fiction fantasy has progressed to the very real prospect of driverless cars, smartphones that recognize complex spoken commands and computers that see.

In an effort to track the progress of this emerging field, a Stanford-led group of leading AI thinkers called the AI100 has launched an index that will provide a comprehensive baseline on the state of artificial intelligence and measure technological progress in the same way the gross domestic product and the S&P 500 index track the U.S. economy and the broader stock market.

For anyone curious about the AI100 initiative, I have a description of it in my Sept. 27, 2016 post highlighting the group’s first report or you can keep on reading.

Getting back to the matter at hand, a December 21, 2017 Stanford University press release by Andrew Myers, which originated the news item, provides more detail about the AI index,

“The AI100 effort realized that in order to supplement its regular review of AI, a more continuous set of collected metrics would be incredibly useful,” said Russ Altman, a professor of bioengineering and the faculty director of AI100. “We were very happy to seed the AI Index, which will inform the AI100 as we move forward.”

The AI100 was set in motion three years ago when Eric Horvitz, a Stanford alumnus and former president of the Association for the Advancement of Artificial Intelligence, worked with his wife, Mary Horvitz, to define and endow the long-term study. Its first report, released in the fall of 2016, sought to anticipate the likely effects of AI in an urban environment in the year 2030.

Among the key findings in the new index are a dramatic increase in AI startups and investment as well as significant improvements in the technology’s ability to mimic human performance.

Baseline metrics

The AI Index tracks and measures at least 18 independent vectors in academia, industry, open-source software and public interest, plus technical assessments of progress toward what the authors call “human-level performance” in areas such as speech recognition, question-answering and computer vision – algorithms that can identify objects and activities in 2D images. Specific metrics in the index include evaluations of academic papers published, course enrollment, AI-related startups, job openings, search-term frequency and media mentions, among others.

“In many ways, we are flying blind in our discussions about artificial intelligence and lack the data we need to credibly evaluate activity,” said Yoav Shoham, professor emeritus of computer science.

“The goal of the AI Index is to provide a fact-based measuring stick against which we can chart progress and fuel a deeper conversation about the future of the field,” Shoham said.

Shoham conceived of the index and assembled a steering committee including Ray Perrault from SRI International, Erik Brynjolfsson of the Massachusetts Institute of Technology and Jack Clark from OpenAI. The committee subsequently hired Calvin LeGassick as project manager.

“The AI Index will succeed only if it becomes a community effort,” Shoham said.

Although the authors say the AI Index is the first index to track either scientific or technological progress, there are many other non-financial indexes that provide valuable insight into equally hard-to-quantify fields. Examples include the Social Progress Index, the Middle East peace index and the Bangladesh empowerment index, which measure factors as wide-ranging as nutrition, sanitation, workload, leisure time, public sentiment and even public speaking opportunities.

Intriguing findings

Among the findings of this inaugural index is that the number of active AI startups has increased 14-fold since 2000. Venture capital investment has increased six times in the same period. In academia, publishing in AI has increased a similarly impressive nine times in the last 20 years while course enrollment has soared. Enrollment in the introductory AI-related machine learning course at Stanford, for instance, has grown 45-fold in the last 30 years.

In technical metrics, image and speech recognition are both approaching, if not surpassing, human-level performance. The authors noted that AI systems have excelled in such real-world applications as object detection, the ability to understand and answer questions and classification of photographic images of skin cancer cells

Shoham noted that the report is still very U.S.-centric and will need a greater international presence as well as a greater diversity of voices. He said he also sees opportunities to fold in government and corporate investment in addition to the venture capital funds that are currently included.

In terms of human-level performance, the AI Index suggests that in some ways AI has already arrived. This is true in game-playing applications including chess, the Jeopardy! game show and, most recently, the game of Go. Nonetheless, the authors note that computers continue to lag considerably in the ability to generalize specific information into deeper meaning.

“AI has made truly amazing strides in the past decade,” Shoham said, “but computers still can’t exhibit the common sense or the general intelligence of even a 5-year-old.”

The AI Index was made possible by funding from AI100, Google, Microsoft and Toutiao. Data supporting the various metrics were provided by Elsevier, TrendKite, Indeed.com, Monster.com, the Google Trends Team, the Google Brain Team, Sand Hill Econometrics, VentureSource, Crunchbase, Electronic Frontier Foundation, EuroMatrix, Geoff Sutcliffe, Kevin Leyton-Brown and Holger Hoose.

You can find the AI Index here. They’re featuring their 2017 report but you can also find data (on the menu bar on the upper right side of your screen), along with a few provisos. I was curious as to whether any AI had been used to analyze the data and/or write the report. A very cursory look at the 2017 report did not answer that question. I’m fascinated by the failure to address what I think is an obvious question. It suggests that even very, very bright people can become blind and I suspect that’s why the group seems quite eager to get others involved, from the 2017 AI Index Report,

As the report’s limitations illustrate, the AI Index will always paint a partial picture. For this reason, we include subjective commentary from a cross-section of AI experts. This Expert Forum helps animate the story behind the data in the report and adds interpretation the report lacks.

Finally, where the experts’ dialogue ends, your opportunity to Get Involved begins [emphasis mine]. We will need the feedback and participation of a larger community to address the issues identified in this report, uncover issues we have omitted, and build a productive process for tracking activity and progress in Artificial Intelligence. (p. 8)

Unfortunately, it’s not clear how one becomes involved. Is there a forum or do you get in touch with one of the team leaders?

I wish them good luck with their project and imagine that these minor hiccups will be dealt with in near term.