Tag Archives: Megan Williams

Touchy robots and prosthetics

I have briefly speculated about the importance of touch elsewhere (see my July 19, 2019 posting regarding BlocKit and blockchain; scroll down about 50% of the way) but this upcoming news bit and the one following it put a different spin on the importance of touch.

Exceptional sense of touch

Robots need a sense of touch to perform their tasks and a July 18, 2019 National University of Singapore press release (also on EurekAlert) announces work on an improved sense of touch,

Robots and prosthetic devices may soon have a sense of touch equivalent to, or better than, the human skin with the Asynchronous Coded Electronic Skin (ACES), an artificial nervous system developed by a team of researchers at the National University of Singapore (NUS).

The new electronic skin system achieved ultra-high responsiveness and robustness to damage, and can be paired with any kind of sensor skin layers to function effectively as an electronic skin.

The innovation, achieved by Assistant Professor Benjamin Tee and his team from the Department of Materials Science and Engineering at the NUS Faculty of Engineering, was first reported in prestigious scientific journal Science Robotics on 18 July 2019.

Faster than the human sensory nervous system

“Humans use our sense of touch to accomplish almost every daily task, such as picking up a cup of coffee or making a handshake. Without it, we will even lose our sense of balance when walking. Similarly, robots need to have a sense of touch in order to interact better with humans, but robots today still cannot feel objects very well,” explained Asst Prof Tee, who has been working on electronic skin technologies for over a decade in hope of giving robots and prosthetic devices a better sense of touch.

Drawing inspiration from the human sensory nervous system, the NUS team spent a year and a half developing a sensor system that could potentially perform better. While the ACES electronic nervous system detects signals like the human sensor nervous system, it is made up of a network of sensors connected via a single electrical conductor, unlike the nerve bundles in the human skin. It is also unlike existing electronic skins which have interlinked wiring systems that can make them sensitive to damage and difficult to scale up.

Elaborating on the inspiration, Asst Prof Tee, who also holds appointments in the NUS Department of Electrical and Computer Engineering, NUS Institute for Health Innovation & Technology (iHealthTech), N.1 Institute for Health and the Hybrid Integrated Flexible Electronic Systems (HiFES) programme, said, “The human sensory nervous system is extremely efficient, and it works all the time to the extent that we often take it for granted. It is also very robust to damage. Our sense of touch, for example, does not get affected when we suffer a cut. If we can mimic how our biological system works and make it even better, we can bring about tremendous advancements in the field of robotics where electronic skins are predominantly applied.”

ACES can detect touches more than 1,000 times faster than the human sensory nervous system. For example, it is capable of differentiating physical contacts between different sensors in less than 60 nanoseconds – the fastest ever achieved for an electronic skin technology – even with large numbers of sensors. ACES-enabled skin can also accurately identify the shape, texture and hardness of objects within 10 milliseconds, ten times faster than the blinking of an eye. This is enabled by the high fidelity and capture speed of the ACES system.

The ACES platform can also be designed to achieve high robustness to physical damage, an important property for electronic skins because they come into the frequent physical contact with the environment. Unlike the current system used to interconnect sensors in existing electronic skins, all the sensors in ACES can be connected to a common electrical conductor with each sensor operating independently. This allows ACES-enabled electronic skins to continue functioning as long as there is one connection between the sensor and the conductor, making them less vulnerable to damage.

Smart electronic skins for robots and prosthetics

ACES’ simple wiring system and remarkable responsiveness even with increasing numbers of sensors are key characteristics that will facilitate the scale-up of intelligent electronic skins for Artificial Intelligence (AI) applications in robots, prosthetic devices and other human machine interfaces.

“Scalability is a critical consideration as big pieces of high performing electronic skins are required to cover the relatively large surface areas of robots and prosthetic devices,” explained Asst Prof Tee. “ACES can be easily paired with any kind of sensor skin layers, for example, those designed to sense temperatures and humidity, to create high performance ACES-enabled electronic skin with an exceptional sense of touch that can be used for a wide range of purposes,” he added.

For instance, pairing ACES with the transparent, self-healing and water-resistant sensor skin layer also recently developed by Asst Prof Tee’s team, creates an electronic skin that can self-repair, like the human skin. This type of electronic skin can be used to develop more realistic prosthetic limbs that will help disabled individuals restore their sense of touch.

Other potential applications include developing more intelligent robots that can perform disaster recovery tasks or take over mundane operations such as packing of items in warehouses. The NUS team is therefore looking to further apply the ACES platform on advanced robots and prosthetic devices in the next phase of their research.

For those who like videos, the researchers have prepared this,

Here’s a link to and a citation for the paper,

A neuro-inspired artificial peripheral nervous system for scalable electronic skins by Wang Wei Lee, Yu Jun Tan, Haicheng Yao, Si Li, Hian Hian See, Matthew Hon, Kian Ann Ng, Betty Xiong, John S. Ho and Benjamin C. K. Tee. Science Robotics Vol 4, Issue 32 31 July 2019 eaax2198 DOI: 10.1126/scirobotics.aax2198 Published online first: 17 Jul 2019:

This paper is behind a paywall.

Picking up a grape and holding his wife’s hand

This story comes from the Canadian Broadcasting Corporation (CBC) Radio with a six minute story embedded in the text, from a July 25, 2019 CBC Radio ‘As It Happens’ article by Sheena Goodyear,

The West Valley City, Utah, real estate agent [Keven Walgamott] lost his left hand in an electrical accident 17 years ago. Since then, he’s tried out a few different prosthetic limbs, but always found them too clunky and uncomfortable.

Then he decided to work with the University of Utah in 2016 to test out new prosthetic technology that mimics the sensation of human touch, allowing Walgamott to perform delicate tasks with precision — including shaking his wife’s hand. 

“I extended my left hand, she came and extended hers, and we were able to feel each other with the left hand for the first time in 13 years, and it was just a marvellous and wonderful experience,” Walgamott told As It Happens guest host Megan Williams. 

Walgamott, one of seven participants in the University of Utah study, was able to use an advanced prosthetic hand called the LUKE Arm to pick up an egg without cracking it, pluck a single grape from a bunch, hammer a nail, take a ring on and off his finger, fit a pillowcase over a pillow and more. 

While performing the tasks, Walgamott was able to actually feel the items he was holding and correctly gauge the amount of pressure he needed to exert — mimicking a process the human brain does automatically.

“I was able to feel something in each of my fingers,” he said. “What I feel, I guess the easiest way to explain it, is little electrical shocks.”

Those shocks — which he describes as a kind of a tingling sensation — intensify as he tightens his grip.

“Different variations of the intensity of the electricity as I move my fingers around and as I touch things,” he said. 

To make that [sense of touch] happen, the researchers implanted electrodes into the nerves on Walgamott’s forearm, allowing his brain to communicate with his prosthetic through a computer outside his body. That means he can move the hand just by thinking about it.

But those signals also work in reverse.

The team attached sensors to the hand of a LUKE Arm. Those sensors detect touch and positioning, and send that information to the electrodes so it can be interpreted by the brain.

For Walgamott, performing a series of menial tasks as a team of scientists recorded his progress was “fun to do.”

“I’d forgotten how well two hands work,” he said. “That was pretty cool.”

But it was also a huge relief from the phantom limb pain he has experienced since the accident, which he describes as a “burning sensation” in the place where his hand used to be.

A July 24, 2019 University of Utah news release (also on EurekAlert) provides more detail about the research,

Keven Walgamott had a good “feeling” about picking up the egg without crushing it.

What seems simple for nearly everyone else can be more of a Herculean task for Walgamott, who lost his left hand and part of his arm in an electrical accident 17 years ago. But he was testing out the prototype of a high-tech prosthetic arm with fingers that not only can move, they can move with his thoughts. And thanks to a biomedical engineering team at the University of Utah, he “felt” the egg well enough so his brain could tell the prosthetic hand not to squeeze too hard.

That’s because the team, led by U biomedical engineering associate professor Gregory Clark, has developed a way for the “LUKE Arm” (so named after the robotic hand that Luke Skywalker got in “The Empire Strikes Back”) to mimic the way a human hand feels objects by sending the appropriate signals to the brain. Their findings were published in a new paper co-authored by U biomedical engineering doctoral student Jacob George, former doctoral student David Kluger, Clark and other colleagues in the latest edition of the journal Science Robotics. A copy of the paper may be obtained by emailing robopak@aaas.org.

“We changed the way we are sending that information to the brain so that it matches the human body. And by matching the human body, we were able to see improved benefits,” George says. “We’re making more biologically realistic signals.”

That means an amputee wearing the prosthetic arm can sense the touch of something soft or hard, understand better how to pick it up and perform delicate tasks that would otherwise be impossible with a standard prosthetic with metal hooks or claws for hands.

“It almost put me to tears,” Walgamott says about using the LUKE Arm for the first time during clinical tests in 2017. “It was really amazing. I never thought I would be able to feel in that hand again.”

Walgamott, a real estate agent from West Valley City, Utah, and one of seven test subjects at the U, was able to pluck grapes without crushing them, pick up an egg without cracking it and hold his wife’s hand with a sensation in the fingers similar to that of an able-bodied person.

“One of the first things he wanted to do was put on his wedding ring. That’s hard to do with one hand,” says Clark. “It was very moving.”

Those things are accomplished through a complex series of mathematical calculations and modeling.

The LUKE Arm

The LUKE Arm has been in development for some 15 years. The arm itself is made of mostly metal motors and parts with a clear silicon “skin” over the hand. It is powered by an external battery and wired to a computer. It was developed by DEKA Research & Development Corp., a New Hampshire-based company founded by Segway inventor Dean Kamen.

Meanwhile, the U’s team has been developing a system that allows the prosthetic arm to tap into the wearer’s nerves, which are like biological wires that send signals to the arm to move. It does that thanks to an invention by U biomedical engineering Emeritus Distinguished Professor Richard A. Normann called the Utah Slanted Electrode Array. The array is a bundle of 100 microelectrodes and wires that are implanted into the amputee’s nerves in the forearm and connected to a computer outside the body. The array interprets the signals from the still-remaining arm nerves, and the computer translates them to digital signals that tell the arm to move.

But it also works the other way. To perform tasks such as picking up objects requires more than just the brain telling the hand to move. The prosthetic hand must also learn how to “feel” the object in order to know how much pressure to exert because you can’t figure that out just by looking at it.

First, the prosthetic arm has sensors in its hand that send signals to the nerves via the array to mimic the feeling the hand gets upon grabbing something. But equally important is how those signals are sent. It involves understanding how your brain deals with transitions in information when it first touches something. Upon first contact of an object, a burst of impulses runs up the nerves to the brain and then tapers off. Recreating this was a big step.

“Just providing sensation is a big deal, but the way you send that information is also critically important, and if you make it more biologically realistic, the brain will understand it better and the performance of this sensation will also be better,” says Clark.

To achieve that, Clark’s team used mathematical calculations along with recorded impulses from a primate’s arm to create an approximate model of how humans receive these different signal patterns. That model was then implemented into the LUKE Arm system.

Future research

In addition to creating a prototype of the LUKE Arm with a sense of touch, the overall team is already developing a version that is completely portable and does not need to be wired to a computer outside the body. Instead, everything would be connected wirelessly, giving the wearer complete freedom.

Clark says the Utah Slanted Electrode Array is also capable of sending signals to the brain for more than just the sense of touch, such as pain and temperature, though the paper primarily addresses touch. And while their work currently has only involved amputees who lost their extremities below the elbow, where the muscles to move the hand are located, Clark says their research could also be applied to those who lost their arms above the elbow.

Clark hopes that in 2020 or 2021, three test subjects will be able to take the arm home to use, pending federal regulatory approval.

The research involves a number of institutions including the U’s Department of Neurosurgery, Department of Physical Medicine and Rehabilitation and Department of Orthopedics, the University of Chicago’s Department of Organismal Biology and Anatomy, the Cleveland Clinic’s Department of Biomedical Engineering and Utah neurotechnology companies Ripple Neuro LLC and Blackrock Microsystems. The project is funded by the Defense Advanced Research Projects Agency and the National Science Foundation.

“This is an incredible interdisciplinary effort,” says Clark. “We could not have done this without the substantial efforts of everybody on that team.”

Here’s a link to and a citation for the paper,

Biomimetic sensory feedback through peripheral nerve stimulation improves dexterous use of a bionic hand by J. A. George, D. T. Kluger, T. S. Davis, S. M. Wendelken, E. V. Okorokova, Q. He, C. C. Duncan, D. T. Hutchinson, Z. C. Thumser, D. T. Beckler, P. D. Marasco, S. J. Bensmaia and G. A. Clark. Science Robotics Vol. 4, Issue 32, eaax2352 31 July 2019 DOI: 10.1126/scirobotics.aax2352 Published online first: 24 Jul 2019

This paper is definitely behind a paywall.

The University of Utah researchers have produced a video highlighting their work,

American Association for the Advancement of Science (AAAS) 2015 meeting in San Jose, CA from Feb. 12 -16, 2014

The theme for the 2015 American Association for the Advancement of Science meeting is Innovations, Information, and Imaging and you can find the program here. A few of the talks and presentations caught my eye and I’m starting with the plenary lectures as these reflect, more or less, the interpretation of the theme and set the tone for the meeting.

Plenary lectures

President’s Address
Thursday, 12 February 2015: 6:00 PM-7:30 PM

Dr. Gerald Fink’s work in genetics, biochemistry, and molecular biology has advanced our understanding of gene regulation, mutation, and recombination. He developed a technique for transforming yeast that allowed researchers to introduce a foreign piece of genetic material into yeast cells and study the inheritance and expression of that DNA. [emphasis mine] The technique, fundamental to genetic engineering, laid the groundwork for the commercial use of yeast as biological factories for manufacturing vaccines and other drugs, and set the stage for genetic engineering in all organisms. Fink chaired a National Research Council Committee that produced the 2003 report Biotechnology Research in an Age of Terrorism: Confronting the Dual Use Dilemma, recommending practices to prevent the potentially destructive application of biotechnology research while enabling legitimate research. …

I did not include Dr.Fink’s many, many professional attributes but rest assured Dr. Fink has founded at least one research group, received many professional honours, and has multiple degrees.

Back to the plenary lectures,

Daphne Koller: The Online Revolution: Learning Without Limits
Plenary Lecture
Friday, 13 February 2015: 5:00 PM-6:00 PM

Dr. Daphne Koller is the Rajeev Motwani Professor in the Department of Computer Science at Stanford University and president and co-founder of Coursera, an online education platform. Her research focus is artificial intelligence and its applications in the biomedical sciences. She received her bachelor’s and master’s degrees from Hebrew University of Jerusalem. Koller completed her Ph.D. at Stanford under the supervision of Joseph Halpern and performed postdoctoral research at University of California, Berkeley. She was named a MacArthur Fellow in 2004 and was awarded the first ACM-Infosys Foundation Award in Computing Sciences. She co-authored, with Nir Friedman, a textbook on probabilistic graphical models and offered a free online course on the subject. She and Andrew Ng, a fellow Stanford computer science professor, launched Coursera in 2012. Koller and Ng were recognized on the 2013 Time 100 list of the most influential people in the world.

David Baker: Post-Evolutionary Biology: Design of Novel Protein Structures, Functions, and Assemblies

Plenary Lecture

Saturday, 14 February 2015: 5:00 PM-6:00 PM

Dr. David Baker is a biochemist and computational biologist whose research focuses on the prediction and design of macromolecular structures and functions. He is the director of the Rosetta Commons, a consortium of labs and researchers that develop the Rosetta biomolecular structure prediction and design program, which has been extended to the distributed computing project Rosetta@Home and the online computer game Foldit. He received his Ph.D. in biochemistry at the University of California, Berkeley and completed postdoctoral work in biophysics at University of California, San Francisco. Baker has received numerous awards in recognition of his work, including the AAAS Newcomb Cleveland Prize; the Sackler International Prize in Biophysics; the Overton Prize from the International Society of Computational Biology; the Feynman Prize from the Foresight Institute; and the Centenary Award from the Biochemical Society. He is an investigator of the Howard Hughes Medical Institute, and a member of the National Academy of Sciences and the American Academy of Arts and Sciences.[emphasis mine]

I found the mention of the Foresight Institute (a nanotechnology organization founded by Eric Drexler and Christine Petersen) quite interesting. The title of Baker’s presentation certainly brings to mind, synthetic biology.

Back to the plenary lectures,

Neil Shubin: Finding Your Inner Fish
Plenary Lecture
Monday, 16 February 2015: 8:30 AM-9:30 AM

Dr. Neil Shubin is a paleontologist and evolutionary biologist who researches the origin of animal anatomical features. He has done field work in Greenland, Africa, Asia, and North America. One of his discoveries, Tiktaalik roseae, has been described as the “missing link” between fish and land animals. He has also done important work on the developmental biology of limbs, and he uses his diverse fossil findings to devise hypotheses on how anatomical transformations occurred by way of genetic and morphogenetic processes. He is a fellow of the John Simon Guggenheim Memorial Foundation and the American Association for the Advancement of Science and a member of the National Academy of Sciences. He earned a Ph.D. in organismic and evolutionary biology from Harvard University. Shubin’s popular science book Your Inner Fish: A Journey into the 3.5-Billion-Year History of the Human Body was adapted for a PBS documentary series in 2014.

Here are a few presentations from the main program; this first one is a ‘conference within a conference’,

Citizen Science 2015, Day One
Pre-registration required
Wednesday, 11 February 2015: 8:30 AM-5:00 PM

Citizen science is a partnership between everyday people and professional scientists to investigate pressing questions about the world. Citizen Science 2015 invites anyone interested in such collaborations to participate in a two-day pre-conference before the AAAS Annual Meeting. All involved in any aspect of citizen science are welcome, including researchers, project leaders, educators, evaluators, designers and makers, volunteers, and more–representing a wide variety of disciplines. Join people from across the field of citizen science to discuss designing, implementing, sustaining, evaluating, and participating in projects. Share your project innovations and questions. Citizen Science 2015 is the inaugural conference and gathering of the newly formed Citizen Science Association (CSA). For additional information, including Citizen Science Conference registration, visit www.citizenscienceassociation.org.

Revolutionary Vision: Implants, Prosthetics, Smart Glasses, and the Telescopic Contact Lens
Friday, 13 February 2015: 8:00 AM-9:30 AM

According to the World Health Organization, 285 million people are estimated to be visually impaired worldwide. Age-related macular degeneration alone is the leading cause of blindness among older adults in the western world. These facts leave no question as to why the brightest minds in science and engineering are setting their sights on vision through new electronics, retinal prosthesis, wearable technologies, and even telescopic contact lenses. Researchers are bringing into focus novel electronics such as systems on plastic, which are deformable and implantable, zero-power, and wireless and have numerous applications for sight and vision. Retinal prosthesis combined with video goggles pulsing near-infrared light, meanwhile, have restored up to half of normal acuity in rats. This symposium showcases and demos the latest prototypes tackling form as well as function: smart glasses with novel display architecture that make them small and light while maintaining an optimal field of view. These breakthroughs not only help subjects see but also hold promise for noninvasive continuous monitoring of eye health. Scientists will reveal the first-ever telescopic contact lens, which magnifies 2.8 times and offers hope for millions suffering from macular degeneration and seeking alternatives to bulky glasses and invasive surgery. These advances reveal the great promise that science holds for the visually impaired — truly a sight to behold.
Organizer:
Megan Williams, swissnex
Co-organizers:
Christian Simm, swissnex
and Melanie Picard, swissnex
Moderator:
Christian Simm, swissnex
Speakers:
Daniel Palanker, Stanford University
Restoration of Sight with Photovoltaic Subretinal Prosthesis
Eric Tremblay, Swiss Federal Institute of Technology (EPFL)
Smart Glasses and Telescopic Contact Lenses for Macular Degeneration
Giovanni Antonio Salvatore, ETH Zurich
The Next Technological Leap in Electronics

Celebration of 2015: The International Year of Light
Friday, 13 February 2015: 8:30 AM-11:30 AM

In recognition that light-based science and technologies play a critical role in our daily lives, the United Nations passed a resolution declaring 2015 the International Year of Light. The UN resolution states that “applications of light science and technology are vital for existing and future advances in medicine, energy, information and communications, fiber optics, astronomy, agriculture, archaeology, entertainment, and culture.” Hundreds of science and engineering organizations across the globe signed on in support of the International Year of Light 2015 and will be raising awareness of light-based science and technology throughout the year. This symposium brings together speakers from diverse fields to illustrate the many sectors that are influenced by optics and photonics.
Organizer:
Martha Paterson, The Optical Society (OSA)
Co-organizers:
Anthony Johnson, University of Maryland
and Phil Bucksbaum, Stanford University
Moderator:
Anthony Johnson, University of Maryland
Speakers:
Elizabeth Hillman, Columbia University
Optics in Neuroscience
Warren Warren, Duke University
Applying Nonlinear Laser Microscopy to Melanoma Diagnosis and Renaissance Art Imaging
Uwe Bergmann, SLAC National Accelerator Laboratory
X-Ray Laser Research: Lighting Our Future
Alan Eli Willner, University of Southern California
Optical Communications
Christopher Stratas , Flextronics
LED Lighting and Energy Efficiency
R. Rox Anderson, Harvard Medical School
Lasers in Medicine

I last mentioned the upcoming International Year of Light in a Nov. 7, 2014 post about the Nanoscale Informal Science Education Network (NISENet) newsletter. For anyone who has difficulty connecting nano with light, remember the Lycurgus Cup (Sept. 21, 2010 post) infused with gold and silver nanoparticles and which appears either green or red depending on how the light is shone?

Back to the programme,

The Future of the Internet: Meaning and Names or Numbers?
The Future of Computing
Friday, 13 February 2015: 10:00 AM-11:30 AM

Information-centric networking (ICN) is a new, disruptive technology that holds the promise of eliminating many of the internet’s current technical shortcomings. The idea is based on two simple concepts: addressing information by its name rather than by its location, and adding computation and memory to the network, especially at the edge. The implications for network architects are far reaching and offer both elegant solutions and perplexing implementation challenges. The field of ICN research is active, including hundreds of projects at leading academic, industrial, and government laboratories around the world. This session will explore the motivations and current state-of-the-art in ICN research from multiple perspectives and approaches. The speakers in this session have contributed to every facet of the internet’s evolution since its inception.
Organizer:
Glenn T. Edens, PARC Xerox
Co-Organizer:
J.J. Garcia-Luna-Aceves, University of California, Santa Cruz
Speakers:
Vinton Cerf, Google Inc.
Digital Vellum
David Oran, Cisco Systems
Information-Centric Networking: Is It Ready for Prime Time? Will It Ever Be?
Glenn T. Edens, PARC Xerox
Information-Centric Networking: Towards a Reliable and Robust 21st Century Internet

It seems odd that the speakers come from industry/business exclusively.

Comics, Zombies, and Hip-Hop: Leveraging Pop Culture for Science Engagement
Friday, 13 February 2015: 1:00 PM-2:30 PM

Access to quality scientific information is progressively more important in society today. The critical ways information can be used range from increasing scientific literacy and developing the public’s understanding of behaviors that promote health and well-being, to increasing interest in careers in science and success in school — particularly among students traditionally underrepresented in the sciences. Traditional forms of scientific communication — textbooks, talks, and articles in the lay press — succeed at reaching some, but leave many others in the dark. Recent research also indicates that scientists have a narrow view of outreach, mostly considering it as simply giving a talk at a school. However, new forms of culturally relevant engagement for K-12 students are emerging — comic books with rich scientific content that have been demonstrated to increase student engagement, novel workshops (for settings in and out of school) that interweave STEM  exploration with creative writing to build students’ scientific and written literacy, and connecting hip-hop culture and the classroom through rap — while engaging students as co-teachers and translators to help their peers learn science.
Organizer:
Rebecca L. Smith, University of California
Co-Organizer:
Kishore Hari, University of California
Moderator:
Rebecca L. Smith, University of California
Speakers:
Judy Diamond, University of Nebraska State Museum
Engaging Teenagers with Science Through Comics
Julius Diaz Panoriñgan, 826LA
Developing Multiple Literacies with Zombies, Space Exploration, and Superheroes
Tom McFadden, Nueva School
Science Rapping from Auckland to Oakland

Tom McFadden, one of the speakers, has been mentioned here on more than one occasion (most recently in a May 30, 2014 post).

Back to the program,

Citizen Science from the Zooniverse: Cutting-Edge Research with 1 Million Scientists
Friday, 13 February 2015: 1:30 PM-4:30 PM

Citizen science (CS) involves public participation and engagement in scientific research in a way that makes it possible to perform tasks that a small number of researchers could not accomplish alone, makes the research more democratic, and potentially educates the participants. Volunteers simply need access to a computer or tablet to become involved and assist research activities. The presence of massive online datasets and the availability of high-speed internet access provide many opportunities for citizen scientists to work on projects analyzing and interpreting data — especially images — in astronomy, biology, climate science, and other fields. The growing phenomenon of CS has drawn the interest of social scientists who study the efficacy of CS projects, motivations of participants, and applications to industry and policymaking. CS clearly has considerable potential in the era of big data. Galaxy Zoo is an example of a successful CS project; it invites volunteers to visually classify the shapes and structures of galaxies seen in images from optical surveys. The project resulted in catalogs of hundreds of thousands of classified galaxies, allowing for novel statistical analyses and the identification of rare objects. Its popularity led to the Zooniverse, a suite of projects in a diverse and interdisciplinary range of fields. This symposium will demonstrate how CS is becoming a vital tool and highlight the work of a variety of researchers.
Organizer:
Ramin A. Skibba, University of California
Speakers:
Laura Whyte, Adler Planetarium
Introduction to Citizen Science and the Zooniverse
Brooke Simmons, University of Oxford
The Scientific Impact of Galaxy Zoo
Alexandra Swanson, University of Minnesota
Photographing Carnivores with Snapshot Serengeti
Kevin Wood, University of Washington
Old Weather: Studying Historical Weather Patterns with Ship Logbooks
Paul Pharoah, University of Cambridge
Contributing to Cancer Research with Cell Slider
Philip Marshall, Stanford University
Using Space Warps To Find Gravitational Lenses

The Zooniverse has been mentioned here before, most recently in a March 17, 2014 post about the TED 2014 conference held in Vancouver (Canada),

Robert Simpson talked about citizen science, the Zooniverse project, and astronomy.  I have mentioned Zooniverse here (a Jan. 17, 2012 posting titled: Champagne galaxy, drawing bubbles for science and a Sept. 17, 2013 posting titled: Volunteer on the Plankton Portal and help scientists figure out ways to keep the ocean healthy.  Simpson says there are 1 million people participating in various Zooniverse projects and he mentioned that in addition to getting clicks and time from people, they’ve also gotten curiosity. That might seem obvious but he went on to describe a project (the Galaxy Zoo project) where the citizen scientists became curious about certain phenomena they were observing and as a consequence of their curiosity an entirely new type of galaxy was discovered, a pea galaxy. From the Pea Galaxy Wikipedia entry (Note: Links have been removed),

A Pea galaxy, also referred to as a Pea or Green Pea, might be a type of Luminous Blue Compact Galaxy which is undergoing very high rates of star formation.[1] Pea galaxies are so-named because of their small size and greenish appearance in the images taken by the Sloan Digital Sky Survey (SDSS).

Pea Galaxies were first discovered in 2007 by the volunteer users within the forum section of the online astronomy project Galaxy Zoo (GZ).[2]

Here’s the last presentation I’m featuring in this post and it has a ‘nano’ flavour,

Beyond Silicon: New Materials for 21st Century Electronics
Saturday, 14 February 2015: 8:00 AM-9:30 AM

Silicon Valley gets its name from the element found at the heart of all microelectronics. For decades, pure silicon single crystals have been the basis for computer chips. But as chips become smaller and faster, doubling the number of transistors on integrated circuits every two years in accordance with Moore’s law, silicon is nearing its practical limits. Scientists are exploring radical new materials and approaches to take over where silicon leaves off — from graphene, a honeycombed sheet of carbon just one atom thick, to topological insulators that conduct electricity perfectly on their surfaces and materials that use the electron’s spin, rather than its charge, to store information. Beyond graphene, scientists are investigating relatively new types of two-dimensional materials that have graphene-like structures and are also semiconducting, making them a natural fit for advanced electronics. This session will describe theoretical and experimental progress in materials beyond silicon that hold promise for continued improvement in computer performance.
Organizer:
Glennda Chui, SLAC National Accelerator Laboratory
Discussant:
Shoucheng Zhang, Stanford University
Speakers:
Stuart S.P. Parkin, IBM Research
Spintronic and Ionitronic Materials and Devices
Joshua Goldberger, Ohio State University
Beyond Graphene: Making New Two-Dimensional Materials for Future Electronics
Elsa Reichmanis, Georgia Institute of Technology
Active Organic and Polymer Materials for Flexible Electronics

There are some very intriguing presentations and one theme not featured here: data visualization (several presentations about visualizing data and/or science can be found). you can explore for yourself, here’s the online program.